US6954224B1 - Camera control apparatus and method - Google Patents

Camera control apparatus and method Download PDF

Info

Publication number
US6954224B1
US6954224B1 US09/550,038 US55003800A US6954224B1 US 6954224 B1 US6954224 B1 US 6954224B1 US 55003800 A US55003800 A US 55003800A US 6954224 B1 US6954224 B1 US 6954224B1
Authority
US
United States
Prior art keywords
camera
cameras
section
designated
location
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Lifetime
Application number
US09/550,038
Other languages
English (en)
Inventor
Susumu Okada
Eimei Nanma
Shinji Nojima
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Panasonic Holdings Corp
Panasonic Intellectual Property Corp of America
Original Assignee
Matsushita Electric Industrial Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Matsushita Electric Industrial Co Ltd filed Critical Matsushita Electric Industrial Co Ltd
Assigned to MATSUSHITA ELECTRIC INDUSTRIAL, CO., LTD. reassignment MATSUSHITA ELECTRIC INDUSTRIAL, CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: NANMA, EIMEI, NOJIMA, SHINJI, OKADA, SUSUMU
Application granted granted Critical
Publication of US6954224B1 publication Critical patent/US6954224B1/en
Assigned to PANASONIC INTELLECTUAL PROPERTY CORPORATION OF AMERICA reassignment PANASONIC INTELLECTUAL PROPERTY CORPORATION OF AMERICA ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: PANASONIC CORPORATION
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices

Definitions

  • the present invention relates to an image-receiver (e.g., a camera control system capable of controlling a camera angle) of a system comprising an image-transmitter for transmitting camera images captured by a plurality of cameras whose angles can be controlled, and the image-receiver for displaying the thus-transmitted camera images.
  • an image-receiver e.g., a camera control system capable of controlling a camera angle
  • Japanese Patent Unexamined Publication 7-123390/(1995) describes a camera control system which enables an image-receiving end to control a camera connected to an image-transmitting end.
  • the image-receiving end is equipped with a monitor screen having a plurality of windows for indicating a plurality of camera images (also called “camera image windows”), a camera selection button, and a camera control panel.
  • a monitor screen having a plurality of windows for indicating a plurality of camera images (also called “camera image windows”), a camera selection button, and a camera control panel.
  • By pressing the camera selection button while viewing images on the camera image windows an operator selects a camera which he desires to control.
  • the operator presses a control button provided in the camera control panel, to thereby transmit a control command to the image-transmitting end and enable control of operation of the camera connected to the image-transmitting end.
  • the camera control system comprises cameras 2701 for capturing images; image transmitters 2710 for transmitting the images captured by the cameras 2701 ; an image receiver 2720 ; a display 2704 for displaying images; and an input device 2705 for entering a command which enables the image receiver 2720 to control the camera 2701 connected to the selected image transmitter 2710 .
  • Each of the image transmitters 2710 comprises an image data import section 2711 for importing an image captured by the corresponding camera 2701 ; an image data transmission section 2712 for transmitting image data to the image receiver 2720 ; a control command receiving section 2713 for receiving a camera control command transmitted from the image receiver 2720 ; and a control command transmission section 2714 for transmitting the camera control command to the camera 2701 .
  • the video receiver 2720 comprises an image data receiving section 2721 for receiving a plurality of images transmitted from the image transmitters 2710 ; an, image data playback section 2722 for displaying the plurality of image data sets on the display 2704 ; a command load section 2723 for loading a camera control command entered by way of the input device 2705 ; and a control command transmission section 2724 for transmitting, to the image transmitters 2710 , the camera control command loaded by way of the command load section 2723 .
  • the image captured by the camera 2701 is imported into the image data import section 2711 , and the image data import section 2711 delivers to the image data transmission section 2712 data pertaining to the thus-imported image.
  • the image data transmission section 2712 transmits the image data to the image data receiving section 2721 of the image receiver 2720 .
  • the image data receiving section 2721 receives a plurality of image data sets from the plurality of image data transmission sections 2712 and delivers the thus-received image data sets the image data playback section 2722 .
  • the image data playback section 2722 displays a plurality of images on the display 2704 .
  • FIG. 28 shows an example camera control panel for controlling the cameras 2701 and example images captured thereby to be displayed on the display 2705 .
  • a display screen 2800 comprises image display areas 2801 , 2802 and 2803 for displaying images captured by the plurality of cameras 2701 ; a camera control panel display area 2801 ; and a control camera selection display area 2830 .
  • the camera control panel display area 2810 comprises an UP button 2811 , a DOWN button 2812 , a LEFT button 2813 , and a RIGHT button 2814 for panning the camera 2701 vertically or hoizontally; an IN button 2817 and an OUT button 2818 for causing the camera 2701 to zoom in and out; and a focusing button 2819 and defocusing button 2820 .
  • the control camera selection button 2830 comprises camera selection buttons 2831 , 2832 and 2833 .
  • the operator selects a camera he desires to control, by means of pressing any one of the camera selection buttons 2831 , 2832 , and 2833 and pressing any of the buttons 2811 through 2820 .
  • the command load section 2723 loads a control command assigned to the camera selected by means of the camera selection button and delivers the thus-loaded control command to the control command transmission section 2724 .
  • the control command transmission section 2724 transmits the control command to the control command receiving section 2713 of the image transmitter 2710 corresponding to the camera selection button selected from the camera selection buttons 2831 , 2832 , and 2833 .
  • control command receiving section 2713 Upon receipt of the control command, the control command receiving section 2713 delivers the thus-received control command to the corresponding control command transmission section 2714 .
  • the control command transmission section 2714 delivers the control command to the corresponding camera 2701 , whereupon the camera 2701 performs the operation instructed by way of the input device 2705 .
  • an impediment may block the camera from capturing a desired image.
  • the present invention is aimed at controlling a camera capable of capturing most quickly an image situated at a desired location in a case where the camera control system is equipped with a plurality of cameras and where one or more of the cameras are controlled.
  • angles required for cameras to train on a position designated by an operator are calculated from the designated position and directions in which the cameras are currently oriented.
  • a camera requiring the minimum angle from among the plurality of cameras is determined as the camera capable of being trained on the designated position most quickly.
  • a camera capable of being trained toward the designated direction can be selected from the plurality of cameras.
  • the user can operate the thus-selected camera without a necessity of pressing any one of UP, DOWN, LEFT, and RIGHT buttons.
  • the present invention yields an advantage of shortening the time from when the operator desires to view a screen until a scene captured by a camera is displayed.
  • a camera which is hindered by an impediment from capturing a position designated by the operator is not considered a candidate camera-to-be-operated.
  • the present invention yields an advantage of preventing occurrence of a case where an impediment blocks a camera selected and controlled by the operator from capturing a desired scene.
  • Two factors that is, an angle through which the camera must pan until it is trained on the designated location, and focusing a camera on the designated location, are converted into factors which can be compared across cameras; that is, a time required for the camera to pan, and a time required for the camera to achieve focusing.
  • the camera to be controlled is selected on the basis of these factors.
  • the operator instructs a desired location and the direction of the desired location. From among cameras which cover the desired direction, there is selected the camera which can be trained on the designated location most quickly.
  • cameras which can capture an image at the designated location in the desired direction can be automatically selected. From among the thus-selected cameras, a camera which can most quickly be trained on the designated location can be automatically selected.
  • a camera is selected on the basis of the time required from when a range which covers the designated location and is instructed directly by the operator is loaded into a camera until the camera is panned to the designated location, as well as on the basis of the time required until an image of the instructed range captured by the camera is displayed.
  • the user enables a camera to capture an image by designation of a desired range.
  • An image captured by a camera which is in operation is displayed in an enlarged manner.
  • the operator can ascertain which camera is in operation.
  • the operator can monitor the image at the designated location in detail while viewing a screen.
  • Two or more cameras which can be quickly trained on the designated location are controlled in decreasing sequence of quickness, thus simultaneously shooting an image situated at a single designated location.
  • Combination of a plurality of cameras enable the operator to simultaneously grasp a detailed image about the designated location and the situation of the surroundings of the designated location. Further, a plurality of cameras can be operated by means of entry of a single command.
  • the present invention provides a camera control apparatus comprising: an image data receiving section for receiving from an image transmitter image data captured by cameras; an image data playback section for display, on a screen, the received images; a camera control area display section for displaying camera symbols, which correspond to information representing the locations of the cameras, and the directions in which the cameras are oriented, as a control region for controlling the cameras connected to the image transmitter; a command load section for loading the coordinates of a location in the control region designated by an operator; a camera-to-be-operated determination section for determining a camera optimal for shooting the designated location; a control command conversion section for converting information about the coordinates loaded by the command load section, into a control command signal capable of being used for controlling the cameras; and a control command transmission section for transmitting the converted control command signal to the image transmitter.
  • the positions of the cameras and the shooting directions thereof are displayed in the camera control region.
  • the operator specifies a location—which the operator desires to shoot—in the area where the positions and shooting directions of the cameras are displayed, through use of a mouse. From among the plurality of cameras, the camera optimal for shooting the designated location is selected.
  • the apparatus of the present invention eliminates superfluous operations, thereby shortening the time from when the operator decides to monitor a certain scene until the scene captured by the camera appears on the display.
  • the present invention provides a camera control method comprises steps of: displaying images captured by a plurality of cameras, a map relating to a location whose image is captured by the plurality of cameras, camera symbols representing the locations of the cameras in the map, and the directions in which the cameras are oriented; selecting a camera optimal for shooting a location designated by an operator, and controlling the selected camera such that the camera is panned toward the designated location.
  • the positions of the cameras and the shooting directions thereof are displayed in the camera control region.
  • the operator specifies a location—which the operator desires to shoot—in the area where the positions and shooting directions of the cameras are displayed, through use of a mouse. From among the plurality of cameras, the camera optimal for shooting the designated location is selected.
  • the operator can operate the camera without involvement of actual operation of UP, DOWN, RIGHT, and LEFT buttons.
  • the system of the present invention eliminates superfluous operations, thereby shortening the time from when the operator decides to monitor a certain scene until the scene captured by the camera appears on the display.
  • the camera-to-be-operated determination section determines a camera to be panned, on the basis of an angle between an imaginary line connecting the center of the camera symbol with the designated location and the direction in which the cameras is currently oriented.
  • the positions of the cameras and the shooting directions thereof are displayed in the camera control region.
  • the operator specifies a location—which the operator desires to shoot—in the area where the positions and shooting directions of the cameras are displayed, through use of a mouse. From among the plurality of cameras, the camera optimal for shooting the designated location is selected.
  • the operator can operate the camera without involvement of actual operation of UP, DOWN, RIGHT, and LEFT buttons.
  • the system of the present invention eliminates superfluous operations, thereby shortening the time from when the operator decides to monitor a certain scene until the scene captured by the camera appears on the display.
  • a camera involving a minimum angle between the direction in which the camera is currently oriented and the imaginary line connecting the center of the camera symbol with the designated location is selected from among the plurality of cameras.
  • the positions of the cameras and the shooting directions thereof are displayed in the camera control region.
  • the operator specifies a location—which the operator desires to shoot—in the area where the positions and shooting directions of the cameras are displayed, through use of a mouse. From among the plurality of cameras, the camera optimal for shooting the designated location is selected.
  • the operator can operate the camera without involvement of actual operation of UP, DOWN, RIGHT, and LEFT buttons.
  • the system of the present invention eliminates superfluous operations, thereby shortening the time from when the operator decides to monitor a certain scene until the scene captured by the camera appears on the display.
  • the camera control system comprises an employable camera survey section which stores information about the positions of impediments existing in the area to be shot by the plurality of cameras and which eliminates a camera incapable of shooting the designated location from candidates considered by the camera-to-be-operated determination section.
  • the system can be set so as to avoid selection of that camera, thereby preventing a situation in which an impediment blocks the camera directed toward the designated location.
  • the camera which is blocked by an impediment and cannot shoot the designated location is eliminated from candidates for selection of a camera to be operated.
  • the system can be set so as to avoid selection of that camera, thereby preventing a situation in which an impediment blocks the camera directed toward the designated location.
  • the impediment is displayed.
  • the operator can ascertain the location of the impediment from the display.
  • the camera control system further comprises:
  • the focuses of respective cameras have been grasped beforehand.
  • the respective camera there are calculated the time required for the camera to pan toward the designated location, as well as the time required for the camera to attain a focus on the designated location.
  • the time required for the camera to pan toward and attain a focus on the designated location is calculated from these time periods.
  • a camera which can shoot the designated location within the minimum period of time is selected on the basis of the time required for the camera to pan toward the designated location from the direction in which the camera is currently oriented and the time required for the camera to zoom into the designated location, and the selected camera is panned toward the designated location and attains focus on the designated location.
  • the focuses of respective cameras have been grasped beforehand.
  • the respective camera there are calculated the time required for the camera to pan toward the designated location, as well as the time required for the camera to attain a focus on the designated location.
  • the time required for the camera to pan toward and attain a focus on the designated location is calculated from these time periods.
  • the operator can ascertain the location on which the camera is currently being focused.
  • the camera control system comprises: a view-point direction survey section for storing the direction in which the operator desires to shoot the designated location, wherein the camera-to-be-operated determination section determines a camera to be operated, from information as to whether or not an image can be shot in the direction designated by the view-point survey section, as well as from the angle between the current shooting direction of the camera and the direction of an imaginary line connecting the designated location with the center of the camera symbol.
  • Cameras capable of shooting an image of the designated location from a desired location can be automatically selected, and a camera capable of being panned most quickly to the desired direction and location can be automatically selected from among those cameras.
  • cameras incapable of shooting an image from a direction desired by the operator are eliminated from candidates camera-to-be-operated.
  • Cameras capable of shooting an image of the designated location from a desired location can be automatically selected, and a camera capable of being panned most quickly to the desired direction and location can be automatically selected from among those cameras.
  • the operator can ascertain the direction in which the operator views the designated location, from the display.
  • the camera control system comprises: an angular-shift-time calculation section for calculating the time required for the camera to pan toward the designated location; a zoom storage section for grasping the degree of zoom of a plurality of cameras; a zoom-shift time calculation section for calculating the time required for a camera to zoom in order to display an image of the designated range; and a zoom range display section for displaying, in the camera control region, a range to be zoomed, wherein the camera-to-be-operated determination section determines a camera to be operated, from the time required for the camera to pan toward the designated location after the operator has designated a desired range in the control region and the time required for the camera to zoom in or out for attaining focus on the designated range.
  • a camera optimal for shooting the designated range can be automatically selected by means of calculating the time required for a camera to pan toward a designated direction from information about the current shooting direction of the camera; calculating the time required for the camera to zoom into the designated range from information about the distance from the currently zoomed location to the designated range; and comparing the cameras in terms of the thus-calculated times, to thereby select the camera capable of most quickly panning toward the designated direction and zooming into the designated range.
  • a camera which can shoot the designated range within the minimum period of time, on the basis of the time required for the camera to pan toward a designated range from the direction in which the camera is currently oriented after the camera has received an instruction for designating a desired range from the operator, and the time required for the camera to attain focus on the designated range from the range on which the camera is currently focused, and the selected camera is panned toward the designated location, to thereby attain focus on the designated range.
  • a camera optimal for shooting the designated range can be automatically selected by means of calculating the time required for a camera to pan toward a designated direction from information about the current shooting direction of the camera; calculating the time required for the camera to zoom into the designated range from information about the distance from the currently zoomed location to the designated range; and comparing the cameras in terms of the thus-calculated times, to thereby select the camera capable of most quickly panning toward the designated direction and zooming into the designated range.
  • an image captured by the camera selected by the camera-to-be-operated determination section is displayed greater than images captured by other cameras.
  • the image of the camera selected by the camera-to-be-operated determination section is enlarged, and the images of the other cameras which are not to be operated are scaled down.
  • the operator can readily ascertain the camera which is currently in an operating state and can view an enlarged image of the designated location in detail.
  • an image captured by the thus-selected camera is displayed greater than images captured by other cameras.
  • the image of the camera selected by the camera-to-be-operated determination section is enlarged, and the images of the other cameras which are not to be operated are scaled down.
  • the operator can readily ascertain the camera which is currently in an operating state and can view an enlarged image of the designated location in detail.
  • the camera control system comprises:
  • Images of the desired location are captured simultaneously through use of two or more cameras.
  • the combined use of cameras enables the operator to simultaneously obtain a detailed image of the designated location and grasp the condition of surroundings of the designated location.
  • the present invention enables operation of a plurality of cameras through entry of a single command, thus realizing more-effective shooting of an image while involving less operation.
  • images captured by the cameras are displayed at respective scales, in sequence in which the cameras are arranged.
  • Images of the desired location are captured simultaneously through use of two or more cameras.
  • the combined use of cameras enables the operator to simultaneously obtain a detailed image of the designated location and grasp the condition of surroundings of the designated location.
  • the present invention enables operation of a plurality of cameras through entry of a single command, thus realizing more-effective shooting of an image while involving less operation.
  • FIG. 1 is a block diagram showing the configuration of a camera control system according to first emobdiment of the present invention
  • FIG. 2 is a schematic representation showing the layout of a display screen displayed on an image receiver according to first embodiment
  • FIG. 3 is a flowchart showing processing required for determining which of cameras is to be operated according to first embodiment
  • FIG. 4 is a schematic representation illustrating an example camera control region according to first embodiment
  • FIG. 5 is a plot showing an example calculation of an angle according to first embodiment
  • FIG. 6 is a block diagram showing the configuration of a camera control system according to second embodiment of the present invention.
  • FIG. 7 is a schematic representation showing the layout of a display screen displayed on an image receiver according to second embodiment
  • FIG. 8 is a schematic representation illustrating an example camera control region according to second embodiment
  • FIG. 9 is a flowchart showing processing required for determining which of the cameras is to be operated according to the second embodiment
  • FIG. 10 is a block diagram showing the configuration of a camera control system according to third embodiemnt of the present invention.
  • FIG. 11 is a schematic representation showing the layout of a display screen displayed on an image receiver according to the third embodiment.
  • FIG. 12 is a flowchart showing processing required for determining which of cameras is to be operated according to the third embodiment
  • FIG. 13 is a block diagram showing the configuration of a camera control system according to fourth embodiment of the present invention.
  • FIG. 14 is a schematic representation showing the layout of a display screen displayed on an image receiver according to the fourth embodiment.
  • FIG. 15 is a schematic representation illustrating an example camera control region according to the fourth embodiment.
  • FIG. 16 is a flowchart showing processing required for determining which of the cameras is to be operated according to the fourth embodiment
  • FIG. 17 is a block diagram showing the configuration of a camera control system according to fifth embodiment of the present invention.
  • FIG. 18 is a schematic representation showing the layout of a display screen displayed on an image receiver according to the fifth embodiment.
  • FIG. 19 is a flowchart showing processing required for determining which of the cameras is to be operated according to the fifth embodiment
  • FIG. 20 is a schematic representation illustrating an example camera control region according to the fifth embodiment.
  • FIG. 21 is a block diagram showing the configuration of a camera control system according to sixth embodiment of the present invention.
  • FIG. 22 is a schematic representation showing the layout of a display screen displayed on an image receiver according to the sixth embodiment.
  • FIG. 23 shows the flow of processing for producing an enlarged image according to the sixth embodiment
  • FIG. 24 is a block diagram showing the configuration of a camera control system according to seventh embodiment of the present invention.
  • FIG. 25 is a schematic representation showing the layout of a display screen displayed on an image receiver according to the seventh embodiment.
  • FIG. 26 shows the flow of processing for producing an enlarged image according to the seventh embodiment
  • FIG. 27 is a block diagram showing the configuration of a prevailing camera control system.
  • FIG. 28 is a schematic representation showing the layout of a display screen displayed on an image receiver of the prevailing camera control system.
  • FIGS. 1 through 26 Preferred examples of the present invention will be described hereinbelow by reference to FIGS. 1 through 26 .
  • the present example is directed to a system comprising an image transmitter for transmitting images captured by a plurality of cameras; and an image receiver which receives the images over a network and display those camera images.
  • the system enables the image receiver to control angles of the cameras connected to the image transmitter.
  • FIG. 1 shows the configuration of the system which embodies the camera control method.
  • FIG. 2 shows an example layout of a display screen connected to the image receiver in a case where cameras are controlled according to the camera control method of the present invention.
  • reference numerals 101 , 102 , 103 , and 104 designate cameras; 110 designates an image transmitter for transmitting images; 120 designates an image receiver for receiving images; 105 designates a display for displaying the images received by the image receiver 120 ; and 106 designates an input device connected to the image receiver 120 capable of controlling the cameras 101 , 102 , 103 , and 104 .
  • reference numeral 200 designates a screen of the display 105 connected to the image receiver 120 ; 201 designates an image display area for displaying the image captured by the camera 101 ; 202 designates an image display area for display an image captured by the camera 102 ; 203 designates an image display area for displaying the image captured by the camera 103 ; 204 designates an image display area captured by the camera 104 ; 210 designates a camera control area for controlling the cameras 101 , 102 , 103 , and 104 connected to the image transmitter 110 ; 211 designates a camera symbol depicting the position of the camera 101 and the shooting direction thereof; 212 designates a camera symbol depicting the position of the camera 102 and the shooting direction thereof; 213 designates a camera symbol depicting the position of the camera 103 and the shooting direction thereof; 214 designates a camera symbol depicting the position of the camera 104 and the shooting direction thereof; 220 designates a map showing a region in which the cameras 101 , 102 , 103 , and
  • the image transmitter 110 shown in FIG. 1 comprises an image import section 111 for importing images captured by the cameras 101 , 102 , 103 , and 104 ; an image data transmission section 112 for transmitting to the image receiver 120 data corresponding to the thus-imported image data; a control command receiving section 113 for receiving a camera control request from the image receiver 120 ; and a control command transmission section 114 for delivering, to the cameras 101 through 104 , the camera control request (hereinafter also called a “camera control command”) received by the control command receiving section 113 .
  • the camera control request hereinafter also called a “camera control command”
  • the image receiver 120 shown in FIG. 1 comprises an image data receiving section 121 for receiving the image data transmitted from the image transmitter 110 ; an image data playback section 122 for displaying the thus-received images on the screen 200 ; a cameral control area display section 123 ; a command load section 124 for loading coordinates of a location which the operator desires to monitor and designates through use of the input device 106 ; a camera-to-be-operated determination section 125 ; a camera angle storage section 126 for storing the angles of the respective cameras 101 , 102 , 103 , and 104 ; a control command conversion section 127 for converting the coordinate information loaded by way of the command loading section 124 into a signal (a control command) which enables control of the cameras 101 through 104 ; and a control command transmission section 128 for transmitting the thus-converted command to the image transmitter 110 .
  • the camera control area display section 123 displays, on the display 105 , the camera symbols 211 through 214 included in the camera control area 210 shown in FIG. 2 , and the shooting directions 221 through 224 of the cameras 101 through 104 .
  • the camera-to-be-operated determination section 125 determines an angle between the shooting direction 221 of the camera 101 and an imaginary extension extending from the center of the camera symbol 211 to the location designated by the pointer 230 ; an angle between the shooting direction 222 of the camera 102 and an imaginary extension extending from the center of the camera symbol 212 to the location designated by the pointer 230 ; an angle between the shooting direction 223 of the camera 103 and an imaginary extension extending from the center of the camera symbol 213 to the location designated the pointer 230 ; and an angle between the shooting direction 224 of the camera 104 and an imaginary extension extending from the center of the camera symbol 214 to the location designated by the pointer 230 .
  • the camera assigned the camera symbol having the smallest angle is determined as the camera to be
  • the image data import section 111 imports data sets pertaining to the images captured by the cameras 101 through 104 , and these image data sets are transmitted in a bundle to the image data transmission section 112 .
  • the image data transmission section 112 receives the plurality of images imported by the image data import section 11 and transmits the thus-received image data sets to the image receiver 120 .
  • the image data receiving section 121 receives the thus-transmitted image data sets.
  • the image data playback section 122 determines display areas on the screen 200 and displays, on the display 105 , the plurality of image data sets received by the image data receiving section 121 in the display areas assigned to the respective image data sets. The display areas may have been determined in advance or may have been determined by the user.
  • the camera control area display section 123 displays, on the display 105 , the camera symbols 211 through 214 representing respective locations of the cameras 101 through 104 , and the shooting directions 221 through 224 of the cameras 101 through 104 .
  • FIG. 3 is a flowchart showing processing required for the camera-to-be-operated determination section 125 to determine which of the cameras 101 through 104 connected to the image transmitter 110 is to be operated.
  • FIG. 4 schematically illustrates the location designated by the operator through use of the pointer 230 and the x-y coordinates of positions of the camera symbols 211 through 214 . The flow of determination of the camera to be operated will now be described by reference to FIGS. 3 and 4 .
  • the camera-to-be-operated determination section 125 selects, from among the plurality of cameras 101 through 104 , a camera for which there has not yet been examined an angle through which the camera must be panned such that the designated location falls on the center of a frame (step 301 ). In the example shown in FIG. 4 , the camera 101 designated by the camera symbol 211 is selected. The camera-to-be-operated determination section 125 then determines an angle 401 between the shooting direction of the camera 101 —which is obtained at the time when the operator has designated the location and is stored in the camera angle storage section 126 —and an imaginary line extending from the camera 101 to the designated location loaded by the command load section 124 (step 302 ). FIG.
  • FIG. 5 depicts determination of an angle which is to be measured for determining the camera to be operated.
  • the illustration shows the location designated by the operator through use of the pointer 230 , an x-axis 501 , a y-axis 502 , the point of origin 503 serving as the center of the camera symbols, the direction 504 in which the camera is currently performing a shooting operation, and an angle 505 between the shooting direction 504 and the line extending from the point 230 to the point of origin 503 .
  • the angle 505 is calculated from the coordinates (S,T) of the designated location, as well as from the shooting direction (S′, T′) of the camera.
  • tan ⁇ 1 ( T/S ) ⁇ tan ⁇ 1 ( T′/S′ ) (1)
  • Another method may also be employed for calculating an angle.
  • the angle 401 relating to the camera symbol 211 is calculated to be 45°.
  • the angle 402 relating to the camera symbol 212 , the angle 403 relating to the camera symbol 213 , and the angle 404 relating to the camera symbol 214 are also calculated in the same manner (step 303 ).
  • the camera-to-be-operated determination section 125 determines the minimum camera angle from the thus-calculated angles 401 , 402 , 403 , and 404 .
  • the camera corresponding to the camera symbol assigned the thus-determined minimum angle is taken as the camera to be used for shooting the designated location (step 304 ).
  • the minimum angle is the angle 403 , and hence the camera 103 corresponding to the camera symbol 213 is taken as the camera to be operated.
  • the camera-to-be-operated determination section 125 sends, to the control command conversion section 127 , information (also called “angle information”) about the camera to be operated and the angle through which the camera is to be panned.
  • the control command conversion section 127 converts the angle information received from the camera-to-be-operated determination section 125 into a command for panning the camera toward the designated location loaded by way of the command load section 124 .
  • the angle is specified by use of a numerical value, and the camera is panned through the designated angle through use of a turn table.
  • the command conversion section 127 converts the angular information into a command which enables rotation of the turn table.
  • Information about the angle of the camera after the camera has been panned according to the command is transmitted to the camera angle storage section 126 .
  • the camera angle storage section 126 transmits, to the camera control area display section 123 , the angle of the camera after the camera has been panned.
  • the control command transmission section 128 transmits over the network, to the video transmitter 110 , the command converted by the control command conversion section 127 .
  • the present invention has described the example in which the area where the plurality of cameras are disposed is viewed in the direction perpendicular to the ground.
  • the area of the cameras may be viewed in various directions, such as a vertical, parallel, or oblique direction.
  • the operator can select a camera capable of shooting the designated location most quickly, so long as the area of cameras is viewed from a direction parallel to the ground or oblique relative to the ground.
  • the camera control area display section 123 displays, in the camera control region 210 and on the map 220 , the positions of the cameras and the shooting directions thereof.
  • the operator specifies a location—which the operator desires to shoot—in the map 220 through use of a mouse.
  • the camera-to-be-operated determination section 125 selects, from among the plurality of cameras, the camera optimal for shooting the designated location.
  • the operator can operate the camera without involvement of actual operation of UP, DOWN, RIGHT, and LEFT buttons.
  • the system of the present example eliminates superfluous operations, thereby shortening the time from when the operator decides to monitor a certain scene until the scene captured by the camera appears on the display.
  • the present invention yields a great practical effect.
  • the camera-to-be-operated determination section 125 determines, as a camera to be operated, the camera capable of most quickly panning toward a location designated by the operator, and produces a command used for actually panning the thus-determined camera.
  • a method of panning another camera rather than the thus-selected camera in the event of presence of an impediment along an imaginary extension between the camera and the designated location.
  • FIG. 6 shows the configuration of a system embodying the method
  • FIG. 7 shows an example impediment 701 on the map 220 provided in the camera control region 210 displayed on the screen 200 of the image receiver 120 .
  • reference numerals 101 , 102 , 103 , and 104 designate cameras; 110 designates an image transmitter for transmitting image data; 120 designates an image receiver for receiving the image data and playing back images; 105 designates a display for displaying the images; and 106 designates an input device connected to the image receiver 120 .
  • the image receiver 120 shown in FIG. 6 corresponds to the image receiver 120 of the first embodiment additionally provided with an employable-camera survey section 601 .
  • the employable-camera survey section 601 eliminates a camera incapable of shooting the designated location from candidates considered by the camera-to-be-operated determination section 125 .
  • the camera-to-be-operated determination section 125 selects a camera on the basis of a determination made by the employable-camera survey section 601 as to whether or not the camera can shoot the designated location, as well as on the basis of the angle between the current shooting direction of the camera and the designated location.
  • the system is identical in configuration with that of the first embodiment shown in FIG. 1 .
  • a comparison is made between the angles 401 through 404 ; that is, the angle 401 between the current shooting direction of the camera 101 and an imaginary extension extending from the center of the camera symbol 211 to the location designated by the pointer 230 ; the angle 402 between the current shooting direction of the camera 102 and an imaginary extension extending from the center of the camera symbol 212 to the location designated by the pointer 230 ; the angle 403 between the current shooting direction of the camera 103 and an imaginary extension extending from the center of the camera symbol 213 to the location designated by the pointer 230 ; and the angle 404 between the current shooting direction of the camera 104 and an imaginary extension extending from the center of the camera symbol 214 to the location designated by the pointer 230 .
  • the camera corresponding to the minimum angle is determined as the camera to be operated.
  • FIG. 8 shows an example layout including the location designated by the operator by way of the pointer 230 ; the locations of the camera symbols 211 , 212 , 213 , and 214 assigned to the respective cameras 101 , 102 , 103 , and 104 ; and the position of the impediment 701 .
  • FIG. 9 shows the flow of processing through which, on the basis of the designated location and the coordinates of an impediment, the employable-camera survey section 601 eliminates, from candidates considered in the camera angle examination performed by the camera-to-be-operated determination section 125 , cameras incapable of shooting the designated location, even when panned, because of presence of an impediment.
  • a camera symbol 211 is selected.
  • the employable-camera survey section 601 connects the location designated by the pointer 230 and the center of the camera symbol 211 through use of an imaginary line, and calculates the position of the imaginary line in terms of a linear relation between “x” and “y” ( 902 ).
  • the employable-camera survey section 601 assigns the “x” coordinates of each of four points 801 , 802 , 803 , and 804 of the impediment 701 to variable “x” of Equation 2 (step 903 ).
  • the employable-camera survey section 601 compares the four calculation results with the “y” coordinates of the respective points 801 , 802 , 803 , and 804 (step 904 ). If all the calculation results are greater than the “y” coordinates, or if all the calculation results are less than the “y” coordinates, the employable-camera survey section 601 takes the camera as a candidate for angle comparison performed by the camera-to-be-operated determination section 125 (step 906 ).
  • the employable-camera survey section 601 examines whether or not an impediment is present between the location designated by the pointer 230 and the camera (step 905 ).
  • a “y” coordinate (6) of the point 802 is greater than a calculation result (4) obtained as a result of assigning the “x” coordinate of the point 802 to Equation 2.
  • step 906 In a case where no impediment is present between the designated location and the camera, no impediment blocks the field of view of the camera, and hence the camera is subjected to angle examination performed by the camera-to-be-operated determination section 125 (step 906 ). In the event of an impediment being present between the designated location and the camera, the impediment blocks the camera from shooting the designated location, and hence the camera is eliminated from the candidates for angle examination performed by the camera-to-be-operated determination section 125 (step 907 ).
  • the coordinates of the center of the camera symbol 211 are (1, 15) and the coordinates of the pointer 230 designated by the operator are (6,10).
  • the camera 101 assigned the camera symbol 211 becomes a candidate for angle examination performed by the camera-to-be-operated determination section 125 .
  • All the cameras 101 through 104 which are present in the map 220 shown in FIG. 7 are subjected to the foregoing processing (step 908 ).
  • the “y” coordinate (6) of the point 801 is less than a calculation result (77/5) obtained as a result of assigning the “x” coordinate of the point 801 to Equation 3;
  • the “y” coordinate (6) of the point 802 is less than a calculation result (104/5) obtained as a result of assigning the “x” coordinate of the point 802 to Equation 3;
  • the “y” coordinate (3) of the point 803 is less than a calculation result (77/5) obtained as a result of assigning the “x” coordinate of the point 803 to Equation 3;
  • the “y” coordinate (3) of the point 804 is less than a calculation result (104/5) obtained as a result of assigning the “x” coordinate of the point 804 to Equation 3. Therefore, the camera 102 assigned the camera symbol 212 is taken as a candidate for selection as a camera to be operated (step 906 ).
  • the “y” coordinate (6) of the point 801 is less than a calculation result (35/3) obtained as a result of assigning the “x” coordinate of the point 801 to Equation 4;
  • the “y” coordinate (6) of the point 802 is less than a calculation result (40/3) obtained as a result of assigning the “x” coordinate of the point 802 to Equation 4;
  • the “y” coordinate (3) of the point 803 is less than a calculation result (35/3) obtained as a result of assigning the “x” coordinate of the point 803 to Equation 4;
  • the “y” coordinate (3) of the point 804 is less than a calculation result (40/3) obtained as a result of assigning the “x” coordinate of the point 804 to Equation 4. Therefore, the camera 103 assigned the camera symbol 213 is taken as a candidate for selection as a camera to be operated (step 906 ).
  • the “y” coordinate (6) of the point 801 is less than a calculation result (7) obtained as a result of assigning the “x” coordinate (9) of the point 801 to Equation 5;
  • the “y” coordinate (3) of the point 803 is less than a calculation result (7) obtained as a result of assigning the “x” coordinate (9) of the point 803 to Equation 5;
  • the “y” coordinate (3) of the point 804 is less than a calculation result (4) obtained as a result of assigning the “x” coordinate (12) of the point 804 to Equation 5. Therefore, the camera 104 assigned the camera symbol 214 is subjected to processing pertaining to step ( 906 ).
  • the camera 104 is eliminated from candidates for selection performed by the camera-to-be-operated determination section 125 (step 907 ).
  • the camera-to-be-operated determination section 125 selects a camera to be operated from among the cameras determined as being candidates for selection performed by the camera-to-be-operated determination section 125 .
  • the method of determining a camera to be operated and the flow of operation of the cameras 101 , 102 , 103 , and 104 connected to the image transmitter 110 are the same as those employed in the first embodiment.
  • the employable-camera survey section 601 delivers the coordinates of the impediment 701 to the camera control area display section 123 , and the camera control area display section 123 displays the impediment 701 on the map 220 within the camera control region 210 .
  • display of the impediment 701 may be omitted.
  • the operator can ascertain the location of the impediment 701 .
  • the task of the image receiver 120 is diminished by the amount corresponding to that imposed by the image receiver 120 in displaying an impediment, thus increasing processing speed.
  • the image receiver 120 performs all the operations required for determining a camera to be operated in consideration of information about an impediment. Therefore, the present example can yield an advantage of eliminating the necessity of the operator being aware of an impediment.
  • the employable-camera survey section 601 eliminates, from candidates for selection performed by the camera-to-be-operated determination section 125 , a camera which is hindered by an impediment from shooting the location designated by the pointer 230 .
  • the system can be set so as to avoid selection of that camera, thereby preventing a situation in which an impediment blocks the camera directed toward the designated location.
  • the present example yields a large practical effect.
  • the image receiver 120 selects a camera capable of panning toward the designated location most quickly, on the basis of the angles of the cameras.
  • two factors are employed as conditions for selecting a camera to be operated; that is, the angle and focus of a camera.
  • these factors are converted into two factors capable of being compared; that is, the time required for the camera to pan toward the designated location, and the time required for the camera to attain a focus on the designated location.
  • the configuration of a system embodying the method of the present invention is shown in FIG. 10 .
  • the image receiver 120 of the present example corresponds to the image receiver of the first embodiment additionally provided with an angular-shift-time calculation section 1001 for calculating the time required for the camera to pan toward the designated location; a focus storage section 1002 for grasping the focus of a plurality of cameras; and a focus-shift-time calculation section 1003 for calculating the time required for the camera to attain a focus on the designated location.
  • an angular-shift-time calculation section 1001 for calculating the time required for the camera to pan toward the designated location
  • a focus storage section 1002 for grasping the focus of a plurality of cameras
  • a focus-shift-time calculation section 1003 for calculating the time required for the camera to attain a focus on the designated location.
  • the camera-to-be-operated determination section 125 determines a camera to be operated from the angle between the current shooting direction of the camera and the direction of an imaginary line connecting the center of the camera symbol with the designated location. In contrast, in the third embodiment, the camera-to-be-operated determination section 125 determines, as a camera to be operated, a camera which is directed toward the designated location and attains a focus on the designated location most quickly.
  • the system of the present example is identical in configuration with the system of the first embodiment.
  • FIG. 11 shows an example display indicated on the screen of the display 105 shown in FIG. 10 , in which the direction of an arrowy line depicts the shooting direction of a camera and the length of the arrowy line depicts the focus of the camera. From the lengths of respective arrows 1101 , 1102 , 1103 , and 1104 depicting the shooting directions of the cameras, the operator can grasp the locations on which the cameras are focused.
  • FIG. 12 shows the flow of determination of a camera to be operated on the basis of the two factors; that is, the angle between the designated location shown in FIG. 11 and the direction of an imaginary line connecting the center of a camera symbol and the shooting direction of a camera.
  • the flow of determination will now be described by reference to FIGS. 10 through 12 , as well as FIG. 4 , which is taken as an example layout of camera symbols.
  • the command load section 124 loads the location on the map 220 .
  • the command load section 124 loads the coordinates (6, 10) of the location designated by the operator by way of the pointer 230 .
  • An angular-shift-time conversion section 210 and a focus-shift-time calculation section 212 receive the coordinates (6,10) of the point which are designated by the operator through use of the pointer 230 and are loaded by way of the command load section 124 .
  • the angular-shift-time calculation section 1001 shown in FIG. 10 selects one from the camera symbols which are present in the camera control region 210 shown in FIG. 10 (step 1201 ).
  • the camera symbol 211 is selected in the example shown in FIG. 4 .
  • the angular-shift-time calculation section 1001 loads the shooting direction of the camera selected in step (step 1201 ) from the camera angle storage section 126 .
  • the camera-to-be-operated determination section 125 loads a time of three seconds calculated in step (step 1202 ), and the focus storage section 1002 delivers, to the focus-shift-time calculation section 1003 , the focus length of the camera selected in step (step 1201 ).
  • the focus-shift-time calculation section 1003 calculates the time required for a camera to attain a focus on the designated location, from the coordinates of the designated location and the length of the focus (step 1203 ).
  • shifting the focus of a camera by one unit length takes one second in the present example and that the focus length of the camera symbol 211 is four in the example shown in FIG. 4
  • the camera-to-be-operated determination section 125 receives a period of 1.8 sec. calculated in step (step 1203 ).
  • the camera-to-be-operated determination section 125 selects the greater of the time required for the camera to pan toward the designated location and the time required for the camera to attain a focus on the designated location, the selects a greater time (step 1204 ).
  • the time that the camera 101 assigned the camera symbol 211 requires to pan, as calculated in step (step 1202 ), is 3 sec.
  • the time calculated in step (step 1203 ) is 1.8 sec.
  • the camera 101 assigned the camera symbol 211 takes 3 sec. to pan toward the designated location and to attain a focus on the designated location. All the other cameras are subjected to similar time calculation operations (step 1205 ).
  • the camera 102 assigned the camera symbol 212 takes 4 sec. to pan toward the location designated by the pointer 230 ; the camera 103 assigned the camera symbol 213 takes 2 sec. to pan toward the same location; and the camera 104 assigned the camera symbol 214 takes 3 sec. to pan toward the same location.
  • the camera 102 takes 2.2 sec. to attain a focus on the designated location; the camera 103 takes 2.8 sec. to attain a focus on the same location; and the camera 104 takes 2.6 sec. to attain a focus on the same location.
  • the camera 102 takes 4 sec. to pan toward and attain a focus on the designated location; the camera 103 takes 2.8 sec.
  • each camera is assigned a single time factor, and a camera assigned the minimum time factor can pan toward and attain a focus on the designated location most quickly.
  • This camera is determined as the camera to be operated ( 1206 ).
  • the camera-to-be-operated determination section 125 determines the camera 103 assigned the camera symbol 213 as a camera to be operated.
  • the control command conversion section 127 receives information about the angle of the camera 103 and converts the information into a control command to be used for moving the camera 103 .
  • the control command is delivered to the control command transmission section 128 , and the control command receiving section 113 of the image transmission section 110 receives the control command.
  • the flow of subsequent processing is the same as that employed in the first embodiment.
  • the arrows 1101 , 1102 , 1103 , and 1104 depicting the shooting directions of the cameras 101 , 102 , 103 , and 104 illustrate the depths of their focuses.
  • the operator can ascertain the locations on which the cameras are focused.
  • the task of the image receiver 120 is diminished by the amount corresponding to that imposed by the camera control region display section 123 in displaying the focal depths, thus increasing processing speed.
  • the image receiver 120 performs all the operations required for determining a camera to be operated in consideration of information about the focuses. Therefore, the present example can yield an advantage of eliminating the necessity of the operator to be aware of the focal depths of the cameras.
  • the focuses of respective cameras have been grasped beforehand.
  • the respective camera there are calculated the time required for the camera to pan toward the designated location, as well as the time required for the camera to attain a focus on the designated location.
  • the time required for the camera to pan toward and attain a focus on the designated location is calculated from these time periods.
  • the image receiver 120 determines, from the information about the angles of cameras, the camera capable of panning most quickly toward the designated location most quickly, and determines as well a command for actually panning the camera.
  • the operator may be dissatisfied with the image, if the image is not captured from the direction desired by the operator.
  • the operator after having designated a desired location, the operator specifies a desired shooting direction, thus enabling selection of a camera which can capture an image from the direction desired by the operator.
  • FIG. 13 shows the configuration of a system embodying the present example.
  • reference numerals 101 , 102 , 103 , and 104 designate cameras; 110 designates an image transmitter; 120 designates an image receiver; 105 designates a display; and 106 designates an input device.
  • the image receiver 120 of the present example corresponds to an image receiver of the first embodiment additionally provided with a view-point direction survey section 1301 which stores the direction in which the operator desires to shoot his designated location; and a view-point direction display section 1302 for displaying the desired direction together with the camera control region display section 123 .
  • the camera-to-be-operated determination section 125 determines the camera to be operated, from information as to whether or not an image can be shot in the direction designated by the view-point survey section 1301 , as well as from the angle between the current shooting direction of the camera and the direction of an imaginary line connecting the designated location with the center of the camera symbol.
  • the system of the present example is identical in configuration with that of the first embodiment.
  • FIG. 14 shows an example of the screen 200 to be displayed on the display 105 when the view-point direction display section 1302 displays a desired direction designated by the operator.
  • the operator after having specified a desired location on the map 220 by the operator by way of the pointer 230 , the operator specifies the direction in which the location designated by the pointer 230 is to be viewed, by means of rotation of an arrow 1401 around the pointer 230 .
  • FIG. 15 shows the flow of selection of a camera capable of shooting an image in a direction close to a desired shooting direction when the operator designates a desired direction in which the designated location is to be shot.
  • the command load section 124 shown in FIG. 13 loads the position and direction of the designated location.
  • the view-point direction survey section 1301 selects from all the cameras the camera optimal for shooting the designated location and direction. First, one camera is selected from all the cameras (step 1501 ). There is examined the direction of the camera which would result if the thus-selected camera were panned toward the designated location.
  • FIG. 16 shows an example angle between the designated direction and the direction of the camera when the camera is panned toward the location designated by the pointer 230 .
  • the drawing shows the location designated by the pointer 230 ; the arrow 1401 depicting the direction in which the operator desires to shoot the designated location; the direction 221 in which the camera 101 assigned the camera symbol 211 would shoot the location designated by the pointer 230 ; the direction 223 in which the camera 103 assigned the camera symbol 213 would shoot the location designated by the pointer 230 ; an angle 1601 between the direction 221 of the camera symbol 211 and the direction of the arrow 1401 ; and an angle 1602 between the direction 223 of the camera symbol 213 and the direction of the arrow 1401 .
  • the angle 1601 between the camera symbol 211 and the arrow 1401 is measured (step 1502 ), and a determination is made as to whether or not the thus-measured angle 1601 is greater than a certain angle (step 1503 ).
  • the certain angle is set to 90 degrees.
  • the magnitude of the certain angle may assume any value. If the certain angle is smaller than 90 degrees, the camera is deemed capable of being panned toward the designated location and is subjected to the examination performed by the camera-to-be-operated determination section 125 as to whether or not the camera can be panned toward the designated location most quickly (step 1504 ). If the certain angle is greater than 90 degrees, the camera is deemed as being incapable of shooting the designated location in the designated direction. The camera is eliminated from candidates for selection of a camera performed by the camera-to-be-operated determination section 125 (step 1505 ). The angles of all the cameras are examined (step 1506 ).
  • the angle 1601 of the camera symbol 211 assumes 130 degrees, which is greater than 90 degrees, and hence the camera 101 assigned the camera symbol 211 is eliminated from candidates for examination performed by the camera-to-be-operated determination section 125 .
  • the angle 1602 of the camera symbol 213 assumes 30 degrees, which is less than 90 degrees, and hence the camera 103 assigned the camera symbol 213 is selected as a candidate for the examination performed by the camera-to-be-operated determination section 125 .
  • the result of examination of camera angles; that is, the camera 103 assigned the camera symbol 213 having been determined as a candidate for the examination and the camera 101 assigned the camera symbol 211 having been determined to be eliminated from the candidates for the examination, are reported to the camera-to-be-operated determination section 125 .
  • the camera-to-be-operated determination section 125 selects one from the cameras which have been determined to be candidates for examination. Subsequent processing up to process in which the cameras 101 , 102 , 103 , and 104 are operated is the same as that employed in the first embodiment.
  • the command load section 124 reports to the view-point direction display section 1302 the direction in which the designated location is to be shot.
  • the view-point direction display section 1302 displays, on the camera control region 210 shown in FIG. 14 , the arrow 1401 representing the designated direction.
  • the view point direction 1302 displays the desired direction 1401 ; this is not inevitable.
  • the operator can ascertain from the display the direction in which the designated location is to be viewed.
  • the desired direction 1401 is not displayed, the task of the image receiver 120 is diminished by the amount corresponding to that imposed by the image receiver 120 display the desired direction 1401 , thus increasing processing speed.
  • the image receiver 120 receives a command for specifying the direction in which the designated location is to be shot, as well as the designated location.
  • the image receiver 120 then eliminates cameras from candidates for selection those incapable of shooting the designated location in the designated direction.
  • the view-point direction survey section 1301 can automatically narrow the candidates for selection of a camera capable of shooting an image of the designated location in the desired direction.
  • a camera capable of being panned most quickly to the desired direction and location can be automatically selected, thus yielding a great practical advantage.
  • a camera having the minimum angle between the shooting direction and an imaginary line connecting the designated location with the center of the camera symbol is determined as the camera capable of being panned toward the designated location most quickly, and the thus-determined camera is operated.
  • the camera is selected in consideration of the zoom of the camera as well as the angle of the camera. Accordingly, the operator directly specifies a desired range rather than a desired location.
  • FIG. 17 shows the configuration of a system embodying such a method.
  • reference numerals 101 , 102 , 103 , and 104 designate cameras; 110 designates an image receiver for receiving an image; 105 designates a display for displaying a received image; and 106 designates an input device connected to the image receiver 120 .
  • the image receiver 120 of the present example corresponds the image receiver 120 of the first embodiment additionally provided with the angular-shift-time calculation section 1001 of the third embodiment for calculating the time required for the camera to pan toward the designated location; a zoom storage section 1701 for grasping the degree of zoom of a plurality of cameras; a zoom-shift time calculation section 1702 for calculating the time required for a camera to zoom in order to display an image of the designated range; and a zoom range display section 1703 for displaying, in the camera control region 210 , a range to be zoomed.
  • the camera-to-be-operated determination section 125 determines the camera to be operated, from the time required for the camera to pan toward the designated location and the time required for the camera to zoom in or out for capturing an image of the range designated by the operator.
  • the system of the present example is identical in configuration with that of the first embodiment shown in FIG. 1 .
  • FIG. 18 shows an example screen layout in which the zoom range display section 1703 specifies a camera zoom range in the camera control region 210 .
  • the screen layout is formed by addition, to the camera control region 210 of the first embodiment, of a zoom range 1801 in which the user specifies a desired range.
  • the screen layout is identical with that of the first embodiment shown in FIG. 2 .
  • FIG. 19 shows the flow of selection of a camera capable of panning toward a designated direction and zooming into a range designated by the operator when the operator designates a range in which he desired to view an image.
  • FIG. 20 shows an example in which the operator specifies a zoom range in the camera control region 210 .
  • reference numerals 211 , 212 , 213 , and 214 designate camera symbols; 221 , 222 , 223 , and 224 designate current shooting direction of cameras; 2001 , 2002 , 2003 , and 2004 designate ranges currently shot by cameras; and 1801 designates an image display range designated by the operator.
  • the ranges being currently shot by the cameras 101 , 102 , 103 , and 104 are displayed in the camera control region 210 .
  • display of the ranges may be omitted.
  • the ranges 2001 , 2002 , 2003 , and 2004 currently shot by the cameras are expressed as aspects; (6 ⁇ 6), (6 ⁇ 6), (2 ⁇ 2), and (8 ⁇ 8), respectively.
  • the zoom-shift time calculation section 1702 selects a camera which has not yet been subjected to calculation of the time required for the camera to zoom (step 1901 ). In the present example, the camera 101 is selected.
  • the zoom-shift time calculation section 1702 receives, from the zoom range storage section 1701 , the range 2001 (6 ⁇ 6) of the camera 101 assigned the camera symbol 211 ( 1902 ).
  • the zoom-shift time calculation section 1702 loads the designated range 1801 (3 ⁇ 3) received by the command loading section 124 .
  • the zoom-shift time calculation section 1702 accepts a load zoom range from an operator ( 1903 ) and calculates the time required for the camera to zoom in or out into the designated range from the current shooting range ( 1904 ). Provided that changing a shooting range by one unit length requires one second, the camera 101 takes three seconds to zoom into the designated range. Required focus-shift time is calculated for each camera ( 1905 ).
  • the current shooting range of the camera 102 assigned the camera symbol 212 is (6 ⁇ 6); the current shooting range of the camera 103 assigned the camera symbol 213 is (2 ⁇ 2); and the current shooting range of the camera 104 assigned the camera symbol 214 is (8 ⁇ 8). Therefore, it takes three seconds for the camera 102 to zoom into the designated range; it takes one second for the camera 103 to zoom into the designated range; and it takes five seconds for the camera 104 to zoom into the designated range.
  • the camera-to-be-operated determination section 125 receives the angular-shift times of 3 sec., 4 sec., 2 sec., and 3 sec. from the angular-shift-time calculation section 1001 , and zoom-shift times of 3 sec., 3 sec., 1 sec., and 5 sec. from the zoom-shift time calculation section 1702 . For each camera, the larger of the angular-shift time and the zoom-shift time is taken as the time required for the camera to pan toward the designated direction and zoom into the designated range.
  • the camera-to-be-operated determination section 125 determines that the camera 103 requires the least time, thus determining the camera 103 to be the camera capable of most quickly panning toward the designated direction and the zooming into the designated range.
  • the control command conversion section 127 receives the angle of rotation of the camera to be operated and the degree of zoom thereof. The control command conversion section 127 then delivers the rotation angle of the camera to the camera angle storage section 126 and the degree of zoom of the camera to the zoom range storage section 1701 . The camera angle storage section 126 delivers the angle of the camera to the camera control region display section 123 . The zoom range display section 127 displays the designated zoom range. Subsequent processing up to rotation of the cameras 101 , 102 , 103 , and 104 is the same as that employed in the first embodiment.
  • a camera optimal for shooting the designated range can be automatically selected by means of calculating the time required for a camera to pan toward a designated direction from information about the current shooting direction of the camera; calculating the time required for the camera to zoom into the designated range from information about the distance from the currently zoomed location to the designated range; and comparing the cameras in terms of the thus-calculated times, to thereby select the camera capable of most quickly panning toward the designated direction and zooming into the designated range.
  • the present example yields a large practical effect.
  • FIG. 21 shows the configuration of a system bodying the present example.
  • reference numerals 101 , 102 , 103 , and 104 designate cameras; 110 designates an image receiver for receiving an image; 120 designates an image receiver for receiving an image; 105 designates a display for displaying a received image; and 106 designates an input device for controlling the cameras 101 , 102 , 103 , and 104 .
  • the image receiver 120 corresponds to the image receiver 120 of the first embodiment additionally provided with an image size conversion section 2101 for changing the size of an image to be displayed on the display 105 .
  • the system of the present example is identical in configuration with that shown in FIG. 1 .
  • FIG. 22 shows an example screen 200 provided with the image size conversion section 2101 .
  • the image size conversion section 2101 displays, in the image display region of the screen 200 , an enlarged image 2201 captured by the camera selected by the camera-to-be-operated determination section 125 . Images 2202 , 2203 , and 2204 captured by the other cameras are displayed in a uniformly-reduced form.
  • the camera 211 that is currently shooting the enlarged image is displayed by the camera control region display section 123 shown in FIG. 21 . However, such a function of the camera control region display section 123 may be omitted.
  • FIG. 23 shows the flow of processing in which the image receiver 120 displays an enlarged image of the image captured by the camera determined by the camera-to-be-operated determination section 125 .
  • the image size conversion section 2101 receives image data from the image data receiving section 121 . Further, the image size conversion section 2101 receives information about which camera is operated, from the camera-to-be-operated determination section 125 . In the present example, the camera 101 determines a camera to be operated.
  • the camera-to-be-operated determination section 125 sends to the image size conversion section 2101 information indicating that the camera 101 is to be operated.
  • the image size conversion section 2101 examines which of a plurality of images is an image captured by the camera to be operated (step 2301 ).
  • the camera 101 corresponds to the camera to be operated, and an image 2201 of the camera 101 is enlarged (step 2302 ).
  • the scaling factor may be determined in advance or may be designated by the operator.
  • the image size conversion section 210 counts the number of cameras (step 2303 ) and scales down images 2202 , 2203 , and 2204 captured by the cameras 102 , 103 , and 104 which are not to be operated (step 2304 ).
  • the scaling factor may be determined in advance or may be designated by the operator.
  • the image size conversion section 2101 determines the locations of the thus-enlarged/reduced images 2201 , 2202 , 2203 , and 2204 on the screen.
  • the image data playback section 122 receives from the image size conversation section 2102 information about the locations at which the images are to be displayed, and the displays the images (step 2305 ).
  • the display positions of the images may be determined in advance or designated by the operator.
  • the image data display section 122 receives reduced image data and displays, in image display areas of the display 105 , an enlarged image of the image 2201 and reduced images of the images 2202 , 2203 , and 2204 .
  • the camera control region display section 123 displays the camera control region 210 in the same manner as employed in first embodiment.
  • the camera control region display section 123 receives, from the camera-to-be-operated determination section 125 , which camera is to be operated, and displays the name of the camera to be operated (i.e., the camera 221 ) is displayed in the camera control region 210 .
  • the image size conversion section 2101 enlarges the image of the camera finally selected by the camera-to-be-operated determination section 125 and scales down the other images to equal size. So long as the image of the finally-selected camera is displayed larger than the images of the other cameras, the scales of individual images and the display positions of the images may be set arbitrarily.
  • the image of the camera selected by the camera-to-be-operated determination section 125 is enlarged, and the images of the other cameras which are not to be operated are scaled down.
  • the operator can readily ascertain the camera which is currently in an operating state and can view an enlarged image of the designated location in detail, thus yielding a large practical effect.
  • the present invention is applied to a monitoring system of a skyscraper which uses a plurality of monitor cameras, it is evident that the present invention provides a degree of convenience in proportion to the number of monitor cameras.
  • the camera which makes the smallest angle between the shooting direction and the imaginary line connecting the designated location and the center of the camera symbol is operated as the camera capable of panning toward the designated location most quickly.
  • the camera capable of panning toward the designated location most quickly not only the camera that is panned toward the designated location most quickly, but also one or more other cameras are simultaneously controlled in given sequence, by arranging the cameras in descending sequence of panning speed.
  • the number of cameras to be controlled is not limited to any specific number.
  • the configuration of a system embodying the example is shown in FIG. 24 .
  • reference numerals 101 , 102 , 103 , and 104 designate cameras; 110 designates an image transmitter; 120 designates an image receiver; 105 designates a display; and 106 designates an input device.
  • the image receiver 120 of the seventh embodiment corresponds to the image receiver 120 of the first embodiment additionally provided with a zoom-scale determination section 2401 for determining the zoom scale of a camera to be operated.
  • the camera-to-be-operated determination section 125 determines a plurality of cameras to be operated and sends the descending sequence of panning speed at which a plurality of cameras are panned toward the designated location.
  • the system of the present example is identical in configuration with that of the first embodiment shown in FIG. 1 .
  • FIG. 25 shows an example screen of processing in which cameras are arranged in descending sequence of panning speed instead of only the camera that can be panned toward the designated location most quickly, and one or more other cameras are simultaneously controlled in given sequence.
  • An image 203 depicts a zoomed-image of the designated location ( 2501 ), and an image 204 depicts a zoomed-out image captured from the designated direction.
  • FIG. 26 shows the flow of processing in which not only the camera that can be panned toward the designated location most quickly but also one or more other cameras are simultaneously controlled in given sequence, by arranging the cameras in descending sequence of panning speed.
  • the present example describes a case where two cameras are controlled simultaneously, but the number of cameras to be controlled simultaneously is not limited to any specific number.
  • the flow of processing in which the zoom-scale determination section 2401 examines cameras to be controlled will be described by reference to FIG. 26 .
  • the camera-to-be-operated determination section 125 examines a sequence in which the cameras 101 , 102 , 103 , and 104 can be panned toward the designated location (step 2601 ).
  • the zoom-scale determination section 2401 receives, from the camera-to-be-operated determination section 125 , the descending sequence of panning speed at which the plurality of cameras can be panned toward the designated location.
  • the zoom-scale determination section 2401 selects one from the plurality of cameras and examines the camera as to whether or not the camera can be panned toward the designated location most quickly (step 2602 ).
  • the zoom-scale determination section 2401 instructs the camera that can be panned toward the designated location most quickly to zoom in the designated location (step 2603 ).
  • the zoom-scale determination section 2401 examines one of the remaining cameras whether or not the camera can be panned toward the designated location at the second-highest speed (step 2604 ). If the camera can be panned toward the designated location at the second-highest speed, the zoom-scale determination section 2401 instructs the camera to zoom out so as to shoot surroundings of the designated location (step 2605 ).
  • the camera 103 can be panned toward the designated location most quickly, and the camera 104 can be panned toward the designated location at the second-highest speed.
  • the zoom-scale determination section 2401 instructs the camera 103 to zoom in the designated location and the camera 104 to zoom out from the same.
  • the camera-to-be-operated determination section 125 receives, from the zoom-scale determination section 2401 , an instruction for causing the camera 103 to zoom in the designated location and an instruction for causing the camera 104 to zoom out from the same.
  • the camera-to-be-operated determination section 125 determines which camera is to be operated. Under the method of determining a camera to be operated, a plurality of camera which can be panned toward the designated location are operated in descending sequence of panning speed.
  • the method of examining the speed at which the camera is panned is the same as that employed in the first embodiment.
  • the camera control command conversion section 127 receives, from the camera-to-be-operated determination section 125 , information about the identification of a camera to be controlled, the angle through which the camera is to be panned, and a zooming in/out scale. Subsequently, processing identical to that employed in the first embodiment is performed until the cameras 101 , 102 , 103 , and 104 are operated.
  • images of the desired location are captured simultaneously through use of two or more cameras.
  • the combined use of cameras enables the operator to simultaneously obtain a detailed image of the designated location and grasp the condition of surroundings of the designated location.
  • the present invention enables operation of a plurality of cameras through entry of a single command, thus realizing more-effective shooting of an image while involving less operation.
  • the present invention yields a large practical effect.
  • an operator can operate a camera optimal for shooting a location designated by the operator selected from among the plurality of cameras, without involvement of actual operation of UP, DOWN, RIGHT, and LEFT buttons.
  • the system of the present embodiment eliminates superfluous operations, thereby yielding an advantage of shortening the time from when the operator decides to monitor a certain scene until the scene captured by the camera appears on the display.
  • the present invention prevents a situation in which an impediment blocks the camera directed toward the designated location. Even if an impediment blocks the view field of a certain camera and hinders the camera from shooting the location designated by the operator, a system of the present invention can be set so as to avoid selection of that camera.
  • the respective camera there are calculated the time required for the camera to pan toward the designated location from information about the current shooting direction of a camera, as well as the time required for the camera to attain a focus on the designated location from information about the distance between the location on which the camera is currently focused and the designated location.
  • the thus-calculated times there is selected a camera capable of panning toward and attaining a focus on the designated location most quickly, thus enabling selection of a camera optimal for shooting.
  • an image receiver receives a command for specifying the direction in which the designated location is to be shot, as well as the designated location.
  • the image receiver then eliminates cameras from candidates for selection those incapable of shooting the designated location in the designated direction.
  • Cameras capable of shooting an image of the designated location from a desired location can be automatically selected, and a camera capable of being panned most quickly to the desired direction and location can be automatically selected from among those cameras.
  • a camera optimal for shooting the designated range can be automatically selected by means of calculating the time required for a camera to pan toward a designated direction from information about the current shooting direction of the camera; calculating the time required for the camera to zoom into the designated range from information about the distance from the currently zoomed location to the designated range; and comparing the cameras in terms of the thus-calculated times, to thereby select the camera capable of most quickly panning toward the designated direction and zooming into the designated range.
  • the image of a camera to be operated is enlarged, and the images of the other cameras which are not to be operated are scaled down.
  • the operator can efficiently ascertain, on a screen, the camera which is currently in an operating state. Further, the operator can view an enlarged image of a location designated by the operator while viewing the screen display.
  • images of the desired location are captured simultaneously through use of two or more cameras.
  • the combined use of cameras enables the operator to simultaneously obtain a detailed image of the designated location and grasp the condition of surroundings of the designated location.
  • the present invention enables operation of a plurality of cameras through entry of a single command.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Train Traffic Observation, Control, And Security (AREA)
  • Control Of Position Or Direction (AREA)
  • Electric Propulsion And Braking For Vehicles (AREA)
US09/550,038 1999-04-16 2000-04-14 Camera control apparatus and method Expired - Lifetime US6954224B1 (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP10934199A JP4209535B2 (ja) 1999-04-16 1999-04-16 カメラ制御装置

Publications (1)

Publication Number Publication Date
US6954224B1 true US6954224B1 (en) 2005-10-11

Family

ID=14507779

Family Applications (1)

Application Number Title Priority Date Filing Date
US09/550,038 Expired - Lifetime US6954224B1 (en) 1999-04-16 2000-04-14 Camera control apparatus and method

Country Status (6)

Country Link
US (1) US6954224B1 (zh)
EP (1) EP1045580A3 (zh)
JP (1) JP4209535B2 (zh)
KR (1) KR100633465B1 (zh)
CN (1) CN1204739C (zh)
TW (1) TW498689B (zh)

Cited By (38)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030202102A1 (en) * 2002-03-28 2003-10-30 Minolta Co., Ltd. Monitoring system
US20040179121A1 (en) * 2003-03-12 2004-09-16 Silverstein D. Amnon System and method for displaying captured images according to imaging device position
US20050225634A1 (en) * 2004-04-05 2005-10-13 Sam Brunetti Closed circuit TV security system
US20060174203A1 (en) * 2005-01-31 2006-08-03 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Viewfinder for shared image device
US20080129825A1 (en) * 2006-12-04 2008-06-05 Lynx System Developers, Inc. Autonomous Systems And Methods For Still And Moving Picture Production
US20100070172A1 (en) * 2008-09-18 2010-03-18 Ajith Kuttannair Kumar System and method for determining a characterisitic of an object adjacent to a route
US7683937B1 (en) * 2003-12-31 2010-03-23 Aol Inc. Presentation of a multimedia experience
US20100090135A1 (en) * 2008-10-10 2010-04-15 Ajith Kuttannair Kumar System and method for determining characteristic information of an object positioned adjacent to a route
US20100321473A1 (en) * 2007-10-04 2010-12-23 Samsung Techwin Co., Ltd. Surveillance camera system
US20110043628A1 (en) * 2009-08-21 2011-02-24 Hankul University Of Foreign Studies Research and Industry-University Cooperation Foundation Surveillance system
US20110199386A1 (en) * 2010-02-12 2011-08-18 Honeywell International Inc. Overlay feature to provide user assistance in a multi-touch interactive display environment
US20110199517A1 (en) * 2010-02-12 2011-08-18 Honeywell International Inc. Method of showing video on a touch-sensitive display
US20110199314A1 (en) * 2010-02-12 2011-08-18 Honeywell International Inc. Gestures on a touch-sensitive display
US20110199495A1 (en) * 2010-02-12 2011-08-18 Honeywell International Inc. Method of manipulating assets shown on a touch-sensitive display
US20110199516A1 (en) * 2010-02-12 2011-08-18 Honeywell International Inc. Method of showing video on a touch-sensitive display
US20120075466A1 (en) * 2010-09-29 2012-03-29 Raytheon Company Remote viewing
US20140015920A1 (en) * 2012-07-13 2014-01-16 Vivotek Inc. Virtual perspective image synthesizing system and its synthesizing method
US8836802B2 (en) 2011-03-21 2014-09-16 Honeywell International Inc. Method of defining camera scan movements using gestures
US8902320B2 (en) 2005-01-31 2014-12-02 The Invention Science Fund I, Llc Shared image device synchronization or designation
US8964054B2 (en) 2006-08-18 2015-02-24 The Invention Science Fund I, Llc Capturing selected image objects
US8988537B2 (en) 2005-01-31 2015-03-24 The Invention Science Fund I, Llc Shared image devices
US9001215B2 (en) 2005-06-02 2015-04-07 The Invention Science Fund I, Llc Estimating shared image device operational capabilities or resources
US20150109452A1 (en) * 2012-05-08 2015-04-23 Panasonic Corporation Display image formation device and display image formation method
US9041826B2 (en) 2005-06-02 2015-05-26 The Invention Science Fund I, Llc Capturing selected image objects
US9076208B2 (en) 2006-02-28 2015-07-07 The Invention Science Fund I, Llc Imagery processing
US9082456B2 (en) 2005-01-31 2015-07-14 The Invention Science Fund I Llc Shared image device designation
US9124729B2 (en) 2005-01-31 2015-09-01 The Invention Science Fund I, Llc Shared image device synchronization or designation
US9191611B2 (en) 2005-06-02 2015-11-17 Invention Science Fund I, Llc Conditional alteration of a saved image
US9325781B2 (en) 2005-01-31 2016-04-26 Invention Science Fund I, Llc Audio sharing
US9451200B2 (en) 2005-06-02 2016-09-20 Invention Science Fund I, Llc Storage access technique for captured data
US9489717B2 (en) 2005-01-31 2016-11-08 Invention Science Fund I, Llc Shared image device
US9591272B2 (en) 2012-04-02 2017-03-07 Mcmaster University Optimal camera selection in array of monitoring cameras
US9621749B2 (en) 2005-06-02 2017-04-11 Invention Science Fund I, Llc Capturing selected image objects
US9819490B2 (en) 2005-05-04 2017-11-14 Invention Science Fund I, Llc Regional proximity for shared image device(s)
US9910341B2 (en) 2005-01-31 2018-03-06 The Invention Science Fund I, Llc Shared image device designation
US9942511B2 (en) 2005-10-31 2018-04-10 Invention Science Fund I, Llc Preservation/degradation of video/audio aspects of a data stream
US10003762B2 (en) 2005-04-26 2018-06-19 Invention Science Fund I, Llc Shared image devices
US10097756B2 (en) 2005-06-02 2018-10-09 Invention Science Fund I, Llc Enhanced video/still image correlation

Families Citing this family (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
PT102667B (pt) * 2001-09-27 2003-09-30 Inesc Inovacao Inst De Novas T Sistema de direccionamento de camaras de video de qualquer gama do espectro
GB0208654D0 (en) * 2002-04-16 2002-05-29 Koninkl Philips Electronics Nv Image processing for video or photographic equipment
DE10358017A1 (de) * 2003-12-11 2005-07-21 Siemens Ag 3D Kamerasteuerung
GB2412519B (en) * 2004-03-23 2010-11-03 British Broadcasting Corp Monitoring system
WO2006080316A1 (ja) * 2005-01-25 2006-08-03 Matsushita Electric Industrial Co., Ltd. カメラ制御装置およびこの装置におけるズーム倍率制御方法
DE102006012239A1 (de) * 2006-03-16 2007-09-20 Siemens Ag Video-Überwachungssystem
JP5264274B2 (ja) * 2008-04-30 2013-08-14 キヤノン株式会社 通信装置、カメラシステム、制御方法およびプログラム
JP5192442B2 (ja) * 2009-05-21 2013-05-08 日本電信電話株式会社 映像生成装置、映像生成方法、映像生成プログラムおよびそのプログラムを記録したコンピュータ読み取り可能な記録媒体
JP4547040B1 (ja) 2009-10-27 2010-09-22 パナソニック株式会社 表示画像切替装置及び表示画像切替方法
JP5005080B2 (ja) * 2010-09-06 2012-08-22 キヤノン株式会社 パノラマ画像の生成方法
CN103365589B (zh) * 2011-04-05 2018-03-30 霍尼韦尔国际公司 在触敏显示器上示出视频的方法
CN102118611B (zh) * 2011-04-15 2013-01-02 中国电信股份有限公司 运动目标的数字视频监控方法与系统、数字视频监控平台
CN102331792A (zh) * 2011-06-24 2012-01-25 天津市亚安科技电子有限公司 一种控制云台预置位的方法及系统
JP5895389B2 (ja) * 2011-07-27 2016-03-30 株式会社Jvcケンウッド 画像表示装置、画像表示方法及び画像表示プログラム
JP5806147B2 (ja) * 2012-03-05 2015-11-10 Toa株式会社 カメラ制御装置及びそのコンピュータプログラム
JP5955130B2 (ja) * 2012-06-26 2016-07-20 キヤノン株式会社 カメラ制御装置及びカメラ制御方法
CN103500331B (zh) * 2013-08-30 2017-11-10 北京智谷睿拓技术服务有限公司 提醒方法及装置
CN103501406B (zh) 2013-09-16 2017-04-12 北京智谷睿拓技术服务有限公司 图像采集系统及图像采集方法
EP3038344B1 (en) * 2013-10-07 2019-10-23 Sony Corporation Information processing device, imaging system, method for controlling image processing device, and program
JP6310243B2 (ja) * 2013-12-06 2018-04-11 能美防災株式会社 自動火災報知設備
GB2539387B (en) * 2015-06-09 2021-04-14 Oxford Metrics Plc Motion capture system
US20180262659A1 (en) * 2017-03-13 2018-09-13 Sling Media Pvt Ltd Device mobility in digital video production system
CN111163283A (zh) * 2018-11-07 2020-05-15 浙江宇视科技有限公司 一种监控方法及装置
US11368743B2 (en) * 2019-12-12 2022-06-21 Sling Media Pvt Ltd Telestration capture for a digital video production system

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4992866A (en) * 1989-06-29 1991-02-12 Morgan Jack B Camera selection and positioning system and method
EP0529317A1 (en) 1991-08-22 1993-03-03 Sensormatic Electronics Corporation Surveillance system with master camera control of slave cameras
EP0715453A2 (en) 1994-11-28 1996-06-05 Canon Kabushiki Kaisha Camera controller
EP0723374A1 (en) 1994-07-01 1996-07-24 HYUGA, Makoto Communication method and apparatus therefor
EP0729275A2 (en) 1995-02-24 1996-08-28 Canon Kabushiki Kaisha Image input system
US5583565A (en) * 1993-10-20 1996-12-10 Videoconferencing Systems, Inc. Method for automatically adjusting the pan and tilt of a video conferencing system camera
WO1997037494A1 (en) 1996-03-29 1997-10-09 Barry Katz Surveillance system having graphic video integration controller and full motion video switcher
US6359647B1 (en) * 1998-08-07 2002-03-19 Philips Electronics North America Corporation Automated camera handoff system for figure tracking in a multiple camera system
US6597389B2 (en) * 1996-01-30 2003-07-22 Canon Kabushiki Kaisha Camera control system and camera control apparatus

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07123390A (ja) * 1993-08-30 1995-05-12 Canon Inc テレビ会議システムおよび画像入力装置のための制御装置

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4992866A (en) * 1989-06-29 1991-02-12 Morgan Jack B Camera selection and positioning system and method
EP0529317A1 (en) 1991-08-22 1993-03-03 Sensormatic Electronics Corporation Surveillance system with master camera control of slave cameras
US5583565A (en) * 1993-10-20 1996-12-10 Videoconferencing Systems, Inc. Method for automatically adjusting the pan and tilt of a video conferencing system camera
EP0723374A1 (en) 1994-07-01 1996-07-24 HYUGA, Makoto Communication method and apparatus therefor
EP0715453A2 (en) 1994-11-28 1996-06-05 Canon Kabushiki Kaisha Camera controller
US20020067412A1 (en) * 1994-11-28 2002-06-06 Tomoaki Kawai Camera controller
EP0729275A2 (en) 1995-02-24 1996-08-28 Canon Kabushiki Kaisha Image input system
US6597389B2 (en) * 1996-01-30 2003-07-22 Canon Kabushiki Kaisha Camera control system and camera control apparatus
WO1997037494A1 (en) 1996-03-29 1997-10-09 Barry Katz Surveillance system having graphic video integration controller and full motion video switcher
US6359647B1 (en) * 1998-08-07 2002-03-19 Philips Electronics North America Corporation Automated camera handoff system for figure tracking in a multiple camera system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Patent Abstracts of Japan, Ito Miki, "Controller for Video Conference System and Image Input Device", Publication No. 07123390, Publication Date: May 12, 1995, 1 page.

Cited By (54)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030202102A1 (en) * 2002-03-28 2003-10-30 Minolta Co., Ltd. Monitoring system
US20040179121A1 (en) * 2003-03-12 2004-09-16 Silverstein D. Amnon System and method for displaying captured images according to imaging device position
US7683937B1 (en) * 2003-12-31 2010-03-23 Aol Inc. Presentation of a multimedia experience
US8913143B2 (en) 2003-12-31 2014-12-16 Turner Broadcasting System, Inc. Panoramic experience system and method
US9740371B2 (en) 2003-12-31 2017-08-22 Turner Broadcasting System, Inc. Panoramic experience system and method
US20050225634A1 (en) * 2004-04-05 2005-10-13 Sam Brunetti Closed circuit TV security system
US9124729B2 (en) 2005-01-31 2015-09-01 The Invention Science Fund I, Llc Shared image device synchronization or designation
US9910341B2 (en) 2005-01-31 2018-03-06 The Invention Science Fund I, Llc Shared image device designation
US9489717B2 (en) 2005-01-31 2016-11-08 Invention Science Fund I, Llc Shared image device
US9325781B2 (en) 2005-01-31 2016-04-26 Invention Science Fund I, Llc Audio sharing
US8902320B2 (en) 2005-01-31 2014-12-02 The Invention Science Fund I, Llc Shared image device synchronization or designation
US9082456B2 (en) 2005-01-31 2015-07-14 The Invention Science Fund I Llc Shared image device designation
US9019383B2 (en) 2005-01-31 2015-04-28 The Invention Science Fund I, Llc Shared image devices
US8988537B2 (en) 2005-01-31 2015-03-24 The Invention Science Fund I, Llc Shared image devices
US20060174203A1 (en) * 2005-01-31 2006-08-03 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Viewfinder for shared image device
US10003762B2 (en) 2005-04-26 2018-06-19 Invention Science Fund I, Llc Shared image devices
US9819490B2 (en) 2005-05-04 2017-11-14 Invention Science Fund I, Llc Regional proximity for shared image device(s)
US9451200B2 (en) 2005-06-02 2016-09-20 Invention Science Fund I, Llc Storage access technique for captured data
US9001215B2 (en) 2005-06-02 2015-04-07 The Invention Science Fund I, Llc Estimating shared image device operational capabilities or resources
US10097756B2 (en) 2005-06-02 2018-10-09 Invention Science Fund I, Llc Enhanced video/still image correlation
US9967424B2 (en) 2005-06-02 2018-05-08 Invention Science Fund I, Llc Data storage usage protocol
US9621749B2 (en) 2005-06-02 2017-04-11 Invention Science Fund I, Llc Capturing selected image objects
US9191611B2 (en) 2005-06-02 2015-11-17 Invention Science Fund I, Llc Conditional alteration of a saved image
US9041826B2 (en) 2005-06-02 2015-05-26 The Invention Science Fund I, Llc Capturing selected image objects
US9942511B2 (en) 2005-10-31 2018-04-10 Invention Science Fund I, Llc Preservation/degradation of video/audio aspects of a data stream
US9076208B2 (en) 2006-02-28 2015-07-07 The Invention Science Fund I, Llc Imagery processing
US8964054B2 (en) 2006-08-18 2015-02-24 The Invention Science Fund I, Llc Capturing selected image objects
US11317062B2 (en) 2006-12-04 2022-04-26 Isolynx, Llc Cameras for autonomous picture production
US10701322B2 (en) 2006-12-04 2020-06-30 Isolynx, Llc Cameras for autonomous picture production
US20080129825A1 (en) * 2006-12-04 2008-06-05 Lynx System Developers, Inc. Autonomous Systems And Methods For Still And Moving Picture Production
US9848172B2 (en) * 2006-12-04 2017-12-19 Isolynx, Llc Autonomous systems and methods for still and moving picture production
US20100321473A1 (en) * 2007-10-04 2010-12-23 Samsung Techwin Co., Ltd. Surveillance camera system
US8508595B2 (en) * 2007-10-04 2013-08-13 Samsung Techwin Co., Ltd. Surveillance camera system for controlling cameras using position and orientation of the cameras and position information of a detected object
US20100070172A1 (en) * 2008-09-18 2010-03-18 Ajith Kuttannair Kumar System and method for determining a characterisitic of an object adjacent to a route
US8712610B2 (en) 2008-09-18 2014-04-29 General Electric Company System and method for determining a characterisitic of an object adjacent to a route
US7772539B2 (en) 2008-10-10 2010-08-10 General Electric Company System and method for determining characteristic information of an object positioned adjacent to a route
US20100090135A1 (en) * 2008-10-10 2010-04-15 Ajith Kuttannair Kumar System and method for determining characteristic information of an object positioned adjacent to a route
US8564667B2 (en) * 2009-08-21 2013-10-22 Empire Technology Development Llc Surveillance system
US20110043628A1 (en) * 2009-08-21 2011-02-24 Hankul University Of Foreign Studies Research and Industry-University Cooperation Foundation Surveillance system
US20110199386A1 (en) * 2010-02-12 2011-08-18 Honeywell International Inc. Overlay feature to provide user assistance in a multi-touch interactive display environment
US20110199516A1 (en) * 2010-02-12 2011-08-18 Honeywell International Inc. Method of showing video on a touch-sensitive display
US20110199314A1 (en) * 2010-02-12 2011-08-18 Honeywell International Inc. Gestures on a touch-sensitive display
US8570286B2 (en) 2010-02-12 2013-10-29 Honeywell International Inc. Gestures on a touch-sensitive display
US20110199495A1 (en) * 2010-02-12 2011-08-18 Honeywell International Inc. Method of manipulating assets shown on a touch-sensitive display
US20110199517A1 (en) * 2010-02-12 2011-08-18 Honeywell International Inc. Method of showing video on a touch-sensitive display
US8638371B2 (en) 2010-02-12 2014-01-28 Honeywell International Inc. Method of manipulating assets shown on a touch-sensitive display
US20120075466A1 (en) * 2010-09-29 2012-03-29 Raytheon Company Remote viewing
US8836802B2 (en) 2011-03-21 2014-09-16 Honeywell International Inc. Method of defining camera scan movements using gestures
US9942468B2 (en) 2012-04-02 2018-04-10 Mcmaster University Optimal camera selection in array of monitoring cameras
US9591272B2 (en) 2012-04-02 2017-03-07 Mcmaster University Optimal camera selection in array of monitoring cameras
US20150109452A1 (en) * 2012-05-08 2015-04-23 Panasonic Corporation Display image formation device and display image formation method
US10051244B2 (en) * 2012-05-08 2018-08-14 Panasonic Intellectual Property Management Co., Ltd. Display image formation device and display image formation method
US20140015920A1 (en) * 2012-07-13 2014-01-16 Vivotek Inc. Virtual perspective image synthesizing system and its synthesizing method
CN103546720A (zh) * 2012-07-13 2014-01-29 晶睿通讯股份有限公司 合成虚拟视角影像的处理系统及其处理方法

Also Published As

Publication number Publication date
EP1045580A2 (en) 2000-10-18
CN1271233A (zh) 2000-10-25
EP1045580A3 (en) 2003-12-03
KR20000071677A (ko) 2000-11-25
CN1204739C (zh) 2005-06-01
KR100633465B1 (ko) 2006-10-16
JP4209535B2 (ja) 2009-01-14
TW498689B (en) 2002-08-11
JP2000307928A (ja) 2000-11-02

Similar Documents

Publication Publication Date Title
US6954224B1 (en) Camera control apparatus and method
EP0884909B1 (en) Camera control system
US6697105B1 (en) Camera control system and method
US6677990B1 (en) Control device for image input apparatus
US6760063B1 (en) Camera control apparatus and method
US7551200B2 (en) Camera controller and zoom ratio control method for the camera controller
US20040179121A1 (en) System and method for displaying captured images according to imaging device position
JP2009284452A (ja) ハイブリッドビデオカメラ撮像装置及びシステム
US20130314547A1 (en) Controlling apparatus for automatic tracking camera, and automatic tracking camera having the same
KR20060083883A (ko) 카메라 제어 장치, 카메라 시스템, 전자 회의 시스템 및카메라 제어 방법
JP4378636B2 (ja) 情報処理システム、情報処理装置および情報処理方法、プログラム、並びに記録媒体
US20040189804A1 (en) Method of selecting targets and generating feedback in object tracking systems
JP3880734B2 (ja) カメラ制御システム
JP2000341574A (ja) カメラ装置及びカメラ制御システム
JPH09116886A (ja) 画像情報通信装置
US20050117025A1 (en) Image pickup apparatus, image pickup system, and image pickup method
KR102009988B1 (ko) 초광각 카메라를 이용한 렌즈 왜곡 영상 보정 카메라 시스템의 영상 보정 방법 및 그가 적용된 tvi 장치
JP2005130390A (ja) 表示画面制御装置及びそれを用いた遠方監視装置
JP2992617B2 (ja) 遠隔制御カメラの撮影位置決定方法およびカメラ遠隔制御システム
JPH11331824A (ja) カメラ動作制御装置
JPH04317288A (ja) 監視システム
JPH08163411A (ja) カメラシステム
JP3826506B2 (ja) 情報表示方法
JPH1188728A (ja) テレビカメラの情報表示装置
JP2002077880A (ja) 遠隔監視システム

Legal Events

Date Code Title Description
AS Assignment

Owner name: MATSUSHITA ELECTRIC INDUSTRIAL, CO., LTD., JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:OKADA, SUSUMU;NANMA, EIMEI;NOJIMA, SHINJI;REEL/FRAME:010732/0976

Effective date: 20000404

FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

CC Certificate of correction
FPAY Fee payment

Year of fee payment: 4

REMI Maintenance fee reminder mailed
LAPS Lapse for failure to pay maintenance fees
REIN Reinstatement after maintenance fee payment confirmed
FP Lapsed due to failure to pay maintenance fee

Effective date: 20131011

FEPP Fee payment procedure

Free format text: PETITION RELATED TO MAINTENANCE FEES GRANTED (ORIGINAL EVENT CODE: PMFG); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Free format text: PETITION RELATED TO MAINTENANCE FEES FILED (ORIGINAL EVENT CODE: PMFP); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

PRDP Patent reinstated due to the acceptance of a late maintenance fee

Effective date: 20140530

AS Assignment

Owner name: PANASONIC INTELLECTUAL PROPERTY CORPORATION OF AME

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:PANASONIC CORPORATION;REEL/FRAME:032970/0261

Effective date: 20140527

FPAY Fee payment

Year of fee payment: 8

STCF Information on status: patent grant

Free format text: PATENTED CASE

SULP Surcharge for late payment
FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Free format text: PAYER NUMBER DE-ASSIGNED (ORIGINAL EVENT CODE: RMPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FPAY Fee payment

Year of fee payment: 12