WO2005076620A1 - 検出領域調整装置 - Google Patents
検出領域調整装置 Download PDFInfo
- Publication number
- WO2005076620A1 WO2005076620A1 PCT/JP2005/000544 JP2005000544W WO2005076620A1 WO 2005076620 A1 WO2005076620 A1 WO 2005076620A1 JP 2005000544 W JP2005000544 W JP 2005000544W WO 2005076620 A1 WO2005076620 A1 WO 2005076620A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- camera
- detection area
- detection
- area
- terminal
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19602—Image analysis to detect motion of the intruder, e.g. by frame subtraction
- G08B13/19606—Discriminating between target movement or movement in an area of interest and other non-signicative movements, e.g. target movements induced by camera shake or movements of pets, falling leaves, rotating fan
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19639—Details of the system layout
- G08B13/19641—Multiple cameras having overlapping views on a single scene
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19654—Details concerning communication with a camera
- G08B13/19656—Network used to communicate with a camera, e.g. WAN, LAN, Internet
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19697—Arrangements wherein non-video detectors generate an alarm themselves
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B29/00—Checking or monitoring of signalling or alarm systems; Prevention or correction of operating errors, e.g. preventing unauthorised operation
- G08B29/02—Monitoring continuously signalling or alarm systems
- G08B29/04—Monitoring of the detection circuits
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/90—Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/2624—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects for obtaining an image which is composed of whole input images, e.g. splitscreen
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/181—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
Definitions
- the present invention relates to a detection region adjustment device that adjusts a region to be photographed using a plurality of cameras, and particularly relates to a device suitable for a monitoring system using a plurality of cameras.
- Patent Documents 1 and 2 disclose such typical conventional devices using a plurality of cameras.
- FIG. 1 shows an apparatus for automatically adjusting a shooting area of a camera described in Patent Document 1.
- the detection camera device 9010 the camera 9011 and the reflecting mirror 9012 take images of the detection target over a wide imaging area, extract the detection target from the same image taken by the moving object extraction unit 9013, and the position information extraction unit 9014 performs the same detection.
- the detection camera device 9010 acquires position information of the detection target over a wide detection area.
- the camera control unit 9022 controls the turning angle, the depression angle, and the zoom ratio of the camera 9021 based on the position information of the detection target, and the judgment camera device 9020 shoots an enlarged image of the detection target.
- the camera device 9020 acquires detailed information of the detection target.
- FIG. 2 is a diagram showing detection areas of the detection camera device 9010 and the determination camera device 9020.
- a black circle indicates the installation position of the detection camera device 9010, and the detection camera device 9010 is a fixed camera.
- the circle or hexagon indicates the detection area of each detection camera device 9010.
- each detecting camera device 9010 is artificially regularized. If installed properly, it will be possible to always detect the detection target area, which is the target area to be monitored, without blind spots.
- FIG. 3 shows an apparatus for automatically adjusting a shooting area of a camera described in Patent Document 2.
- a moving object detection camera 9211 having the purpose of photographing a detection target over a wide photographing region changes its own photographing region by means of a posture control means 9212, and has a monitoring purpose of photographing an enlarged image of the detection target.
- the camera 9221 changes its own photographing range by the posture control means 9222.
- the shooting area of each camera is determined by the camera angle-of-view storage means 9231 and the camera angle of view based on the position of the detection target extracted by the moving object detection camera 9211 and the shooting area of each camera. Determined based on information stored in advance in storage means 9232.
- FIGS. 4 to 6 are diagrams used to explain a method of determining a shooting region of each camera, and are images taken by the moving object detection camera 9211 divided into several block images.
- the shooting area of the moving object detection camera 9211 is determined as follows. If the detection target exists in the hatched block in FIG. 4, each block position moves in the direction indicated by the arrow direction described in each block in FIG. 5 corresponding to the block position shown in FIG. The posture of the object detection camera 9211 is changed to change the shooting area of the camera.
- the photographing area of the moving object detection camera 9211 corresponding to each block position is determined by a human in advance, and the information is set in the camera view angle storage unit 9231 in advance.
- the shooting area of the monitoring camera 9221 is determined as follows.
- the posture of the monitoring camera 9221 is changed so that the shooting area shown by the broken line is obtained, and the shooting area of the camera is changed.
- the shooting area of the monitoring camera 9221 corresponding to each block position is determined by a human in advance, and the information is set in the camera view angle storage unit 9232 in advance.
- each camera detects a detection target or obtains detailed information of the detection target based on information within a shooting range captured by the camera. Therefore, the shooting range of each camera is the same as the detection range of the detection of the detection target and the information acquisition performed by the camera. There Hereinafter, the shooting range of the camera will be described as the detection range of the camera.
- the detection camera device 9010 is used, and in the conventional device shown in Patent Document 2, the moving object detection camera 9211 detects the detection target over a wide detection area.
- the judgment camera device 9020 has a role of detecting, and in the conventional device shown in Patent Document 2, the surveillance camera 9221 detects the detected image like the enlarged image of the detection target.
- Each camera shares a fixed and predetermined role, such as acquiring detailed information of the target, and one camera monitors the entire detection target area, and the other plays a role. The camera that plays the role of has obtained detailed information.
- the detection area of the moving object detection camera 921 1 is changed in response to a situation change in which the detection target is located in the upper left block in FIG.
- the information in table format that describes the detection area corresponding to the situation change contents assumed and created in advance by humans on a one-to-one basis is described. Based on this, the detection area of each camera is determined and adjusted.
- each camera is set based on information in a table format in which a detection area corresponding to a situation change content assumed and set by a human in advance and a one-to-one correspondence is described. Since the detection area is determined and adjusted, it is necessary for a human to assume and create information in a table format that describes the detection area corresponding to the situation change content on a one-to-one basis for each camera. The information depends on the position and size of the detection target area, the details of changes in the situation assumed by humans, the position and number of cameras to be installed, etc. It is necessary for humans to recreate this information one by one. This operation becomes more complicated as the number of cameras increases, and the cost and load on the operation become enormous. In a surveillance system using cameras in a building, it is very common to use more than 10 cameras. It is.
- Patent Document 1 Japanese Patent No. 3043925
- Patent Document 2 Japanese Patent No. 3180730
- the present invention is to solve the above-mentioned conventional problem, and it is not necessary for a human to predict a situation change in advance and create a table, and even if a camera breaks down, It is an object of the present invention to provide a detection region adjusting device capable of detecting all the regions to be detected without fail.
- a detection area adjustment device comprises a plurality of camera terminals connected by a communication path, and adjusts a detection area which is an imaging area of the plurality of camera terminals.
- a plurality of camera terminals each of which captures a detection area included in the detection target area and changes a position of the detection area.
- a communication unit that transmits detection area information specifying the detection area to another camera terminal via the communication path, and receives detection area information from the other camera terminal.
- An area obtained by adding the detection areas of the plurality of camera terminals based on the detection area information and the detection area information of the other camera terminals received by the communication unit constitutes the detection target area.
- an adjusting means for controlling the camera of the camera terminal to adjust the position of the detection area so as to cover the position.
- the adjusting unit may be configured such that a non-detection area that does not belong to any of the detection areas of the plurality of camera terminals does not occur in a peripheral area that is in contact with the detection area of the camera terminal.
- the position of the detection area is adjusted.
- Each camera terminal has the same function and operates autonomously and cooperatively while communicating with other camera terminals whose detection areas are adjacent to each other, so that a person predicts a situation change in advance and creates a table. Need to keep Even if the camera breaks down, the entire area to be detected is photographed without blind spots.
- the camera has means for changing the spatial resolution of the detection area
- the detection area information includes information for specifying the spatial resolution of the detection area
- the adjustment means Based on the detection area information of the camera terminal and the detection area information of the other camera terminals received by the communication means, an area obtained by summing the detection areas of the plurality of camera terminals is all over the detection target area. It is further preferable that the camera of the camera terminal is controlled to adjust the position and the spatial resolution of the detection region so that the spatial resolution of the detection regions of the plurality of camera terminals is substantially the same.
- each camera terminal also controls the zoom ratio, which depends only on the camera direction (pan, tilt, etc.), so that the position and size of the detection area are adjusted.
- the entire area is imaged throughout with uniform spatial resolution.
- the present invention is not limited to a distributed control type configuration in which each camera terminal is provided with an adjustment unit, but is a centralized control configuration in which a common adjustment unit that adjusts the detection area of all camera terminals is provided.
- the present invention can also be implemented as a detection area adjustment method and a program that causes a computer to execute the method.
- the present invention can be realized as a device that adjusts the detection area of a sensor capable of detecting a physical quantity such as a microphone, instead of the detection area that is an area that can be photographed by a camera.
- the program according to the present invention can be distributed via a recording medium such as a CD-ROM or a transmission medium such as the Internet.
- the camera detection region of each camera terminal is automatically adjusted so that the sum of the detection regions of the cameras of each camera terminal covers the entire predetermined detection target region. Therefore, it is not necessary for humans to create the detection area information corresponding to the situation change for each camera as in the past, and even if some of the cameras fail, the blind spot The predetermined detection target area is efficiently covered.
- the present invention guarantees that an arbitrary space can be detected without blind spots, and has high practical value particularly as a system for monitoring a suspicious individual in a school or a building.
- FIG. 1 is a configuration block diagram according to a conventional technique 1.
- FIG. 2 is an explanatory diagram showing a camera field-of-view range according to Conventional Technique 1.
- FIG. 3 is a configuration block diagram according to the conventional technique 2.
- FIG. 4 is an explanatory diagram of an operation in the conventional technique 2.
- FIG. 5 is an explanatory diagram of an operation in the conventional technique 2.
- FIG. 6 is an explanatory diagram of an operation in the conventional technique 2.
- FIG. 7 is an explanatory diagram of an operation in the conventional technique 2.
- FIG. 8 is a configuration block diagram of a detection area adjustment device according to Embodiment 1 of the present invention.
- FIG. 9 is a diagram showing an example of a detection area and an overlapping area.
- FIG. 10 is a configuration block diagram of a camera P.
- FIG. 11 is a flowchart showing a process performed by an adjustment unit.
- FIG. 12 is an explanatory diagram showing adjacent detection regions.
- FIG. 13 is an explanatory diagram showing a function U ().
- FIG. 14 is an explanatory diagram showing another function U ().
- FIG. 15 is a block diagram showing a configuration of a centralized control type detection region adjustment device in which one adjustment unit adjusts all detection regions.
- FIG. 16 is a diagram illustrating an example in which the present invention is applied to a microphone.
- FIG. 17 is a diagram showing a method of applying the processing on the X axis also on the Y axis.
- FIG. 18 is a configuration block diagram of a detection area adjustment device according to Embodiment 2 of the present invention.
- FIG. 19 is a configuration block diagram of a camera PR.
- FIG. 20 is a flowchart showing a process performed by the adjustment unit.
- FIG. 21 is an explanatory diagram for detection area position calculation in Supplementary Explanation 1 of the present invention.
- FIG. 22 is a configuration block diagram of a detection area adjustment device in Supplementary Explanation 1 of the present invention.
- FIG. 23 is a configuration block diagram of a camera in Supplementary Explanation 1 of the present invention.
- FIG. 24 is a diagram showing a method of determining a region adjacent to a detection region in Supplementary Explanation 2 of the present invention.
- FIG. 25 is a diagram showing a method of determining an area adjacent to a detection area in Supplementary Explanation 2 of the present invention.
- FIG. 26 is a configuration block diagram of a detection area adjusting device in Supplementary Explanation 3 of the present invention.
- FIG. 27 is a flowchart showing a process performed by the adjustment unit in Supplementary Explanation 4 of the present invention.
- FIG. 28 is a flowchart showing a process performed by the adjustment unit in supplementary explanation 4 of the present invention.
- FIG. 29 is a configuration block diagram of a detection area adjusting device in Supplementary Explanation 4 of the present invention.
- FIG. 30 is a configuration block diagram of another detection area adjusting device in Supplementary Explanation 4 of the present invention.
- FIG. 31 is a diagram showing an example of a composite image.
- FIG. 32 is a block diagram showing a configuration of a monitoring system including mobile camera terminals.
- FIG. 33 is a diagram showing an operation state of the mobile camera terminal in the monitoring system.
- FIG. 34 is a diagram showing how a mobile camera terminal moves on a track of a rail installed in a monitoring area.
- Embodiment 1 of the present invention will be described.
- the detection area in which the camera detection area of each camera terminal is automatically adjusted so that the area obtained by adding up the detection areas of the cameras of each camera terminal completely covers the predetermined detection target area The adjusting device will be described with reference to FIGS.
- FIG. 8 is a configuration block diagram of a detection area adjustment device according to Embodiment 1 of the present invention.
- This detection area adjustment device includes a plurality of camera terminals P110A connected via a network 112.
- the plurality of camera terminals P110A-C are camera terminals that operate autonomously and cooperatively while communicating with each other, and include the same components (camera P101, adjustment unit 102, and communication unit 103).
- Camera P101 is a camera with a variable detection area position, which is the position of the detection area of the camera.
- Adjustment unit 102 is a processing unit that adjusts the detection area position of camera P101.
- Communication unit 103 is a detection area position of camera P101. A processing unit that communicates information.
- Adjustment section 102 sums the detection areas of a plurality of camera terminals P110A-C based on the detection area position information of the self-powered camera terminal and the detection area position information of the other camera terminals received by communication section 103.
- the camera P101 of the self-powered camera terminal is controlled to adjust the position of the detection area so that the detected area covers the entire detection target area. More specifically, the adjustment unit 102 does not generate a non-detection area that does not belong to the deviation of the detection areas of the plurality of camera terminals P110A-C in a peripheral area in contact with the detection area of the self-powered camera terminal. In this way, the position of the detection area of the own camera terminal is adjusted.
- the operation terminal 111L and the operation terminal 111R are terminals that acquire an instruction from the user and notify the camera terminal PR1110A-C of the instruction, and include a communication unit 103 that communicates detection target position information.
- the network 112 is a network line used for communication via the communication unit 103 in each of the camera terminals P110A-110C, the operation terminal 111L, and the operation terminal 111R.
- Each camera terminal P110A-110C communicates the detection area position information of the camera P101 in each camera terminal P through the same network 112, and the operation terminal 111L and the operation terminal 111R detect each camera terminal PI 10A-110C through the same network 112. Communicate area.
- Axis 122 is defined.
- the real space plane 125 is the real sky where each camera terminal P110A-110C exists.
- the upper surface for example, when each of the camera terminals P110A-110C is set downward from the ceiling, it is a surface such as a floor. In the first embodiment, the surface coincides with the X-axis 120, and
- the detection target area 130 is the entire area to be detected in the present invention, and the position of the area is represented by X and X.
- the non-detection area 131 is the detection This is an area that is not targeted for exit.
- the broken line emitted from each camera P101 indicates the end of the detection area of each camera P101.
- the detection area 140A is the detection area of the camera terminal P110A, and the position of that area is indicated by X and X
- the detection area 140B is the detection area of the camera terminal P110B, and the position of the area is represented by X and X.
- the detection area 140C is the detection area of the camera terminal P110C.
- the overlapping area 141TA is the camera edge
- the detection area 140A which is the detection area of P110A, and the non-detection area 131 overlap each other, and an amount indicating the size of the area is represented by X—X.
- the overlapping area 141AB is the force
- the detection area 140A which is the detection area of the camera terminal PI 10A
- the detection area 140B which is the detection area of the camera terminal PI 10B
- overlap and the amount indicating the size of the area is X — X
- the overlapping area 141BC is an area where the detection area 140B, which is the detection area of the camera terminal P110B, and the detection area 140C, which is the detection area of the camera terminal P110C, overlap, and the amount indicating the size of the area is represented by X—X. Is done.
- the overlapping area 141CT is the camera edge
- the detection area 140C which is the detection area of P110C, and the non-detection area 131 overlap each other, and the amount indicating the size of the area is X — X
- the detection area and the overlap area are represented by a one-dimensional X-axis 120.
- FIG. 9 (a) is a diagram showing the detection area of the present embodiment in two dimensions
- FIG. 9 (b) shows the detection area (square) and the overlapping area (shaded area) when the detection area is rectangular
- 9 (c) is a diagram showing an example of a detection area (circle) and an overlap area (shaded area) when the detection area is circular.
- the detection area adjustment apparatus according to the present embodiment is an apparatus that can be applied to a case where the detection area is not only a surface but also a three-dimensional object. In the following, the explanation will be limited to the X axis only.
- FIG. 10A is a diagram showing an internal configuration of the camera P101.
- the camera P101 includes a lens 201, an imaging surface 202, an image processing unit 203, and a posture control unit 204.
- the lens 201 is a lens for forming an image
- the imaging surface 202 is an element such as a CCD for capturing an image formed by the lens 201
- the image processing unit 203 is a process for processing an image captured on the imaging surface 202.
- the power control unit 204 is a processing unit that controls the posture of the lens 201 and the imaging surface 202 or the distance between the lens 201 and the imaging surface 202.
- the control of the attitude of the lens 201 and the imaging plane 202 performed by the attitude control unit 204 is control generally called pan or tilt, and as shown in FIG. Are rotated about a point or axis in conjunction. Further, the control of the distance between the lens 201 and the imaging surface 202 performed by the attitude control unit 204 is a control generally called zoom, and as shown in FIG. The interval increases or decreases.
- the camera 1 3 101 has the internal configuration shown in FIG. 10 (&).
- the image formed by the lens 201 shown in FIG. 10A is converted into an image signal on the image plane 202, and the image processing unit 203 performs general image processing and image recognition.
- detection of a detection target and information extraction are performed from the same image signal.
- the camera P101 uses its own imaging range determined by the posture of the lens 201 and the imaging surface 202 and the distance between each of them in the real space as a detection area, and performs detection target detection and information extraction. Perform detection operation.
- the detected information of the detection target is sent to the adjustment unit 102 in FIG.
- the above-mentioned general image processing technology and image recognition technology include a widely known background difference method and a dynamic difference method.
- the posture control unit 204 shown in FIG. 10A controls the posture of the force lens 201 and the imaging surface 202, or the distance between the lens 201 and the imaging surface 202 to control the camera.
- the position of the detection area in P101 is adjusted to the position of the detection area indicated by the adjustment unit 102 in FIG.
- the attitude control unit 204 sends to the adjustment unit 102 the current position information of the detection area of the camera P101 that also determines the current attitude or spacing force of the lens 201 and the imaging surface 202.
- the position of the detection area of the camera P101 is controlled by the adjustment unit 102, and the current position information of the detection area of the camera P101 is sent to the adjustment unit 102.
- the posture and interval of the lens 201 and the imaging surface 202 can be changed by using, for example, a stepping motor, and the current The posture and interval can also be read.
- the adjustment unit 102 periodically transmits the position information of the detection area of the camera P101 transmitted from the camera P101 to the adjustment unit 102 of the remote terminal P via the communication unit 103 and the network 112. Further, the adjusting unit 102 receives the position information of the detection area of the camera P101 of the other-power terminal P periodically transmitted from the adjusting unit 102 of the other-power terminal P. Further, in operation terminal 111L and operation terminal 111R, communication unit 103 periodically transmits position information of detection target area 130 to adjustment unit 102 of each of camera terminals P110A-110C via network 112. .
- each of the adjustment units 102 periodically updates the position information of the detection area and the position information of the detection target area of the camera P101 of the self-powered mobile terminal P and the passive power mobile terminal P.
- each of the adjustment units 102 includes X and X, which are the position of the detection area 140A of the camera terminal P110A, and the camera terminal P1.
- the adjustment unit 102 does not generate a non-detection area that does not belong to any of the detection areas of the plurality of camera terminals P110A-C in a peripheral area in contact with the detection area of the self-powered camera terminal.
- the detection area of the self-powered mobile terminal P adjacent to the detection area of the self-powered mobile terminal P or the detection target is excluded. Select an area (step 301). An example of this selection method will be described below with reference to FIG.
- FIG. 12 (a) shows the X-axis when the detection area is rectangular as shown in FIG. 12 (b).
- the center position of the detection area of the self-powered mobile terminal P, the center position of the detection area of the passive mobile terminal P, and the center position of the non-detection target area are calculated.
- the detection area or the non-detection area of the other force mera having the center position numerically smaller than the center position of the detection area of the self-powered camera terminal P is detected on the adjacent left side. Select as the output area.
- the detection area or the non-detection area of the other power mera having the center position numerically larger than the center position of the detection area of the self-powered camera terminal P is selected as the detection area adjacent to the right.
- the closest distance from the center position of the detection area of the self-powered camera terminal P is the closest. Select the area with the center position at the position. For this reason, in the camera terminal P110A, the non-detection area 131 on the left side, the detection area 140B on the right side, and the detection area 140A on the left side, the detection area 140C on the right side, and the camera terminal PI 10B on the camera terminal PI 10B. In the IOC, the detection area 140B is selected as the left neighbor, and the non-detection target area 131 is selected as the right neighbor. It should be noted that there are several methods other than the above-described method of selecting an adjacent area. Other methods will be described in Supplementary Explanation 2 below.
- an amount indicating the size of the overlapping area which is the area where the detection area selected in step 301 and the detection area of the self-service terminal P overlap, is calculated (step 302).
- This calculation method can be easily calculated based on the magnitude relationship between the selected detection area position and the detection area position of the camera terminal P as shown in FIG. Therefore, in the camera terminal P110A, the amount X—X indicating the size of the overlapping area 141TA, which is the overlapping area on the left, and the weight on the right
- the amount X-X indicating the size of the overlapping area 141AB, which is a multiple area, is attached to the camera terminal P110B.
- the quantity X — X indicating the size of the overlapping area 141CT, which is the adjacent overlapping area, is calculated.
- a function UP () is defined as an amount indicating the difference between the amount indicating the size of the overlapping area and 0 or a certain amount of 0 or more.
- the same function is used for each of the camera terminals P110A to 110C as shown in the following equations 1 to 3.
- Equations 1 to 3 above are for the camera terminals P110A to 110C, respectively, and indicate the square value of the difference between the amount indicating the size of the overlapping area and the fixed amount C, and indicate the respective differences. As quantity.
- Equations 4 to 6 the next detection area position of the self-powered mobile terminal P is calculated by using a generally known equation of the steepest descent method.
- Equations 4 to 6 X ′, ⁇ ′, ⁇ ′, ⁇ ′, ⁇ ′, ⁇ ′, ⁇ ′ are each a camera
- PI terminal 10A-1 Indicates the position of the next detection area of the IOC, where ⁇ is a constant. Finally, the detection area position of the camera P101 is adjusted to the same detection area position.
- AL and AR must each be independently adjustable. Same for X and X, ⁇ and X
- UP A (X AL , X AR ) UP AL (X AL ) + UP AR (X AR )
- UP B (X B! ., X RR ) UP BL (X AL ) + UP BR (X AR )
- UP C (X C ,, X CR ) UP CL (X CL ) + UP CR (X CR )
- the adjusting unit 102 sequentially performs the processing of Step 301, Step 302, and Step 303, and returns to the processing of Step 301 after the processing of Step 303 is completed. Then, the adjustment unit 102 adjusts the detection range of the camera P101 while constantly repeating the processing from step 301 to step 303.
- the operation of the detection area adjusting apparatus according to Embodiment 1 of the present invention is as described above.
- the amount indicating the size of the overlapping area is set to 0 or 0 or more so as to approach a certain amount C.
- the next detection area position of self-powered camera terminal P is calculated using the formula of the descent method, and the detection area position of each camera terminal P1 10A-110C is adjusted to adjust the detection area position of camera P101 to the same detection area position.
- the detection region 140A, the detection region 140B, the detection region 140C, and the non-detection target region overlap each other by repeating the processing from step 301 to step 303 with 0 or a fixed amount C of 0 or more. .
- the detection areas of each of the camera terminals P110A-110C including the non-detection area overlap with 0 or a certain amount C of 0 or more. Since the detection areas are included in the sum of the detection areas of the terminals P110A to 110C, the detection area adjustment device of the present invention can detect the detection target area 130 without blind spots by using the camera terminals P110A to 110C.
- the adjustment unit 102 has an effect of detecting the detection target area 130 without blind spots.
- the processing of steps 302 and 303 of this repeatedly performed processing is performed on the detection area of the other power camera P adjacent to the detection area of the self-powered camera terminal P selected in step 301.
- the detection area adjustment device of the present invention is According to the change of the detection area position or the detection target area position, the detection target area 130 can be detected without a blind spot using each camera terminal P.
- the function UP () indicating the difference between the amount indicating the size of the overlapping region and 0 or a certain amount C equal to or greater than 0 is calculated by using the above equations 1 to 3 or 7 As shown in Equation 9, the square value of the difference between the amount indicating the size of the overlap region and the fixed amount C was used, but as shown in Fig. 13, the function UP () was And the fixed amount C, the even-numbered value of the difference such as the fourth, sixth, or tenth-order value of the difference between the amount indicating the size of the overlap region and the function UP (). Even if the absolute value of the difference between these functions is UP (), X -X
- step 303 Since the steepest descent method performed in step 303 has the minimum value at the time of the AL TL force, the amount indicating the size of the overlapping region approaches the constant amount C, and it is needless to say that the same effect can be obtained.
- a function UP indicating the difference between the amount indicating the size of the overlap region and 0 or a certain amount C equal to or greater than 0
- the adjustment units 102 are dispersedly provided in the camera terminals P 11 OA-110C, but the detection area shown in FIG. As in the case of the adjusting device, if there is only one adjusting unit 102 and only one adjusting unit 102 adjusts the entire detection area position of the camera P 101 of each camera terminal P 11 OA-110C. Needless to say, a similar effect can be obtained.
- the camera P101 that handles the camera P101 as a general camera may be a camera that detects visible light or invisible light such as infrared light or ultraviolet light.
- a general sensor having a detection area for detecting various physical quantities such as a fine movement sensor, a pressure sensor, a temperature sensor, and an air pressure sensor, and having a variable detection area position.
- the same effect can be obtained.
- a microphone having a directional characteristic as shown in Fig. 16 (a) as shown in Fig. 16 (b)
- the direction (region) where sound can be detected with a sensitivity above a certain level is set as the sensing region (detection region).
- the position of the microphone is controlled in the same manner as the pan and tilt of the camera according to the present embodiment, it can be defined instead of the camera according to the present embodiment, or In addition to a camera, it is possible to construct a detection area adjustment device composed of a plurality of microphones. That is, the present invention can also be applied to the various sensors described above that are not limited to cameras.
- the dynamic network 112 that handles the network 112 as a network line used for general communication is the same even if the dynamic network 112 is a wired or wireless network. Needless to say, the effect can be obtained.
- processing on the X axis is mainly described. However, by applying the same processing on the Y axis, it is possible to avoid the occurrence of a blind spot area in a plane. ing. As shown in FIG. 17, processing on the X-axis according to the present embodiment (processing for setting the overlapping area to 0 or more)
- the present invention relates to a detection region adjustment device capable of acquiring a photographed image of a subject.
- FIG. 18 is a configuration block diagram of a detection area adjustment device according to Embodiment 2 of the present invention.
- This detection area adjusting device also includes a plurality of camera terminals PR11 10A-C connected via a network 112, and two operation terminals 1111L and 1111R.
- Plurality of camera terminals PR1110A-C are camera terminals that operate autonomously and cooperatively while communicating with each other, and include the same components (camera PR1101, adjustment unit 1102, and communication unit 1103).
- Camera PR1101 is a camera whose detection area position, which is the position of the detection area of the camera, is variable and the spatial resolution of the detection area, which is the spatial resolution of the detection area of the camera, is variable (for example, zoom control is possible).
- Unit 1102 is the detection area for camera PR1101.
- the processing unit for adjusting the area position and the detection area resolution and the communication unit 1103 are processing units that communicate detection area position information and detection area resolution information of the camera PR1101.
- Adjustment section 1102 is based on the detection area position information and detection area resolution information of the self-powered camera terminal and the detection area position information and detection area resolution information of another camera terminal received by communication section 1103! In order to ensure that the area obtained by adding the detection areas of the multiple camera terminals PR1110A-C covers the entire detection target area, and that the spatial resolutions of the detection areas of the multiple camera terminals PR1110A-C are substantially the same, It controls the camera of the camera terminal to adjust the position and spatial resolution of the detection area. More specifically, the adjustment unit 1102 prevents the non-detection area that does not belong to any of the detection areas of the plurality of camera terminals PR1110A-C from being generated in the peripheral area in contact with the detection area of the self-powered camera terminal. The position and spatial resolution of the detection area of the self-propelled camera terminal are adjusted so that the spatial resolution of the detection area of the self-produced camera terminal and the spatial resolution of the detection areas of other camera terminals adjacent to the detection area are substantially the same. .
- the operation terminal 1111L and the operation terminal 1111R are terminals that acquire instructions from the user and notify the instructions to the camera terminals PR1110A-C, and include a communication unit 1103 that communicates detection target position information.
- the network 1112 is a network line used for communication via the communication unit 1103 in each of the camera terminals PR1110A-1110C, the operation terminal 1111L, and the operation terminal 1111R.
- Each camera terminal PR1110A- 1110C communicates the detection area position information and the detection area spatial resolution information of the camera PR1101 in each camera terminal PR through the same network 1112, and the operation terminal 1111L and the operation terminal 1111R communicate through the same network 1112. Communicate the detection target area to each camera terminal PR1110A-1110C.
- the operation terminal 1111L, the operation terminal 1111R, and the network 1112 are the same as the operation terminal 111L, the operation terminal 111R, and the network 112 in FIG. 8 according to the first embodiment of the present invention, respectively.
- the difference from the first embodiment is that, in the second embodiment, not only the detection area position but also the detection area spatial resolution of the camera PR1101 is variable, and accordingly, the adjustment unit 1102 and the communication unit 1103 are connected to the detection area position.
- the point is to handle not only the information but also the detection area spatial resolution information.
- the spatial resolution is the spatial resolution of the image captured by the camera. And corresponds to a value obtained by dividing the area of the detection region by the number of imaging elements. This spatial resolution changes mainly due to the zoom control of the camera, and has a higher value (a state in which a detailed image can be obtained) as the area of the detection region is smaller.
- X and X which are the positions of the detection area
- X and X which are the detection areas 1140B and the position of that area.
- a size indicator indicates the overlap area 1141CT and the size of that area
- the quantity X—X is no different from the first embodiment of the present invention shown in FIG. R is a turtle
- R is the detection area of camera terminal PR1110B
- the spatial resolution, R indicates the spatial resolution of the detection area of the camera terminal PR1110C.
- FIG. 19A is a diagram showing the internal configuration of the camera PR1101.
- the camera PR1101 includes a lens 1201, an imaging surface 1202, an image processing unit 1203, and a posture control unit 1204.
- the lens 1201 is a lens for forming an image
- the imaging surface 1202 is an element such as a CCD for capturing an image formed by the lens 1201
- the image processing unit 1203 is for processing an image captured on the imaging surface 1202.
- the processing section and the attitude control section 1204 are processing sections that control the attitude of the lens 1201 and the imaging surface 1202, and the distance between the lens 1201 and the imaging surface 1202.
- the lens 1201, the imaging surface 1202, and the image processing unit 1203 are the same as the lens 201, the imaging surface 202, and the image processing unit 203 in FIG. 10A in Embodiment 1 of the present invention, respectively.
- the difference between the first embodiment and the second embodiment of the present invention is that, in the second embodiment shown in FIG. 19A, the posture control unit 1204 determines whether the posture or the interval of the lens 1201 and the imaging surface 1202 is different. The point is that the posture and interval of the slanting lens 1201 and the imaging surface 1202 are simultaneously controlled.
- the attitude control unit 1204 controls the attitude of the lens 1201 and the imaging surface 1202, which are controls generally called pan and tilt, and the control, which is generally called zoom. The distance between the lens 1201 and the imaging surface 1202 is simultaneously controlled.
- the camera PRl 101 detects the detection target and information based on its own shooting range determined by the attitude of the lens 1201 and the imaging surface 1202 and the distance between them in the real space. Perform detection operations such as extraction.
- the detected information of the detection target is sent to the adjustment unit 1102 in FIG.
- the camera PR1101 detects the camera PR1101 by controlling the posture of the force lens 1201 and the imaging surface 1202, and the distance between the lens 1201 and the imaging surface 1202, as shown in FIG.
- the position of the area is adjusted to the position of the detection area indicated by the adjustment unit 1102 in FIG.
- the posture control unit 1204 sends to the adjustment unit 1102 the current position information of the detection area of the camera PR1101 for determining the current posture and spacing force of the lens 1201 and the imaging surface 1202.
- the detection area spatial resolution of the camera PR1101 is set to the detection area indicated by the adjustment unit 1102 in FIG. Adjust to the spatial resolution.
- the attitude control unit 1204 sends the spatial resolution of the detection area of the current force camera PR1101 that also determines the current gap force between the lens 1201 and the imaging surface 1202 to the adjustment unit 1102.
- the position of the detection area of camera PR1101 is controlled by adjustment unit 1102, and the current detection area of camera PR1101 is detected.
- the position information of the camera PR1101 is sent to the adjustment unit 1102, and the spatial resolution of the detection area of the camera PR1101 is further controlled by the adjustment unit 1102 in Embodiment 2 of the present invention, and the spatial resolution information of the current detection area of the camera PR1101 is Is sent to the adjustment unit 1102.
- the adjustment unit 1102 periodically transmits the position information and the spatial resolution information of the detection area of the camera PR1101 sent from the camera PR1101 via the communication unit 1103 and the network 1112 to the adjustment unit 1102 of the second party terminal PR.
- adjustment section 1102 receives position information and spatial resolution information of the detection area of camera PR1101 of other power terminal PR periodically transmitted from adjustment section 1102 of other power terminal PR.
- the communication unit 1103 transmits the position information of the detection target area 1130 to the adjustment unit 1102 of each of the camera terminals PRl 110A-1110C via the network 1112 via the operation terminals 1111L and 1111R. Send periodically.
- adjustment section 1102 and communication section 1103 transmit and receive the spatial resolution of the detection area between camera terminals PR1110A-1110C.
- each adjustment section 1102 provides position information and spatial resolution information of the detection area of camera PR1101 of self-powered terminal P and passive power terminal P, and position information of the detection target area.
- each adjustment unit 1102 includes X and X, which are the positions of the detection area 1140A of the camera terminal PR1110A, and the detection area 1140B of the camera terminal PR1110B. Is the position of X
- R which is the spatial resolution of area 1140B, the detection area 1140C of camera terminal PR1110C
- the inter-resolution R is periodically acquired via the communication unit 1103 and the network 1112.
- adjustment unit 1102 performs the following steps shown in FIG. 20.
- the adjustment unit 1102 obtains, from information indicating the detection area position of the camera PR1101 of the self-powered mobile terminal PR and the self-powered mobile terminal PR, the detection area of the self-powered mobile terminal PR adjacent to the detection area of the self-powered mobile terminal PR. Or, select a non-detection target area (step 1301). This process is the same as in the first embodiment of the present invention.
- adjustment section 1102 calculates an amount indicating the size of the overlapping area in which the detection area selected in step 1301 and the detection area of self-powered mobile terminal P overlap (step 1302). This processing is the same as in the first embodiment of the present invention.
- the adjusting unit 1102 determines the spatial resolution of the detection area selected in step 1301 and its own An amount indicating the difference from the spatial resolution of the detection area of the camera terminal P is calculated (step 1303).
- the amount R—R which indicates the difference between the detection area resolutions in the detection area 1140A of the camera terminal itself and the detection area 1140B adjacent thereto,
- the quantity R-R indicating the difference between the detection area resolutions of the detection area 1140B of the camera itself and the detection area 1140A adjacent to the own detection area 1140B.
- the detection area in each of the output area 1140B and the detection area 1140C adjacent to the output area 1140B is an amount R—R indicating the difference in resolution, and the camera terminal PR1110C has its own detection area 11
- the adjustment unit 1102 adjusts the detection area position of the self-service terminal PR so that the amount indicating the size of the overlapping area calculated in step 1303 approaches a certain amount C (step 1304).
- This processing is the same as in the first embodiment of the present invention.
- the detection area spatial resolution of self-service camera terminal PR is adjusted so that the amount indicating the difference in the detection area spatial resolution calculated in step 1303 approaches zero. This adjustment method will be described below.
- a function UR () is determined as an amount indicating a difference in detection area spatial resolution.
- a function UR () is determined as an amount indicating a difference in detection area spatial resolution.
- Equations 13 to 15 above are for the camera terminals PR1110A to 1110C, respectively, and the square value of the amount indicating the difference in the detection area spatial resolution is an amount indicating the difference.
- the next detection area spatial resolution of the self-powered mobile terminal PR is calculated using a generally known steepest descent method. [0095] [Number 16]
- R'A, R'B, and R'C indicate the spatial resolution of the next detection area of each camera terminal PR111OA-1110C, respectively, and ⁇ is a constant. Finally, the detection area spatial resolution of camera PR1101 is adjusted to the same detection area spatial resolution.
- the adjusting unit 1102 sequentially performs the processing of step 1301, step 1302, step 1303, and step 1304, and returns to the processing of step 1301 after the processing of step 1304 is completed. Then, the adjustment unit 1102 adjusts the detection area of the camera PR1101 while constantly repeating the processing from step 1301 to step 1304.
- step 1303 the next detection area spatial resolution of the self-powered mobile terminal PR is calculated using the steepest descent method so that the amount indicating the difference in the detection area spatial resolution approaches 0.
- the detection area spatial resolution of each of the camera terminals PR1110A-1110C is mutually exclusive, and the processing from step 1301 to step 1304 is repeated.
- the use of the detection region adjustment device of the present invention makes it possible to acquire images of the same spatial resolution and taken by the camera PR1101 in the camera terminals PR1110A-1110C. Further, when the detection area spatial resolutions of the camera terminals PR1110A-1110C match, this indicates that in the present embodiment, each of the camera terminals PR1110A-1110C has the same resolution. Since the number of elements on the imaging surface 1202 is the same, it means that the area of the detection area in which each camera terminal PR1110A-1110C is in charge of detection is the same.
- step 1302 the adjustment unit 1102 has the effect that it is possible to acquire images captured by the cameras with the same spatial resolution.
- the processing of step 1302 and step 1304 of this repeatedly performed processing is performed on the detection area of the other camera PR adjacent to the detection area of the own camera terminal PR selected in step 1301.
- the function UR () indicating the difference in the detection area spatial resolution is defined as the square value of the difference in the detection area spatial resolution as shown in the above Expressions 13 to 15.
- the function UR () is used to calculate the even-numbered value of the difference such as the fourth, sixth, or tenth value of the detection area spatial resolution, or the function UR ( ) Is the absolute value of the difference in the detection area spatial resolution, these functions UR () have the minimum value at the time of
- Is also a function UR () that has a minimum value at the time of R—R force ⁇ within the range where R R can be changed.
- adjustment units 1102 are distributed among camera terminals PR 1110A-1110C, but as shown in the configuration diagram of FIG. If there is only one adjustment unit 1102 and only one adjustment unit 1102 adjusts the spatial resolution of the detection area of each camera terminal PRl 110A-1110C camera PR1101, the same applies. Needless to say, the effect can be obtained.
- the camera PR1101 is handled as a general camera.
- the 1S camera PRl 101 is a camera that detects visible light or invisible light such as infrared light or ultraviolet light. Needless to say, the same effect can be obtained even with a general sensor having a detection area and a variable detection area spatial resolution.
- network 1112 is treated as a network line used for general communication.
- network 1112 is a wired or wireless network, the same applies. Needless to say, you can get the effect of! / ,.
- each camera terminal communicates with another camera terminal whose detection region is adjacent to each other, and thereby the spatial resolution of its own detection region and the empty space of the detection region of another camera terminal.
- the present invention is not limited to this method, and each camera terminal has its own detection area so as to have the highest spatial resolution without communicating with other camera terminals. May be fixed.
- the adjacent detection areas have overlapping areas, and all the detection areas have the highest spatial resolution. Therefore, the detection area adjusting apparatus according to the present embodiment provides the highest spatial resolution ( In the state where the most detailed image can be obtained), the entire detection target area is photographed throughout.
- the camera P101 and the camera P101 described in the first and second embodiments are used.
- a method for calculating the detection area of the camera PR1101 will be described in detail.
- FIG. 21 is a view for explaining variable detection areas of the camera P101 and the camera PR1101.
- a lens 2101 is a lens 201 shown in FIG. 10 (a) and a lens 1201 shown in FIG. 19 (a)
- an imaging surface 2102 is an imaging surface 202 and FIG. 19 (a) shown in FIG. 10 (a).
- the image plane 2102 is located at a distance f away from the lens 2101 in the direction of the Z axis 2106, and c of 2WX 2H
- the X-axis 2107 is the X-axis 120 shown in FIG.
- the X axis 1120 and Z axis 2109 are the Z axis 122 shown in FIG. 8 and the Z axis 11 shown in FIG.
- the camera 2103 is positioned at (X, ⁇ , Z) in the world coordinate system.
- a camera coordinate axis system composed of the X axis 2104, the Y axis 2105, and the Z axis 2106
- the points (X, Y, Z) are expressed by the following equation (19) as X axis 2107, Y axis 2108, and Z axis.
- Equation 19 the 3X3 matrix value where M is also an element of M is the attitude of the camera 2103
- the matrix value (X, ⁇ , Z) represents the position reference point of the camera 2103 (the position of the camera 2103).
- It can be calculated by adjusting to the reference point and the position reference point, or using the current posture and position of the camera 2103 as the posture reference point and the position reference point, and using a calibration method and the like shown in the following document 1. It is calculated in advance before the operation of the detection area adjustment device of the present invention starts.
- Equation 20 The 3 ⁇ 3 matrix value with the R force R as an element, as shown in Equation 20 below,
- the rotation angles ( ⁇ , ⁇ , ⁇ ) are determined according to the first and second embodiments of the present invention.
- the position displacement ( ⁇ , ⁇ , ⁇ ) of the camera 2103 from the position reference point is the same force.
- the mechanism changes the position of the camera 2103 using a stepping motor or the like, the displacement can be read from the stepping motor.
- each point (one W, -H, f), (W, H, f), (-W, H, f), (W, H, f)
- the detection area position of the camera 2103 can be calculated by the above equation (22).
- FIG. 23 is a diagram showing an internal configuration of a camera P2301 including a laser pointer.
- the camera P2301 includes a lens 2301, an imaging surface 2302, an image processing unit 2303, an attitude control unit 2304, and a laser pointer 2305.
- the lens 2301 is a lens for forming an image
- the imaging surface 2302 is an element such as a CCD for capturing an image formed by the lens 2301
- the image processing unit 2303 is for an image captured on the imaging surface 2302.
- the processing unit that performs processing controls the posture of the lens 2301, the imaging surface 2302, and the laser pointer 2305, and the processing unit that controls the distance between the lens 2301 and the imaging surface 2302, and the laser pointer 2305 includes the lens 2301,
- the posture is changed by the posture control unit 2304 in conjunction with the surface 2302, and the laser is projected to the end of the detection area of the force camera P2301.
- the lens 2301, the imaging surface 2302, and the image processing unit 2303 are the same as the lens 201, the imaging surface 202, and the image processing unit 203 in FIG. 10A in Embodiment 1 of the present invention, respectively.
- the difference from the first embodiment of the present invention is that the attitude control unit 2304 controls the attitude of the laser pointer 2305 as well as the lens 2301 and the imaging surface 2302, so that the laser pointer 2305 detects the detection area of the camera P2301. Is the point where the laser is projected on the edge of the.
- the laser pointer 2305 projects a laser beam at the end of the detection area of the camera.
- the projected laser hits the real space plane 125 shown in FIG. 8, and a light spot appears on the same plane.
- the light spot indicates the end of the detection area of the camera P2301, and the other camera adjacent to the camera captures the light spot, and the image processing unit 2303 uses the general image processing method to perform the same light spot. Extract the position of.
- the position of the light spot extracted in the image processing unit 2303 is the position in the camera coordinate system, but as described in Supplementary explanation 1, the position in the world coordinate system is calculated by using the above equation (19). be able to.
- the detection area position can be calculated by using the camera P2301 shown in FIG. 23, and the detection area position information of the self-powered camera P2301 can be obtained without using the network 112 shown in FIG. Transmission of detection area position information to other cameras adjacent to the camera can do.
- the case where the detection area is a line has been described as an example.
- a case where the detection area is a plane or a three-dimensional object will be described as an example.
- FIG. 24 shows that, on the line passing through the point (X, Y) and the point (X, Y),
- FIG. 7 is a diagram showing whether a point (X, Y) exists in a region. Pass through point (X, Y) and point (X, Y)
- FIGS. 25 (a) and 25 (b) show examples where the detection area is a plane.
- FIG. 25 (c) shows an example in which the detection area is a three-dimensional object.
- the position of each vertex of the detection area can be calculated using Equation 19 above.
- the center of gravity of the camera 101 of the other camera terminal ⁇ ⁇ can be easily calculated by knowing the top position of the detection area of the camera ⁇ ⁇ ⁇ 101 of the other camera terminal ⁇ , and the position of the center of gravity is expressed as (X, ⁇ ). I do. Using the above relationship
- the detection area of the camera P101 of the other camera terminal P is adjacent to the area D of the output area, the relationship of the above equations 30 and 31 is satisfied. As described above, even if the detection area is on the surface, the information indicating the detection area position of the camera P101 of the own camera terminal P and the other camera terminal P indicates that the other camera terminal P adjacent to the detection area of the own camera terminal P A detection area or a non-detection area can be selected.
- Fig. 25 (b) is obtained by changing the way of drawing a straight line passing through each vertex in Fig. 25 (a).
- the own camera terminal P and the passive camera terminal are displayed. It is needless to say that the detection area or the non-detection area of the other mobile terminal P adjacent to the detection area of the self-powered mobile terminal P can be selected from the information indicating the detection area position of the camera P101 of P.
- Fig. 25 (c) shows an example in which the detection area is a three-dimensional object.
- the camera of the own camera terminal P and the camera terminal P of the other power terminal are also obtained in the same manner as described above. It goes without saying that, based on the information indicating the position of the detection area in P101, the detection area or the non-detection area of the other-power mobile terminal P adjacent to the detection area of the self-powered mobile terminal P can be selected.
- operation terminal 111L and operation terminal 111R in FIG. 8 has a function of transmitting the position information of the detection target area from the communication unit 103 or the communication unit 1103 to each camera terminal P 11 OA-11OC in FIG. 8 and each camera terminal P 111ROA-1110C in FIG.
- the operation terminal includes a communication unit 103 or a communication unit 1103, and the force communication unit also exists in each camera terminal P or each camera terminal PR. If the communication unit 103 or the communication unit 1103 in each camera terminal P or each camera terminal PR transmits the position information of the detection target area, each camera terminal P or each camera terminal PR also has the function of the operation terminal. Will be. In this case, the operation terminal is not particularly required.
- each operating terminal transmits end position information of a detection target area, and a closed area formed by each end position is set as a detection target area.
- the effect of the detection region adjustment device of the present invention can be obtained even if one operation terminal transmits all end position information of the detection target region and a closed region formed by each end position is set as the detection target region. Needless to say.
- two operation terminals respectively transmit two end position information of the detection target area, but if there are N ends of the detection target area, N The operation terminals may transmit the end position information.
- the information on the position of the detection target area transmitted by the operation terminal is a predetermined constant value.
- the detection area adjustment device of the present invention can cope with the change even when the position of the detection target area transmitted from the operation terminal is changed. As a result, it is possible to obtain the effect of detecting the detection target area without blind spots. For this reason, the information on the position of the detection target area transmitted by the operation terminal may be a force even if the detection area adjustment device of the present invention is operating or a value that changes over time.
- a camera terminal 4101 is provided according to the first and second embodiments of the present invention. And the camera terminal P or the camera terminal PR in the form 2 and communicates information with the remote terminal 4101 and the operation terminal 4105 via the wireless network 4102.
- the car 4103 is a car that runs on a road 4104, and is equipped with an operation terminal 4105.
- the detection area 4106A and the detection area 4106B are detection areas at each time of the car 4103 traveling on the road 4104, and the detection area is centered on the position of the car obtained using a GPS, a gyrocompass, or the like. This area has a certain size, and is transmitted from the operation terminal 4105.
- the operation of such a detection area adjustment device is as follows.
- the plurality of camera terminals 4101 of the detection area adjustment device of the present invention installed on the road 4104 communicate with the other terminal using the wireless network 4102.
- the operation terminal 4105 installed on the vehicle 4103 running on the road 4104 transmits detection area position information centered on the current position of the vehicle 4103 to each camera terminal 4101 using the wireless network 4102.
- the detection area adjusting device can constantly capture an image in the detection area centered on the position of the car 4103 that changes with time, without blind spots.
- the image information photographed without blind spots is provided to the driver of the car 4103 using the wireless network 4102, so that the driver of the car 4103 can acquire information about the surroundings of the car without blind spots.
- Driving and parking are supported.
- each camera terminal P110A—HOC in FIG. 8 and each camera terminal P111R0A—1110C in FIG. 18 are based on the procedures of the flowcharts shown in FIG. 11 and FIG. Perform the operation. In this supplementary explanation, it is assumed that the operation is performed based on the procedure of the flowchart shown in FIG. 27 for each camera terminal P and FIG. 28 for each camera terminal PR.
- the flowchart shown in FIG. 27 is obtained by adding steps 5104 and 5105 to the flowchart shown in FIG. 11 described in the first embodiment of the present invention. If the determination in step 5104 is No, that is, if the detection area is not specified, the processing of steps 5101 to 5103 similar to that of the first embodiment of the present invention is repeated. It goes without saying that the effect of the device can be obtained.
- the flowchart shown in FIG. 28 is obtained by adding Step 5205 and Step 5206 to the flowchart shown in FIG.
- Step 5205 is No, that is, the detection area
- the processing of steps 5201 to 5204 similar to that of the second embodiment of the present invention is repeated, so that the effect of the detection area adjustment device of the present invention can be obtained.
- step 5104 or step 5205 if the determination in step 5104 or step 5205 is Yes, that is, if the detection range is specified, in step 5105 or step 5206, camera terminal P or camera PR
- the detection area position or the detection area spatial resolution is adjusted to the detection area position or the detection area spatial resolution designated in step 5104 or step 5205.
- step 5104 and step 5205 the position of the designated detection region and the spatial resolution of the detection region are designated by a human.
- the image processing unit 203 in FIG. 10 (a) and the image processing unit 1203 in FIG. 19 (a) the image power captured by the camera terminals P and PR also determines the position and size of the detection target by pattern matching or the like. It is detected by a general image processing method. Then, with the position of the detected object as the center, the detection area position and the detection area spatial resolution are specified so that the detected object falls within the detection area.
- camera terminal P and the camera terminal PR operate based on the flowcharts shown in Figs. 27 and 28, if the detection area position or the detection area spatial resolution is specified, Alternatively, for camera terminals P and PR with the same designation, camera terminal P and camera terminal PR are adjusted to the same detection area position or the same detection area spatial resolution, and the detection area position or the detection area spatial resolution is specified. If there is no camera terminal P or camera terminal P and camera terminal PR that do not have the same designation, camera terminal P and camera terminal PR cover the detection target area with a blind spot, as in the first and second embodiments of the present invention.
- a camera terminal 5301A is a camera terminal P or a camera terminal PR according to Embodiments 1 and 2 of the present invention, and operates based on the flowcharts shown in FIGS. 27 and 28.
- the network 5302 is a network for transmitting information between the camera terminals 5301A to 5301E, and the detection target 5303 is a target to be detected by the camera terminals 5301A to 5301E, and exists in the detection target area 5304.
- Each of the camera terminals 5301A-5301E operates based on the flowchart shown in FIG. 27 or FIG. That is, unlike the other camera terminals, the camera terminal 5301B detects the detection target 5303, and thus the detection area position or the detection area spatial resolution is specified in steps 5104 and 5205.
- the specified detection area position and detection area spatial resolution are the detection area position and the detection area spatial resolution centered on the position of the detection object 5303 and within which the detection object 5303 falls within the detection area.
- the camera terminal 5301B is adjusted to the detection area position and the detection area spatial resolution where the detection object 5303 fits within the detection area with the position of the detection object 5303 as the center.
- the adjacent camera terminals 5301A, 5301C, 5301D, and 5301E do not detect the detection target 5303, the adjacent camera terminals 5301A, 5301C, The detection area position is adjusted to have a certain overlap area with the area.
- the position of the detection target 5303 is automatically set to the center. A detailed image in which the detection target object 5303 falls within the detection area is obtained, and the detection target area 5304 is always detected without blind spots. Needless to say, the above operation is the same as the above operation, even if the detection object 5303 moves, only the camera terminal that detects the detection object 5303 is switched.
- camera terminal 5401A has a camera terminal 5401E according to the first embodiment of the present invention. And the camera terminal P or the camera terminal PR in the mode 2 and operates based on the flowcharts shown in FIGS. 27 and 28.
- the network 5402 is a network for transmitting information between the camera terminals 5401A-5401C, and the detection target 5403 is a target detected by each of the camera terminals 5401A-5401C, and exists in the detection target area 5404. The above is the same as in FIG. 29.From this, the camera terminals 5401A to 5401C automatically detect the detection target object 5403 even when the detection target object 5403 exists in the detection target area 5404. A detailed image in which the detection target object 5403 falls within the detection area around the position of 5403 is obtained, and the detection target area 5404 is always detected without blind spots.
- the detection area adjustment device shown in FIG. 30 is different from the detection area adjustment device of the present invention shown in FIG. 29 in that new processing units (image synthesis unit 5405, display unit 5406, and instruction unit 5407) are added. Have been.
- the image synthesizing unit 5405 is a processing unit for synthesizing each image acquired by each of the camera terminals 5401A-5401C into one image or the like
- the display unit 5406 is a processing unit for displaying the image synthesized by the image synthesizing unit 5405.
- the instruction unit 5407 is a processing unit that specifies a detection area or a detection area spatial resolution for each of the camera terminals 5401A to 5401C.
- the image combining unit 5405 receives, via the network 5402, the image captured by each of the camera terminals 5401A to 5401C and the detection area position information transmitted by each of the camera terminals 5401A to 5401C.
- the image synthesizing unit 5405 synthesizes an image captured by each camera terminal into an image in which the spatial positions of each image are continuous as shown in FIG. 31 using the detection area position information of each camera terminal.
- the synthesized image is displayed on the display portion 5406, and the image information is presented to a human.
- the image obtained by each of the camera terminals 5401A-5401C and the positions of the pixels constituting the image used in the image synthesizing unit 5405 in the world coordinate axis system can be calculated by the above equation 19;
- the image combining unit 5405 can combine images in which spatial positions of various viewpoints are continuous.
- the person viewing the composite image displayed on display portion 5406 inputs the position or spatial resolution of the region on the composite image desired by himself to instruction portion 5407.
- the position or spatial resolution of the area is indicated using a pointing device or the like.
- the indicator 5407 that has received the position or spatial resolution of the area specified by the human
- the camera terminal in the area is determined. This determination can be easily made using the detection area information transmitted by each of the camera terminals 5401A-5401C.
- the instructing unit 5407 determines the position or spatial resolution of the region specified by the human in the detection region position or the detection region The resolution is indicated via the network 5402.
- the camera terminal whose detection area position or detection area spatial resolution is specified adjusts the detection area position or detection area spatial resolution of the camera terminal to the specified detection area position or detection area spatial resolution.
- the human can always receive the detection target area 5404 as image information in which the blind spot is always blind spot, from various viewpoints, and spatial positions are continuous. Furthermore, by specifying the position or spatial resolution of the specified area based on the image information, it is possible to acquire an image of a specific area position or spatial resolution. For example, if a human inputs to the instructing unit 5407 so as to increase the spatial resolution of a certain area, an image having a high spatial resolution, that is, an image with a high resolution for the same area will be displayed on the display unit 5406. These effects are very useful for monitoring a building having a wide detection target area.
- the position and area of the detection area (the highest! State), and control was performed so that adjacent detection areas overlapped for all camera terminals including that one camera.However, as another method to obtain a similar result, one camera For camera terminals, the position and area of the detection area (the highest and spatial resolution) are fixed, and control is performed so that adjacent detection areas overlap for all camera terminals except one. May be. In other words, one camera terminal that is capturing an object to be detected may be excluded as a camera terminal to be subjected to control for overlapping detection areas. In other words, the object to be detected may be photographed, and one camera terminal may be deemed to have failed, and control may be performed to overlap the detection area only with another camera terminal excluding the camera terminal. . As a result, an image of the object to be detected is obtained with the highest spatial resolution, and the entire area to be detected is photographed throughout.
- the detection region adjusting device according to the present invention has been described in the embodiment and the supplementary description. Force described on the basis of the present invention is not limited to only these examples.
- various modifications and functions may be conceived by those skilled in the art without departing from the spirit of the invention, and each component in each embodiment may be arbitrarily added as long as the functions can coexist. They may be combined.
- each of the plurality of camera terminals constituting the detection area adjustment device is capable of controlling pan, tilt, and zoom.
- the configuration is not limited, and some or all of the camera terminals may be fixed in any of pan, tilt, and zoom.
- ADVANTAGE OF THE INVENTION According to the present invention, a plurality of camera terminals constituting a detection region adjustment device communicate with each other to adjust autonomously and cooperatively to overlap adjacent detection regions or to match spatial resolution. Therefore, it is only necessary that one camera terminal adjacent to the detection area has a function of adjusting pan, tilt, and zoom. Even in such a case, as a result, the entire detection target region is adjusted so that the adjacent detection regions overlap or the spatial resolution is adjusted to match.
- the camera terminal is a fixed camera terminal fixed to a specific place, but may be a mobile camera terminal.
- FIG. 32 is a block diagram showing a configuration of a monitoring system when the detection area adjustment device according to the present invention is applied to a monitoring system including a mobile camera terminal camera.
- the surveillance system also includes a plurality of mobile camera terminals 6101 connected by a communication network 6103, and the plurality of mobile camera terminals 6101 adjust pan and tilt so that the monitoring area 6111 can be monitored throughout. It is unique in that it moves in an autonomous and cooperative manner.
- the mobile camera terminal 6101 is a camera terminal that moves while being supported by the moving unit 6102.
- the moving unit 6102 is a mechanism unit or the like that changes a shooting position of the mobile camera terminal 6101.
- the communication network 6103 is a transmission line connecting a plurality of mobile camera terminals 6101.
- the communication unit 6104 is a communication interface for the mobile camera terminal 6101 to exchange information with another mobile camera terminal via the communication network 6103.
- the adjacent shooting area specifying unit 6105 is a processing unit that estimates a mobile camera terminal whose shooting area is adjacent to the other mobile camera terminal power information notified to the communication unit 6104.
- the imaging element 6106 is a CCD camera or the like that captures an image in the monitoring area.
- the imaging area estimation unit 6107 determines the characteristics of the imaging element
- the position force of the unit 6102 is also a processing unit for estimating the shooting area of the mobile camera terminal 6101.
- the monitoring range storage unit 6108 is a memory or the like that stores a range of an area to be monitored by the mobile camera terminal 6101.
- the photographing position evaluation unit 6109 is a processing unit that evaluates the distance between the photographing region of the mobile camera terminal 6101 and the overlapping region of the neighboring photographing regions or the boundary of the monitoring region.
- the shooting position changing unit 6110 is a control unit that controls the moving unit 6102 and changes the shooting position of the mobile camera terminal 6101.
- the monitoring area 6111 is an area to be monitored by the mobile camera terminal 6101.
- the photographing area 6112 is an area photographed by the mobile camera terminal 6101.
- the mobile camera terminal 6101 notifies the surrounding mobile camera terminals of information about the shooting area estimated based on its own shooting position and the characteristics of the imaging element 6106, By changing the pan, tilt and shooting position in coordination with the surrounding mobile camera terminals so that the size of the overlapping area with the adjacent shooting area and the distance from the boundary of the monitoring area approach a predetermined state, In simultaneous imaging by a plurality of mobile camera terminals 6101, it is possible to move to an imaging position where blind spots in the monitoring area are reduced.
- FIG. 33 shows the operation of mobile camera terminal 6101 in such a monitoring system.
- a mobile camera terminal 6101 that can be moved in the horizontal direction (one dimension) is installed on the ceiling of a room with a fixed height to monitor the floor surface for simplicity.
- the mobile camera terminals will not overlap with the width C of the overlapping area of the shooting areas or the boundary of the monitoring area.
- the shooting position so that the distance D approaches a predetermined value, the entire monitoring area is automatically moved to a position where multiple mobile camera terminals can shoot simultaneously, as shown in the lower diagram of this figure. It is possible to do.
- the mobile camera terminal 6101 can be used for simultaneous shooting with multiple mobile camera terminals. Since the mobile camera terminal automatically moves to a position where the blind spot is reduced, it is possible to determine the installation position of the mobile camera terminal, to perform the installation work, and to reduce the burden on the terminal.
- a rail is installed in the monitoring area, and the system is configured so that the mobile camera terminal moves on the track of the rail.
- the detection area adjusting device is particularly applicable to a monitoring system using a plurality of cameras, a sensing system for measuring a physical quantity using a plurality of sensor elements, and the like. It is useful as a high-performance monitoring system that monitors a wide area without causing blind spots, for example, as a monitoring system for suspicious persons in schools and buildings.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computer Security & Cryptography (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Studio Devices (AREA)
- Closed-Circuit Television Systems (AREA)
- Burglar Alarm Systems (AREA)
- Geophysics And Detection Of Objects (AREA)
- Indication And Recording Devices For Special Purposes And Tariff Metering Devices (AREA)
Abstract
Description
Claims
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2005517639A JP3905116B2 (ja) | 2004-02-03 | 2005-01-18 | 検出領域調整装置 |
EP05703781A EP1771005B1 (en) | 2004-02-03 | 2005-01-18 | Detection range adjustment device |
AT05703781T ATE547896T1 (de) | 2004-02-03 | 2005-01-18 | Detektionsbereichs-einstelleinrichtung |
US11/115,152 US7880766B2 (en) | 2004-02-03 | 2005-04-27 | Detection area adjustment apparatus |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2004-027294 | 2004-02-03 | ||
JP2004027294 | 2004-02-03 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/115,152 Continuation US7880766B2 (en) | 2004-02-03 | 2005-04-27 | Detection area adjustment apparatus |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2005076620A1 true WO2005076620A1 (ja) | 2005-08-18 |
Family
ID=34835883
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2005/000544 WO2005076620A1 (ja) | 2004-02-03 | 2005-01-18 | 検出領域調整装置 |
Country Status (5)
Country | Link |
---|---|
EP (1) | EP1771005B1 (ja) |
JP (1) | JP3905116B2 (ja) |
CN (1) | CN100566405C (ja) |
AT (1) | ATE547896T1 (ja) |
WO (1) | WO2005076620A1 (ja) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007068041A (ja) * | 2005-09-01 | 2007-03-15 | Matsushita Electric Ind Co Ltd | 広域監視パノラマシステム |
JP2007208387A (ja) * | 2006-01-31 | 2007-08-16 | Matsushita Electric Ind Co Ltd | センサー配置装置、センサー制御装置およびセンサー制御システム |
JP2009527149A (ja) * | 2006-02-13 | 2009-07-23 | ソニー株式会社 | 複数のビデオストリームを組み合わせるシステムおよび方法 |
CN101636769B (zh) * | 2006-12-07 | 2012-07-04 | 传感电子公司 | 用于视频监视系统视野对准的方法和装置 |
JP2015060327A (ja) * | 2013-09-17 | 2015-03-30 | 株式会社リコー | 投影装置、投影方法及び情報処理システム |
JP2016220145A (ja) * | 2015-05-25 | 2016-12-22 | キヤノン株式会社 | 画像解析装置、画像解析方法、およびプログラム |
JP2018074528A (ja) * | 2016-11-02 | 2018-05-10 | キヤノン株式会社 | 情報処理システムおよびその構成機器、実空間の監視方法 |
Families Citing this family (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7880766B2 (en) | 2004-02-03 | 2011-02-01 | Panasonic Corporation | Detection area adjustment apparatus |
JP4418805B2 (ja) * | 2004-02-03 | 2010-02-24 | パナソニック株式会社 | 検出領域調整装置 |
CN106950925A (zh) * | 2007-09-10 | 2017-07-14 | 费舍-柔斯芒特系统股份有限公司 | 过程控制系统中的位置依赖控制访问 |
JP5428210B2 (ja) * | 2008-06-11 | 2014-02-26 | ソニー株式会社 | 情報処理装置、撮像システム、録画制御方法及びプログラム |
WO2016040085A1 (en) * | 2014-09-10 | 2016-03-17 | Siemens Aktiengesellschaft | Gas turbine failure prediction utilizing supervised learning methodologies |
WO2016132267A2 (en) * | 2015-02-19 | 2016-08-25 | Spark S.R.L. | A video management system and method |
CN105554449B (zh) * | 2015-12-11 | 2018-04-27 | 浙江宇视科技有限公司 | 一种用于快速拼接摄像机图像的方法及装置 |
CN105956555A (zh) * | 2016-04-29 | 2016-09-21 | 广东小天才科技有限公司 | 拍照搜题的方法及装置 |
CN106202359B (zh) * | 2016-07-05 | 2020-05-15 | 广东小天才科技有限公司 | 拍照搜题的方法及装置 |
CN106454068B (zh) * | 2016-08-30 | 2019-08-16 | 广东小天才科技有限公司 | 一种快捷获取有效图像的方法和装置 |
CN106303255B (zh) * | 2016-08-30 | 2019-08-02 | 广东小天才科技有限公司 | 快速获取目标区域图像的方法和装置 |
CN111095031B (zh) * | 2017-09-12 | 2023-06-13 | 三菱电机株式会社 | 人体检测装置以及照明装置 |
CN112004028B (zh) * | 2020-09-03 | 2021-06-15 | 南京国础工程技术有限公司 | 一种基于机器视觉的智慧社区智能安防监测管理系统 |
CN112969034B (zh) * | 2021-03-01 | 2023-03-03 | 华雁智能科技(集团)股份有限公司 | 摄像装置布点方案图的验证方法、装置及可读存储介质 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000083243A (ja) * | 1998-09-04 | 2000-03-21 | Canon Inc | 撮像装置、撮像システム、撮像制御方法、及び記憶媒体 |
JP2001094975A (ja) * | 1999-09-20 | 2001-04-06 | Hitachi Ltd | 移動物体追跡方法および装置 |
JP2002077887A (ja) * | 2000-09-04 | 2002-03-15 | Mitsubishi Electric Corp | 自動監視方法および自動監視装置 |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3618891B2 (ja) * | 1996-04-08 | 2005-02-09 | キヤノン株式会社 | カメラ制御装置及びカメラ制御情報の表示方法 |
US6778207B1 (en) * | 2000-08-07 | 2004-08-17 | Koninklijke Philips Electronics N.V. | Fast digital pan tilt zoom video |
US20030053658A1 (en) * | 2001-06-29 | 2003-03-20 | Honeywell International Inc. | Surveillance system and methods regarding same |
JP3870124B2 (ja) * | 2002-06-14 | 2007-01-17 | キヤノン株式会社 | 画像処理装置及びその方法、並びにコンピュータプログラム及びコンピュータ可読記憶媒体 |
-
2005
- 2005-01-18 JP JP2005517639A patent/JP3905116B2/ja not_active Expired - Fee Related
- 2005-01-18 CN CNB2005800040192A patent/CN100566405C/zh not_active Expired - Fee Related
- 2005-01-18 WO PCT/JP2005/000544 patent/WO2005076620A1/ja not_active Application Discontinuation
- 2005-01-18 AT AT05703781T patent/ATE547896T1/de active
- 2005-01-18 EP EP05703781A patent/EP1771005B1/en not_active Not-in-force
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000083243A (ja) * | 1998-09-04 | 2000-03-21 | Canon Inc | 撮像装置、撮像システム、撮像制御方法、及び記憶媒体 |
JP2001094975A (ja) * | 1999-09-20 | 2001-04-06 | Hitachi Ltd | 移動物体追跡方法および装置 |
JP2002077887A (ja) * | 2000-09-04 | 2002-03-15 | Mitsubishi Electric Corp | 自動監視方法および自動監視装置 |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007068041A (ja) * | 2005-09-01 | 2007-03-15 | Matsushita Electric Ind Co Ltd | 広域監視パノラマシステム |
JP4527638B2 (ja) * | 2005-09-01 | 2010-08-18 | パナソニック株式会社 | 広域監視パノラマシステム |
JP2007208387A (ja) * | 2006-01-31 | 2007-08-16 | Matsushita Electric Ind Co Ltd | センサー配置装置、センサー制御装置およびセンサー制御システム |
JP2009527149A (ja) * | 2006-02-13 | 2009-07-23 | ソニー株式会社 | 複数のビデオストリームを組み合わせるシステムおよび方法 |
US9182228B2 (en) | 2006-02-13 | 2015-11-10 | Sony Corporation | Multi-lens array system and method |
CN101636769B (zh) * | 2006-12-07 | 2012-07-04 | 传感电子公司 | 用于视频监视系统视野对准的方法和装置 |
JP2015060327A (ja) * | 2013-09-17 | 2015-03-30 | 株式会社リコー | 投影装置、投影方法及び情報処理システム |
JP2016220145A (ja) * | 2015-05-25 | 2016-12-22 | キヤノン株式会社 | 画像解析装置、画像解析方法、およびプログラム |
JP2018074528A (ja) * | 2016-11-02 | 2018-05-10 | キヤノン株式会社 | 情報処理システムおよびその構成機器、実空間の監視方法 |
Also Published As
Publication number | Publication date |
---|---|
JP3905116B2 (ja) | 2007-04-18 |
EP1771005A1 (en) | 2007-04-04 |
CN1914919A (zh) | 2007-02-14 |
CN100566405C (zh) | 2009-12-02 |
EP1771005B1 (en) | 2012-02-29 |
JPWO2005076620A1 (ja) | 2008-01-10 |
EP1771005A4 (en) | 2010-03-17 |
ATE547896T1 (de) | 2012-03-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2005076620A1 (ja) | 検出領域調整装置 | |
JP4418805B2 (ja) | 検出領域調整装置 | |
JP3887403B2 (ja) | カメラ端末および撮影領域調整装置 | |
JP3886524B2 (ja) | カメラ端末および監視システム | |
JP3968376B2 (ja) | 撮影領域調整装置 | |
JP3902222B2 (ja) | 監視システム、監視方法及びカメラ端末 | |
US7787013B2 (en) | Monitor system and camera | |
US7880766B2 (en) | Detection area adjustment apparatus | |
JP4931218B2 (ja) | 撮像装置、物体検出方法及び姿勢パラメータの算出方法 | |
JP3700707B2 (ja) | 計測システム | |
US20060017812A1 (en) | Camera link system, camera device and camera link control method | |
US20060055792A1 (en) | Imaging system with tracking function | |
JP2016177640A (ja) | 映像監視システム | |
JP7204346B2 (ja) | 情報処理装置、システム、情報処理方法及びプログラム | |
JP2009010728A (ja) | カメラ設置支援装置 | |
JP4227037B2 (ja) | 撮像システム及び校正方法 | |
KR101916093B1 (ko) | 객체 추적 방법 | |
JP3875199B2 (ja) | 撮影装置 | |
WO2021251171A1 (ja) | 情報処理装置、情報処理方法、及び、プログラム | |
JP2006033188A (ja) | 監視装置および監視方法 | |
KR101996907B1 (ko) | 객체 추적 장치 | |
JP2004320175A (ja) | 監視カメラシステム | |
JP4846203B2 (ja) | 画像領域設定装置及び画像領域設定方法 | |
JP4027294B2 (ja) | 移動体検出装置、移動体検出方法及び移動体検出プログラム | |
WO2023166649A1 (ja) | 移動軌跡情報処理装置、移動軌跡情報処理方法、および記録媒体 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 11115152 Country of ref document: US |
|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
DPEN | Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed from 20040101) | ||
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2005517639 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2005703781 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 200580004019.2 Country of ref document: CN |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWW | Wipo information: withdrawn in national office |
Country of ref document: DE |
|
WWP | Wipo information: published in national office |
Ref document number: 2005703781 Country of ref document: EP |