WO2017199398A1 - Dispositif de commande d'affichage et dispositif de capture d'image - Google Patents

Dispositif de commande d'affichage et dispositif de capture d'image Download PDF

Info

Publication number
WO2017199398A1
WO2017199398A1 PCT/JP2016/064904 JP2016064904W WO2017199398A1 WO 2017199398 A1 WO2017199398 A1 WO 2017199398A1 JP 2016064904 W JP2016064904 W JP 2016064904W WO 2017199398 A1 WO2017199398 A1 WO 2017199398A1
Authority
WO
WIPO (PCT)
Prior art keywords
video
camera
unit
video data
turning
Prior art date
Application number
PCT/JP2016/064904
Other languages
English (en)
Japanese (ja)
Inventor
教敬 岸田
信幸 猪木原
康行 笠井
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to PCT/JP2016/064904 priority Critical patent/WO2017199398A1/fr
Priority to US16/099,609 priority patent/US10440254B2/en
Priority to JP2018518021A priority patent/JP6598992B2/ja
Publication of WO2017199398A1 publication Critical patent/WO2017199398A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/56Accessories
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/56Accessories
    • G03B17/561Support related camera accessories
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/682Vibration or motion blur correction
    • H04N23/683Vibration or motion blur correction performed by a processor, e.g. controlling the readout of an image memory
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/69Control of means for changing angle of the field of view, e.g. optical zoom objectives or electronic zooming
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/695Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/681Motion detection
    • H04N23/6815Motion detection by distinguishing pan or tilt from motion

Definitions

  • the present invention relates to a display control device that displays on a monitor a video image taken by a camera that is rotated using a mechanism that transmits rotational force by meshing teeth such as a worm and a worm wheel.
  • the pan turning mechanism includes a worm 33 to which a pan motor 32 is attached and a worm wheel 34 to which the camera 10 is attached.
  • 30 is a view of the pan turning mechanism of FIG. 29 as viewed from the direction A in the drawing.
  • the teeth of the worm 33 and the teeth of the worm wheel 34 are engaged with each other, and the camera 10 is turned to a desired position by the forward or reverse rotation of the pan motor 32.
  • FIG. 31 is a diagram showing the meshing between the teeth of the worm 33 and the teeth of the worm wheel 34.
  • a backlash that is, a gear gap
  • the total number of teeth of the worm wheel 34 is 120 teeth
  • the tooth angle 34a is 3 degrees.
  • the backlash that is, the gap 33a + the gap 33b is set to 0.03 degrees.
  • Each angle of the tooth angle 34 a and the gaps 33 a and 33 b indicates an angle when viewed from the rotation center of the worm wheel 34.
  • FIG. 32 is a diagram showing the meshing when the worm 33 is rotated forward.
  • FIG. 33 is a diagram showing the meshing when the worm 33 is reversed. In the reverse rotation, the worm wheel 34 rotates in the reverse rotation direction when the teeth of the worm 33 and the teeth of the worm wheel 34 come into contact with each other on the opposite gap 33b side.
  • a stop position correction value table in which a stop position error amount is calculated and measured in advance is created and used from the captured image using the table.
  • An image pickup apparatus for adjusting the cutout position is described. Thereby, an image in which the stop position error is reduced can be used as a display image.
  • Patent Document 1 merely reduces an apparent stop position error, and the actual stop position of the camera remains shifted from a desired position, for example, a preset position. In order to eliminate the deviation of the camera stop position itself, it is necessary to make an additional fine adjustment of the camera position. However, it is not preferable that a good video cannot be provided to the user because the displayed video fluctuates during the fine adjustment.
  • the present invention has been made in order to solve the above-described problems, and can improve the stop position accuracy of the camera, and can reduce the fluctuation of the displayed image and provide a good image.
  • the purpose is to obtain.
  • the display control device turns in the first rotational direction used for defining the camera position and the second rotational direction opposite to the direction by the rotational force transmitted by the meshing of teeth in the rotational force transmission mechanism.
  • a video cutout unit that outputs video data within a part of the shooting range of the camera, a cutout position calculation unit that calculates a movement amount of the shooting range corresponding to the turning amount of the camera, and a video within the region
  • a display control unit that displays an image indicated by the data, and the image cutout unit is located in an area where the position in the shooting range is fixed to the set position until the camera that rotates in the second rotation direction reaches the first position.
  • the cut position calculation is performed until the camera further turns by the set angle in the second rotation direction and further turns by the set angle in the first rotation direction to reach the second position.
  • video data in the area where the position in the shooting range is moved in a direction to cancel the movement of the area linked to the turning of the camera is output, and when the camera reaches the second position, the shooting range is The video data in the region where the position at is fixed at the set position is output.
  • FIG. 2A and 2B are diagrams showing a hardware configuration example of the control unit according to Embodiment 1 of the present invention. It is a flowchart which shows the process of the imaging device which concerns on Embodiment 1 of this invention. 4 is a timing chart corresponding to the flowchart of FIG. 3. It is a flowchart which shows the process of the imaging device which concerns on Embodiment 1 of this invention. 6 is a timing chart corresponding to the flowchart of FIG. 5. It is a flowchart which shows the process of the imaging device which concerns on Embodiment 1 of this invention. It is a figure which shows the turning operation
  • 15A, 15B, and 15C are diagrams showing an image of processing by the cutout position calculation unit and the video cutout unit according to Embodiment 1 of the present invention. It is a figure which shows the image
  • FIG. 1 is a configuration diagram of an imaging apparatus 1 according to Embodiment 1 of the present invention.
  • the imaging device 1 according to Embodiment 1 is used for monitoring, for example.
  • the imaging device 1 includes a camera 10, a control unit 21, a current position storage unit 22, a preset position storage unit 23, an input unit 24, a pan turning unit 30, a display control unit 41, and a monitor 42. .
  • the camera 10 that captures an image is attached to the worm wheel 34 of the pan turning unit 30 and rotates integrally with the worm wheel 34.
  • the camera 10 includes a lens 11, a zoom change unit 12, an imaging unit 13, a video storage unit 14, a video cutout unit 15, and a video output unit 16.
  • the zoom change unit 12 changes the zoom value of the camera 10 by changing the zoom value of the lens 11.
  • the zoom changing unit 12 is a mechanism that moves a plurality of lenses constituting the lens 11 and changes the distance between the plurality of lenses.
  • the imaging unit 13 converts the subject image formed by the lens 11 into video data and outputs the video data to the video storage unit 14.
  • the imaging unit 13 includes an imaging element such as a charge coupled device (CCD) or a complementary metal oxide semiconductor (CMOS) system.
  • CCD charge coupled device
  • CMOS complementary metal oxide semiconductor
  • the video storage unit 14 stores the video data output from the imaging unit 13.
  • the video storage unit 14 is composed of various types of memory, similar to the memory 101 described later.
  • the video cutout unit 15 cuts out a part of the video data stored in the video storage unit 14 and outputs it to the video output unit 16. As a result, only video data within a part of the shooting range of the camera 10 is output to the video output unit 16.
  • the video output unit 16 is a video output interface that outputs the video data output from the video cutout unit 15 to the display control unit 41.
  • the display control unit 41 controls the monitor 42 to display on the monitor 42 the video indicated by the video data output by the video cutout unit 15 via the video output unit 16.
  • the monitor 42 is a liquid crystal display, for example.
  • the control unit 21 controls each component of the imaging device 1 such as the camera 10 and the pan turning unit 30.
  • the control unit 21 includes a pan rotation control unit 21a, a cut-out position calculation unit 21b, and a zoom control unit 21c.
  • the pan turning control unit 21 a outputs a signal instructing the pan driving unit 31 to rotate and stop the pan motor 32. This signal is a forward / reverse rotation signal S1, a turning pulse signal S2, and the like.
  • the pan rotation control unit 21a stores and reads the position of the camera 10 and stores and reads the preset position with respect to the current position storage unit 22 and the preset position storage unit 23.
  • the cutout position calculation unit 21b instructs the video cutout unit 15 about the cutout position. At that time, if necessary, the camera 10 turns to calculate how much the shooting range of the camera 10 moves.
  • the zoom control unit 21c outputs a signal instructing the operation of the zoom change unit 12 to the zoom change unit 12, and sets the zoom value.
  • the current position of the camera 10 is stored in the current position storage unit 22 by the pan turning control unit 21a.
  • the current position storage unit 22 is composed of various types of memory in the same manner as the memory 101 described later.
  • the preset position storage unit 23 the preset position of the camera 10 is stored by the pan rotation control unit 21a.
  • the preset position is stored when the preset position is registered by a user instruction signal via the input unit 24.
  • the preset position storage unit 23 is composed of various memories in the same manner as the memory 101 described later.
  • the input unit 24 receives a user operation and outputs a user instruction signal to the control unit 21.
  • the input unit 24 is, for example, an operation button.
  • the pan turning unit 30 is a mechanism that is controlled by the control unit 21 to turn the camera 10 in the pan direction.
  • the pan turning unit 30 includes a pan driving unit 31, a pan motor 32, a worm 33, a worm wheel 34, an origin generation unit 35, and an origin detection unit 36.
  • the pan driving unit 31 is a driver that controls the driving voltage and driving current supplied to the pan motor 32 in accordance with the forward / reverse rotation signal S1 and the turning pulse signal S2 output from the pan turning control unit 21a.
  • the pan motor 32 is rotated by the drive voltage and drive current supplied by the pan drive unit 31.
  • the pan motor 32 is, for example, a stepping motor.
  • the worm 33 is attached to the output shaft of the pan motor 32 and rotates integrally with the output shaft.
  • the worm wheel 34 meshes with the worm 33 and rotates as the worm 33 rotates.
  • the camera 10 is attached to the worm wheel 34, and the rotational force generated by the pan motor 32 is transmitted to the camera 10 by meshing teeth of the worm 33 and the worm wheel 34, and the camera 10 turns.
  • the worm 33 and the worm wheel 34 constitute a rotational force transmission mechanism.
  • the origin generating portion 35 is attached to an arbitrary position on the outer peripheral surface, which is the side surface on which the teeth of the worm wheel 34 are formed, without affecting the meshing between the worm 33 and the worm wheel 34.
  • the origin detection unit 36 detects the position of the origin generation unit 35 and outputs an origin detection signal S3 to the pan rotation control unit 21a of the control unit 21.
  • the origin detection unit 36 is fixed in the vicinity of the worm wheel 34 by a support mechanism (not shown).
  • the origin generator 35 is made of, for example, a sheet metal having a slit formed therein, and the origin detector 36 is made of, for example, a photo interrupter. Then, the origin detection unit 36 detects the origin generation unit 35 by passing a sheet metal in which a slit is formed between the light emitting unit and the light receiving unit of the photo interrupter.
  • a known object detection method using a magnetic sensor, a relay, or the like may be applied.
  • the video cutout unit 15, the cutout position calculation unit 21b, and the display control unit 41 constitute a display control device.
  • the image cutout unit 15, the cutout position calculation unit 21 b, and the display control unit 41 constitute a single display control device, and the camera 10 configured by excluding them includes: You may connect so that communication with the control part 21 and the monitor 42 is possible.
  • the functions of the pan rotation control unit 21a, the cutout position calculation unit 21b, the zoom control unit 21c, the video cutout unit 15, and the display control unit 41 of the control unit 21 are realized by a processing circuit.
  • the processing circuit may be dedicated hardware or a CPU (Central Processing Unit) that executes a program stored in a memory.
  • the CPU is also called a central processing unit, a processing unit, an arithmetic unit, a microprocessor, a microcomputer, a processor, and a DSP (Digital Signal Processor).
  • FIG. 2A is a diagram illustrating a hardware configuration example when the functions of the respective units of the control unit 21 are realized by the processing circuit 100 which is dedicated hardware.
  • the processing circuit 100 corresponds to, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), or a combination thereof.
  • ASIC application specific integrated circuit
  • FPGA field programmable gate array
  • the functions of the respective parts of the pan rotation control unit 21a, the cutout position calculation unit 21b, and the zoom control unit 21c may be realized by combining separate processing circuits 100, or the functions of the respective units may be realized by one processing circuit 100.
  • the video cutout unit 15 and the display control unit 41 can also be realized by the same hardware configuration as the control unit 21 shown in FIG. 2A.
  • FIG. 2B is a diagram illustrating a hardware configuration example in a case where the functions of the respective units of the control unit 21 are realized by the CPU 102 that executes a program stored in the memory 101.
  • the functions of the pan rotation control unit 21a, the cutout position calculation unit 21b, and the zoom control unit 21c are realized by software, firmware, or a combination of software and firmware.
  • Software and firmware are described as programs and stored in the memory 101.
  • the CPU 102 implements the functions of each unit of the control unit 21 by reading and executing the program stored in the memory 101. That is, the control unit 21 has a memory 101 for storing a program or the like in which each step shown in the flowcharts of FIGS. 3, 5, 7, and 10 to be described later is executed as a result.
  • the memory 101 is a non-volatile or volatile semiconductor memory such as RAM, ROM, flash memory, EPROM (Erasable Programmable ROM), EEPROM (Electrically Erasable Programmable ROM), magnetic disk, flexible disk, optical disk, Compact discs, mini discs, DVDs (Digital Versatile Disc), and the like are applicable.
  • the video cutout unit 15 and the display control unit 41 can also be realized by the same hardware configuration as the control unit 21 shown in FIG. 2B.
  • a part may be implement
  • the functions of the pan rotation control unit 21a and the cutout position calculation unit 21b are realized by a processing circuit as dedicated hardware, and the processing circuit reads and executes a program stored in the memory for the zoom control unit 21c. This function can be realized.
  • the processing circuit can realize the functions of the control unit 21, the video cutout unit 15, and the display control unit 41 by hardware, software, firmware, or a combination thereof.
  • the pan turning control unit 21a turns the camera 10 in the forward rotation direction (step ST101). Specifically, as shown in FIG. 4, the pan turning control unit 21 a outputs the turning pulse signal S ⁇ b> 2 to the pan driving unit 31 after setting the forward / reverse rotation signal S ⁇ b> 1 output to the pan driving unit 31 to the H level. . Then, the pan driving unit 31 supplies a driving voltage and a driving current to the pan motor 32, and the pan motor 32 performs step rotation according to the number of pulses of the turning pulse signal S2. The rotational force output from the pan motor 32 is transmitted to the worm wheel 34 via the worm 33, and the camera 10 turns in the forward rotation direction.
  • the pan turning control unit 21 a needs to output 1800 pulses of the turning pulse signal S ⁇ b> 2 to the pan driving unit 31.
  • the pan turning control unit 21a subsequently determines whether the origin generation unit 35 has been detected using the origin detection signal S3 output from the origin detection unit 36 (step ST102).
  • An origin generating portion 35 is attached to a worm wheel 34 that is rotating for turning the camera 10, and is rotating integrally with the worm wheel 34.
  • the origin detector 36 changes the origin detection signal S3 output to the pan rotation control unit 21a from the L level to the H level as shown in FIG. To change.
  • the pan turning control unit 21a determines that the origin generation unit 35 has been detected (step ST102; YES), stops the output of the turning pulse signal S2, stops the rotation of the pan motor 32, and also turns the camera 10.
  • step ST103 Stop (step ST103).
  • the origin detection signal S3 remains at the L level and the origin generation unit 35 is not detected (step ST102; NO)
  • the process returns to step ST101, and the pan rotation control unit 21a receives the rotation pulse signal S2. Continue output.
  • the pan turning control unit 21a defines the position of the camera 10 in a state where the turning is stopped due to the detection of the origin generation unit 35 as the turning origin of the camera 10. Specifically, the pan turning control unit 21a stores the current position of the camera 10 in the current position storage unit 22 as 0 degrees indicating the origin (step ST104). Thereafter, the turning operation and the turning angle of the camera 10 are processed with this origin as a reference.
  • the pan turning control unit 21a determines whether the user instructs to register the current position of the camera 10 as a preset position using the user instruction signal output from the input unit 24 (step ST111). When the user does not instruct to register the current position of the camera 10 as a preset position (step ST111; NO), the pan rotation control unit 21a uses the user instruction signal output from the input unit 24 to It is determined whether or not the user instructs turning (step ST112).
  • step ST112 If the user has not instructed the camera 10 to turn (step ST112; NO), the process returns to step ST111.
  • step ST112 when the user instructs to turn the camera 10 (step ST112; YES), the pan turning control unit 21a outputs the normal / reverse rotation signal S1 output to the pan driving unit 31 to the H level as shown in FIG. Then, the turning pulse signal S2 is output to the pan driving unit 31, and the camera 10 is turned in the forward rotation direction (step ST113).
  • the pan rotation control unit 21a stores the current position of the camera 10 in the current position storage unit 22 in accordance with the number of pulses of the output rotation pulse signal S2 (step ST114).
  • the camera 10 is turning in the forward rotation direction by the number of pulses of the turning pulse signal S2 with the origin stored in step ST104 of FIG. Therefore, for example, when the camera 10 turns by 0.05 degree with one pulse of the turning pulse signal S2, when the camera 10 is at the origin, 100 pulses of the turning pulse signal S2 are output to the pan driving unit 31. Is turning 5 degrees in the forward direction from the origin.
  • the pan turning control unit 21 a calculates the angle of turning of the camera 10 using the output pulse number, and stores the angle in the current position storage unit 22 as the current position of the camera 10.
  • the current position storage unit 22 always stores an angle indicating the position of the camera 10 at that time.
  • step ST114 After the process of step ST114, the process returns to step ST111.
  • the pan turning control unit 21a stops the output of the turning pulse signal S2, and the pan motor 32 The rotation is stopped and the turning of the camera 10 is also stopped.
  • the pan rotation control unit 21a reads the current position of the stopped camera 10 stored in the current position storage unit 22, and stores the current position in the preset position storage unit 23 as the preset position of the camera 10 (step ST115).
  • a plurality of preset positions may be registered. In that case, a series of processes shown in FIG. 5 may be repeatedly performed.
  • the preset operation when the current position of the camera 10 is 30 degrees and the preset position A is 120 degrees will be described using the flowchart shown in FIG. 7 and the timing chart shown in FIG. .
  • the cut-out position calculation unit 21b cuts out the video data stored in the video storage unit 14
  • the position is instructed to the image cutout unit 15 (step ST120).
  • the video data stored in the video storage unit 14 is all video data within a shooting range that can be said to be a shooting angle of view of the camera 10.
  • the video data cut out by the video cutout unit 15 becomes video data in a part of the shooting range of the camera 10. That is, the cutout position calculation unit 21b instructs the video cutout unit 15 to substantially determine the position of the area in the shooting range of the camera 10.
  • FIGS. 14A to 14C show an image of processing by the cutout position calculation unit 21b and the video cutout unit 15.
  • FIG. 14A shows the video F indicated by the video data output from the imaging unit 13, that is, a landscape that falls within the shooting range of the camera 10.
  • Video data indicating the video F is stored in the video storage unit 14 by the imaging unit 13.
  • FIG. 14B shows a video portion F1 corresponding to the cutout position indicated by the cutout position calculation unit 21b for the video F.
  • the cut-out position calculation unit 21b sets the cut-out position of the video data stored in the video storage unit 14 so that the center of the video part F1 and the center of the video F are matched, for example.
  • FIG. 14A shows the video F indicated by the video data output from the imaging unit 13, that is, a landscape that falls within the shooting range of the camera 10.
  • Video data indicating the video F is stored in the video storage unit 14 by the imaging unit 13.
  • FIG. 14B shows a video portion F1 corresponding to the cutout position
  • FIG. 14C shows a video F2 indicated by the video data output by the video cutout unit 15.
  • Video F2 matches video portion F1 in FIG. 8B.
  • the video cutout unit 15 cuts out video data instructed by the cutout position calculation unit 21b from the video data stored in the video storage unit 14 and outputs the video data.
  • the display control unit 41 displays the video F2 on the monitor 42.
  • FIGS. 15A to 15C shows an image of processing by the cutout position calculating unit 21b and the video cutout unit 15, and is a case where the cutout positions are different from those of FIGS. 14A to 14C.
  • FIG. 15A is a view similar to FIG. 14A.
  • FIG. 15B shows the video portion F3 corresponding to the cut-out position indicated by the cut-out position calculation unit 21b for the video F, as in FIG. 14B.
  • the cut-out position calculation unit 21b sets the cut-out position of the video data stored in the video storage unit 14 so that the right end in the figure of the video portion F3 matches the right end of the video F, for example.
  • FIG. 15C shows the video F4 indicated by the video data output by the video cutout unit 15 as in FIG. 14C.
  • Video F4 corresponds to video portion F3 in FIG. 15B.
  • the video cutout unit 15 cuts out video data instructed by the cutout position calculation unit 21b from the video data stored in the video storage unit 14 and outputs the video data.
  • the display control unit 41 displays the video F4 on the monitor 42.
  • the video is cut out at the position shown in FIGS. 14A to 14C or the video is cut out at the positions shown in FIGS. 15A to 15C, when the user initially sets the imaging apparatus 1 or the like. Should be set to.
  • the central portion is cut out as shown in FIGS. 14A to 14C, the monitoring target appears at the center of the shooting range of the camera 10, and thus it is easy to use for monitoring work.
  • the cutting position shown in FIGS. 14A to 14C is instructed by the cutting position calculation unit 21b in step ST120. This instruction in step ST120 is then maintained until the cutting position is changed by the cutting position calculation unit 21b.
  • the pan turning control unit 21a acquires the current position of the camera 10 from the current position storage unit 22 (step ST121). If the current position is Xc degrees, Xc at this time is 30. Subsequently, the pan turning control unit 21a acquires the preset position A stored in the preset position storage unit 23 (step ST122). If the preset position is X degrees, X at this time is 120.
  • the pan turning control unit 21a calculates a turning direction and a turning angle using Xc and X (step ST123). As shown in FIG. 8, in this example, since the camera 10 is turned from the current position of 30 degrees to the preset position A of 120 degrees, the turning direction is the forward rotation direction and the turning angle is 90 degrees. To do. This is because the preset position A can be reached in a shorter time by turning the camera 10 in the forward direction instead of the reverse direction. Subsequently, the pan turning control unit 21a determines whether the calculated turning direction is the normal rotation direction (step ST124). In this example, since the forward rotation direction is set (step ST124; YES), processing of step ST125 and step ST126 is subsequently performed. Since the processing of step ST125 and step ST126 is the same processing as step ST113 and step ST114 of FIG. 5 already described, the description thereof is omitted.
  • the pan turning control unit 21a determines whether or not the turning pulse signal S2 has been output by the required number of pulses (step ST127). In this example, it is determined whether 1800 pulses necessary for turning the camera 10 by 90 degrees are output. When the turning pulse signal S2 has not been output for the required number of pulses (step ST127; NO), the process returns to step ST125. On the other hand, when the turn pulse signal S2 is output by the required number of pulses (step ST127; YES), the pan turn control unit 21a stops the output of the turn pulse signal S2 as shown in FIG. The turning is stopped (step ST128). At this time, the camera 10 has reached the preset position A.
  • FIG. 16 shows images F10a to F13a indicated by the image data stored in the image storage unit 14 and images F10b to F13b indicated by the image data output by the image cutout unit 15 during the processing shown in FIG. Yes.
  • the camera 10 turns to the preset position A
  • the range of the landscape shown in the images F10a to F13a is moved accordingly.
  • the cutout position for the video data stored in the video storage unit 14 is set to the position shown in FIGS. 14A to 14C, and the range of the landscape shown in the videos F10b to F13b is also moved.
  • the camera 10 reaches the preset position A
  • the video F13b at the preset position A is displayed on the monitor 42 by the display control unit 41.
  • the position of the origin is determined in a state where the camera 10 is rotated in the forward rotation direction.
  • the preset position is also registered in a state where the camera 10 is turned in the forward rotation direction.
  • the forward rotation direction is the first rotation direction used for defining the position of the camera 10.
  • the position at step ST128 of the camera 10 rotated in the forward rotation direction so as to reach the preset position A is accurately affected by the backlash generated between the worm 33 and the worm wheel 34 without being affected by the backlash. It has become. Therefore, the video displayed on the monitor 42 by the display control unit 41 using the video data output from the video cutout unit 15 is also an accurate video at the preset position A.
  • the preset operation when the current position of the camera 10 is 30 degrees and the preset position B is 300 degrees is shown in the flowcharts of FIGS. This will be described with reference to the timing chart shown in FIG.
  • the cut-out position calculation unit 21b cuts out the video data stored in the video storage unit 14 The position is instructed to the image cutout unit 15 (step ST120).
  • the cutout positions shown in FIGS. 14A to 14C will be described below assuming that the cutout position calculation unit 21b has instructed in step ST120.
  • the pan turning control unit 21a acquires the current position of the camera 10 from the current position storage unit 22 (step ST121). If the current position is Xc degrees, Xc at this time is 30. Subsequently, the pan turning control unit 21a acquires the preset position B stored in the preset position storage unit 23 (step ST122). If the preset position is X degrees, X at this time is 300.
  • the pan turning control unit 21a calculates a turning direction and a turning angle using Xc and X (step ST123). As shown in FIG. 11, in this example, since the camera 10 is turned from the current position of 30 degrees to the preset position B of 300 degrees, the turning direction is the reverse direction and the turning angle is 90 degrees. . This is because turning the camera 10 in the reverse direction rather than the normal direction can reach the preset position B in a short time. Subsequently, the pan turning control unit 21a determines whether the calculated turning direction is the normal rotation direction (step ST124). In this example, since the direction is the reverse direction (step ST124; NO), the process of step ST130 shown in FIG. 10 is subsequently performed.
  • the pan rotation control unit 21a sets the number of offset pulses (step ST130).
  • the number of offset pulses is set using, for example, the tooth angle 34a of the worm wheel 34 shown in FIG. Assuming that the number of turns of the turning pulse signal S2 necessary for turning the camera 10 by 1.5 degrees, which is half of the tooth angle 34a of the worm wheel 34, is the number of offset pulses, the camera 10 becomes 0 with one pulse of the turning pulse signal S2. Because it turns only .05 degrees, the number of offset pulses is 30.
  • the number of offset pulses is not limited to the half of the tooth angle 34a of the worm wheel 34, but may be based on one third of the tooth angle 34a, or 100% of the tooth angle 34a. It may be a standard. In short, the backlash of the worm 33 and the worm wheel 34 is set as a lower limit, and a value higher than that is set as a reference.
  • the pan turning control unit 21 a outputs the turning pulse signal S ⁇ b> 2 to the pan driving unit 31 after setting the forward / reverse rotation signal S ⁇ b> 1 output to the pan driving unit 31 to the L level. Thereby, the pan turning control unit 21a turns the camera 10 in the reverse direction (step ST131). Subsequently, the pan turning control unit 21a determines whether or not the turning pulse signal S2 has been output by 1800 pulses here, corresponding to the turning angle calculated in step ST123 (step ST132). When the turning pulse signal S2 is not output to the extent corresponding to the turning angle (step ST132; NO), the process returns to step ST131.
  • step ST132 when the turn pulse signal S2 is output by the amount corresponding to the turn angle (step ST132; YES), the process proceeds to step ST133.
  • the position of the camera 10 after the turning pulse signal S2 is output corresponding to the turning angle is shifted from the original preset position B by the backlash of the worm 33 and the worm wheel 34 as shown in FIG. As already described.
  • the position of the camera 10 that deviates from the original preset position B is referred to as the target position B1 with respect to the preset position B.
  • the target position B1 is shown in FIG.
  • the pan turning control unit 21a adds the turning pulse signal S2 by the number of offset pulses, that is, 30 pulses here, and further turns the camera 10 in the reverse direction (step ST133).
  • the turning pulse signal S2 is added by the number of offset pulses, that is, 30 pulses here, and further turns the camera 10 in the reverse direction (step ST133).
  • half of the tooth angle 34a of the worm wheel 34 is larger than the backlash between the worm 33 and the worm wheel 34. Therefore, as shown in FIG. Continue turning in the direction.
  • step ST134 While the pan turning control unit 21a outputs the turning pulse signal S2 for the number of offset pulses, the cutout position calculating unit 21b changes the cutout position for the video data stored in the video storage unit 14 (step ST134). . Details of step ST134 will be described later.
  • the pan turning control unit 21a determines whether the turning pulse signal S2 is output by the number of offset pulses, that is, 30 pulses here (step ST135).
  • the process returns to step ST133.
  • the turning pulse signal S2 is output up to the number of offset pulses (step ST135; YES)
  • the process proceeds to step ST136.
  • the position of the camera 10 rotated in the reverse rotation direction by applying 1800 + 30 pulses of the rotation pulse signal S2 from the current position acquired in step ST121 is reversed by about half of the tooth angle 34a of the worm wheel 34 with respect to the preset position B. Too much in the direction.
  • the position of the camera 10 that deviates from the original preset position B is called a target position B2 with respect to the preset position B.
  • the target position B2 is shown in FIG.
  • the pan turning control unit 21a turns the camera 10 in the forward rotation direction by the number of offset pulses, here, 30 pulses (step ST136).
  • step ST137 While the pan rotation control unit 21a outputs the rotation pulse signal S2 for the number of offset pulses, the cut position calculation unit 21b continues to change the cut position for the video data stored in the video storage unit 14 to step ST134. (Step ST137). Details of step ST137 will be described later.
  • the pan turning control unit 21a determines whether or not the turning pulse signal S2 is output by the number of offset pulses, that is, 30 pulses in this case after turning the camera 10 in the forward rotation direction (step ST138). If the turning pulse signal S2 has not been output for the number of offset pulses (step ST138; NO), the process returns to step ST136. On the other hand, when the turning pulse signal S2 is output up to the number of offset pulses (step ST138; YES), the process proceeds to step ST139. In this manner, the camera 10 is additionally rotated in the reverse rotation direction by the number of offset pulses, and then returned by the same number of offset pulses in the forward rotation direction. Thereby, the worm 33 and the worm wheel 34 are brought into the meshing state at the time of forward rotation shown in FIG. Therefore, the position of the camera 10 is an accurate preset position B from which the stop position error due to backlash is eliminated.
  • the cutout position calculation unit 21b instructs the video cutout unit 15 to return the cutout position for the video data stored in the video storage unit 14 to the position specified in step ST120 (step ST139).
  • the video displayed on the monitor 42 by the display control unit 41 using the video data output from the video cutout unit 15 is an accurate video at the preset position B.
  • This instruction in step ST139 is then maintained until the cutout position is changed by the cutout position calculation unit 21b, and the video cutout unit 15 is an area where the position of the camera 10 in the shooting range is fixed at the set position.
  • FIG. 12 shows a timing chart until the camera 10 reaches the preset position B from the current position of 30 degrees as described above.
  • FIG. 17 shows videos F15a to F22a indicated by the video data stored in the video storage unit 14 and videos F15b to F22b indicated by the video data output by the video cutout unit 15 during the processing shown in FIG. Yes.
  • step ST133 From turning the camera 10 in the reverse direction in step ST131 until reaching step ST133, that is, until the position of the camera 10 reaches the target position B1 from the current position acquired in step ST121, according to the turning of the camera 10, The range of the scenery shown in the images F15a to F18a is also moved.
  • the cutout position for the video data stored in the video storage unit 14 is set to the position shown in FIGS. 14A to 14C in step ST120, and the range of the landscape shown in the videos F15b to F18b also moves. .
  • the video cutout unit 15 continues to output the video data in the region where the position in the shooting range of the camera 10 is fixed at the set position.
  • step ST133 From turning the camera 10 in the reverse direction by the offset pulse in step ST133 until reaching step ST139, that is, the position of the camera 10 becomes the target position B2 and then the preset position B2. Until then, the cut-out position for the video data stored in the video storage unit 14 is changed at any time by the processing of step ST134 and step ST137.
  • step ST134 when changing the cutout position, the cutout position calculation unit 21b acquires the zoom value Zs of the lens 11 from the zoom control unit 21c, and calculates the angle of view ⁇ s of the video F18b indicated by the video data output from the video cutout unit 15. To do.
  • a storage unit (not shown) stores the angle of view ⁇ s when the zoom value Zs of the lens 11 is 1.
  • the cutout position calculation unit 21b calculates the calculated angle of view ⁇ s, the turning angle ⁇ of the camera 10 per one pulse of the turning pulse signal S2, and the resolution R of the video F18b indicated by the video data output by the video cutting unit 15. Is used to calculate the number of moving pixels ⁇ P of the image F18b when the camera 10 turns ⁇ as shown in the following equation (1).
  • the resolution R is the total number of pixels in the horizontal direction of the video F18b.
  • the angle of view ⁇ s is the angle of view of the video indicated by the video data output from the imaging unit 13
  • the resolution R is the total number of pixels in the horizontal direction of the video indicated by the video data output from the imaging unit 13. It is the same even if replaced.
  • the moving pixel number ⁇ P of the image F18b is synonymous with the moving pixel number of the image F18a when the camera 10 turns ⁇ , that is, the moving amount of the shooting range of the camera 10.
  • the cutout position calculation unit 21b performs the video cutout unit 15.
  • the image indicated by the image data output by the image cutout unit 15 is moved by the moving pixel number ⁇ P in the direction in which the range of the landscape shown in the images F18a to F20a moves, that is, the direction opposite to the turning direction of the camera 10. Specify the position to cut out.
  • the images F19a and F20a indicated by the image data stored in the image storage unit 14 indicate that the position of the camera 10 is the target position B1.
  • the images F19b and F20b indicated by the video data output from the video cutout unit 15 are moved leftward in the drawing by the number of moving pixels ⁇ P in accordance with an instruction from the cutout position calculation unit 21b. Is the same as the image F18b at the target position B1.
  • the cut position calculating unit 21b instructs such movement of the cut position.
  • the video data in a part of the shooting range of the camera 10 output by the video cutout unit 15 during that time moves the area in conjunction with the turning of the camera 10 using the movement amount of the shooting range.
  • step ST137 differs in that the movement of the cut-out position is in the opposite direction to that of step ST134 because the camera 10 is turning in the forward rotation direction by the number of offset pulses.
  • the video data in a part of the shooting range of the camera 10 output by the video cutout unit 15 while the camera 10 is turning in the forward rotation direction by the offset pulse uses the movement amount of the shooting range.
  • the video data in the area in which the position in the imaging range has been moved in the direction to cancel the movement of the area in conjunction with the turning of the camera 10 is obtained. Therefore, although the camera 10 is turning in the forward rotation direction, the video that the display control unit 41 displays on the monitor 42 using the video data output from the video cutout unit 15 is as if the camera 10 is the target position B1. The video appears to be stopped at.
  • the camera 10 when the camera 10 is turned in the reverse direction to the preset position B, the camera 10 that has reached the target position B1 is further turned in the reverse direction by the number of offset pulses, and then the forward rotation is performed by the number of offset pulses.
  • the stop position error due to backlash can be eliminated.
  • the image displayed on the monitor 42 is as if the camera 10 was at the target position B1.
  • the video appears to be stopped. Therefore, the shaking of the image displayed on the monitor 42 while the camera 10 turns from the target position B1 to the preset position B is reduced, and a good image is provided to the user.
  • the imaging apparatus 1 according to the first embodiment is applicable not only to the operation of turning the camera 10 to a preset position registered in advance, but also to the operation of turning the camera 10 to an arbitrary target position. is there.
  • the target position B1 is shown as a specific example of the first position
  • the preset position B is shown as a specific example of the second position.
  • the video cutout unit 15 outputs a part of the video data stored in the video storage unit 14.
  • the control unit 21 may change the horizontal / vertical readout timing of the imaging unit 13 so that only a part of the video data generated by the imaging unit 13 is output by the imaging unit 13.
  • the imaging unit 13 that outputs video data in a part of the shooting range of the camera 10 functions as the video cutout unit 15.
  • a mechanical movable unit that moves in the horizontal and vertical directions and covers a part of the lens 11 is provided, and the imaging unit 13 outputs video data in a part of the original photographing range of the camera 10. You may make it do.
  • the imaging unit 13 functions as the video cutout unit 15.
  • the camera 10 is an IP (Internet Protocol) camera
  • the video output unit 16 sends MPEG, H.264, and so on.
  • the encoded data such as H.264 may be output.
  • the combination of the worm 33 and the worm wheel 34 is shown as the rotational force transmission mechanism.
  • the rotational force may be transmitted from the worm 33 to the worm wheel 34 via the belt.
  • a rotational force transmission mechanism may be configured by combining a plurality of such belts, worms 33 and worm wheels 34.
  • the camera 10 may be turned by the rotational force transmitted by the meshing of teeth by the rotational force transmission mechanism.
  • the imaging apparatus 1 As described above, according to the imaging apparatus 1 according to the first embodiment, even when the camera 10 is turned in the reverse direction to the preset position B, the stop position error due to backlash can be eliminated. At that time, the shaking of the image displayed on the monitor 42 while the camera 10 is rotated in the reverse and forward rotation directions by the offset pulse is reduced, and a good image is provided to the user.
  • FIG. FIG. 18 shows the relationship between the video F30a indicated by the video data stored in the video storage unit 14 and the video F30b indicated by the video data output by the video cutout unit 15 when the zoom value of the lens 11 is, for example, 10 times.
  • the zoom value of the lens 11 is 10 times
  • the angle of view ⁇ s of the image F30b is 6.4 degrees
  • the total number of pixels in the horizontal direction of the image F30b is 1920 pixels equivalent to 2K
  • the turning angle ⁇ of the camera 10 is 0.05 degrees.
  • the moving pixel number ⁇ P is 15 pixels according to the equation (1).
  • the total number of moving pixels when the number of offset pulses is 30 is 15 pixels ⁇ 30, that is, 450 pixels. Further, the camera 10 turns in total 1.5 degrees by the turning pulse signal S2 of 30 pulses in total.
  • the video data output by the video cutout unit 15 during steps ST134 and ST137 in FIG. This is video data in the area where the position in the 10 shooting range has been moved. At this time, the area does not deviate from the shooting range. This is because the total number of pixels in the horizontal direction of the image F30b is 1920 pixels, and the movement by the offset pulse is 450 pixels ⁇ 2 in total of reverse rotation and normal rotation, and the total of 2,820 pixels is smaller than 2880 pixels.
  • FIG. 19 shows a video F31a indicated by the video data stored in the video storage unit 14 and a video F31b indicated by the video data output by the video cutout unit 15 when the zoom value of the lens 11 is set to 20 times, for example. It is a figure which shows the relationship.
  • the angle of view ⁇ s of the image F31b is 3.2 degrees, so that the total number of moving pixels when the number of offset pulses is 30 is 30 pixels according to equation (1) and the like. ⁇ 30, that is, 900 pixels.
  • processing is performed as shown in steps ST134 and ST137 of FIG.
  • the video data output by the video cutout unit 15 is video data in an area where the position in the shooting range of the camera 10 has been moved. However, if the area is moved at that time, the imaging range is deviated.
  • the video indicated by the video data output by the video cutout unit 15 between steps ST134 and ST137 is a video having a total number of 1920 pixels in the horizontal direction and a video having a total number of pixels in the horizontal direction of 2880 pixels. Since it corresponds to an image obtained by moving 900 pixels in the right and left directions above, the range covers a total of 1920 pixels + 900 pixels ⁇ 2, that is, 3720 pixels, as shown in FIG. growing. In order to avoid such a situation, it is necessary that the total number of pixels in the horizontal direction of the video F31a is larger than 3720 pixels, for example, 3840 pixels equivalent to 4K.
  • FIG. 20 is a configuration diagram of an imaging apparatus 1 according to Embodiment 2 of the present invention.
  • Components that are the same as or correspond to the components described in Embodiment 1 are denoted by the same reference numerals, and description thereof is omitted or simplified.
  • the imaging apparatus 1 according to Embodiment 2 includes a video pixel interpolation unit 50 and a first video switching unit 51 between the video cutout unit 15 and the video output unit 16. Further, the control unit 21 includes a switching control unit 21d.
  • the video pixel interpolation unit 50 performs pixel interpolation processing on the video data output from the video cutout unit 15 and outputs the result to the first video switching unit 51.
  • the first video switching unit 51 is a selector switch that selectively outputs the video data output from the video cutout unit 15 and the video data output from the video pixel interpolation unit 50 to the video output unit 16.
  • the video output unit 16 outputs the video data selectively output by the first video switching unit 51 to the display control unit 41.
  • the display control unit 41 causes the monitor 42 to display the video indicated by the video data.
  • the switching control unit 21d controls the switching operation of the first video switching unit 51.
  • the video pixel interpolation unit 50 and the switching control unit 21d may be realized by the processing circuit 100 as dedicated hardware as shown in FIG. 2A, or stored in the memory 101 as shown in FIG. 2B. It may be realized by the CPU 102 that executes the program.
  • the video pixel interpolation unit 50 When the video pixel interpolation unit 50 is realized by the CPU 102 that executes a program stored in the memory 101, the video pixel interpolation unit 50 eventually executes step ST145 shown in the flowchart of FIG. A memory 101 for storing programs and the like is included. These programs can be said to cause the computer to execute the procedure or method of the video pixel interpolation unit 50.
  • the switching control unit 21d When the switching control unit 21d is realized by the CPU 102 that executes a program stored in the memory 101, the switching control unit 21d is a program or the like in which step ST146 shown in the flowchart of FIG. The memory 101 is stored. These programs can also be said to cause a computer to execute the procedure or method of each unit of the switching control unit 21d.
  • the video cutout unit 15, cutout position calculation unit 21b, display control unit 41, zoom control unit 21c, and video pixel interpolation unit 50 constitute a display control device.
  • the image cutout unit 15, the cutout position calculation unit 21 b, the display control unit 41, the zoom control unit 21 c, and the video pixel interpolation unit 50 form a single display control device. It may be configured to be communicably connected to the camera 10, the control unit 21, and the monitor 42 that are configured without them.
  • step ST124 If the turning direction of the camera 10 is the reverse rotation direction (step ST124; NO) by the processing of step ST120 to step ST124 already described with reference to FIG. 7, then the processing of step ST130 shown in FIG. 21 is performed.
  • the processing in steps ST130 to ST132 has already been described with reference to FIG.
  • the cut-out position calculation unit 21b acquires the current zoom value Zc of the lens 11 from the zoom control unit 21c (step ST140).
  • the zoom value Zc is 20 times, for example.
  • the cut-out position calculation unit 21b calculates the number of moving pixels ⁇ P when the camera 10 turns by ⁇ as shown in Expression (1). Further, the cut-out position calculation unit 21b calculates the total number of moving pixels when the turning pulse signal S2 corresponding to the number of offset pulses is output (step ST141).
  • the total number of moving pixels is 900 pixels.
  • the cutout position calculating unit 21b calculates the total number of pixels in the horizontal direction of the video indicated by the video data stored in the video storage unit 14 and the total number of pixels in the horizontal direction of the video indicated by the video data output from the video cutout unit 15. The number is compared with the number of pixels obtained by adding the total number of moving pixels including the normal rotation and the reverse rotation.
  • the video cutout unit 15 outputs the video data in the area where the position in the shooting range of the camera 10 has been moved between steps ST134 and ST137, if the area moves, does it depart from the shooting range? Is determined (step ST142).
  • the total number of pixels in the horizontal direction of the video indicated by the video data stored in the video storage unit 14 is 2880 pixels.
  • the total number of pixels in the horizontal direction of the video indicated by the video data output from the video cutout unit 15 is assumed to be 1920 pixels. Further, the total number of moving pixels including the normal rotation and the reverse rotation is 900 pixels ⁇ 2, that is, 1800 pixels.
  • step ST142 If the shooting range of the camera 10 is not deviated (step ST142; NO), the process moves to step ST133 in FIG. This is because the total number of pixels in the horizontal direction of the video indicated by the video data stored in the video storage unit 14 is changed to the total number of pixels in the horizontal direction of the video indicated by the video data output from the video cutout unit 15. This is equivalent to the case where the total number of moving pixels is greater than the total number of moving pixels.
  • the processing after step ST133 is as already described with reference to FIG.
  • the zoom control unit 21c changes the zoom value of the lens 11 to the zoom value Zw that becomes the wide angle side when compared with the current zoom value Zc.
  • a signal is output to the zoom changing unit 12 (step ST143). This is because the total number of pixels in the horizontal direction of the video indicated by the video data stored in the video storage unit 14 is changed to the total number of pixels in the horizontal direction of the video indicated by the video data output from the video cutout unit 15. This is equivalent to the case where the total number of moving pixels is less than the total number of moving pixels.
  • the zoom value Zw is, for example, 10 times.
  • the cutout position calculation unit 21b adjusts the apparent angle of view on the monitor 42, and in accordance with the change from the zoom value Zc to the zoom value Zw, the cutout range instructed to the video cutout unit 15, in other words, the video
  • the size of the video indicated by the video data output by the cutout unit 15 is changed (step ST144).
  • step ST143 since the zoom value is always changed to the wide angle side, the size of the video is changed in the direction of reduction. That is, by this step ST144, the video cutout unit 15 outputs the video data in the area where the size in the shooting range of the camera 10 is reduced.
  • the zoom value of the lens 11 is changed from 20 times to 10 times on the wide angle side, the total number of pixels in the horizontal direction of the video indicated by the video data output from the video cutout unit 15 is changed from 1920 pixels to 960 pixels. To do. At that time, the total number of pixels in the vertical direction of the video may be changed so that the aspect ratio is maintained. Thereby, for example, as shown in FIGS. 14A to 14C, the center of the video indicated by the video data stored in the video storage unit 14 and the center of the video indicated by the video data output from the video cutout unit 15 coincide with each other. The cutout is not changed before and after the change of the video size, but the range from the center to which the cutout is made changes.
  • the video pixel interpolation unit 50 acquires the zoom value Zc and the zoom value Zw from the zoom control unit 21c, and sets pixel interpolation processing using them (step ST145).
  • the zoom value is changed from 20 times to 10 times
  • the total number of pixels in the horizontal direction of the video indicated by the video data output from the video cutout unit 15 is changed from 1920 pixels to 960 pixels. It will be.
  • the pixel interpolation processing is set so that the total number of pixels in the horizontal direction is changed from 960 pixels after the change to 1920 pixels before the change. The same applies to the pixels in the vertical direction.
  • the video pixel interpolation unit 50 performs pixel interpolation processing on the video data output from the video cutout unit 15 in accordance with the setting in step ST145.
  • pixel interpolation methods include well-known methods such as bicubic, bilinear, and neighborhood methods.
  • FIG. 24 shows a video F32a indicated by the video data stored in the video storage unit 14 and a video F32b indicated by the video data output by the video cutout unit 15 when the settings described in steps ST143 to ST145 are made. And a video F32c indicated by video data that has undergone pixel interpolation processing by the video pixel interpolation unit 50.
  • the switching control unit 21d controls the first video switching unit 51 to output the video data output from the video pixel interpolation unit 50 to the video output unit 16 (step ST146).
  • the switching control unit 21d outputs a control signal to the first video switching unit 51 to control the switching operation. For example, by setting the control signal to H level, the video data output from the video pixel interpolation unit 50 is output to the first video switching unit 51.
  • step ST133 the zoom value, the size of the video indicated by the video data output from the video clipping unit 15, the setting of the pixel interpolation processing, the switching operation of the first video switching unit 51, etc. If it has been changed in ST143 to ST146, it is restored (step ST147).
  • the processing from step ST133 to step ST138 has already been described in the first embodiment, and a description thereof will be omitted. Note that the number of moving pixels ⁇ P calculated by the cutout position calculation unit 21b in step ST134 is expressed by the following equation (2) in consideration of the pixel interpolation processing of the video pixel interpolation unit 50.
  • FIG. 23 shows a timing chart until the camera 10 reaches the preset position B as described above.
  • the image pickup apparatus 1 is not affected by the zoom value of the lens 11, the resolution performance of the image pickup unit 13, and the like, from the target position B1 to the target position B2, and from the target position B2. While the position of the camera 10 is adjusted to the preset position B, an image as if the camera 10 is stopped at the target position B1 can be displayed on the monitor 42.
  • step ST140 to step ST146 are performed after the camera 10 reaches the target position B1 .
  • these processes may be performed before the camera 10 reaches the target position B1. Even in this case, the same effect can be obtained.
  • step ST140 to step ST146 is performed immediately after the camera 10 reaches the target position B1 is shown. However, it may be performed during the turning operation of the camera 10 from the target position B1 to the preset position B. In this case, even if the area is moved when the video cutout unit 15 outputs the video data in the area where the position of the camera 10 in the shooting range is moved during the turning operation of the camera 10 by the offset pulse. While not deviating from the photographing range, the zoom value Zc of the lens 11 is fixed, and the processing of steps ST142 to ST146 is not performed. When deviating, the zoom value of the lens 11 is set to a slightly wide angle side, and step ST142 is performed. The process in step ST146 may be repeated. Even in this case, the same effect can be obtained.
  • the same effect as in the first embodiment is obtained without being affected by the zoom value of the lens 11, the resolution performance of the imaging unit 13, and the like. be able to.
  • FIG. FIG. 25 is a configuration diagram of the imaging apparatus 1 according to Embodiment 3 of the present invention.
  • the same or corresponding components as those described in the first and second embodiments are denoted by the same reference numerals, and the description thereof is omitted or simplified.
  • the imaging apparatus 1 according to Embodiment 3 includes a second video switching unit 60, a video holding unit 61, and a third video switching unit 62 between the video cutout unit 15 and the video output unit 16.
  • the second video switching unit 60 is a switch that switches whether to output the video data output from the video cutout unit 15 to the video holding unit 61.
  • the video holding unit 61 acquires the video data output from the video cutout unit 15 via the second video switching unit 60 and holds it as still video data. Similar to the memory 101, the video holding unit 61 includes various memories.
  • the third video switching unit 62 is a selector switch that selectively switches between the video data output by the video cutout unit 15 and the still video data held by the video holding unit 61 and outputs the video data to the video output unit 16.
  • the video output unit 16 outputs the one video data selectively output by the third video switching unit 62 to the display control unit 41.
  • the display control unit 41 causes the monitor 42 to display the video indicated by the video data.
  • the switching control unit 21d controls the switching operation of the second video switching unit 60 and the third video switching unit 62.
  • the video holding unit 61, the third video switching unit 62, and the display control unit 41 constitute a display control device.
  • the video holding unit 61, the third video switching unit 62, and the display control unit 41 constitute a single display control device, and the camera is configured by excluding them. 10
  • the control unit 21 and the monitor 42 may be communicably connected.
  • step ST124 If the turning direction of the camera 10 is the reverse rotation direction (step ST124; NO) by the processing of step ST120 to step ST124 already described with reference to FIG. 7, then the processing of step ST130 shown in FIG. 26 is performed.
  • the processing in steps ST130 to ST132 has already been described with reference to FIG.
  • steps ST131 and ST132 the second video switching unit 60 is turned off, and the third video switching unit 62 is set to a state in which the video data output by the video cutout unit 15 is output to the video output unit 16. ing. Therefore, until the camera 10 reaches the target position B1, the third video switching unit 62 outputs the video data output by the video cutout unit 15.
  • the video holding unit 61 When the turning pulse signal S2 is output by an amount corresponding to the turning angle (step ST132; YES), that is, when the camera 10 reaches the target position B1, the video holding unit 61 outputs the video data output by the video cutting unit 15. One frame is acquired and held as still video data (step ST150). This is performed by the switching control unit 21d outputting a control signal to turn on the second video switching unit 60. When the video data for one frame is held in the video holding unit 61, the switching control unit 21d outputs a control signal to turn off the second video switching unit 60.
  • the third video switching unit 62 outputs the still video data held by the video holding unit 61 to the video output unit 16 (step ST151). This is performed by the switching control unit 21d outputting a control signal to the third video switching unit 62 to control the switching operation.
  • the switching operation of the third video switching unit 62 set at this time is maintained until changed in step ST152 described later.
  • the processing of step ST133 and step ST135 already described in the first embodiment is performed until the turning pulse signal S2 is output by the number of offset pulses (step ST135; YES), that is, the camera 10 moves from the target position B1 to the target.
  • the monitor 42 displays the video indicated by the still video data at the target position B1 acquired in step ST150 by the display control unit 41 that has received the output from the video output unit 16.
  • step ST135; YES the camera 10 turns in the normal rotation direction toward the preset position B by the process of step ST136 already described in the first embodiment.
  • the third video switching unit 62 outputs the still video data held by the video holding unit 61 to the video output unit 16. Accordingly, on the monitor 42, the video indicated by the still video data at the target position B1 acquired in step ST150 is continuously displayed by the display control unit 41 that has received the output from the video output unit 16.
  • step ST138 when the camera 10 is turned in the forward rotation direction and the turning pulse signal S2 is output by the number of offset pulses (step ST138; YES), that is, when the camera 10 reaches the preset position B, the third image is displayed.
  • the switching unit 62 outputs the video data output by the video cutout unit 15 to the video output unit 16 instead of the still video data held by the video holding unit 61 (step ST152). This is performed by the switching control unit 21d outputting a control signal to the third video switching unit 62 to control the switching operation. Therefore, after the camera 10 reaches the preset position B, a real-time video at the preset position B is displayed on the monitor 42 by the display control unit 41.
  • the video displayed on the monitor 42 is displayed at the target position B1. It will be the video acquired when it reaches. Therefore, the shaking of the image displayed on the monitor 42 while the camera 10 turns from the target position B1 to the preset position B is reduced, and a good image is provided to the user.
  • the second video switching unit 60 is turned on, and the video data output from the video clipping unit 15 is acquired and held by the video holding unit 61. It was. However, at other times, the second video switching unit 60 is turned on, the video data output from the video clipping unit 15 is acquired and held by the video holding unit 61, and the third video switching unit 62 is The video data held by the video holding unit 61 may be output to the video output unit 16. Processing in such a case will be described with reference to FIG.
  • the second video switching unit 60 When the second video switching unit 60 is turned on, for example, at 1 second, and the third video switching unit 62 is set to always output video data held by the video holding unit 61 to the video output unit 16, display control is performed.
  • the video displayed on the monitor 42 by the unit 41 is a video that is thinned out every second.
  • the second time is reached when the camera 10 reaches the target position B1. It is desirable that the video switching unit 60 is turned on.
  • the time until the camera 10 reaches the target position B1 from the current position at which the camera 10 starts turning in the reverse rotation direction that is, in units of 1 second.
  • the turning start timing and turning speed of the camera 10 may be adjusted so that the thinning time T ⁇ n.
  • the thinning time T is 1 second in this example, and n is an arbitrary positive integer.
  • the pan turning control unit 21a starts turning the camera 10 in the reverse direction from the current position at the timing when the second video switching unit 60 is turned on.
  • the second video switching unit 60 is performed in the same manner as the processing shown in FIG. Is turned off, and the video data acquired by the video holding unit 61 at the timing when the camera 10 reaches the target position B1 continues to be held. Accordingly, while the camera 10 turns to the target position B2 after reaching the target position B1 and then from the target position B2 to the preset position B, the video indicated by the video data acquired at the target position B1 is displayed on the monitor 42. Is done.
  • the switching control unit 21d determines that the second video switching unit 60 is turned on at 1-second intervals, and the third video switching unit 62 is held by the video holding unit 61. Control is performed so that the video data is output to the video output unit 16. As a result, a real-time image at the preset position B thinned out every second is displayed on the monitor 42.
  • the video displayed on the monitor 42 is displayed at the target position B1. It will be the video acquired when it reaches. Therefore, the shaking of the image displayed on the monitor 42 while the camera 10 turns from the target position B1 to the preset position B is reduced, and a good image is provided to the user.
  • the second video switching unit 60 is turned on at the timing when the camera 10 reaches the target position B1 by using Equation (3) and the like. However, if the camera 10 reaches the target position B1 while the second video switching unit 60 is off for some reason, the second video switching unit 60 is forcibly forced to reach the target position B1.
  • the switching control unit 21d may control so as to be turned on.
  • an intra frame of encoded data may be output.
  • FIG. 25 shows a configuration in which the video cutout unit 15 is provided.
  • the imaging apparatus 1 according to Embodiment 3 may not include the video cutout unit 15.
  • the video data processed between the video storage unit 14 and the display control unit 41 is all video data within the shooting range of the camera 10.
  • the video data captured by the camera 10 is input to the video holding unit 61 and the third video switching unit 62 regardless of the presence or absence of the video cutout unit 15.
  • the imaging apparatus 1 even when the camera 10 is turned in the reverse direction to the preset position B, the stop position error due to backlash can be eliminated. At that time, the shaking of the image displayed on the monitor 42 while the camera 10 is rotated in the reverse and forward rotation directions by the offset pulse is reduced, and a good image is provided to the user.
  • the display control apparatus can reduce the shaking of the displayed image when improving the stop position accuracy of the camera. Suitable for use against.
  • 1 imaging device 10 camera, 11 lens, 12 zoom changing unit, 13 imaging unit, 14 video storage unit, 15 video cropping unit, 16 video output unit, 21 control unit, 21a pan turning control unit, 21b cutout position calculation unit, 21c Zoom control unit, 21d switching control unit, 22 current position storage unit, 23 preset position storage unit, 24 input unit, 30 pan turning unit, 31 pan drive unit, 32 pan motor, 33 worm, 34 worm wheel, 35 origin generation unit 36 origin detection unit, 41 display control unit, 42 monitor, 50 video pixel interpolation unit, 51 first video switching unit, 60 second video switching unit, 61 video holding unit, 62 third video switching unit, 100 Processing circuit, 101 memory, 102 CPU.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Studio Devices (AREA)
  • Accessories Of Cameras (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

Dans le cas où une caméra (10) pivote dans le sens inverse vers une position prédéfinie, la caméra (10) pivote en outre dans la direction inverse selon le nombre d'impulsions de décalage puis pivote davantage dans la direction normale selon le nombre d'impulsions de décalage. Tandis que la caméra (10) pivote dans la direction normale ou inverse selon le nombre d'impulsions de décalage, une unité de segmentation vidéo (15) émet des données sur la vidéo d'une région, dans lequel la position de la distance de photographie de la caméra (10) est déplacée dans une direction d'annulation du mouvement synchronisé avec le pivotement de la caméra (10).
PCT/JP2016/064904 2016-05-19 2016-05-19 Dispositif de commande d'affichage et dispositif de capture d'image WO2017199398A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
PCT/JP2016/064904 WO2017199398A1 (fr) 2016-05-19 2016-05-19 Dispositif de commande d'affichage et dispositif de capture d'image
US16/099,609 US10440254B2 (en) 2016-05-19 2016-05-19 Display control device and image pickup device
JP2018518021A JP6598992B2 (ja) 2016-05-19 2016-05-19 表示制御装置及び撮像装置

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2016/064904 WO2017199398A1 (fr) 2016-05-19 2016-05-19 Dispositif de commande d'affichage et dispositif de capture d'image

Publications (1)

Publication Number Publication Date
WO2017199398A1 true WO2017199398A1 (fr) 2017-11-23

Family

ID=60325854

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/064904 WO2017199398A1 (fr) 2016-05-19 2016-05-19 Dispositif de commande d'affichage et dispositif de capture d'image

Country Status (3)

Country Link
US (1) US10440254B2 (fr)
JP (1) JP6598992B2 (fr)
WO (1) WO2017199398A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7401272B2 (ja) 2019-11-26 2023-12-19 キヤノン株式会社 撮像装置、撮像装置の制御方法、及びプログラム
JP7516103B2 (ja) 2020-05-15 2024-07-16 キヤノン株式会社 撮像装置の制御装置、方法及びプログラム

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190098226A1 (en) * 2017-09-26 2019-03-28 Panasonic Automotive Systems Company Of America, Division Of Panasonic Corporation Of North America Smart field of view (fov) adjustment for a rearview display system in a motor vehicle

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009222222A (ja) * 2008-02-20 2009-10-01 Mitsubishi Electric Corp ウォーム減速機、及びそれを用いた旋回型カメラ並びに電波受信装置
JP2014007481A (ja) * 2012-06-22 2014-01-16 Canon Inc 撮像装置及びその制御方法
JP2014175769A (ja) * 2013-03-07 2014-09-22 Casio Comput Co Ltd 撮像装置、画角補正方法、及びプログラム

Family Cites Families (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06109102A (ja) 1992-09-22 1994-04-19 Mitsubishi Heavy Ind Ltd 歯車伝導装置
JP2004023580A (ja) 2002-06-19 2004-01-22 Konica Minolta Holdings Inc デジタルカメラ
JP4723817B2 (ja) 2004-04-14 2011-07-13 パナソニック株式会社 移動機構付き撮像装置
JP2005303806A (ja) 2004-04-14 2005-10-27 Matsushita Electric Ind Co Ltd 移動機構付き撮像装置
JP4932501B2 (ja) 2007-01-12 2012-05-16 三菱電機株式会社 減速装置
JP5074951B2 (ja) 2008-02-18 2012-11-14 株式会社日立国際電気 テレビカメラ装置および位置補正方法
JP5350029B2 (ja) 2008-03-14 2013-11-27 キヤノン株式会社 光学機器
WO2010029697A1 (fr) * 2008-09-11 2010-03-18 パナソニック株式会社 Corps d'appareil photographique, adaptateur et dispositif de capture d’image
JP2010237251A (ja) 2009-03-30 2010-10-21 Canon Inc 撮像装置及び撮像装置の制御方法
US20120098971A1 (en) * 2010-10-22 2012-04-26 Flir Systems, Inc. Infrared binocular system with dual diopter adjustment
TW201341756A (zh) * 2011-11-30 2013-10-16 尼康股份有限公司 形狀測定裝置、形狀測定方法、及記錄有其程式之記錄媒體
US9746093B2 (en) * 2011-12-21 2017-08-29 Deka Products Limited Partnership Flow meter and related system and apparatus
JP6167599B2 (ja) * 2013-03-26 2017-07-26 パナソニックIpマネジメント株式会社 光学ファインダー
JP2015012438A (ja) 2013-06-28 2015-01-19 キヤノン株式会社 撮像装置
JP6507626B2 (ja) * 2014-12-19 2019-05-08 アイシン精機株式会社 車両周辺監視装置
WO2016139875A1 (fr) * 2015-03-04 2016-09-09 ソニー株式会社 Dispositif de capture d'image
JP2017079437A (ja) * 2015-10-21 2017-04-27 株式会社ニコン 表示制御装置および撮像装置

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009222222A (ja) * 2008-02-20 2009-10-01 Mitsubishi Electric Corp ウォーム減速機、及びそれを用いた旋回型カメラ並びに電波受信装置
JP2014007481A (ja) * 2012-06-22 2014-01-16 Canon Inc 撮像装置及びその制御方法
JP2014175769A (ja) * 2013-03-07 2014-09-22 Casio Comput Co Ltd 撮像装置、画角補正方法、及びプログラム

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7401272B2 (ja) 2019-11-26 2023-12-19 キヤノン株式会社 撮像装置、撮像装置の制御方法、及びプログラム
JP7516103B2 (ja) 2020-05-15 2024-07-16 キヤノン株式会社 撮像装置の制御装置、方法及びプログラム

Also Published As

Publication number Publication date
US20190132507A1 (en) 2019-05-02
JPWO2017199398A1 (ja) 2018-11-15
US10440254B2 (en) 2019-10-08
JP6598992B2 (ja) 2019-10-30

Similar Documents

Publication Publication Date Title
US10694109B2 (en) Imaging apparatus
JP6598992B2 (ja) 表示制御装置及び撮像装置
JP5764740B2 (ja) 撮像装置
JP5931619B2 (ja) 撮像装置
WO2017170165A1 (fr) Dispositif d'imagerie, procédé de fonctionnement, dispositif de traitement d'image et procédé de traitement d'image
JP4022595B2 (ja) 撮影装置
US9191576B2 (en) Imaging apparatus having optical zoom mechanism, viewing angle correction method therefor, and storage medium
WO2013021728A1 (fr) Dispositif d'imagerie et procédé d'imagerie
WO2010137211A1 (fr) Dispositif de caméra avec base rotative
JP2010028764A (ja) 撮像装置及び画像処理方法並びにプログラム
JP2007199195A (ja) レンズ制御装置
JP2016050973A (ja) 撮像装置及びその制御方法
JP6172901B2 (ja) 撮像装置
JP2006345147A (ja) 画素ずらし画像の生成機能を有する電子カメラおよび撮像方法
JP2012147044A (ja) デジタル顕微鏡
JP3804841B2 (ja) カメラのレンズ駆動装置
JP7066395B2 (ja) 撮像装置およびその制御方法
JP4723817B2 (ja) 移動機構付き撮像装置
JP2007034105A (ja) 光学機器、撮像装置および撮影レンズ
US10649312B2 (en) Lens device, imaging apparatus, lens driving method, and lens driving program
JP4778883B2 (ja) 雲台カメラ装置
JP5790011B2 (ja) 撮像装置
JP2012034309A (ja) 撮像装置及びその制御方法
JP5664440B2 (ja) 撮像装置
WO2020170723A1 (fr) Dispositif d'imagerie

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 2018518021

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16902417

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 16902417

Country of ref document: EP

Kind code of ref document: A1