WO2024101153A1 - Camera system and processing method - Google Patents

Camera system and processing method Download PDF

Info

Publication number
WO2024101153A1
WO2024101153A1 PCT/JP2023/038470 JP2023038470W WO2024101153A1 WO 2024101153 A1 WO2024101153 A1 WO 2024101153A1 JP 2023038470 W JP2023038470 W JP 2023038470W WO 2024101153 A1 WO2024101153 A1 WO 2024101153A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera head
main body
level
unit
level information
Prior art date
Application number
PCT/JP2023/038470
Other languages
French (fr)
Japanese (ja)
Inventor
喜崇 竹川
Original Assignee
ソニーグループ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーグループ株式会社 filed Critical ソニーグループ株式会社
Publication of WO2024101153A1 publication Critical patent/WO2024101153A1/en

Links

Images

Definitions

  • This technology relates to a camera system and its processing method, and to a camera system in which the camera body and camera head can be used separately.
  • Patent Documents 3 and 4 listed below disclose configurations equipped with a spirit level.
  • the level of the image sensor and peripheral equipment such as a tripod are aligned with the bottom of the main body as a reference when shooting, so a spirit level for obtaining the level of the image sensor is built into the main body.
  • the image sensor is located away from the main body and peripheral equipment when they are used separately, which means that it is not possible to obtain appropriate level gauge information that indicates the horizontality of the image sensor.
  • This disclosure therefore proposes technology that enables appropriate spirit level information to be obtained in a camera system in which the camera head can be operated either separately or in addition to being separated from the main body.
  • the camera system of the present technology includes a main body unit that processes captured image data, a camera head unit that is detachable from the main body unit and configured to output the captured image data generated by an image sensor to the main body unit whether the camera head unit is in an undetached state attached to the main body unit or in a detached state separated from the main body unit, a first level provided on the main body unit, a second level provided integrally with a housing of the camera head unit, and a control unit that performs processing to associate first level information from the first level and second level information from the second level with the captured image data.
  • the camera system is configured with a spirit level on both the main body side and the camera head side, and the control unit selectively associates spirit level information from both spirit levels or both of them with captured image data.
  • FIG. 1 is a perspective view of a basic state of a camera system according to an embodiment of the present technology.
  • FIG. 2 is a perspective view of the embodiment with a camera head removed from a main body.
  • FIG. 2 is a perspective view of the camera head unit according to the embodiment with an adapter removed.
  • 4 is an explanatory diagram of a case where an interchangeable lens and a finder unit are attached to the main body of the embodiment.
  • FIG. 4 is an explanatory diagram of a case where an interchangeable lens and a finder unit are attached to the main body of the embodiment.
  • FIG. FIG. 2 is a perspective view of the embodiment in an extended state.
  • FIG. 2 is a perspective view of an extension cable according to an embodiment.
  • FIG. 11 is a perspective view of the embodiment in a state where a second connector portion is detached from the base plate.
  • FIG. 4A and 4B are explanatory diagrams of terminal surfaces of a camera head unit and a base plate according to an embodiment. 4A and 4B are explanatory diagrams of terminal surfaces of the extension cable according to the embodiment.
  • 1A to 1C are explanatory diagrams of a connection configuration in a basic state and an extended state according to an embodiment.
  • FIG. 2 is an explanatory diagram of a connection configuration in a basic state according to the embodiment.
  • FIG. 4 is an explanatory diagram of a connection configuration in an extended state according to an embodiment.
  • FIG. 11 is an explanatory diagram of another connection configuration in the basic state according to the embodiment.
  • FIG. 11 is a flowchart of a level selection process according to the first embodiment.
  • 11 is a flowchart of an association process according to the first and second embodiments.
  • 13 is a flowchart of a level selection process according to the second embodiment.
  • 13 is a flowchart of a process for selecting all of the spirit levels according to the third embodiment.
  • 13 is a flowchart of a selection process of level information to be used in the third and fourth embodiments.
  • 13 is a flowchart of an association process according to the fourth embodiment.
  • captured image data refers to data of moving images or still images captured by an image sensor, but it also includes not only so-called RAW image data as captured, but also image data at various stages after various processes such as development, compression, encoding, etc., and image data converted into an image file, etc.
  • captured image data also includes analog signals captured by an image sensor and before being converted into digital data.
  • the camera system 1 of the present technology is applied to a video camera.
  • the scope of application of the present technology is not limited to the video camera illustrated in the example, but can be widely applied to various imaging devices, such as video cameras, still cameras, cameras with special imaging functions such as infrared cameras and specific wavelength cameras, and cameras for various purposes such as commercial, general, and surveillance cameras.
  • the directions of front, back, up, down, left and right are indicated as viewed from the cameraman when taking pictures with a video camera. Therefore, the object side is the front, and the image plane side is the rear. Note that the directions of front, back, up, down, left and right shown below are for convenience of explanation, and the implementation of the present technology is not limited to these directions.
  • the camera system 1 has a main body 2 and a camera head 3, and the camera head 3 is detachable from the main body 2 (see FIGS. 1 to 5).
  • the main body 2 may be provided as an external device for the camera head 3.
  • the camera head 3 is made detachable from the external device.
  • the camera head 3 may be provided as a standalone imaging device.
  • the main body 2 has an outer panel 4 and an outer housing 5, and the outer housing 5 is covered by the outer panel 4 (see Figure 2).
  • the outer panel 4 has a base panel portion 6, an upper panel portion 7, and a rear panel portion 8.
  • the base panel portion 6 has a bottom surface portion 9 facing in the vertical direction, and a pair of side surfaces 10, 11 that are connected to the left and right side edges of the bottom surface portion 9 and positioned at a distance from each other (see Figures 1 and 4).
  • the operation sections 12, 12, ... include, for example, a power button, a shooting button, a zoom knob, a mode switching knob, and the like.
  • Display units 13, 13 such as liquid crystal panels are disposed on the side sections 10, 11, respectively.
  • Connection terminals 14, 14, ... are arranged vertically on one side of the rear panel portion 8 (see FIGS. 4 and 5). Cables (not shown) for supplying power and transmitting/receiving signals are connected to the connection terminals 14, 14, ....
  • One of the connection terminals 14 is a video output terminal 14a that outputs a monitor image signal.
  • the monitor image signal is a signal of an image that displays an image of a subject being imaged by the camera system 1 in real time so that the photographer can check it, and is a signal of an image called a through image.
  • the upper panel portion 7 is formed in a plate shape facing the up-down direction, and both left and right ends are attached to the upper ends of the side portions 10, 11, respectively (see Figs. 1 and 2).
  • the rear panel portion 8 is formed in a plate shape facing the front-rear direction, and its outer periphery is attached to the rear end of the base panel portion 6 and the rear end of the upper panel portion 7 .
  • the upper panel portion 7 is attached to the side portions 10, 11, and the rear panel portion 8 is attached to the base panel portion 6 and the upper panel portion 7, thereby forming the outer panel 4, and the outer housing 5 is covered from above, below, left, right, and rear by the outer panel 4 (see Figure 2).
  • An adjustment base 15 is attached to the top surface of the upper panel portion 7 (see Figs. 1 and 2).
  • the adjustment base 15 is formed in a vertically long rectangular shape, with a central portion in the left-right direction formed as a concave groove portion 16, and the left and right portions on both sides of the groove portion 16 are provided as adjustment portions 17, 17.
  • a handle 80 is detachably attached to the adjustment base 15 (see FIGS. 4 and 5).
  • a finder unit 85 is also detachably attached to the adjustment base 15.
  • the finder unit 85 has a rotating arm 90 and a viewfinder main body 91. One end of the view body 91 is provided as a finder section 91a, and the user can see a monitor image, an operation screen, and the like through the finder section 91a.
  • a battery 501 is attached to the rear surface of the main body 2 (see FIGS. 4 and 5).
  • the battery 501 serves as a power source for supplying a power supply voltage to each part of the main body 2 and the camera head 3 .
  • a camera head 3 is attached to the front side of the main body 2 (see FIG. 1).
  • the camera head 3 is detachable and can be removed from the main body 2 (see FIG. 2).
  • the camera head unit 3 is shown with an adapter 500 attached thereto.
  • the adapter 500 is detachable from the camera head unit 3.
  • the adapter 500 is used for mounting different interchangeable lenses.
  • an interchangeable lens 503 shown in FIG. 5 can be attached to the camera head unit 3 without the adapter 500 being attached.
  • an adapter 500 on the camera head unit 3 it is possible to mount a different type of interchangeable lens 502 shown in FIG.
  • the camera head unit 3 has a generally rectangular, plate-like housing 140, the front side of which serves as a mounting surface 141 that is adapted for receiving an adapter 500 and an interchangeable lens 503 (see FIG. 3).
  • a plurality of screw holes 142 are provided at required positions. These screw holes 142 are formed at positions corresponding to a plurality of screw holes 550 provided in the adapter 500, and the adapter 500 can be attached to the camera head unit 3 by screwing in the screw 551 as shown in FIG.
  • the camera head unit 3 is equipped with an image sensor 300, optical elements such as an ND (neutral density) filter and an iris mechanism (not shown), and a circuit board carrying necessary circuits (see FIG. 3).
  • the camera head unit 3 is also provided with an assignable button 302 .
  • the assignable buttons 302 are operators to which the user can assign any operation function. For example, even in the main body 2, some buttons in the operation unit 12 are set as assignable buttons 12a. To each of the assignable buttons 12a and 302, the user can assign any operation function according to his/her convenience, such as a recording start/stop operation, a playback operation, a menu operation, etc. Since the camera head unit 3 is also provided with an assignable button 302, even when the camera head unit 3 is used away from the main body unit 2, the user on the camera head unit 3 side can perform the necessary operations using the assignable button 302, making it convenient to use.
  • the camera head unit 3 is detachable from the outer casing 5 of the main body unit 2 (see FIGS. 1 and 2).
  • the main body 2 is formed with a mounting portion 18, and the camera head 3 is attached to the mounting portion 18 by a predetermined mechanism (see FIG. 2).
  • a mechanism for attachment corresponding screw holes may be provided in the housing 140 of the camera head unit 3 and the attachment unit 18 at the end, and the attachment unit 18 may be fastened with screws.
  • a locking mechanism or a release mechanism for releasing the locking mechanism may be provided to allow easy attachment and detachment without using screws. In particular, assuming an extended use mode described later, it is desirable to allow the camera head unit 3 and the main body unit 2 to be easily attached and detached without using screws.
  • FIG. 9A shows a terminal surface 3T on the rear side of the camera head unit 3 opposite the mounting surface 141, and this terminal surface 3T is provided with a connector 3a.
  • the mounting portion 18 of the main body 2 is provided with a connector 2a.
  • the camera head unit 3 is used in a state where it is attached to the main body unit 2 as shown in FIG. 1, but the camera head unit 3 can also be used in a state where it is mechanically separated from the main body unit 2. I also try to do that.
  • the terms “basic state” and “extended state” are used.
  • the “basic state” refers to a non-separated state in which the camera head unit 3 is attached to the main body unit 2. That is, this is the usage state shown in Figures 1, 4 and 5.
  • the “extended state” refers to a state in which the camera head unit 3 is mechanically separated from the main body unit 2 and is communicatively connected to the main body unit 2 via the extension cable 20 so as to be capable of transmitting signals.
  • main body 2 side and “camera head 3 side” are used, the terms also refer to the parts that are mechanically connected to the main body 2 and camera head 3. That is, the “main body 2 side” includes the base plate 50, which will be described later, in the extended state.
  • the “camera head 3 side” includes the first connector 21, which will be described later, in the extended state.
  • the extension cable 20 described below is a cable used for electrical connection.
  • the extension cable 20 is a cable that connects between the main body 2 and the camera head 3. Any extension cable that can connect the camera head 3 and the main body 2 can be used as the extension cable 20, and it can be a relatively soft cable covered with an insulator such as vinyl, or it can be a hard tube (cylindrical pipe).
  • FIG. 6 shows a state in which the camera head unit 3 is detached from the main body unit 2 and connected via an extension cable 20.
  • the extension cable 20 is configured to include a first connector portion 21, a cable 22, and a second connector portion 23 (see FIGS. 6, 7, and 10).
  • the first connector portion 21 is a connector portion on one end side of a cable 22 that is connected to the camera head portion 3 .
  • the second connector portion 23 is a connector portion on the other end side of the cable 22 that is connected to the main body portion 2 side.
  • the cable 22 transmits signals between the first connector portion 21 and the second connector portion 23, and has a required number of transmission paths for electrical signals formed therein.
  • the camera head unit 3 can be moved, for example, forward from the main body unit 2 by the length of the cable 22 to capture images.
  • the first connector portion 21 is adapted to be joined to a terminal surface 3T (see FIG. 9A) on the rear side of the camera head portion 3, as shown in FIGS.
  • the housing 21K of the first connector unit 21 has a terminal surface 21FR (see FIG. 10) that has a contour shape of approximately the same type and size as the terminal surface 3T of the housing 140 of the camera head unit 3.
  • the housing 140 of the camera head unit 3 is a relatively thin plate, making it difficult for it to stand on its own. Also, the weight of the adapter 500 makes the balance poor. By attaching the first connector unit 21 to this, the camera head unit 3 can be made to stand on its own more easily. Furthermore, the fact that the housing 140 of the camera head unit 3 is a relatively thin plate means that it is difficult for a user to stably hold the camera head unit 3 alone and point it toward a subject. The increased thickness due to the housing of the first connector unit 21 makes the camera head unit 3 easier to hold and handle when removed from the main body unit 2.
  • a cable end 22E1 of the cable 22 is fixedly attached to the rear surface 21BK side of the housing 21K of the first connector portion 21 (see FIG. 7). Cable end 22E1 is attached so as to extend downward from its fixed portion relative to housing 21K along notch 126 provided in housing 21K. By positioning the cable end 22E1 inside the cutout portion 126, external stress is less likely to be applied to the cable end 22E1, and the cable end 22E1 is protected. Furthermore, by making the cable 22 extend downward from the cable end 22E1, the camera head unit 3 can be easily maintained in the upright position shown in FIG.
  • a connector 3a is disposed on a terminal surface 3T of the camera head unit 3 (see FIG. 9A). Also, a connector 21a is disposed on a terminal surface 21FR of the first connector unit 21 (see FIG. 10).
  • the first connector portion 21 is formed with a video output terminal 121 to which, for example, a monitor device can be connected.
  • the first connector portion 21 is also formed with an external power output terminal 120, which makes it possible to supply a power supply voltage to a device that requires an external power supply.
  • the second connector portion 23 on the other end side of the cable 22 is formed of a housing 23K having a generally rectangular parallelepiped shape with curved top and bottom.
  • a recess 127 is formed on the upper and lower surfaces of the housing 23K.
  • a pair of handles 23H are attached to the rear surface 23BK of the housing 23K. The handles 23H and the recessed portion 127 make it easy to handle the second connector portion 23.
  • the other end of the cable 22 is fixed in a state in which the cable end 22E2 projects perpendicularly from the rear surface 23BK.
  • the pair of handles 23H makes it difficult for external stress to be applied to the joint portion of the cable end 22E2.
  • the cable end 22E2 is protected by the handles 23H.
  • the second connector portion 23 is detachable from the main body portion 2.
  • the second connector portion 23 is detachable with the base plate 50 attached to the main body portion 2.
  • the base plate 50 is provided as a structure on the main body portion 2 side, and allows the extension cable 20 to be connected to the main body portion 2.
  • Fig. 8 shows the front side (terminal surface 50Ta) of the base plate 50.
  • Fig. 9B shows the back side (terminal surface 50Tb) of the base plate 50.
  • the base plate 50 has a recess 54 in the center of the terminal surface 50Ta for mounting the second connector portion 23 therein.
  • the recess 54 is sized to fit the housing 23K of the second connector portion 23 therein.
  • 6 shows a state in which the second connector portion 23 is fitted into the recess 54.
  • the housing 23K of the second connector portion 23 is fitted into the recess 54 to the extent that it is in close contact with the side surface of the recess 54, leaving only a slight protruding portion.
  • the recessed portions 127 of the housing 23K are left as gaps between the housing 23K and the recess 54.
  • the formation of the recessed portions 127 on the top and bottom of the housing 23K facilitates attachment and detachment without excessive friction between the housing 23K and the recess 54.
  • the user can easily attach and detach the second connector portion 23 to and from the base plate 50 by using the handle 23H.
  • the second connector portion 23 fits into the base plate 50, so there is less protrusion due to the joining of the second connector portion 23.
  • both the base plate 50 and the second connector portion 23 are attached to the main body portion 2, but the thickness direction size of the base plate 50 and the second connector portion 23 does not directly protrude from the main body portion 2, so the front-to-rear size of the main body portion 2 in the extended state can be reduced.
  • a connector 50a is disposed on the innermost surface of the recess 54 of the base plate 50 (see FIG. 8).
  • a connector 23a is disposed on the terminal surface 23FR of the second connector portion 23 (see FIG. 10).
  • the surface of the base plate 50 facing the main body 2 is a terminal surface 50Tb (see FIG. 9B).
  • the base plate 50 is detachable from the mounting portion 18 of the main body 2. That is, the base plate 50 can be attached to the mounting portion 18 that is exposed when the camera head 3 is removed from the main body 2 (see FIG. 2). For this reason, as shown in Figures 9A and 9B, the terminal surface 50Tb of the base plate 50 and the terminal surface 3T of the camera head unit 3 do not need to have exactly the same shape, but both are structured so that they can be attached to the mounting portion 18.
  • a connector 50b is provided on the terminal surface 50Tb of the base plate 50.
  • This connector 50b is a connector that can be joined to the connector 2a (see FIG. 2) in the mounting portion 18, and is formed in a position where it can be joined opposite the connector 2a when the base plate 50 is attached to the mounting portion 18. Therefore, when the base plate 50 is attached to the main body portion 2, signals are transmitted between the main body portion 2 and the base plate 50 via the connectors 50b and 2a.
  • connector 50b is attached to one side of board 55, and the above-mentioned connector 50a is attached to the other side of board 55.
  • Board 55 has wiring between each pin of connectors 50a and 50b, which forms a transmission path all the way to connector 50a.
  • an external power supply input terminal 51 is formed on the base plate 50.
  • an external power supply output terminal 120 is formed on the first connector portion 21 on the camera head portion 3 side.
  • the connectors 50a, 23a, the cable 22, and the connector 21a form a line for supplying external power.
  • a power supply device such as a power adapter
  • the power supply voltage can be secured using the external power supply output terminal 120 provided in the second connector portion 23. In other words, even when separated from the main body portion 2, the external power supply voltage can be used on the camera head portion 3 side without requiring a separate power supply wiring.
  • a video input terminal 53 is provided on the base plate 50, and a video output terminal 121 is provided on the first connector portion 21 correspondingly.
  • a transmission line between the video input terminal 53 and the video output terminal 121 is formed as a path formed by the connectors 50a, 23a, the cable 22, and the connector 21a.
  • the camera head unit 3 is provided with the assignable button 302, and correspondingly, the base plate 50 is provided with the assignable button 52.
  • the assignable button 52 is an operator with the same function as the assignable button 302, but by being provided on the base plate 50, it becomes possible to perform the same operations on the main body 2 side as on the camera head 3 side.
  • a spirit level 30 is provided inside the main body 2.
  • a spirit level 40 is provided on the camera head 3 side.
  • the spirit level 30 is disposed, for example, near the inner bottom surface of the main body 2, and serves as a spirit level that detects the horizontality of the bottom surface of the main body 2. Specifically, as spirit level information, angle information in the pitch direction and angle information in the roll direction based on a horizontal state are detected.
  • the level 30 need not necessarily be attached near the inner bottom surface of the main body 2, but may be attached at any location where the relative position and attitude with respect to a surface serving as a horizontal reference (for example, the bottom surface 9) does not change.
  • This spirit level 30 is a device for obtaining a horizontal reference for the camera system 1. For example, the level of the main body 2, the image sensor of the camera head 3 in the normal state, accessory parts attached to the main body 2, etc. are detected by the spirit level 30.
  • a spirit level 40 is also provided on the camera head unit 3 side.
  • the spirit level 40 may be built in the camera head unit 3 or the first connector unit 21.
  • the spirit level 40 may be attached to the outside of the housing of the camera head unit 3 or the first connector unit 21.
  • the spirit level 40 may be attached to a location where the position and attitude relative to the image sensor 300 in the camera head unit 3 do not change at least when the camera head unit 3 is in a separated state. That is, the level 40 detects, as level information, angle information in the pitch direction and angle information in the roll direction of the image sensor 300 based on the horizontal state.
  • Each of these spirit levels 30, 40 may be anything capable of detecting angle information in the pitch and roll directions, and may be configured, for example, by a two-axis angular velocity sensor.
  • Connector connection The connector connection in the above-mentioned basic state and extended state will be described.
  • the following connectors were mentioned above: Connector 3a of camera head unit 3 Connector 21a of the first connector portion 21 of the extension cable 20 Connector 23a of the second connector portion 23 of the extension cable 20 Connector 50a on terminal surface 50Ta of base plate 50 Connector 50b on terminal surface 50Tb of base plate 50 Connector 2a of main body 2
  • the connector 3a of the camera head unit 3 can be joined to the connectors 2a and 21a.
  • a connector 21a of the first connector portion 21 of the extension cable 20 can be joined to the connector 3a.
  • the connector 23a of the second connector portion 23 of the extension cable 20 can be joined to the connector 50a.
  • the connector 50a on the terminal surface 50Ta of the base plate 50 can be joined to the connector 23a.
  • the connector 50b on the terminal surface 50Tb of the base plate 50 can be joined to the connector 2a.
  • the connector 2a of the main body 2 can be joined to the connectors 3a and 50b.
  • FIG. 11A shows the base state.
  • the camera head 3 is attached to the main body 2 and connectors 2 a and 3 a are joined together to transmit signals between the main body 2 and the camera head 3 .
  • FIG. 11B shows the extended state.
  • a base plate 50 is attached to the main body 2.
  • An extension cable 20 is attached between the camera head 3 and the base plate 50.
  • the connectors 2a and 50b are joined, the connectors 50a and 23a are joined, and the connectors 21a and 3a are joined.
  • signals are transmitted between the main body 2 and the camera head 3 via the base plate 50 and the extension cable 20.
  • All or some of these connectors are attached to a substrate and arranged as board-to-board (BtoB) connectors.
  • BtoB board-to-board
  • the number of pins (terminals) of each connector is not particularly specified, but it is considered that each connector has, for example, 100 pins or more.
  • FIG. 12 shows the electrical connection state in the basic state, and the main configuration of the main body unit 2 and the camera head unit 3 will be explained using this FIG. 12.
  • the main body 2 includes a control unit 200, a signal processing unit 202, a recording unit 203, a communication unit 204, a power supply circuit 205, and a spirit level 30.
  • the camera body includes other components such as a display control unit, a display unit, and an operation unit, but in Figs. 12 to 15, some components are omitted from the illustration in order to clarify the correspondence between the various units and to avoid cluttering the figures.
  • the camera head unit 3 includes an image sensor 300, a lens system drive unit 301, an assignable button 302, and an ID generation unit 303.
  • the camera head unit 3 also has other components, but for the same reason, some of these are not shown in the illustration.
  • the image sensor 300 has an imaging element formed by arranging photoelectric conversion pixels in a matrix, such as a CCD (Charge Coupled Device) type, a CMOS (Complementary Metal Oxide Semiconductor) type, etc.
  • a CCD Charge Coupled Device
  • CMOS Complementary Metal Oxide Semiconductor
  • Light from a subject is collected on the image sensor 300 by an optical system (not shown).
  • the optical system refers to lenses such as a zoom lens and a focus lens, an aperture mechanism, an optical filter, etc., and these can be provided in the camera head unit 3, the adapter 500, or the interchangeable lenses 502 and 503.
  • the image sensor 300 performs processes such as CDS (Correlated Double Sampling) and AGC (Automatic Gain Control) on the electrical signal obtained by photoelectric conversion in the image sensor, and then performs A/D (Analog/Digital) conversion.
  • the captured image data is then output as digital data to the main body 2.
  • the image sensor 300 outputs an image signal, for example, as so-called RAW image data.
  • the lens system driving unit 301 drives the focus lens, zoom lens, aperture mechanism, optical filter mechanism, etc. in the above optical system based on the control of the control unit 200.
  • the ID generating unit 303 generates identification information for the camera head unit 3.
  • the ID generating unit 303 can be configured as a terminal voltage setting unit corresponding to one or more connector terminals.
  • the ID generating unit 303 may be a memory or a processor that stores the identification information.
  • the identification information by the ID generating unit 303 is used, for example, by the control unit 200 of the main body unit 2 as information for distinguishing between a non-separated state and a separated state of the camera head unit 3.
  • the ID generating unit 303 may be used as information for identifying the model of the camera head unit 3.
  • the signal processing unit 202 in the main body 2 is configured as an image processing processor, for example, a DSP (Digital Signal Processor). This signal processing unit 202 performs various signal processing on the captured image data from the image sensor 300.
  • a DSP Digital Signal Processor
  • the signal processing unit 202 performs processes on the captured image data, such as clamping processing to clamp the R, G, and B black levels to predetermined levels, correction processing between the R, G, and B color channels, color separation processing so that the image data for each pixel has all R, G, and B color components, and processing to generate (separate) a luminance (Y) signal and a chrominance (C) signal.
  • the signal processing unit 202 performs necessary resolution conversion processing, such as resolution conversion for recording, communication output, or monitor image, on the captured image data that has been subjected to various signal processes.
  • the signal processing unit 202 also performs compression processing, encoding processing, etc. for recording or communication on the resolution-converted captured image data.
  • the signal processing unit 202 also performs processing to generate a monitor image signal for the captured image monitor display (through image display) and supplies the monitor image signal to the video output terminal 14a. This makes it possible to view the monitor image by connecting an external monitor device to the video output terminal 14a.
  • the control unit 200 is configured by a microcomputer (arithmetic processing device) equipped with a CPU (Central Processing Unit), a ROM (Read Only Memory), a RAM (Random Access Memory), a flash memory, and the like.
  • the CPU executes programs stored in the ROM, flash memory, or the like to centrally control the entire camera system 1 .
  • the RAM is used as a working area for the CPU to process various data, and is used for temporarily storing data, programs, etc.
  • ROM and flash memory non-volatile memory
  • OS Operating System
  • Such a control unit 200 controls the operation of each necessary unit with respect to parameter control of various signal processes in the signal processing unit 202, imaging and recording operations in response to user operations, playback of recorded image files, imaging operations of the image sensor 300, camera operations such as zoom, focus, and exposure adjustment, user interface operations, etc.
  • control unit 200 transmits control signals to the image sensor 300 and the lens system driving unit 301.
  • control unit 200 transmits control signals such as the shutter speed and frame rate of the image sensor 300, a clock signal, or a control signal for the lens system driving unit 301 to the camera head unit 3.
  • the control unit 200 also generates metadata to be associated with the captured image data and performs processing for the association.
  • the control unit 200 is supplied with level information from the level 30 and the level 40.
  • the control unit 200 performs processing to associate the level information with the captured image data as one piece of metadata.
  • Examples of processing to associate with captured image data include recording or transmitting metadata in correspondence with frames of captured image data, and generating and recording or transmitting a metadata file as a file associated with an image file based on captured image data.
  • the recording unit 203 is, for example, a non-volatile memory, and functions as a storage area for storing image files (content files) of captured image data such as still images and moving images, metadata, thumbnail images, and the like.
  • the recording unit 203 may be a flash memory built into the main body 2, or may be a memory card (e.g., a portable flash memory) that can be attached to and detached from the main body 2 and a card recording and playback unit that performs recording and playback access to the memory card.
  • the recording unit 203 may be realized as a hard disk drive (HDD) or solid state drive (SSD) built into the main body 2.
  • HDD hard disk drive
  • SSD solid state drive
  • the communication unit 204 performs data communication and network communication with external devices via wired or wireless means. For example, captured image data (still image files and video files), metadata, and the like are communicated between external display devices, recording devices, playback devices, and the like. Furthermore, the communication unit 204 may serve as a network communication unit, performing communication via various networks such as the Internet, a home network, and a LAN (Local Area Network), and may transmit and receive various data between servers, terminals, etc. on the network.
  • captured image data still image files and video files
  • metadata and the like
  • the communication unit 204 may serve as a network communication unit, performing communication via various networks such as the Internet, a home network, and a LAN (Local Area Network), and may transmit and receive various data between servers, terminals, etc. on the network.
  • LAN Local Area Network
  • the power supply circuit 205 uses, for example, a battery 501 as a power source, generates the necessary power supply voltage V0, and supplies it to each unit within the main body 2.
  • the power supply circuit 205 also generates a power supply voltage V 1 to be supplied to the camera head unit 3 , and supplies it to the camera head unit 3 .
  • the power supply circuit 205 is controlled to turn on/off the power supply voltage by the control unit 200 .
  • Operation information of an assignable button 302 provided on the camera head unit 3 is detected by the control unit 200 via connectors 3a and 2a.
  • the control unit 200 controls the operation assigned to the assignable button 302. For example, the control unit 200 controls the start of recording.
  • the above-mentioned components in the main body 2 and camera head 3 work together to capture, record, and communicate moving and still images.
  • FIG. 13 shows the extended state in which the main body 2, the base plate 50, the extension cable 20, and the camera head 3 are connected.
  • lines LN are provided as transmission paths for signal transmission between the main body 2 side and the camera head 3 side.
  • the following lines LN (excluding the monitor image line LN2) are formed even in the basic state of FIG.
  • Each of these lines LN1 to LN8 is not necessarily a single transmission line, but is shown functionally together and only representative transmission lines are shown. In reality, a larger number of lines LN are formed.
  • the lines LN are as follows:
  • the image data line LN1 is a line that transmits captured image data from the image sensor 300 to the main body 2.
  • the monitor image line LN2 is a line that connects the video input terminal 53 of the base plate 50 and the video output terminal 121 of the first connector portion 21, and is a line that transmits, for example, a monitor image signal. 12, this monitor image line LN2 is not formed. For this reason, for example, wiring is performed using pins (terminals) in connectors 50a, 23a that correspond to the free pins of connectors 2a, 3a, so that monitor image line LN2 is formed within extension cable 20.
  • the number of pins in connectors 50a, 23a may be greater than the number of pins in connectors 2a, 3a, in which case additional pins may be used.
  • control line LN3 indicates a plurality of signal lines used for transmitting control signals and clock signals from the control unit 200 to the camera head unit 3.
  • the control line LN4 indicates a plurality of signal lines used for transmitting signals from the camera head unit 3 to the control unit 200.
  • the control line LN4 is used for transmitting a state detection signal of the camera head unit 3, a response signal to a control signal, and the like.
  • level information detected by the level 40 is also transmitted to the control unit 200 via a control line LN4.
  • LN5 ID Line This is a line for allowing the control unit 200 to detect the identification information generated by the ID generating unit 303.
  • the power supply line LN6 is a line that supplies the power supply voltage V1 from the power supply circuit 205 of the main body 2 to the camera head 3 side.
  • the assignable button line LN7 is a line that connects the assignable buttons 302 and 52 by wired-OR, enabling the control unit 200 to detect the operations of these buttons.
  • the external power supply line LN8 is a line that connects the external power supply input terminal 51 of the base plate 50 and the external power supply output terminal 120 of the first connector portion 21.
  • a ground line is also provided. This forms a common ground for the main body 2, base plate 50, extension cable 20, and camera head 3.
  • a pre-processing unit 24 In the extension cable 20 in which the above-mentioned line LN is formed, a pre-processing unit 24, a buffer amplifier 26, a DC/DC converter 28, and a level 40 are provided within the first connector unit 21.
  • a post-processing section 25 In the second connector section 23, a post-processing section 25, a buffer amplifier 27, a DC/DC converter 29, and an ID generating section 400 are provided.
  • the pre-processing unit 24 is provided corresponding to the image data line LN1, and performs amplification processing (pre-emphasis processing) on the captured image data output from the image sensor 300.
  • This pre-emphasis processing is a process that takes into account signal attenuation during transmission through the extension cable 20 and boosts the attenuation in advance.
  • the post/pre-processing unit 25 is also provided corresponding to the image data line LN1.
  • the post/pre-processing unit 25 is capable of signal compensation processing as waveform shaping and amplification processing (pre-emphasis processing) similar to that of the pre-processing unit 24.
  • the waveform shaping (signal compensation) referred to here is an equalizing process for compensating for the frequency characteristics that have changed (deteriorated) due to transmission through the extension cable 20 .
  • the buffer amplifier 27 is provided corresponding to the control line LN3, and performs amplification processing on necessary signals among various control signals and clock signals transmitted from the control unit 200.
  • the buffer amplifier 26 is provided corresponding to the control line LN 4 , and performs amplification processing on necessary signals among various signals transmitted from the camera head unit 3 .
  • the amplification processing by these buffer amplifiers 26 and 27 also corresponds to the attenuation that occurs during cable transmission through the extension cable 20 .
  • Level information from the level 40 is also transmitted to the control unit 200 from the control line LN3 via the buffer amplifier 26. This allows the control unit 200 to input level information from the level 40 on the camera head unit 3 side at least when in the extended state. In the normal state, since the level 40 does not exist, the control unit 200 may recognize the terminal information to which the level information is input as invalid information.
  • the DC/DC converter 28 receives the DC power supply voltage V1 from the power supply circuit 205 supplied to the power supply line LN6, performs voltage conversion, and generates the required power supply voltage Vc within the first connector portion 21, which is supplied to the pre-processing portion 24 and the buffer amplifier 26.
  • the DC/DC converter 29 also receives the DC power supply voltage V1 from the power supply circuit 205 supplied to the power supply line LN6, performs voltage conversion, and generates the required power supply voltage Vc within the second connector portion 23, which is supplied to the post/pre-processing portion 25 and the buffer amplifier 27.
  • the ID generating unit 400 generates identification information for the extension cable 20.
  • the ID generating unit 400 can be configured as a terminal voltage setting unit corresponding to one or more connector terminals.
  • the ID generating unit 400 may be a memory or a processor that stores the identification information.
  • the identification information generated by the ID generating unit 400 is recognized by the control unit 200, for example, via the ID line LN5.
  • the identification information generated by the ID generating unit 400 is then used by the control unit 200 of the main body unit 2, for example, as information for distinguishing between an undetached state and a detached state of the camera head unit 3.
  • the ID generating unit 400 may be used as information for identifying the model of the extension cable 20.
  • the identification information of this ID generating unit 400 and the ID generating unit 303 of the camera head unit 3 described above may be input to the control unit 200 in a configuration in which only one of them is input, or in which both are input.
  • the control unit 200 can determine whether the camera head unit 3 is in a non-separated state or a separated state from the identification information.
  • the identification information of both the ID generating unit 303 and the ID generating unit 400 may be input to the control unit 200, in which case the control unit 200 can determine whether the camera head unit 3 is in a non-separated state or a separated state depending on whether the identification information of the extension cable 20 is input. Furthermore, in this case, the control unit 200 can determine the model of the camera head unit 3 regardless of whether it is in the extended state or the normal state.
  • FIGS. 14 and 15 show an example of the configuration when the spirit level 40 on the camera head unit 3 side is directed toward the camera head unit 3.
  • the configuration other than the spirit level 40 is the same as in FIG. 12 and FIG. 13, so a duplicated explanation will be avoided.
  • a spirit level 40 is provided on the camera head unit 3 as shown in FIG. 14, and spirit level information from the spirit level 40 is input to the control unit 200.
  • level information from the level 40 in the camera head unit 3 is transmitted to the control unit 200 via the buffer amplifier 26 and the control line LN3. Therefore, the control unit 200 can input level information from the level 40 on the camera head unit 3 side whether in the normal state or the extended state.
  • the process of the control unit 200 will be described as an example of the process of the embodiment.
  • the process of the control unit 200 will be described as an example of the process related to the level information obtained from the levels 30 and 40.
  • FIG. 16 shows an example of processing in the first embodiment.
  • the control unit 200 performs power-on processing in step S101 in response to a user operation or the like, it determines in step S102 whether to separate or not separate the camera head unit 3.
  • the control unit 200 can determine whether to separate or not separate the camera head unit 3 based on the identification information of the ID generating units 303 and 400. Note that the determination whether to separate or not separate the camera head unit 3 can also be made by a mechanical switch or the like.
  • step S103 the level 30 on the main body side is set to active. If the control unit 200 determines that the camera head unit 3 is in a separated extended state, the process proceeds to step S104, where the level 40 on the camera head unit 3 side is set to active.
  • active setting means a setting in which the level information input from the level is processed as valid information.
  • the control unit 200 performs the process in FIG. 17 during the imaging period.
  • imaging hereinafter refers to the processing operation of recording or transmitting the imaging video signal obtained by the image sensor 300 as frames that make up still images or videos.
  • imaging by the image sensor 300 is still performed for the purpose of displaying a through image even during recording standby for still images or videos, but this does not include such cases.
  • control unit 200 When image capture is started by a user operation, for example an operation to start video recording, the control unit 200 proceeds from step S201 to S202, and thereafter repeats the processing of steps S202 and S203 until it is determined in step S204 that image capture has ended due to a user operation or some other trigger.
  • step S202 the control unit 200 acquires level information of the level that is set as active.
  • step S203 the control unit 200 performs processing to associate the acquired spirit level information as metadata corresponding to the current frame of the captured image data.
  • the control unit 200 causes the recording unit 203 to record the metadata corresponding to the frame on a recording medium.
  • the control unit 200 causes the communication unit 204 to transmit the metadata corresponding to the frame to an external device.
  • the control unit 200 performs the processes of steps S202 and S203 for each frame timing.
  • the level 30 is set to active in step S103 in Fig. 16, during image capture, the level information from the level 30 is made into metadata corresponding to each frame of the moving image to be recorded in steps S202 and S203 in Fig. 17. This is the case when image capture is performed in a normal state. Also, if the level 40 is set to active in step S103 in Fig. 16, during image capture, level information from the level 40 is made into metadata corresponding to each frame of the moving image to be recorded in steps S202 and S203 in Fig. 17. This is the case when image capture is performed in the extended state.
  • the level information of the level 30 is associated with the captured image data.
  • the level information of the level 40 is associated with the captured image data. Therefore, the level information associated with the captured image data is information on the angles in the roll and pitch directions based on the horizontal state of the image sensor 300 .
  • FIG. 18 A processing example of the second embodiment will be described using FIG. 18 and the above-mentioned FIG. 17. Note that from here on, the same processing as in the previously described example will be assigned the same step number.
  • FIG. 18 shows the processing of the control unit 200 when the power is turned on, and in step S110, the control unit 200 monitors user operations related to the spirit levels 30, 40.
  • the user can selectively set the spirit levels 30, 40 to active by, for example, operating a menu. If no such operation is detected, the control unit 200 exits from the processing of FIG. 18.
  • control unit 200 proceeds from step S110 to step S111, and checks the selection state by a menu operation or the like. If the control unit 200 confirms that the level 30 on the main body 2 side has been selected by a user operation, the control unit 200 proceeds from step S112 to step S103, and sets the level 30 to active. Furthermore, when the control unit 200 confirms that the level 40 on the camera head unit 3 side has been selected by a user operation, the control unit 200 proceeds from step S112 to step S104, and sets the level 40 to active.
  • the active setting can be switched by the user's operation.
  • the control unit 200 then performs the above-described processing in Fig. 17 during the image capture period. Therefore, if the user has selected the level 30, during image capture, in steps S202 and S203 in Fig. 17, the level information from the level 30 is made into metadata corresponding to each frame of the moving image to be recorded. Furthermore, if the user has selected the level 40, during image capture, in steps S202 and S203 of FIG. 17, level information from the level 40 is made into metadata corresponding to each frame of the moving image to be recorded.
  • the level information of the level 30 or level 40 is associated with the captured image data in accordance with the user operation performed prior to capturing an image.
  • the user can select level 30 or level 40 depending on whether it is in the normal state or the extended state, for example, and associate the level information with the captured image data as metadata.
  • the process in FIG. 16 and the process in FIG. 18 can be combined.
  • the control unit 200 automatically sets one of the spirit levels to active depending on whether the camera head unit 3 is separated or not at that time. After that, when a user operation is detected, the control unit 200 switches the spirit level to be set to active depending on the operation.
  • the processing in Figure 17 basically automatically associates one of the level information depending on whether the camera head unit 3 is separated or not, and the user can associate either level information at will depending on the trial situation.
  • FIG. 19 shows the processing of the control unit 200 when the power is turned on.
  • the control unit 200 performs power-on processing in step S101 in response to a user operation or the like, it sets all of the spirit levels (spirit level 30 and spirit level 40) to active in step S120.
  • the control unit 200 performs the process of FIG. 17 during the imaging period.
  • the spirit levels 30, 40 are set to active, the spirit level information obtained by each of the spirit levels 30, 40 during image capture in steps S202, S203 of FIG. 17 is used as metadata corresponding to each frame of the video to be recorded.
  • the level information obtained by the levels 30 and 40 are used as metadata corresponding to each frame of the captured image data.
  • the user can select any level information and perform processing.
  • the process shown in FIG. 20 is carried out.
  • step S301 the device selects and sets the level information. For example, the user determines whether to use the level information from level 30 or level 40.
  • the device executes a process using the selected level information, such as a playback process or an image editing process. For example, when performing a playback process, the device plays back images while performing horizontal correction using the selected and set level information. Also, for example, the device can use the selected and set spirit level information to perform editing processing to add horizontal correction to the image, or to perform processing to cut out horizontally corrected images from each frame.
  • a process using the selected level information such as a playback process or an image editing process.
  • the device plays back images while performing horizontal correction using the selected and set level information.
  • the device can use the selected and set spirit level information to perform editing processing to add horizontal correction to the image, or to perform processing to cut out horizontally corrected images from each frame.
  • control unit 200 performs power-on processing in step S101, and then sets all of the spirit levels (the spirit level 30 and the spirit level 40) to active in step S120.
  • control unit 200 performs the process of FIG. 21 during the imaging period.
  • control unit 200 proceeds from step S201 to step S210 to determine whether the camera head unit 3 is separated. That is, it is confirmed whether the camera head unit 3 is in a separated state or a non-separated state.
  • control unit 200 repeats the processes of steps S202, S203, and S211 until it is determined in step S204 that imaging has ended due to a user operation or some other trigger.
  • step S202 the control unit 200 acquires level information for the level that is set as active, that is, for both of the level gauges 30 and 40 in this case.
  • step S203 the control unit 200 performs processing to associate the acquired level information of each of the level gauges 30 and 40 as metadata corresponding to the current frame of the captured image data.
  • the recording unit 203 records the metadata corresponding to the frame on a recording medium.
  • the communication unit 204 transmits the metadata corresponding to the frame to an external device.
  • control unit 200 associates information on the result of the separation determination executed in step S210 with the captured image data.
  • the information on the separation determination result may be metadata corresponding to each frame together with the spirit level information, or may be metadata corresponding to the entire image file.
  • the level information obtained by each of the levels 30 and 40 during image capture is treated as metadata corresponding to each frame of the video being recorded, and is also associated with information on the separation determination results.
  • the user can select the level information according to the information on the separation judgment result and perform processing.
  • the editing device or playback device can automatically select one of the level information using the information on the separation judgment result and use it to perform playback processing or editing processing.
  • the playback device or editing device can play back a horizontally corrected image or perform horizontal correction editing of an image.
  • the level information is treated as metadata corresponding to each frame of the video, but the timing of acquiring the level information does not necessarily have to be synchronized with the frames. In other words, it does not have to be aligned with the frame rate or frame phase. Therefore, acquiring level information at the timing of each frame as described above and using it as metadata for that frame is just one example.
  • the level information may be associated with the captured image data asynchronously with the frames, or may be associated with, for example, intermittent frames, because during playback or image editing, such metadata can be used to perform interpolation along the time axis to obtain level information synchronized with the frame timing.
  • spirit level information at a synchronized timing may be obtained by an interpolation process, and this may be used as metadata associated with the frame.
  • the camera system 1 of the embodiment includes a main body section 2 that processes captured image data, and a camera head section 3 that is detachable from the main body section 2 and can output captured image data generated by an image sensor 300 to the main body section 2, whether the camera head section 3 is in an undetached state attached to the main body section 2 or in a detached state separated from the main body section 2.
  • the camera system 1 also includes a first spirit level 30 provided on the main body section 2 side, a second spirit level 40 provided on the camera head section 3 side, and a control section 200 that performs processing to associate spirit level information from the spirit level 30 and the spirit level information from the spirit level 40 with the captured image data. This allows appropriate level information to be associated with the captured image data depending on the actual situation during image capture.
  • At least the level information (roll angle, pitch angle) from the level 30 on the main body 2 side can be associated with the captured image data
  • at least the level information of the camera head 3 from the level 40 on the camera head 3 side can be associated with the captured image data.
  • appropriate level information depending on the posture at the time of image capture that affects the image can be associated with the captured image data.
  • control unit 200 performs an association process in which it automatically selects one of the level information of the level 30 and the level information of the level 40 and associates it with the captured image data (see Figures 16 and 17). For example, by selecting the appropriate level information for a captured image and associating it with the captured image data, it is possible to save appropriate level information according to the situation at the time of capturing the image.
  • control unit 200 performing the association process by selecting the spirit level information of the spirit level 30 when the camera head unit 3 is in a non-detached state, and selecting the spirit level information of the spirit level 40 when the camera head unit 3 is in a detached state, and performing the process of associating the information with the captured image data (see Figures 16 and 17).
  • the camera head unit 3 is operated in a non-detached state, i.e., in a normal state
  • the level information provided by the level 30 on the main body side can be associated with the captured image data, thereby saving the level information that serves as the reference in the camera system 1.
  • the level information from the level 40 on the camera head unit 3 side can be associated with the captured image data, so that the level information according to the attitude of the camera head unit 3, i.e., the level information that affects the captured image data, can be saved.
  • control unit 200 selects level information from either the level 30 or 40 in response to user operation input and associates it with the captured image data (see Figures 18 and 17).
  • the level 30 on the main body side and the level 40 on the camera head unit 3 side are selected and the level information is associated with the captured image data. This allows either level information to be saved depending on the circumstances and situation at the time of shooting.
  • control unit 200 performs the association process of associating the level information of both the levels 30, 40 with the captured image data (see FIGS. 19, 17, 20, and 21).
  • the control unit 200 performs the association process of associating the level information of both the levels 30, 40 with the captured image data (see FIGS. 19, 17, 20, and 21).
  • an example of the association process is given in which the control unit 200 performs a process of associating the information on the determination result of whether the state is non-separated or separated, together with the level information of both the levels 30, 40, with the captured image data (see Figures 19 and 21).
  • the control unit 200 performs a process of associating the information on the determination result of whether the state is non-separated or separated, together with the level information of both the levels 30, 40, with the captured image data (see Figures 19 and 21).
  • the control unit 200 performs a process of associating the information on the determination result of whether the state is non-separated or separated, together with the level information of both the levels 30, 40, with the captured image data (see Figures 19 and 21).
  • the camera head unit 3 and the main body unit 2 are connected by the extension cable 20, and captured image data is transmitted from the camera head unit 3 to the main body unit 2.
  • the extension cable 20 flexible shooting using the camera head unit 3 becomes possible.
  • signal transmission between the camera head unit 3 and the main body unit 2 may be performed by wireless communication, rather than a wired connection using the extension cable 20 or the like.
  • the control unit 200 of the main body unit 2 can set the spirit level 40 on the camera head unit 3 side to active and perform a process of relating the spirit level information to the captured image data.
  • the level 40 is provided on a mounting body that is attached to the camera head unit 3 in a separate state. 13, for example, a spirit level 40 on the camera head 3 side is provided on the first connector 21. Since the first connector 21 is attached to the camera head 3 in an extended state, the spirit level 40 can output spirit level information according to the attitude of the camera head 3. In other words, the information correctly indicates whether the captured image is in a horizontal or non-horizontal state. Furthermore, by equipping the first connector unit 21 with the spirit level 40, it becomes possible to output spirit level information according to the attitude of the camera head unit 3, regardless of the presence or absence of a spirit level in the camera head unit 3. In other words, even when using a camera head unit 3 that does not have a spirit level, it becomes possible to provide a spirit level on the camera head unit 3 side in the extended state.
  • the attachment part that is attached to the camera head unit 3 in a separated state is the first connector unit 21 of the extension cable 20 .
  • the first connector portion 21 is a connector for the extension cable 20. Therefore, the first connector portion 21 is a member that is always attached to the camera head portion 3 in an extended state, and is therefore suitable as a location for providing a spirit level 40 that detects the attitude of the camera head portion 3.
  • a level 40 on the camera head unit 3 side is provided on the camera head unit 3. This allows the level 40 to output level information according to the attitude of the camera head unit 3.
  • control unit 200 performs the association process by associating the level information of the level 30 or the level 40 with each frame of the captured image data. For example, by associating level information with each frame of captured image data, when the images are later played back or edited, processing can be performed using information on the roll angle and pitch angle with respect to the horizontal for each frame.
  • control unit 200 performs, as the association process, a process of recording the level information of the level 30 or the level 40 on the recording medium as metadata associated with the captured image data.
  • the spirit level information may be included as metadata in the image file of the captured image data, but the spirit level information may be recorded as a file separate from the image file so that the files are associated with each other.
  • control unit 200 performs the association process by transmitting the level information of the level 30 or the level 40 to an external device as information associated with the captured image data. This makes it possible to provide the level information corresponding to the captured image data to an external device, and to use the level information for recording, playback, and editing in the external device.
  • the present technology can also be configured as follows.
  • a main body section for performing processing on captured image data a camera head unit that is detachable from the main body unit and configured to output the captured image data generated by an image sensor to the main body unit in either a non-detached state in which the camera head is attached to the main body unit or a detached state in which the camera head is separated from the main body unit;
  • control unit performs the association process by automatically selecting one of the first level information and the second level information and associating it with the captured image data.
  • the control unit performs the association process by The camera system described in (2) above, wherein when the camera head unit is in the non-separated state, the first level information is selected, and when the camera head unit is in the separated state, the second level information is selected, and processing is performed to associate the information with the captured image data.
  • the control unit performs the association process by The camera system according to (1) above, further comprising: a process of selecting one of the first level information and the second level information in response to a user operation input, and associating the selected one with the captured image data.
  • the attachment body is a connector at the end of a cable that transmits the captured image data to the main body when the camera head is in the separated state.
  • the second spirit level is provided in the camera head portion.
  • the control unit performs the association process by The camera system according to any one of (1) to (10) above, further comprising a process for associating the first spirit level information or the second spirit level information with each frame of the captured image data.
  • the control unit performs the association process by The camera system according to any one of (1) to (11) above, further comprising a process for recording the first level information or the second level information on a recording medium as information associated with the captured image data.
  • the control unit performs the association process by The camera system according to any one of (1) to (12) above, further comprising a process for transmitting the first level information or the second level information to an external device as information associated with the captured image data.
  • a main body section for performing processing on captured image data; a camera head unit that is detachable from the main body unit and capable of outputting the captured image data generated by an image sensor to the main body unit in either a non-detached state in which the camera head is attached to the main body unit or a detached state in which the camera head is separated from the main body unit; A first level provided on the main body portion; a second level provided integrally with a housing of the camera head;
  • a processing method for a camera system having a processing method for performing processing for associating first level information obtained by the first level and second level information obtained by the second level with the captured image data;

Landscapes

  • Studio Devices (AREA)

Abstract

This camera system comprises: a main body unit which processes captured image data; a camera head unit which is detachable from the main body unit and can output, to the main body unit, the captured image data generated by an image sensor, even when the camera head unit is either in an undetached state of being mounted on the main body unit or in a detached state of being away from the main body unit; a first level provided in the main body unit; a second level provided integrally with the housing of the camera head unit; and a control unit which associates the captured image data with first level information from the first level and second level information from the second level.

Description

カメラシステム、処理方法Camera system, processing method
 本技術はカメラシステムとその処理方法に関し、カメラの本体部とカメラヘッド部を分離させて使用することのできるカメラシステムに関する。 This technology relates to a camera system and its processing method, and to a camera system in which the camera body and camera head can be used separately.
 下記特許文献1、2のようにカメラシステムとして、イメージセンサを備えたカメラヘッド部と本体部を離間させて撮像を行うものが知られている。
 また下記特許文献3,4には、水準器を搭載した構成が開示されている。
2. Description of the Related Art As disclosed in Japanese Patent Application Laid-Open No. 2003-233699 and Japanese Patent Application Laid-Open No. 2003-233699, camera systems are known in which an image is captured by separating a camera head unit having an image sensor from a main body unit.
Furthermore, Patent Documents 3 and 4 listed below disclose configurations equipped with a spirit level.
再表2020/054266号公報Republished Publication No. 2020/054266 特開2005-236773号公報JP 2005-236773 A 特開2016-54414号公報JP 2016-54414 A 特開2011-199503号公報JP 2011-199503 A
 一般的な動画用カメラでは、撮影時に三脚等の周辺機材とイメージセンサの水平度を本体ボディの底面を基準に合わせこむため、イメージセンサの水平度を取得するための水準器は本体ボディに搭載される。
 ところが、本体部とカメラヘッド部を分離させて使用することのできるカメラシステムを考えると、分離運用した際にはイメージセンサが本体部や周辺機材から離れた位置にある。このため、イメージセンサの水平度を示す水準器情報が適切に得られなくなってしまう。
In general video cameras, the level of the image sensor and peripheral equipment such as a tripod are aligned with the bottom of the main body as a reference when shooting, so a spirit level for obtaining the level of the image sensor is built into the main body.
However, when considering a camera system in which the main body and the camera head can be used separately, the image sensor is located away from the main body and peripheral equipment when they are used separately, which means that it is not possible to obtain appropriate level gauge information that indicates the horizontality of the image sensor.
 そこで本開示では、カメラヘッド部が本体部から非分離でも分離しても運用できるカメラシステムにおいて適切な水準器情報が得られるようにする技術を提案する。 This disclosure therefore proposes technology that enables appropriate spirit level information to be obtained in a camera system in which the camera head can be operated either separately or in addition to being separated from the main body.
 本技術に係るカメラシステムは、撮像画像データに対する処理を行う本体部と、前記本体部に対して着脱可能とされ、前記本体部に装着された非分離状態及び前記本体部から分離された分離状態のいずれであってもイメージセンサにより生成した前記撮像画像データを前記本体部に対して出力できるように構成されたカメラヘッド部と、前記本体部に設けられる第1水準器と、前記カメラヘッド部の筐体と一体的に設けられる第2水準器と、前記第1水準器による第1水準器情報と前記第2水準器による第2水準器情報に関する、前記撮像画像データへの関連づけ処理を行う制御部とを備える。
 本体部側とカメラヘッド部側のそれぞれに水準器を備えるカメラシステムを構成し、制御部は双方の水準器の水準器情報について選択的に、或いは両方を撮像画像データに関連づける。
The camera system of the present technology includes a main body unit that processes captured image data, a camera head unit that is detachable from the main body unit and configured to output the captured image data generated by an image sensor to the main body unit whether the camera head unit is in an undetached state attached to the main body unit or in a detached state separated from the main body unit, a first level provided on the main body unit, a second level provided integrally with a housing of the camera head unit, and a control unit that performs processing to associate first level information from the first level and second level information from the second level with the captured image data.
The camera system is configured with a spirit level on both the main body side and the camera head side, and the control unit selectively associates spirit level information from both spirit levels or both of them with captured image data.
本技術の実施の形態のカメラシステムの基本状態の斜視図である。FIG. 1 is a perspective view of a basic state of a camera system according to an embodiment of the present technology. 実施の形態の本体部からカメラヘッド部を外した状態の斜視図である。FIG. 2 is a perspective view of the embodiment with a camera head removed from a main body. 実施の形態のカメラヘッド部からアダプターを外した状態の斜視図である。FIG. 2 is a perspective view of the camera head unit according to the embodiment with an adapter removed. 実施の形態の本体部に交換レンズ、ファインダーユニットを取り付ける場合の説明図である。4 is an explanatory diagram of a case where an interchangeable lens and a finder unit are attached to the main body of the embodiment. FIG. 実施の形態の本体部に交換レンズ、ファインダーユニットを取り付ける場合の説明図である。4 is an explanatory diagram of a case where an interchangeable lens and a finder unit are attached to the main body of the embodiment. FIG. 実施の形態の延長状態の斜視図である。FIG. 2 is a perspective view of the embodiment in an extended state. 実施の形態の延長ケーブルの斜視図である。FIG. 2 is a perspective view of an extension cable according to an embodiment. 実施の形態のベースプレートから第2コネクタ部を外した状態の斜視図である。11 is a perspective view of the embodiment in a state where a second connector portion is detached from the base plate. FIG. 実施の形態のカメラヘッド部とベースプレートの端子面の説明図である。4A and 4B are explanatory diagrams of terminal surfaces of a camera head unit and a base plate according to an embodiment. 実施の形態の延長ケーブルの端子面の説明図である。4A and 4B are explanatory diagrams of terminal surfaces of the extension cable according to the embodiment. 実施の形態の基本状態、延長状態における接続構成の説明図である。1A to 1C are explanatory diagrams of a connection configuration in a basic state and an extended state according to an embodiment. 実施の形態の基本状態での接続構成の説明図である。FIG. 2 is an explanatory diagram of a connection configuration in a basic state according to the embodiment. 実施の形態の延長状態での接続構成の説明図である。FIG. 4 is an explanatory diagram of a connection configuration in an extended state according to an embodiment. 実施の形態の基本状態での他の接続構成の説明図である。FIG. 11 is an explanatory diagram of another connection configuration in the basic state according to the embodiment. 実施の形態の延長状態での他の接続構成の説明図である。13 is an explanatory diagram of another connection configuration in the extended state according to the embodiment. FIG. 第1の実施の形態の水準器選択処理のフローチャートである。11 is a flowchart of a level selection process according to the first embodiment. 第1,第2の実施の形態の関連づけ処理のフローチャートである。11 is a flowchart of an association process according to the first and second embodiments. 第2の実施の形態の水準器選択の処理のフローチャートである。13 is a flowchart of a level selection process according to the second embodiment. 第3の実施の形態の水準器の全選択の処理のフローチャートである。13 is a flowchart of a process for selecting all of the spirit levels according to the third embodiment. 第3,第4の実施の形態の使用する水準器情報の選択処理のフローチャートである。13 is a flowchart of a selection process of level information to be used in the third and fourth embodiments. 第4の実施の形態の関連づけ処理のフローチャートである。13 is a flowchart of an association process according to the fourth embodiment.
 以下、実施の形態を次の順序で説明する。
<1.カメラシステムの構造>
<2.延長使用形態>
<3.コネクタ接続>
<4.各状態での電気的接続構成>
<5.第1から第4の実施の形態の処理例>
<6.まとめ及び変形例>
The embodiments will be described below in the following order.
1. Camera system structure
<2. Extended use form >
<3. Connector connection>
4. Electrical connection configuration in each state
5. Processing Examples of the First to Fourth Embodiments
6. Summary and Modifications
 なお本開示において「画像」とは動画と静止画を含む用語として用いている。実施の形態では主に動画撮像を想定して説明するが、静止画撮像の場合も本開示の技術は同様に適用できる。
 また「撮像画像データ」とは、イメージセンサにより撮像された動画や静止画のデータを指すが、撮像されたままのいわゆるRAW画像データだけでなく、現像処理、圧縮処理、エンコード処理等の各種処理が施された各段階の画像データや、画像ファイル化された画像データなども含む。さらにイメージセンサで撮像され、デジタルデータ化される前のアナログ段階の信号も撮像画像データに含むものとする。
In the present disclosure, the term "image" is used as a term including both moving images and still images. Although the embodiments will be described assuming mainly moving image capture, the technology of the present disclosure can be similarly applied to still image capture.
Moreover, the term "captured image data" refers to data of moving images or still images captured by an image sensor, but it also includes not only so-called RAW image data as captured, but also image data at various stages after various processes such as development, compression, encoding, etc., and image data converted into an image file, etc. Furthermore, captured image data also includes analog signals captured by an image sensor and before being converted into digital data.
<1.カメラシステムの構造>
 まず実施の形態のカメラシステム1の構造を図1から図5を参照して説明する。
 以下に示した実施の形態は、本技術のカメラシステム1をビデオカメラに適用したものである。
 尚、本技術の適用範囲は例示するようなビデオカメラに限られることはない。本技術は、例えば、ビデオカメラ、スチルカメラであったり、赤外線カメラや特定波長カメラ等の特殊な撮影機能を持つカメラであったり、業務用、一般用、監視用など各種の用途のカメラなど、多様な撮像装置に広く適用することができる。
1. Camera system structure
First, the structure of a camera system 1 according to the embodiment will be described with reference to FIGS.
In the embodiment described below, the camera system 1 of the present technology is applied to a video camera.
The scope of application of the present technology is not limited to the video camera illustrated in the example, but can be widely applied to various imaging devices, such as video cameras, still cameras, cameras with special imaging functions such as infrared cameras and specific wavelength cameras, and cameras for various purposes such as commercial, general, and surveillance cameras.
 以下の説明にあっては、ビデオカメラの撮影時において撮影者から見た方向で前後上下左右の方向を示すものとする。
 従って、物体側が前方となり、像面側が後方となる。
 尚、以下に示す前後上下左右の方向は説明の便宜上のものであり、本技術の実施に関しては、これらの方向に限定されることはない。
In the following description, the directions of front, back, up, down, left and right are indicated as viewed from the cameraman when taking pictures with a video camera.
Therefore, the object side is the front, and the image plane side is the rear.
Note that the directions of front, back, up, down, left and right shown below are for convenience of explanation, and the implementation of the present technology is not limited to these directions.
 またカメラシステムに搭載されるレンズ群は、単数又は複数のレンズにより構成されたものの他、これらの単数又は複数のレンズと絞りやアイリス等の他の光学素子を含んでもよい。 The lens group mounted on the camera system may be composed of one or more lenses, or may include one or more lenses and other optical elements such as an aperture or iris.
 カメラシステム1は本体部2とカメラヘッド部3を有し、カメラヘッド部3が本体部2に対して着脱可能にされている(図1から図5参照)。
 尚、本体部2はカメラヘッド部3に対する外部装置として設けられていてもよく、この場合にはカメラヘッド部3が外部装置に対して着脱可能にされる。また、カメラヘッド部3は単体で撮像装置として設けられていてもよい。
The camera system 1 has a main body 2 and a camera head 3, and the camera head 3 is detachable from the main body 2 (see FIGS. 1 to 5).
The main body 2 may be provided as an external device for the camera head 3. In this case, the camera head 3 is made detachable from the external device. The camera head 3 may be provided as a standalone imaging device.
 本体部2は外パネル4と外筐5を有し、外筐5が外パネル4によって覆われた構造にされている(図2参照)。 The main body 2 has an outer panel 4 and an outer housing 5, and the outer housing 5 is covered by the outer panel 4 (see Figure 2).
 外パネル4はベースパネル部6とアッパーパネル部7とリアパネル部8を有する。ベースパネル部6は上下方向を向く底面部9と、底面部9の左右両側縁にそれぞれ連続され左右に離隔して位置された一対の側面部10、11とを有している(図1、図4参照)。 The outer panel 4 has a base panel portion 6, an upper panel portion 7, and a rear panel portion 8. The base panel portion 6 has a bottom surface portion 9 facing in the vertical direction, and a pair of side surfaces 10, 11 that are connected to the left and right side edges of the bottom surface portion 9 and positioned at a distance from each other (see Figures 1 and 4).
 側面部10、11には各種の操作部12、12、・・・が配置されている。操作部12、12、・・・としては、例えば、電源釦、撮影釦、ズーム摘子、モード切替摘子等が設けられている。
 側面部10、11にはそれぞれ液晶パネル等の表示部13、13が配置されている。
Various operation sections 12, 12, ... are arranged on the side sections 10, 11. The operation sections 12, 12, ... include, for example, a power button, a shooting button, a zoom knob, a mode switching knob, and the like.
Display units 13, 13 such as liquid crystal panels are disposed on the side sections 10, 11, respectively.
 リアパネル部8の一方の側部には接続端子14、14、・・・が上下に並んで配置されている(図4、図5参照)。接続端子14、14、・・・には電力の供給や信号の送受信等を行うための図示しないケーブルが接続される。
 接続端子14の1つは、モニタ画像信号を出力するビデオ出力端子14aとされている。モニタ画像信号とは、カメラシステム1で撮像している被写体の画像をリアルタイムに表示して撮影者が確認できるようにする画像の信号であり、いわゆるスルー画と呼ばれる画像の信号である。ビデオ出力端子14aを図示しないケーブルによりモニタ装置に接続することで、ユーザが撮像中又は撮像スタンバイ中の被写体画像を見ることができる。
Connection terminals 14, 14, ... are arranged vertically on one side of the rear panel portion 8 (see FIGS. 4 and 5). Cables (not shown) for supplying power and transmitting/receiving signals are connected to the connection terminals 14, 14, ....
One of the connection terminals 14 is a video output terminal 14a that outputs a monitor image signal. The monitor image signal is a signal of an image that displays an image of a subject being imaged by the camera system 1 in real time so that the photographer can check it, and is a signal of an image called a through image. By connecting the video output terminal 14a to a monitor device via a cable (not shown), the user can view the subject image during image capture or during image capture standby.
 アッパーパネル部7は上下方向を向く板状に形成され、左右両端部がそれぞれ側面部10、11の上端部に取り付けられる(図1、図2参照)。
 リアパネル部8は前後方向を向く板状に形成され、外周部がベースパネル部6の後端部とアッパーパネル部7の後端部とに取り付けられる。
 上記のように、アッパーパネル部7が側面部10、11に取り付けられると共にリアパネル部8がベースパネル部6とアッパーパネル部7に取り付けられることにより外パネル4が構成され、外筐5が外パネル4によって上下左右及び後方から覆われる(図2参照)。
The upper panel portion 7 is formed in a plate shape facing the up-down direction, and both left and right ends are attached to the upper ends of the side portions 10, 11, respectively (see Figs. 1 and 2).
The rear panel portion 8 is formed in a plate shape facing the front-rear direction, and its outer periphery is attached to the rear end of the base panel portion 6 and the rear end of the upper panel portion 7 .
As described above, the upper panel portion 7 is attached to the side portions 10, 11, and the rear panel portion 8 is attached to the base panel portion 6 and the upper panel portion 7, thereby forming the outer panel 4, and the outer housing 5 is covered from above, below, left, right, and rear by the outer panel 4 (see Figure 2).
 アッパーパネル部7の上面には調整台15が取り付けられている(図1、図2参照)。調整台15は縦長の矩形状に形成され、左右方向における中央部が凹状の溝部16として形成され、溝部16の左右両側の部分が調整部17、17として設けられている。
 この調整台15にはハンドル80が着脱可能にされている(図4、図5参照)。
 また調整台15にはハンドル80の他にファインダーユニット85も着脱可能にされている。ファインダーユニット85は回動アーム90とビュー本体91を有している。
 ビュー本体91は、一端部がファインダー部91aとして設けられており、ユーザはファインダー部91aによりモニタ画像や操作画面等を見ることができる。
An adjustment base 15 is attached to the top surface of the upper panel portion 7 (see Figs. 1 and 2). The adjustment base 15 is formed in a vertically long rectangular shape, with a central portion in the left-right direction formed as a concave groove portion 16, and the left and right portions on both sides of the groove portion 16 are provided as adjustment portions 17, 17.
A handle 80 is detachably attached to the adjustment base 15 (see FIGS. 4 and 5).
In addition to the handle 80, a finder unit 85 is also detachably attached to the adjustment base 15. The finder unit 85 has a rotating arm 90 and a viewfinder main body 91.
One end of the view body 91 is provided as a finder section 91a, and the user can see a monitor image, an operation screen, and the like through the finder section 91a.
 本体部2の後面にはバッテリー501が装着される(図4、図5参照)。
 バッテリー501は本体部2やカメラヘッド部3の各部に対して電源電圧を供給するための電源となる。
A battery 501 is attached to the rear surface of the main body 2 (see FIGS. 4 and 5).
The battery 501 serves as a power source for supplying a power supply voltage to each part of the main body 2 and the camera head 3 .
 本体部2の前面側にはカメラヘッド部3が装着される(図1参照)。このカメラヘッド部3は、着脱可能とされ、本体部2から取り外すことができる(図2参照)。
 なお、図1、図2では、カメラヘッド部3にはさらにアダプター500が装着されている状態で示しているが、図3に示すとおり、カメラヘッド部3に対してはアダプター500が着脱可能とされている。
A camera head 3 is attached to the front side of the main body 2 (see FIG. 1). The camera head 3 is detachable and can be removed from the main body 2 (see FIG. 2).
1 and 2, the camera head unit 3 is shown with an adapter 500 attached thereto. However, as shown in FIG. 3, the adapter 500 is detachable from the camera head unit 3.
 アダプター500は異なる交換レンズの装着のために用いられる。
 例えばカメラヘッド部3に対しては、アダプター500を装着していない状態で、図5に示す交換レンズ503を装着することができる。
 一方、カメラヘッド部3にアダプター500を装着することで、図4に示す異なるタイプの交換レンズ502を装着することができるように構成されている。
The adapter 500 is used for mounting different interchangeable lenses.
For example, an interchangeable lens 503 shown in FIG. 5 can be attached to the camera head unit 3 without the adapter 500 being attached.
On the other hand, by mounting an adapter 500 on the camera head unit 3, it is possible to mount a different type of interchangeable lens 502 shown in FIG.
 カメラヘッド部3は、概略方形で略板状の筐体140を有し、前面側がアダプター500や交換レンズ503に対応する装着面141とされている(図3参照)。
 そして所要の位置に複数のネジ孔142が設けられている。このネジ孔142はアダプター500に設けられている複数のネジ孔550と対応した位置に形成されており、図1のようにネジ551により螺合することで、アダプター500がカメラヘッド部3に装着されるようにしている。
The camera head unit 3 has a generally rectangular, plate-like housing 140, the front side of which serves as a mounting surface 141 that is adapted for receiving an adapter 500 and an interchangeable lens 503 (see FIG. 3).
A plurality of screw holes 142 are provided at required positions. These screw holes 142 are formed at positions corresponding to a plurality of screw holes 550 provided in the adapter 500, and the adapter 500 can be attached to the camera head unit 3 by screwing in the screw 551 as shown in FIG.
 カメラヘッド部3には、イメージセンサ300や、ここでは図示しないND(neutral density)フィルタ、アイリス機構等の光学系素子や、必要な回路を搭載した回路基板などが搭載されている(図3参照)。
 またカメラヘッド部3にはアサイナブルボタン302が設けられている。
 アサイナブルボタン302は、ユーザが当該ボタンに任意の操作機能を割り当てることができる操作子である。例えば本体部2においても、操作部12としてのいくつかのボタンがアサイナブルボタン12aとされている。
 各アサイナブルボタン12a、302に対して、ユーザは、例えば録画開始/停止操作、再生操作、メニュー操作など、自分の使用勝手に応じて任意の操作機能を割り当てることができる。
 カメラヘッド部3にもアサイナブルボタン302が設けられていることで、カメラヘッド部3を本体部2から離して使用する場合でも、アサイナブルボタン302によりカメラヘッド部3側のユーザが必要な操作を行うことができ、使用上便利のものとなる。
The camera head unit 3 is equipped with an image sensor 300, optical elements such as an ND (neutral density) filter and an iris mechanism (not shown), and a circuit board carrying necessary circuits (see FIG. 3).
The camera head unit 3 is also provided with an assignable button 302 .
The assignable buttons 302 are operators to which the user can assign any operation function. For example, even in the main body 2, some buttons in the operation unit 12 are set as assignable buttons 12a.
To each of the assignable buttons 12a and 302, the user can assign any operation function according to his/her convenience, such as a recording start/stop operation, a playback operation, a menu operation, etc.
Since the camera head unit 3 is also provided with an assignable button 302, even when the camera head unit 3 is used away from the main body unit 2, the user on the camera head unit 3 side can perform the necessary operations using the assignable button 302, making it convenient to use.
  上記のようにカメラヘッド部3は、本体部2の外筐5に対して着脱可能にされている(図1、図2参照)。
 本体部2には装着部18が形成されており、カメラヘッド部3はこの装着部18に対して所定の機構により取り付けられる(図2参照)。
 ここで装着のための機構としては、端にカメラヘッド部3の筐体140と装着部18内に対応するネジ孔を設け、ネジ止めとしてもよい。但し例えば係止機構や係止を解除する解除機構を設け、ネジを用いずに容易に脱着できるようにしてもよい。特に後述する延長使用形態を想定すれば、ネジを使用せずにカメラヘッド部3と本体部2を容易に着脱できるようにすることが望ましい。
As described above, the camera head unit 3 is detachable from the outer casing 5 of the main body unit 2 (see FIGS. 1 and 2).
The main body 2 is formed with a mounting portion 18, and the camera head 3 is attached to the mounting portion 18 by a predetermined mechanism (see FIG. 2).
Here, as a mechanism for attachment, corresponding screw holes may be provided in the housing 140 of the camera head unit 3 and the attachment unit 18 at the end, and the attachment unit 18 may be fastened with screws. However, for example, a locking mechanism or a release mechanism for releasing the locking mechanism may be provided to allow easy attachment and detachment without using screws. In particular, assuming an extended use mode described later, it is desirable to allow the camera head unit 3 and the main body unit 2 to be easily attached and detached without using screws.
 カメラヘッド部3の装着面141に対する背面側は端子面3Tを図9Aに示すが、この端子面3Tにはコネクタ3aが設けられている。
 これに対応して本体部2の装着部18には、コネクタ2aが設けられている。
 カメラヘッド部3が本体部2に装着された状態では、コネクタ2a,3aが接合し、これにより本体部2とカメラヘッド部3の間で各種信号の伝送が行われる。
FIG. 9A shows a terminal surface 3T on the rear side of the camera head unit 3 opposite the mounting surface 141, and this terminal surface 3T is provided with a connector 3a.
Correspondingly, the mounting portion 18 of the main body 2 is provided with a connector 2a.
When the camera head 3 is attached to the main body 2 , the connectors 2 a and 3 a are joined, thereby transmitting various signals between the main body 2 and the camera head 3 .
<2.延長使用形態>
 以上の構造のカメラシステム1では、カメラヘッド部3は図1のように本体部2に装着された状態で使用されるが、カメラヘッド部3を本体部2から機構的に分離させて使用できるようにもしている。
<2. Extended use form >
In the camera system 1 having the above structure, the camera head unit 3 is used in a state where it is attached to the main body unit 2 as shown in FIG. 1, but the camera head unit 3 can also be used in a state where it is mechanically separated from the main body unit 2. I also try to do that.
 以下、カメラヘッド部3を本体部2から延長させて使用する構造について説明していく。なお用語として、「基本状態」「延長状態」という言葉を用いる。
 「基本状態」とは、カメラヘッド部3が本体部2に装着されている非分離状態を指す。即ち図1,図4,図5に示した使用状態である。
 「延長状態」とは、カメラヘッド部3が本体部2から機構的に分離された分離状態であって、延長ケーブル20を介して本体部2と信号伝送可能に通信接続されている状態を指す。
The following describes a structure in which the camera head unit 3 is used by extending it from the main body unit 2. The terms "basic state" and "extended state" are used.
The "basic state" refers to a non-separated state in which the camera head unit 3 is attached to the main body unit 2. That is, this is the usage state shown in Figures 1, 4 and 5.
The "extended state" refers to a state in which the camera head unit 3 is mechanically separated from the main body unit 2 and is communicatively connected to the main body unit 2 via the extension cable 20 so as to be capable of transmitting signals.
 なお、「本体部2側」「カメラヘッド部3側」と「側」を付して表記する場合は、その本体部2やカメラヘッド部3に機構的に接合されている部分も含めた呼称とする。即ち「本体部2側」とは、延長状態では後述するベースプレート50を含む。また「カメラヘッド部3側」とは、延長状態では後述する第1コネクタ部21を含む。 When the terms "main body 2 side" and "camera head 3 side" are used, the terms also refer to the parts that are mechanically connected to the main body 2 and camera head 3. That is, the "main body 2 side" includes the base plate 50, which will be described later, in the extended state. The "camera head 3 side" includes the first connector 21, which will be described later, in the extended state.
 以下説明する延長ケーブル20は電気的接続のために用いるケーブルである。そして延長ケーブル20は、本体部2とカメラヘッド部3の間を接続するケーブルのことである。延長ケーブル20としては、カメラヘッド部3と本体部2を接続できるものであれば良く、ビニールなどの絶縁体で被覆された比較的軟質性のものでも良いし、硬質性のチューブ(円筒管)のようなものでも良い。 The extension cable 20 described below is a cable used for electrical connection. The extension cable 20 is a cable that connects between the main body 2 and the camera head 3. Any extension cable that can connect the camera head 3 and the main body 2 can be used as the extension cable 20, and it can be a relatively soft cable covered with an insulator such as vinyl, or it can be a hard tube (cylindrical pipe).
 まず図6、図7、図8、図9A、図9B、図10を用いて延長状態の形態を説明する。
 図6は、本体部2からカメラヘッド部3が取り外され、延長ケーブル20によって接続されている状態を示している。
 延長ケーブル20は、第1コネクタ部21、ケーブル22、第2コネクタ部23を有して構成される(図6、図7、図10参照)。
 第1コネクタ部21はカメラヘッド部3に接続される、ケーブル22の一端側のコネクタ部である。
 第2コネクタ部23は、本体部2側に接続される、ケーブル22の他端側のコネクタ部である。
 ケーブル22は、第1コネクタ部21と第2コネクタ部23の間の信号伝送を行うもので、所要数の電気信号の伝送路が形成されている。
First, the configuration of the extended state will be described with reference to FIGS. 6, 7, 8, 9A, 9B, and 10. FIG.
FIG. 6 shows a state in which the camera head unit 3 is detached from the main body unit 2 and connected via an extension cable 20.
The extension cable 20 is configured to include a first connector portion 21, a cable 22, and a second connector portion 23 (see FIGS. 6, 7, and 10).
The first connector portion 21 is a connector portion on one end side of a cable 22 that is connected to the camera head portion 3 .
The second connector portion 23 is a connector portion on the other end side of the cable 22 that is connected to the main body portion 2 side.
The cable 22 transmits signals between the first connector portion 21 and the second connector portion 23, and has a required number of transmission paths for electrical signals formed therein.
 図6の延長状態では、カメラヘッド部3を本体部2から例えば前方に、ケーブル22の長さの分だけ離間させて撮像を行うことができるようになる。 In the extended state shown in FIG. 6, the camera head unit 3 can be moved, for example, forward from the main body unit 2 by the length of the cable 22 to capture images.
 第1コネクタ部21は、図6、図7のように、カメラヘッド部3の背面側の端子面3T(図9A参照)に対して接合される形態とされている。
 第1コネクタ部21の筐体21Kは、例えばカメラヘッド部3の筐体140の端子面3Tとほぼ同型同サイズの輪郭形状とされた端子面21FR(図10参照)を有するものとされている。これにより端子面3Tと端子面21FRが相対する状態で接合されたときは、図6,図7に示すように、カメラヘッド部3の筐体140と第1コネクタ部21の筐体21Kが1つのボックスを形成するように一体化される。
The first connector portion 21 is adapted to be joined to a terminal surface 3T (see FIG. 9A) on the rear side of the camera head portion 3, as shown in FIGS.
The housing 21K of the first connector unit 21 has a terminal surface 21FR (see FIG. 10) that has a contour shape of approximately the same type and size as the terminal surface 3T of the housing 140 of the camera head unit 3. As a result, when the terminal surface 3T and the terminal surface 21FR are joined in a state where they face each other, the housing 140 of the camera head unit 3 and the housing 21K of the first connector unit 21 are integrated to form a single box, as shown in FIGS.
 特にカメラヘッド部3の筐体140は比較的薄い板状であるため、単体では自立しにくい。またアダプター500の重量もありバランスも悪い。これに対して第1コネクタ部21が装着されることカメラヘッド部3が自立し易いものとすることができる。
 またカメラヘッド部3の筐体140が比較的薄い板状であることは、カメラヘッド部3のみではユーザが被写体に向けて安定して持ちにくいということもある。第1コネクタ部21の筐体によって厚みが増すことで、カメラヘッド部3を本体部2から取り外したときに手に持ちやすく、取り扱いやすいものとなる。
In particular, the housing 140 of the camera head unit 3 is a relatively thin plate, making it difficult for it to stand on its own. Also, the weight of the adapter 500 makes the balance poor. By attaching the first connector unit 21 to this, the camera head unit 3 can be made to stand on its own more easily.
Furthermore, the fact that the housing 140 of the camera head unit 3 is a relatively thin plate means that it is difficult for a user to stably hold the camera head unit 3 alone and point it toward a subject. The increased thickness due to the housing of the first connector unit 21 makes the camera head unit 3 easier to hold and handle when removed from the main body unit 2.
 第1コネクタ部21の筐体21Kの背面21BK側には、ケーブル22のケーブル端22E1が固定装着されている(図7参照)。
 ケーブル端22E1は、その筐体21Kに対する固定部分から、筐体21Kに設けられた切欠部126に沿って下方に向かうように取り付けられている。
 切欠部126内にケーブル端22E1が位置することで、ケーブル端22E1に外部応力が加わりにくい状態となり、ケーブル端22E1の保護が図られている。
 また、ケーブル端22E1からケーブル22が下向きに伸びるようにされることで、カメラヘッド部3が図7に示す状態に起立した状態に保たれやすくなる。
A cable end 22E1 of the cable 22 is fixedly attached to the rear surface 21BK side of the housing 21K of the first connector portion 21 (see FIG. 7).
Cable end 22E1 is attached so as to extend downward from its fixed portion relative to housing 21K along notch 126 provided in housing 21K.
By positioning the cable end 22E1 inside the cutout portion 126, external stress is less likely to be applied to the cable end 22E1, and the cable end 22E1 is protected.
Furthermore, by making the cable 22 extend downward from the cable end 22E1, the camera head unit 3 can be easily maintained in the upright position shown in FIG.
 カメラヘッド部3の端子面3Tにはコネクタ3aが配置されている(図9A参照)。また第1コネクタ部21の端子面21FRにはコネクタ21aが配置される(図10参照)。
 図6のようにカメラヘッド部3と第1コネクタ部21を接合した状態では、コネクタ3a,21aが接合し、延長ケーブル20とカメラヘッド部3の間での信号の伝送が行われるようになる。
 第1コネクタ部21にはビデオ出力端子121が形成されており、例えばモニタ装置が接続可能とされている。
 また第1コネクタ部21には外部電源出力端子120が形成されており、外部電源が必要な機器に対して電源電圧の供給が可能とされている。
A connector 3a is disposed on a terminal surface 3T of the camera head unit 3 (see FIG. 9A). Also, a connector 21a is disposed on a terminal surface 21FR of the first connector unit 21 (see FIG. 10).
When the camera head 3 and the first connector 21 are joined as shown in FIG. 6, the connectors 3a and 21a are joined, and signals are transmitted between the extension cable 20 and the camera head 3.
The first connector portion 21 is formed with a video output terminal 121 to which, for example, a monitor device can be connected.
The first connector portion 21 is also formed with an external power output terminal 120, which makes it possible to supply a power supply voltage to a device that requires an external power supply.
 ケーブル22の他端側の第2コネクタ部23は、上下が湾曲された略直方体状の筐体23Kにより形成されている。
 筐体23Kの上面及び下面には窪み部127が形成されている。
 筐体23Kの背面23BKには、一対のハンドル23Hが取り付けられている。ハンドル23Hや窪み部127により第2コネクタ部23の取り扱いが容易となる。
The second connector portion 23 on the other end side of the cable 22 is formed of a housing 23K having a generally rectangular parallelepiped shape with curved top and bottom.
A recess 127 is formed on the upper and lower surfaces of the housing 23K.
A pair of handles 23H are attached to the rear surface 23BK of the housing 23K. The handles 23H and the recessed portion 127 make it easy to handle the second connector portion 23.
 ケーブル22の他端側は、そのケーブル端22E2が背面23BKに対して垂直方向に突出する状態に固定されている。
 特にケーブル端22E2が背面23BKの略中央に固定され、その左右両側に一対のハンドル23Hが位置するようにされることで、一対のハンドル23Hによりケーブル端22E2の接合部分に外部応力が加わりにくくなる。即ちケーブル端22E2がハンドル23Hにより保護されている。
The other end of the cable 22 is fixed in a state in which the cable end 22E2 projects perpendicularly from the rear surface 23BK.
In particular, by fixing the cable end 22E2 substantially at the center of the back surface 23BK and positioning the pair of handles 23H on both the left and right sides of the cable end 22E2, the pair of handles 23H makes it difficult for external stress to be applied to the joint portion of the cable end 22E2. In other words, the cable end 22E2 is protected by the handles 23H.
 この第2コネクタ部23は、本体部2側に対して着脱可能とされる。特に本実施の形態では、本体部2にベースプレート50を装着した状態で、第2コネクタ部23が着脱可能となるようにしている。換言すればベースプレート50は、本体部2側の構造体として設けられ、本体部2に延長ケーブル20を接続可能とするものである。
 図8にベースプレート50の表面側(端子面50Ta)を示している。また図9Bにベースプレート50の裏面側(端子面50Tb)を示している。
The second connector portion 23 is detachable from the main body portion 2. In particular, in this embodiment, the second connector portion 23 is detachable with the base plate 50 attached to the main body portion 2. In other words, the base plate 50 is provided as a structure on the main body portion 2 side, and allows the extension cable 20 to be connected to the main body portion 2.
Fig. 8 shows the front side (terminal surface 50Ta) of the base plate 50. Fig. 9B shows the back side (terminal surface 50Tb) of the base plate 50.
 ベースプレート50は、端子面50Taに中央に、第2コネクタ部23を装着するための凹部54を有している。
 凹部54は、第2コネクタ部23の筐体23Kが嵌入するサイズとされる。
 図6は、第2コネクタ部23が凹部54に嵌入された状態を示している。図示のように第2コネクタ部23の筐体23Kは、ほぼ凹部54の側面に密着する状態で、わずかな突出部分を残すのみの状態にまで嵌入する。嵌入状態では、筐体23Kの窪み部127が凹部54との隙間として残される。筐体23Kの上下に窪み部127が形成されていることで、筐体23Kと凹部54の摩擦を過大にせずに、着脱が容易化される。
 またユーザはハンドル23Hを用いることで、ベースプレート50に対する第2コネクタ部23の着脱を容易に行うことができる。
The base plate 50 has a recess 54 in the center of the terminal surface 50Ta for mounting the second connector portion 23 therein.
The recess 54 is sized to fit the housing 23K of the second connector portion 23 therein.
6 shows a state in which the second connector portion 23 is fitted into the recess 54. As shown in the figure, the housing 23K of the second connector portion 23 is fitted into the recess 54 to the extent that it is in close contact with the side surface of the recess 54, leaving only a slight protruding portion. In the fitted state, the recessed portions 127 of the housing 23K are left as gaps between the housing 23K and the recess 54. The formation of the recessed portions 127 on the top and bottom of the housing 23K facilitates attachment and detachment without excessive friction between the housing 23K and the recess 54.
In addition, the user can easily attach and detach the second connector portion 23 to and from the base plate 50 by using the handle 23H.
 また第2コネクタ部23の大部分はベースプレート50に嵌入することで、第2コネクタ部23の接合による出っ張りが少なくなる。つまり本体部2にはベースプレート50と第2コネクタ部23の両方を装着することになるが、ベースプレート50と第2コネクタ部23の厚み方向のサイズがそのまま本体部2からの出っ張りとなることはないため、延長状態における本体部2側の前後方向のサイズを抑制できる。 Also, most of the second connector portion 23 fits into the base plate 50, so there is less protrusion due to the joining of the second connector portion 23. In other words, both the base plate 50 and the second connector portion 23 are attached to the main body portion 2, but the thickness direction size of the base plate 50 and the second connector portion 23 does not directly protrude from the main body portion 2, so the front-to-rear size of the main body portion 2 in the extended state can be reduced.
 ベースプレート50の凹部54の奥の面にはコネクタ50aが配置されている(図8参照)。また第2コネクタ部23の端子面23FRにはコネクタ23aが配置される(図10参照)。
 図6のようにベースプレート50に対して第2コネクタ部23を装着した状態では、コネクタ50a,23aが接合し、延長ケーブル20とベースプレート50の間での信号の伝送が行われるようになる。
A connector 50a is disposed on the innermost surface of the recess 54 of the base plate 50 (see FIG. 8). A connector 23a is disposed on the terminal surface 23FR of the second connector portion 23 (see FIG. 10).
When the second connector portion 23 is attached to the base plate 50 as shown in FIG. 6, the connectors 50 a and 23 a are joined, and signals are transmitted between the extension cable 20 and the base plate 50 .
 ベースプレート50の本体部2に対向する面は端子面50Tbとされる(図9B参照)。
 ベースプレート50は本体部2の装着部18に対して着脱可能とされる。即ち本体部2からカメラヘッド部3を取り外した状態(図2参照)において表出する装着部18に対してベースプレート50を装着できる。
 このため、図9A、図9Bに示すように、ベースプレート50の端子面50Tbと、カメラヘッド部3の端子面3Tとは、完全同一形状である必要はないが、いずれも装着部18に対応して装着できる構造とされている。
The surface of the base plate 50 facing the main body 2 is a terminal surface 50Tb (see FIG. 9B).
The base plate 50 is detachable from the mounting portion 18 of the main body 2. That is, the base plate 50 can be attached to the mounting portion 18 that is exposed when the camera head 3 is removed from the main body 2 (see FIG. 2).
For this reason, as shown in Figures 9A and 9B, the terminal surface 50Tb of the base plate 50 and the terminal surface 3T of the camera head unit 3 do not need to have exactly the same shape, but both are structured so that they can be attached to the mounting portion 18.
 ベースプレート50の端子面50Tbには、コネクタ50bが設けられている。このコネクタ50bは装着部18におけるコネクタ2a(図2参照)と接合可能なコネクタであって、ベースプレート50を装着部18に装着した状態でコネクタ2aに相対して接合される位置に形成されている。従ってベースプレート50が本体部2に装着されると、コネクタ50b、2aを介して、本体部2とベースプレート50の間で信号の伝送が行われるようになる。 A connector 50b is provided on the terminal surface 50Tb of the base plate 50. This connector 50b is a connector that can be joined to the connector 2a (see FIG. 2) in the mounting portion 18, and is formed in a position where it can be joined opposite the connector 2a when the base plate 50 is attached to the mounting portion 18. Therefore, when the base plate 50 is attached to the main body portion 2, signals are transmitted between the main body portion 2 and the base plate 50 via the connectors 50b and 2a.
 またコネクタ50bは、基板55の一面側に取り付けられており、基板55の他面側には上述したコネクタ50aが取り付けられている。基板55は、コネクタ50a,50bの各ピン間の配線を有しており、これによりコネクタ50a側まで伝送路が形成されていることになる。 In addition, connector 50b is attached to one side of board 55, and the above-mentioned connector 50a is attached to the other side of board 55. Board 55 has wiring between each pin of connectors 50a and 50b, which forms a transmission path all the way to connector 50a.
 図6に示すように、ベースプレート50には外部電源入力端子51が形成されている。これに対応してカメラヘッド部3側となる第1コネクタ部21には、外部電源出力端子120が形成されている。
 コネクタ50a,23a、ケーブル22、コネクタ21aで形成される経路として外部電源供給のためのラインが形成されている。
 これにより本体部2側で外部電源入力端子51に電源装置(電源アダプター等)を接続して、電源供給を行うことで、カメラヘッド部3側では外部電源出力端子120から電源電圧を得、必要な装置を駆動することができる。
 例えば交換レンズ502等としては、レンズ駆動等のために外部電源を必要とするものもあるが、そのような場合に、第2コネクタ部23に設けられた外部電源出力端子120を用いて電源電圧を確保することができる。つまり本体部2から離間した状態で、カメラヘッド部3側でも、別途の電源配線を必要とせずに、外部電源電圧を使用することができるようになっている。
6, an external power supply input terminal 51 is formed on the base plate 50. Correspondingly, an external power supply output terminal 120 is formed on the first connector portion 21 on the camera head portion 3 side.
The connectors 50a, 23a, the cable 22, and the connector 21a form a line for supplying external power.
This allows a power supply device (such as a power adapter) to be connected to the external power input terminal 51 on the main body 2 side to supply power, so that the camera head unit 3 side can obtain power supply voltage from the external power output terminal 120 and drive the necessary devices.
For example, some interchangeable lenses 502 and the like require an external power source for driving the lens, etc., and in such cases, the power supply voltage can be secured using the external power supply output terminal 120 provided in the second connector portion 23. In other words, even when separated from the main body portion 2, the external power supply voltage can be used on the camera head portion 3 side without requiring a separate power supply wiring.
 またベースプレート50にはビデオ入力端子53が設けられ、対応して第1コネクタ部21にはビデオ出力端子121が設けられている。
 コネクタ50a,23a、ケーブル22、コネクタ21aで形成される経路としてビデオ入力端子53とビデオ出力端子121間の伝送ラインが形成されている。そしてビデオ出力端子121にモニタ装置を接続することで、本体部2側から供給する画像信号をモニタ装置に供給し表示させることができる。例えば本体部2で生成するモニタ画像信号(スルー画)をカメラヘッド部3側のユーザが見ることができる。
In addition, a video input terminal 53 is provided on the base plate 50, and a video output terminal 121 is provided on the first connector portion 21 correspondingly.
A transmission line between the video input terminal 53 and the video output terminal 121 is formed as a path formed by the connectors 50a, 23a, the cable 22, and the connector 21a. By connecting a monitor device to the video output terminal 121, an image signal supplied from the main body 2 side can be supplied to the monitor device and displayed. For example, a user on the camera head 3 side can view a monitor image signal (through image) generated by the main body 2.
 また上述したようにカメラヘッド部3にはアサイナブルボタン302が設けられているが、これに対応してベースプレート50にはアサイナブルボタン52が設けられている。
 アサイナブルボタン52はアサイナブルボタン302と同機能の操作子であるが、ベースプレート50に設けられることで、カメラヘッド部3側と同じ操作を本体部2側でできることとなる。
As described above, the camera head unit 3 is provided with the assignable button 302, and correspondingly, the base plate 50 is provided with the assignable button 52.
The assignable button 52 is an operator with the same function as the assignable button 302, but by being provided on the base plate 50, it becomes possible to perform the same operations on the main body 2 side as on the camera head 3 side.
 図6に模式的に示しているが、本体部2の内部には水準器30が設けられている。さらに、カメラヘッド部3側には水準器40が設けられている。 As shown diagrammatically in FIG. 6, a spirit level 30 is provided inside the main body 2. In addition, a spirit level 40 is provided on the camera head 3 side.
 水準器30は例えば本体部2の内部底面付近などに配置され、本体部2の底面の水平度を検出する水準器とされる。具体的には水準器情報として、水平状態を基準としたピッチ方向の角度情報、ロール方向の角度情報を検出する。
 なお、水準器30は本体部2の内部底面付近に限らず、水平基準とする面(例えば底面部9)との相対的な位置や姿勢関係が変化しない箇所に取り付けられるものであればよい。
The spirit level 30 is disposed, for example, near the inner bottom surface of the main body 2, and serves as a spirit level that detects the horizontality of the bottom surface of the main body 2. Specifically, as spirit level information, angle information in the pitch direction and angle information in the roll direction based on a horizontal state are detected.
The level 30 need not necessarily be attached near the inner bottom surface of the main body 2, but may be attached at any location where the relative position and attitude with respect to a surface serving as a horizontal reference (for example, the bottom surface 9) does not change.
 この水準器30は、カメラシステム1の水平基準を得るための装置となる。例えば本体部2、通常状態におけるカメラヘッド部3のイメージセンサ、本体部2に取り付けられるアクセサリ部品などの水平度が、水準器30によって検出される。 This spirit level 30 is a device for obtaining a horizontal reference for the camera system 1. For example, the level of the main body 2, the image sensor of the camera head 3 in the normal state, accessory parts attached to the main body 2, etc. are detected by the spirit level 30.
 一方、カメラヘッド部3側にも水準器40が設けられる。水準器40は、カメラヘッド部3に内蔵されてもよいし、第1コネクタ部21に内蔵されてもよい。或いはカメラヘッド部3や第1コネクタ部21の筐体外部に取り付けられるものでもよい。水準器40は、少なくともカメラヘッド部3が分離状態にあるときに、カメラヘッド部3内のイメージセンサ300と相対的な位置や姿勢関係が変化しない箇所に取り付けられるものであればよい。
 即ち水準器40は、水準器情報として、水平状態を基準としたイメージセンサ300のピッチ方向の角度情報、ロール方向の角度情報を検出する。
Meanwhile, a spirit level 40 is also provided on the camera head unit 3 side. The spirit level 40 may be built in the camera head unit 3 or the first connector unit 21. Alternatively, the spirit level 40 may be attached to the outside of the housing of the camera head unit 3 or the first connector unit 21. The spirit level 40 may be attached to a location where the position and attitude relative to the image sensor 300 in the camera head unit 3 do not change at least when the camera head unit 3 is in a separated state.
That is, the level 40 detects, as level information, angle information in the pitch direction and angle information in the roll direction of the image sensor 300 based on the horizontal state.
 これら水準器30,40はそれぞれ、ピッチ方向とロール方向の角度情報を検出できるものであればよく、例えば2軸角速度センサなどにより構成することができる。
Each of these spirit levels 30, 40 may be anything capable of detecting angle information in the pitch and roll directions, and may be configured, for example, by a two-axis angular velocity sensor.
<3.コネクタ接続>
 上述の基本状態、延長状態におけるコネクタ接続について説明する。
 上記では以下のコネクタについて述べた。
・カメラヘッド部3のコネクタ3a
・延長ケーブル20の第1コネクタ部21のコネクタ21a
・延長ケーブル20の第2コネクタ部23のコネクタ23a
・ベースプレート50の端子面50Taのコネクタ50a
・ベースプレート50の端子面50Tbのコネクタ50b
・本体部2のコネクタ2a
<3. Connector connection>
The connector connection in the above-mentioned basic state and extended state will be described.
The following connectors were mentioned above:
Connector 3a of camera head unit 3
Connector 21a of the first connector portion 21 of the extension cable 20
Connector 23a of the second connector portion 23 of the extension cable 20
Connector 50a on terminal surface 50Ta of base plate 50
Connector 50b on terminal surface 50Tb of base plate 50
Connector 2a of main body 2
 これらのコネクタは、少なくとも次のように対応する。
 カメラヘッド部3のコネクタ3aは、コネクタ2a、21aに接合可能である。
 延長ケーブル20の第1コネクタ部21のコネクタ21aは、コネクタ3aに接合可能である。
 延長ケーブル20の第2コネクタ部23のコネクタ23aは、コネクタ50aに接合可能である。
 ベースプレート50の端子面50Taのコネクタ50aはコネクタ23aに接合可能である。
 ベースプレート50の端子面50Tbのコネクタ50bはコネクタ2aに接合可能である。
 本体部2のコネクタ2aはコネクタ3a、50bに接合可能である。
These connectors correspond to at least the following:
The connector 3a of the camera head unit 3 can be joined to the connectors 2a and 21a.
A connector 21a of the first connector portion 21 of the extension cable 20 can be joined to the connector 3a.
The connector 23a of the second connector portion 23 of the extension cable 20 can be joined to the connector 50a.
The connector 50a on the terminal surface 50Ta of the base plate 50 can be joined to the connector 23a.
The connector 50b on the terminal surface 50Tb of the base plate 50 can be joined to the connector 2a.
The connector 2a of the main body 2 can be joined to the connectors 3a and 50b.
 図11に各状態でのコネクタ接続を示している。なお以降の各図で「CN」はコネクタを表すこととする。
 図11Aは基本状態を示している。
 本体部2にカメラヘッド部3が装着され、コネクタ2a,3aが接合されて、本体部2とカメラヘッド部3の間の信号伝送が行われる。
The connector connections in each state are shown in Fig. 11. In the following figures, "CN" stands for connector.
FIG. 11A shows the base state.
The camera head 3 is attached to the main body 2 and connectors 2 a and 3 a are joined together to transmit signals between the main body 2 and the camera head 3 .
 図11Bは延長状態を示している。
 本体部2にベースプレート50が装着される。そしてカメラヘッド部3とベースプレート50の間に延長ケーブル20が取り付けられる。
 この場合、コネクタ2a,50bが接合され、コネクタ50a,23aが接合され、コネクタ21a,3aが接合される。これによりベースプレート50と延長ケーブル20を介して、本体部2とカメラヘッド部3の間の信号伝送が行われる。
FIG. 11B shows the extended state.
A base plate 50 is attached to the main body 2. An extension cable 20 is attached between the camera head 3 and the base plate 50.
In this case, the connectors 2a and 50b are joined, the connectors 50a and 23a are joined, and the connectors 21a and 3a are joined. As a result, signals are transmitted between the main body 2 and the camera head 3 via the base plate 50 and the extension cable 20.
  なお、これらのコネクタの全部又は一部は基板に取り付けられ、ボードトゥボード(BtoB)コネクタとして配置されている。
 各コネクタのピン数(端子数)は、特に規定されるものではないが、例えば100ピン以上設けられていることが考えられる。
All or some of these connectors are attached to a substrate and arranged as board-to-board (BtoB) connectors.
The number of pins (terminals) of each connector is not particularly specified, but it is considered that each connector has, for example, 100 pins or more.
<4.各状態での電気的接続構成>
 以下では、通常状態と延長状態での電気的接続構成を説明するが、そのためにまず各部の内部の信号処理系の構成について説明しておく。
 なお、図12,図13は、水準器40が第1コネクタ部21に搭載される場合の例であり、図14,図15は、水準器40がカメラヘッド部3に搭載される場合の例である。
4. Electrical connection configuration in each state
The electrical connection configurations in the normal state and the extended state will be explained below, but first the configuration of the internal signal processing systems of each section will be explained.
12 and 13 show an example in which the spirit level 40 is mounted on the first connector portion 21, and FIGS. 14 and 15 show an example in which the spirit level 40 is mounted on the camera head portion 3.
 図12には基本状態における電気的接続状態を示しているが、この図12により本体部2とカメラヘッド部3の要部構成を説明する。 FIG. 12 shows the electrical connection state in the basic state, and the main configuration of the main body unit 2 and the camera head unit 3 will be explained using this FIG. 12.
 本体部2は制御部200、信号処理部202、記録部203、通信部204、電源回路205、水準器30を備える。なお、これら以外にも表示制御部、表示部、操作部等、カメラ本体として必要な構成を備えるが、図12から図15では各部の対応関係を明確にしつつ図の煩雑化を避けるために、一部の構成の図示を省略している。 The main body 2 includes a control unit 200, a signal processing unit 202, a recording unit 203, a communication unit 204, a power supply circuit 205, and a spirit level 30. In addition to these, the camera body includes other components such as a display control unit, a display unit, and an operation unit, but in Figs. 12 to 15, some components are omitted from the illustration in order to clarify the correspondence between the various units and to avoid cluttering the figures.
 カメラヘッド部3はイメージセンサ300、レンズ系駆動部301、アサイナブルボタン302、ID発生部303を備えている。カメラヘッド部3についても他にも構成要素はあるが同様の理由で図示を一部省略している。 The camera head unit 3 includes an image sensor 300, a lens system drive unit 301, an assignable button 302, and an ID generation unit 303. The camera head unit 3 also has other components, but for the same reason, some of these are not shown in the illustration.
 イメージセンサ300は、例えばCCD(Charge Coupled Device)型、CMOS(Complementary Metal Oxide Semiconductor)型など、光電変換画素がマトリクス状に配置されて形成された撮像素子を有する。そして図示しない光学系により被写体からの光がイメージセンサ300に集光される。
 ここで光学系とは、例えばズームレンズ、フォーカスレンズ等のレンズや絞り機構、光学フィルタ等を指し、これらはカメラヘッド部3内、アダプター500内、交換レンズ502,503内に設けることができる。
The image sensor 300 has an imaging element formed by arranging photoelectric conversion pixels in a matrix, such as a CCD (Charge Coupled Device) type, a CMOS (Complementary Metal Oxide Semiconductor) type, etc. Light from a subject is collected on the image sensor 300 by an optical system (not shown).
Here, the optical system refers to lenses such as a zoom lens and a focus lens, an aperture mechanism, an optical filter, etc., and these can be provided in the camera head unit 3, the adapter 500, or the interchangeable lenses 502 and 503.
 イメージセンサ300では、撮像素子での光電変換で得た電気信号について、例えばCDS(Correlated Double Sampling)処理、AGC(Automatic Gain Control)処理などを実行し、さらにA/D(Analog/Digital)変換処理を行う。そしてデジタルデータとしての撮像画像データを本体部2側に出力する。イメージセンサ300は例えばいわゆるRAW画像データとしての画像信号を出力する。 The image sensor 300 performs processes such as CDS (Correlated Double Sampling) and AGC (Automatic Gain Control) on the electrical signal obtained by photoelectric conversion in the image sensor, and then performs A/D (Analog/Digital) conversion. The captured image data is then output as digital data to the main body 2. The image sensor 300 outputs an image signal, for example, as so-called RAW image data.
  レンズ系駆動部301は、制御部200の制御に基づいて、上記の光学系におけるフォーカスレンズ、ズームレンズ、絞り機構、光学フィルタ機構等を駆動する。 The lens system driving unit 301 drives the focus lens, zoom lens, aperture mechanism, optical filter mechanism, etc. in the above optical system based on the control of the control unit 200.
 ID発生部303は、カメラヘッド部3の識別情報を生成する。簡易的にはID発生部303は、1又は複数のコネクタ端子に対応する端子電圧設定部として構成できる。或いはID発生部303は識別情報を記憶したメモリ或いはプロセッサとしてもよい。
 ID発生部303による識別情報は、例えば本体部2の制御部200が、カメラヘッド部3の非分離状態と分離状態を識別するための情報として用いられる。或いはID発生部303はカメラヘッド部3の機種識別のための情報として用いられてもよい。
The ID generating unit 303 generates identification information for the camera head unit 3. In a simplified manner, the ID generating unit 303 can be configured as a terminal voltage setting unit corresponding to one or more connector terminals. Alternatively, the ID generating unit 303 may be a memory or a processor that stores the identification information.
The identification information by the ID generating unit 303 is used, for example, by the control unit 200 of the main body unit 2 as information for distinguishing between a non-separated state and a separated state of the camera head unit 3. Alternatively, the ID generating unit 303 may be used as information for identifying the model of the camera head unit 3.
  本体部2における信号処理部202は、例えばDSP(Digital Signal Processor)等により画像処理プロセッサとして構成される。この信号処理部202は、イメージセンサ300からの撮像画像データに対して各種の信号処理を施す。 The signal processing unit 202 in the main body 2 is configured as an image processing processor, for example, a DSP (Digital Signal Processor). This signal processing unit 202 performs various signal processing on the captured image data from the image sensor 300.
  例えば信号処理部202は、撮像画像データについて、R,G,Bの黒レベルを所定のレベルにクランプするクランプ処理、R,G,Bの色チャンネル間の補正処理、各画素についての画像データが、R,G,B全ての色成分を有するようにする色分離処理、輝度(Y)信号および色(C)信号を生成(分離)する処理等を施す。
 さらに信号処理部202は、各種の信号処理が施された撮像画像データに対して、必要な解像度変換処理、例えば記録用や通信出力用、或いはモニタ画像用の解像度変換を実行する。また信号処理部202は、解像度変換された撮像画像データについて、例えば記録用や通信用の圧縮処理、符号化処理等を行う。
For example, the signal processing unit 202 performs processes on the captured image data, such as clamping processing to clamp the R, G, and B black levels to predetermined levels, correction processing between the R, G, and B color channels, color separation processing so that the image data for each pixel has all R, G, and B color components, and processing to generate (separate) a luminance (Y) signal and a chrominance (C) signal.
Furthermore, the signal processing unit 202 performs necessary resolution conversion processing, such as resolution conversion for recording, communication output, or monitor image, on the captured image data that has been subjected to various signal processes. The signal processing unit 202 also performs compression processing, encoding processing, etc. for recording or communication on the resolution-converted captured image data.
 なお、信号処理部202は、撮像モニタ表示(スルー画表示)のためのモニタ画像信号を生成する処理を行い、そのモニタ画像信号をビデオ出力端子14aに供給する。これにより外部のモニタ装置をビデオ出力端子14aに接続することでモニタ画像を見ることができるようにしている。 The signal processing unit 202 also performs processing to generate a monitor image signal for the captured image monitor display (through image display) and supplies the monitor image signal to the video output terminal 14a. This makes it possible to view the monitor image by connecting an external monitor device to the video output terminal 14a.
  制御部200は、CPU(Central Processing Unit)、ROM(Read Only Memory)、RAM(Random Access Memory)、フラッシュメモリなどを備えたマイクロコンピュータ(演算処理装置)により構成される。
  CPUがROMやフラッシュメモリ等に記憶されたプログラムを実行することで、このカメラシステム1の全体を統括的に制御する。
  RAMは、CPUの各種データ処理の際の作業領域として、データやプログラム等の一時的な格納に用いられる。
  ROMやフラッシュメモリ(不揮発性メモリ)は、CPUが各部を制御するためのOS(Operating System)や、画像ファイル等のコンテンツファイルの他、各種動作のためのアプリケーションプログラムや、ファームウエア等の記憶に用いられる。
The control unit 200 is configured by a microcomputer (arithmetic processing device) equipped with a CPU (Central Processing Unit), a ROM (Read Only Memory), a RAM (Random Access Memory), a flash memory, and the like.
The CPU executes programs stored in the ROM, flash memory, or the like to centrally control the entire camera system 1 .
The RAM is used as a working area for the CPU to process various data, and is used for temporarily storing data, programs, etc.
ROM and flash memory (non-volatile memory) are used to store the OS (Operating System) that the CPU uses to control each part, content files such as image files, application programs for various operations, firmware, etc.
  このような制御部200は、信号処理部202における各種信号処理のパラメータ制御、ユーザの操作に応じた撮像動作や記録動作、記録した画像ファイルの再生動作、イメージセンサ300の撮像動作、ズーム、フォーカス、露光調整等のカメラ動作、ユーザインターフェース動作等について、必要各部の動作を制御する。 Such a control unit 200 controls the operation of each necessary unit with respect to parameter control of various signal processes in the signal processing unit 202, imaging and recording operations in response to user operations, playback of recorded image files, imaging operations of the image sensor 300, camera operations such as zoom, focus, and exposure adjustment, user interface operations, etc.
 このため制御部200はイメージセンサ300やレンズ系駆動部301に対して制御信号を送信する。例えば制御部200は、イメージセンサ300におけるシャッタースピードやフレームレート等の制御信号や、クロック信号、或いはレンズ系駆動部301に対する制御信号をカメラヘッド部3側に送信する。 For this reason, the control unit 200 transmits control signals to the image sensor 300 and the lens system driving unit 301. For example, the control unit 200 transmits control signals such as the shutter speed and frame rate of the image sensor 300, a clock signal, or a control signal for the lens system driving unit 301 to the camera head unit 3.
 また制御部200は撮像画像データに関連づけるメタデータの生成や、その関連づけのための処理を行う。
 本実施の形態の場合、制御部200には、水準器30や水準器40からの水準器情報が供給される。制御部200はこれらの水準器情報をメタデータの1つとして撮像画像データに関連づける処理を行う。
The control unit 200 also generates metadata to be associated with the captured image data and performs processing for the association.
In the case of this embodiment, the control unit 200 is supplied with level information from the level 30 and the level 40. The control unit 200 performs processing to associate the level information with the captured image data as one piece of metadata.
 なお撮像画像データに関連づける処理の例としては、撮像画像データのフレームに対応させてメタデータを記録又は送信することや、撮像画像データによる画像ファイルと関連づけられたファイルとしてメタデータファイルを生成して記録又は送信することなどが想定される。 Examples of processing to associate with captured image data include recording or transmitting metadata in correspondence with frames of captured image data, and generating and recording or transmitting a metadata file as a file associated with an image file based on captured image data.
  記録部203は、例えば不揮発性メモリからなり、静止画や動画等としての撮像画像データの画像ファイル(コンテンツファイル)や、メタデータ、サムネイル画像等を記憶する記憶領域として機能する。
  記録部203の実際の形態は多様に考えられる。例えば記録部203は、本体部2に内蔵されるフラッシュメモリでもよいし、本体部2に着脱できるメモリカード(例えば可搬型のフラッシュメモリ)と該メモリカードに対して記録再生アクセスを行うカード記録再生部による形態でもよい。また本体部2に内蔵されている形態としてHDD(Hard Disk Drive)やSSD(Solid State Drive)などとして実現されることもある。
The recording unit 203 is, for example, a non-volatile memory, and functions as a storage area for storing image files (content files) of captured image data such as still images and moving images, metadata, thumbnail images, and the like.
There are various possible actual forms of the recording unit 203. For example, the recording unit 203 may be a flash memory built into the main body 2, or may be a memory card (e.g., a portable flash memory) that can be attached to and detached from the main body 2 and a card recording and playback unit that performs recording and playback access to the memory card. Also, the recording unit 203 may be realized as a hard disk drive (HDD) or solid state drive (SSD) built into the main body 2.
  通信部204は、外部機器との間のデータ通信やネットワーク通信を有線又は無線で行う。
  例えば外部の表示装置、記録装置、再生装置等の間で撮像画像データ(静止画ファイルや動画ファイル)やメタデータ等の通信を行う。
 また通信部204は、ネットワーク通信部として、例えばインターネット、ホームネットワーク、LAN(Local Area Network)等の各種のネットワークによる通信を行い、ネットワーク上のサーバ、端末等との間で各種データ送受信を行うようにしてもよい。
The communication unit 204 performs data communication and network communication with external devices via wired or wireless means.
For example, captured image data (still image files and video files), metadata, and the like are communicated between external display devices, recording devices, playback devices, and the like.
Furthermore, the communication unit 204 may serve as a network communication unit, performing communication via various networks such as the Internet, a home network, and a LAN (Local Area Network), and may transmit and receive various data between servers, terminals, etc. on the network.
 電源回路205は、例えばバッテリー501を電源として、必要な電源電圧V0を生成し本体部2内の各部に供給する。
 また電源回路205はカメラヘッド部3側に供給する電源電圧V1も生成し、カメラヘッド部3に供給する。
 電源回路205による電源電圧供給のオン/オフは制御部200によって制御される。
The power supply circuit 205 uses, for example, a battery 501 as a power source, generates the necessary power supply voltage V0, and supplies it to each unit within the main body 2.
The power supply circuit 205 also generates a power supply voltage V 1 to be supplied to the camera head unit 3 , and supplies it to the camera head unit 3 .
The power supply circuit 205 is controlled to turn on/off the power supply voltage by the control unit 200 .
 カメラヘッド部3に設けられたアサイナブルボタン302の操作情報は、コネクタ3a、2aを介して制御部200が検出する。
 制御部200はアサイナブルボタン302の操作を検出することに応じて、アサイナブルボタン302に割り当てられた動作の制御を行う。例えば録画開始などの制御を行う。
Operation information of an assignable button 302 provided on the camera head unit 3 is detected by the control unit 200 via connectors 3a and 2a.
In response to detecting the operation of the assignable button 302, the control unit 200 controls the operation assigned to the assignable button 302. For example, the control unit 200 controls the start of recording.
 図12の基本状態では、本体部2とカメラヘッド部3において以上の各部が連携して動作し、動画や静止画の撮像、記録、通信等が行われる。 In the basic state shown in FIG. 12, the above-mentioned components in the main body 2 and camera head 3 work together to capture, record, and communicate moving and still images.
 続いて図13を参照して、延長ケーブル20の構成について説明する。
 図13は、延長状態として本体部2、ベースプレート50、延長ケーブル20、カメラヘッド部3が接続されている状態を示している。
Next, the configuration of the extension cable 20 will be described with reference to FIG.
FIG. 13 shows the extended state in which the main body 2, the base plate 50, the extension cable 20, and the camera head 3 are connected.
 ここで本体部2側とカメラヘッド部3側の信号伝送のための伝送路としては、ラインLN(LN1~LN8)が設けられている。
 図12では示さなかったが、図12の基本状態でも、以下の各ラインLN(但しモニタ画像ラインLN2は除く)は形成されている。
 これらのラインLN1~LN8は、それぞれが必ずしも1本の伝送路ではなく、機能的にまとめて示すとともに、代表的な伝送路のみ示している。実際にはさらに多数のラインLNが形成される。
 各ラインLNは以下のとおりである。
Here, lines LN (LN1 to LN8) are provided as transmission paths for signal transmission between the main body 2 side and the camera head 3 side.
Although not shown in FIG. 12, the following lines LN (excluding the monitor image line LN2) are formed even in the basic state of FIG.
Each of these lines LN1 to LN8 is not necessarily a single transmission line, but is shown functionally together and only representative transmission lines are shown. In reality, a larger number of lines LN are formed.
The lines LN are as follows:
・LN1:画像データライン
 画像データラインLN1は、イメージセンサ300からの撮像画像データを本体部2に伝送するラインである。
LN1: Image Data Line The image data line LN1 is a line that transmits captured image data from the image sensor 300 to the main body 2.
・LN2:モニタ画像ライン
 モニタ画像ラインLN2は、ベースプレート50のビデオ入力端子53と第1コネクタ部21のビデオ出力端子121を接続するラインで、例えばモニタ画像信号を伝送するラインである。
 なお、図12の状態では、このモニタ画像ラインLN2は形成されない。このため例えばコネクタ50a、23aにおける、コネクタ2a,3aの空きピンに対応するピン(端子)を利用して配線が行われ、延長ケーブル20内でモニタ画像ラインLN2が形成されるようにしている。もちろんコネクタ50a、23aのピン数を、コネクタ2a,3aのピン数よりも多くしてもよく、その場合は追加のピンを用いればよい。
LN2: Monitor Image Line The monitor image line LN2 is a line that connects the video input terminal 53 of the base plate 50 and the video output terminal 121 of the first connector portion 21, and is a line that transmits, for example, a monitor image signal.
12, this monitor image line LN2 is not formed. For this reason, for example, wiring is performed using pins (terminals) in connectors 50a, 23a that correspond to the free pins of connectors 2a, 3a, so that monitor image line LN2 is formed within extension cable 20. Of course, the number of pins in connectors 50a, 23a may be greater than the number of pins in connectors 2a, 3a, in which case additional pins may be used.
・LN3:制御ライン
 制御ラインLN3は、制御部200からカメラヘッド部3への制御信号やクロック信号の伝送に用いられる複数の信号線路を示している。
LN3: Control Line The control line LN3 indicates a plurality of signal lines used for transmitting control signals and clock signals from the control unit 200 to the camera head unit 3.
・LN4:制御ライン
 制御ラインLN4は、カメラヘッド部3側から制御部200への信号伝送に用いられる複数の信号線路を示している。例えばカメラヘッド部3の状態検出信号や、制御信号に対する応答信号などの伝送に用いられる。
 さらに水準器40で検出される水準器情報も制御ラインLN4により制御部200に伝送される。
LN4: Control Line The control line LN4 indicates a plurality of signal lines used for transmitting signals from the camera head unit 3 to the control unit 200. For example, the control line LN4 is used for transmitting a state detection signal of the camera head unit 3, a response signal to a control signal, and the like.
Furthermore, level information detected by the level 40 is also transmitted to the control unit 200 via a control line LN4.
・LN5:IDライン
 ID発生部303による識別情報を制御部200に検知させるためのラインである。
LN5: ID Line This is a line for allowing the control unit 200 to detect the identification information generated by the ID generating unit 303.
・LN6:電源ライン
 電源ラインLN6は、本体部2の電源回路205による電源電圧V1を、カメラヘッド部3側に供給するラインである。
LN6: Power Supply Line The power supply line LN6 is a line that supplies the power supply voltage V1 from the power supply circuit 205 of the main body 2 to the camera head 3 side.
・LN7:アサイナブルボタンライン
 アサイナブルボタンラインLN7は、アサイナブルボタン302,52をワイヤードオアで接続し、これらの操作を制御部200が検知できるようにするラインである。
LN7: Assignable Button Line The assignable button line LN7 is a line that connects the assignable buttons 302 and 52 by wired-OR, enabling the control unit 200 to detect the operations of these buttons.
・LN8:外部電源ライン
 外部電源ラインLN8は、ベースプレート50の外部電源入力端子51と第1コネクタ部21の外部電源出力端子120を接続するラインである。
LN8: External power supply line The external power supply line LN8 is a line that connects the external power supply input terminal 51 of the base plate 50 and the external power supply output terminal 120 of the first connector portion 21.
 なお、図の煩雑化を避けるため図示は省略したが、グランドラインも設けられる。これにより本体部2、ベースプレート50、延長ケーブル20、カメラヘッド部3で共通のグランドが形成される。 Although not shown to avoid complicating the diagram, a ground line is also provided. This forms a common ground for the main body 2, base plate 50, extension cable 20, and camera head 3.
 以上のラインLNが形成される延長ケーブル20には、第1コネクタ部21内に前処理部24、バッファーアンプ26、DC/DCコンバータ28、水準器40が設けられる。
 また第2コネクタ部23内には、後/前処理部25、バッファーアンプ27、DC/DCコンバータ29、ID発生部400が設けられる。
In the extension cable 20 in which the above-mentioned line LN is formed, a pre-processing unit 24, a buffer amplifier 26, a DC/DC converter 28, and a level 40 are provided within the first connector unit 21.
In addition, in the second connector section 23, a post-processing section 25, a buffer amplifier 27, a DC/DC converter 29, and an ID generating section 400 are provided.
 前処理部24は、画像データラインLN1に対応して設けられており、イメージセンサ300から出力される撮像画像データに対しての増幅処理(プリエンファシス処理)を行う。このプリエンファシス処理は延長ケーブル20の伝送における信号減衰を考慮して、予め減衰分をブーストする処理である。
 後/前処理部25は同じく画像データラインLN1に対応して設けられる。この後/前処理部25は、波形整形としての信号補償処理、及び、前処理部24と同様の増幅処理(プリエンファシス処理)が可能とされている。
 ここでいう波形整形(信号補償)は、延長ケーブル20の伝送によって変化(劣化)した周波数特性の補償のためのイコライジング処理である。
The pre-processing unit 24 is provided corresponding to the image data line LN1, and performs amplification processing (pre-emphasis processing) on the captured image data output from the image sensor 300. This pre-emphasis processing is a process that takes into account signal attenuation during transmission through the extension cable 20 and boosts the attenuation in advance.
The post/pre-processing unit 25 is also provided corresponding to the image data line LN1. The post/pre-processing unit 25 is capable of signal compensation processing as waveform shaping and amplification processing (pre-emphasis processing) similar to that of the pre-processing unit 24.
The waveform shaping (signal compensation) referred to here is an equalizing process for compensating for the frequency characteristics that have changed (deteriorated) due to transmission through the extension cable 20 .
 バッファーアンプ27は、制御ラインLN3に対応して設けられており、制御部200から送信する各種の制御信号やクロック信号のうちで、必要な信号に対して増幅処理を行う。
 バッファーアンプ26は、制御ラインLN4に対応して設けられており、カメラヘッド部3から送信する各種の信号のうちで、必要な信号に対して増幅処理を行う。
 これらバッファーアンプ26,27による増幅処理も、延長ケーブル20のケーブル伝送時の減衰に対応するものである。
The buffer amplifier 27 is provided corresponding to the control line LN3, and performs amplification processing on necessary signals among various control signals and clock signals transmitted from the control unit 200.
The buffer amplifier 26 is provided corresponding to the control line LN 4 , and performs amplification processing on necessary signals among various signals transmitted from the camera head unit 3 .
The amplification processing by these buffer amplifiers 26 and 27 also corresponds to the attenuation that occurs during cable transmission through the extension cable 20 .
 水準器40による水準器情報もバッファーアンプ26を介して制御ラインLN3より制御部200に伝送される。これにより制御部200は、少なくとも延長状態にある場合に、カメラヘッド部3側の水準器40による水準器情報を入力することができる。
 なお通常状態では、水準器40が存在しないことになるため、制御部200は、水準器情報が入力される端子情報は無効情報として認識すればよい。
Level information from the level 40 is also transmitted to the control unit 200 from the control line LN3 via the buffer amplifier 26. This allows the control unit 200 to input level information from the level 40 on the camera head unit 3 side at least when in the extended state.
In the normal state, since the level 40 does not exist, the control unit 200 may recognize the terminal information to which the level information is input as invalid information.
 DC/DCコンバータ28は、電源ラインLN6に供給される電源回路205からの直流の電源電圧V1を入力して電圧変換を行い、第1コネクタ部21内で必要な電源電圧Vcを生成して、前処理部24、バッファーアンプ26に供給する。
 DC/DCコンバータ29も、電源ラインLN6に供給される電源回路205からの直流の電源電圧V1を入力して電圧変換を行い、第2コネクタ部23内で必要な電源電圧Vcを生成して、後/前処理部25、バッファーアンプ27に供給する。
The DC/DC converter 28 receives the DC power supply voltage V1 from the power supply circuit 205 supplied to the power supply line LN6, performs voltage conversion, and generates the required power supply voltage Vc within the first connector portion 21, which is supplied to the pre-processing portion 24 and the buffer amplifier 26.
The DC/DC converter 29 also receives the DC power supply voltage V1 from the power supply circuit 205 supplied to the power supply line LN6, performs voltage conversion, and generates the required power supply voltage Vc within the second connector portion 23, which is supplied to the post/pre-processing portion 25 and the buffer amplifier 27.
 ID発生部400は、延長ケーブル20の識別情報を生成する。簡易的にはID発生部400は、1又は複数のコネクタ端子に対応する端子電圧設定部として構成できる。或いはID発生部400は識別情報を記憶したメモリ或いはプロセッサとしてもよい。 The ID generating unit 400 generates identification information for the extension cable 20. In a simpler form, the ID generating unit 400 can be configured as a terminal voltage setting unit corresponding to one or more connector terminals. Alternatively, the ID generating unit 400 may be a memory or a processor that stores the identification information.
 ID発生部400による識別情報は、例えばIDラインLN5により制御部200によって認識される。そしてID発生部400による識別情報は、例えば本体部2の制御部200が、カメラヘッド部3の非分離状態と分離状態を識別するための情報として用いられる。或いはID発生部400は延長ケーブル20の機種識別のための情報として用いられてもよい。 The identification information generated by the ID generating unit 400 is recognized by the control unit 200, for example, via the ID line LN5. The identification information generated by the ID generating unit 400 is then used by the control unit 200 of the main body unit 2, for example, as information for distinguishing between an undetached state and a detached state of the camera head unit 3. Alternatively, the ID generating unit 400 may be used as information for identifying the model of the extension cable 20.
 このID発生部400と、上述したカメラヘッド部3のID発生部303の各識別情報は、制御部200に対して一方のみが入力される構成でもよいし、両方が入力される構成でもよい。
 延長ケーブル20が用いられない通常状態では、もちろんID発生部303の識別情報のみが制御部200に入力されるが、延長状態になったら、ID発生部303の識別情報は延長ケーブル20内を伝送されず、ID発生部400のみが制御部200に入力される構成としてもよい。このようにすると制御部200は、識別情報から、カメラヘッド部3の非分離状態と分離状態を判定できる。
 或いは、延長状態において、ID発生部303とID発生部400の各識別情報の両方が制御部200に入力される構成としてもよく、その場合では制御部200は、延長ケーブル20の識別情報を入力するか否かで、カメラヘッド部3の非分離状態と分離状態を判定できる。さらに、その場合は制御部200は、延長状態か通常状態かに関わらず、カメラヘッド部3の機種判定等も可能になる。
The identification information of this ID generating unit 400 and the ID generating unit 303 of the camera head unit 3 described above may be input to the control unit 200 in a configuration in which only one of them is input, or in which both are input.
In a normal state where the extension cable 20 is not used, only the identification information of the ID generating unit 303 is input to the control unit 200, but in an extended state, the identification information of the ID generating unit 303 is not transmitted through the extension cable 20, and only the ID generating unit 400 is input to the control unit 200. In this manner, the control unit 200 can determine whether the camera head unit 3 is in a non-separated state or a separated state from the identification information.
Alternatively, in the extended state, the identification information of both the ID generating unit 303 and the ID generating unit 400 may be input to the control unit 200, in which case the control unit 200 can determine whether the camera head unit 3 is in a non-separated state or a separated state depending on whether the identification information of the extension cable 20 is input. Furthermore, in this case, the control unit 200 can determine the model of the camera head unit 3 regardless of whether it is in the extended state or the normal state.
 図14,図15は、カメラヘッド部3側の水準器40がカメラヘッド部3に向けられる場合の構成例を示している。水準器40以外の構成は図12,図13と同様であるため重複説明を避ける。 FIGS. 14 and 15 show an example of the configuration when the spirit level 40 on the camera head unit 3 side is directed toward the camera head unit 3. The configuration other than the spirit level 40 is the same as in FIG. 12 and FIG. 13, so a duplicated explanation will be avoided.
 この場合、図14のようにカメラヘッド部3に水準器40が設けられ、水準器40による水準器情報は制御部200に入力される。
 また図15のように延長状態では、カメラヘッド部3における水準器40による水準器情報は、バッファーアンプ26を介して制御ラインLN3より制御部200に伝送される。
 従って制御部200は通常状態でも延長状態でも、カメラヘッド部3側の水準器40による水準器情報を入力することができる。
In this case, a spirit level 40 is provided on the camera head unit 3 as shown in FIG. 14, and spirit level information from the spirit level 40 is input to the control unit 200.
In the extended state as shown in FIG. 15, level information from the level 40 in the camera head unit 3 is transmitted to the control unit 200 via the buffer amplifier 26 and the control line LN3.
Therefore, the control unit 200 can input level information from the level 40 on the camera head unit 3 side whether in the normal state or the extended state.
<5.第1から第4の実施の形態の処理例>
 実施の形態の処理例として制御部200の処理を説明する。以下説明するのは制御部200の水準器30、40から得られる水準器情報に関する処理例である。
5. Processing Examples of the First to Fourth Embodiments
The process of the control unit 200 will be described as an example of the process of the embodiment. The process of the control unit 200 will be described as an example of the process related to the level information obtained from the levels 30 and 40.
 図16は第1の実施の形態の処理例を示している。
 制御部200はユーザ操作等に応じてステップS101で電源オン処理を行った場合、ステップS102でカメラヘッド部3の分離、非分離の判定を行う。例えば制御部200はID発生部303、400の識別情報に基づいて、カメラヘッド部3の分離、非分離の判定を行うことができる。なおカメラヘッド部3の分離、非分離の判定は、機構的なスイッチ等で行うこともできる。
FIG. 16 shows an example of processing in the first embodiment.
When the control unit 200 performs power-on processing in step S101 in response to a user operation or the like, it determines in step S102 whether to separate or not separate the camera head unit 3. For example, the control unit 200 can determine whether to separate or not separate the camera head unit 3 based on the identification information of the ID generating units 303 and 400. Note that the determination whether to separate or not separate the camera head unit 3 can also be made by a mechanical switch or the like.
 制御部200はカメラヘッド部3が非分離の通常状態であると判定した場合は、ステップS103に進み、本体側の水準器30をアクティブ設定する。
 また制御部200はカメラヘッド部3が分離した延長状態であると判定した場合は、ステップS104に進み、カメラヘッド部3側の水準器40をアクティブ設定する。
 ここでいうアクティブ設定とは、その水準器から入力される水準器情報を有効な情報として処理するものとする設定のことである。
If the control unit 200 determines that the camera head unit 3 is in a normal non-detached state, the process proceeds to step S103, where the level 30 on the main body side is set to active.
If the control unit 200 determines that the camera head unit 3 is in a separated extended state, the process proceeds to step S104, where the level 40 on the camera head unit 3 side is set to active.
The term "active setting" as used herein means a setting in which the level information input from the level is processed as valid information.
 例えば電源オン時に以上のような設定を行った後、制御部200は撮像期間中に図17の処理を行う。なお以下でいう撮像とは、イメージセンサ300で得られた撮像映像信号を、静止画や動画を構成するフレームとして記録又は送信する処理動作を指すものとする。例えば静止画や動画の記録スタンバイ中もスルー画表示のためにイメージセンサ300による撮像自体は行われているが、そのような場合を除く。 For example, after making the above settings when the power is turned on, the control unit 200 performs the process in FIG. 17 during the imaging period. Note that imaging hereinafter refers to the processing operation of recording or transmitting the imaging video signal obtained by the image sensor 300 as frames that make up still images or videos. For example, imaging by the image sensor 300 is still performed for the purpose of displaying a through image even during recording standby for still images or videos, but this does not include such cases.
 ユーザの操作、例えば動画記録開始操作により撮像が開始されると、制御部200はステップS201からS202に進み、以降、ユーザ操作もしくは何らかのトリガによりステップS204で撮像終了と判定されるまで、ステップS202,S203の処理を繰り返し行う。 When image capture is started by a user operation, for example an operation to start video recording, the control unit 200 proceeds from step S201 to S202, and thereafter repeats the processing of steps S202 and S203 until it is determined in step S204 that image capture has ended due to a user operation or some other trigger.
 ステップS202で制御部200は、アクティブ設定されている水準器の水準器情報を取得する。
 ステップS203で制御部200は、取得した水準器情報を、撮像画像データの現在のフレームに対応するメタデータとして関連づける処理を行う。例えばフレームに対応するメタデータとして記録部203において記録媒体に記録させる。或いはフレームに対応するメタデータとして通信部204により外部機器に送信させる。
 制御部200はフレームタイミング毎にステップS202,S203の処理を行う。
In step S202, the control unit 200 acquires level information of the level that is set as active.
In step S203, the control unit 200 performs processing to associate the acquired spirit level information as metadata corresponding to the current frame of the captured image data. For example, the control unit 200 causes the recording unit 203 to record the metadata corresponding to the frame on a recording medium. Alternatively, the control unit 200 causes the communication unit 204 to transmit the metadata corresponding to the frame to an external device.
The control unit 200 performs the processes of steps S202 and S203 for each frame timing.
 従って、図16のステップS103で水準器30がアクティブ設定されていた場合は、撮像中において図17のステップS202,S203で、水準器30による水準器情報が、記録される動画の各フレームに対応するメタデータとされる。これは通常状態での撮像の場合である。
 また図16のステップS103で水準器40がアクティブ設定されていた場合は、撮像中において図17のステップS202,S203で、水準器40による水準器情報が、記録される動画の各フレームに対応するメタデータとされる。これは延長状態での撮像の場合である。
Therefore, if the level 30 is set to active in step S103 in Fig. 16, during image capture, the level information from the level 30 is made into metadata corresponding to each frame of the moving image to be recorded in steps S202 and S203 in Fig. 17. This is the case when image capture is performed in a normal state.
Also, if the level 40 is set to active in step S103 in Fig. 16, during image capture, level information from the level 40 is made into metadata corresponding to each frame of the moving image to be recorded in steps S202 and S203 in Fig. 17. This is the case when image capture is performed in the extended state.
 以上の図16,図17により、通常状態での撮像時には、水準器30の水準器情報が撮像画像データに関連づけられる。また延長状態の撮像時には、水準器40の水準器情報が撮像画像データに関連づけられる。
 従って撮像画像データに関連づける水準器情報は、イメージセンサ300の水平状態を基準としたロール及びピッチ方向の角度の情報となる。
16 and 17, when an image is captured in the normal state, the level information of the level 30 is associated with the captured image data. When an image is captured in the extended state, the level information of the level 40 is associated with the captured image data.
Therefore, the level information associated with the captured image data is information on the angles in the roll and pitch directions based on the horizontal state of the image sensor 300 .
 第2の実施の形態の処理例を図18及び上述の図17を用いて説明する。なお以降、既述の例と同一の処理については同一のステップ番号を付している。 A processing example of the second embodiment will be described using FIG. 18 and the above-mentioned FIG. 17. Note that from here on, the same processing as in the previously described example will be assigned the same step number.
 図18は電源オンの際の制御部200の処理であり、ステップS110では水準器30,40に関するユーザ操作を監視する。カメラシステム1では、例えばメニュー操作によりユーザが、水準器30、40について選択的にアクティブ設定するように操作可能とする。このような操作を検知していないときは、制御部200は図18の処理を抜ける。 FIG. 18 shows the processing of the control unit 200 when the power is turned on, and in step S110, the control unit 200 monitors user operations related to the spirit levels 30, 40. In the camera system 1, the user can selectively set the spirit levels 30, 40 to active by, for example, operating a menu. If no such operation is detected, the control unit 200 exits from the processing of FIG. 18.
 当該操作を検知した場合は、制御部200はステップS110からステップS111に進み、メニュー操作等による選択状態を確認する。
 そして制御部200は、ユーザ操作により本体部2側の水準器30の選択操作を確認した場合は、ステップS112からステップS103に進み、水準器30をアクティブ設定する。
 また制御部200は、ユーザ操作によりカメラヘッド部3側の水準器40の選択操作を確認した場合は、ステップS112からステップS104に進み、水準器40をアクティブ設定する。
If such an operation is detected, the control unit 200 proceeds from step S110 to step S111, and checks the selection state by a menu operation or the like.
If the control unit 200 confirms that the level 30 on the main body 2 side has been selected by a user operation, the control unit 200 proceeds from step S112 to step S103, and sets the level 30 to active.
Furthermore, when the control unit 200 confirms that the level 40 on the camera head unit 3 side has been selected by a user operation, the control unit 200 proceeds from step S112 to step S104, and sets the level 40 to active.
 以上のようにユーザ操作によりアクティブ設定が切り替え可能とされる。
 そして制御部200は撮像期間中に上述の図17の処理を行う。従ってユーザが水準器30を選択していた場合は、撮像中において図17のステップS202,S203で、水準器30による水準器情報が、記録される動画の各フレームに対応するメタデータとされる。
 またユーザが水準器40を選択していた場合は、撮像中において図17のステップS202,S203で、水準器40による水準器情報が、記録される動画の各フレームに対応するメタデータとされる。
As described above, the active setting can be switched by the user's operation.
The control unit 200 then performs the above-described processing in Fig. 17 during the image capture period. Therefore, if the user has selected the level 30, during image capture, in steps S202 and S203 in Fig. 17, the level information from the level 30 is made into metadata corresponding to each frame of the moving image to be recorded.
Furthermore, if the user has selected the level 40, during image capture, in steps S202 and S203 of FIG. 17, level information from the level 40 is made into metadata corresponding to each frame of the moving image to be recorded.
 このように、撮像に先立って行われるユーザ操作に応じて、水準器30又は水準器40の水準器情報が撮像画像データに関連づけられる。 In this way, the level information of the level 30 or level 40 is associated with the captured image data in accordance with the user operation performed prior to capturing an image.
 ユーザは、例えば通常状態か延長状態であるかに応じて水準器30又は水準器40を選択して、その水準器情報をメタデータとして撮像画像データに関連づけさせることができる。 The user can select level 30 or level 40 depending on whether it is in the normal state or the extended state, for example, and associate the level information with the captured image data as metadata.
 また例えば延長状態のまま、カメラヘッド部3及び本体部2を固定配置して撮像する場合もある。例えば一旦、延長状態にした後において、本体部2とカメラヘッド部3を共通の台座に設置して撮像するようなケースもある。そのときに、わざわざ通常状態にもどすのは面倒である。
 このような場合に、ユーザ操作により、どちらの水準器情報を有効とするかを選択できるようにすると便利である。
In addition, there are cases where the camera head unit 3 and the main body unit 2 are fixed in the extended state and imaging is performed. For example, there are also cases where the main body unit 2 and the camera head unit 3 are mounted on a common base after being temporarily extended and imaging is performed. In such cases, it is troublesome to go to the trouble of returning to the normal state.
In such a case, it would be convenient if the user could select which level information is to be valid by operation.
 なお図16の処理と、図18の処理を組み合わせることもできる。
 例えば制御部200は、電源オン時には、そのときのカメラヘッド部3の分離、非分離に応じて自動的に一方の水準器をアクティブ設定する。その後、ユーザ操作が検知された場合は、制御部200は操作に応じてアクティブ設定とする水準器を切り替える。
 このようにしておくことで、図17の処理により、基本的にはカメラヘッド部3の分離、非分離に応じて自動的に一方の水準器情報が関連づけられるとともに、ユーザは試用状況によって任意にいずれかの水準器情報を関連づけさせることができる。
The process in FIG. 16 and the process in FIG. 18 can be combined.
For example, when the power is turned on, the control unit 200 automatically sets one of the spirit levels to active depending on whether the camera head unit 3 is separated or not at that time. After that, when a user operation is detected, the control unit 200 switches the spirit level to be set to active depending on the operation.
By doing this, the processing in Figure 17 basically automatically associates one of the level information depending on whether the camera head unit 3 is separated or not, and the user can associate either level information at will depending on the trial situation.
 第3の実施の形態の処理例を図19、図20、及び上述の図17を用いて説明する。 An example of processing in the third embodiment will be explained using Figures 19 and 20, and the above-mentioned Figure 17.
 図19は電源オンの際の制御部200の処理であり、制御部200はユーザ操作等に応じてステップS101で電源オン処理を行った場合、ステップS120で全ての水準器(水準器30及び水準器40)のアクティブ設定を行う。 FIG. 19 shows the processing of the control unit 200 when the power is turned on. When the control unit 200 performs power-on processing in step S101 in response to a user operation or the like, it sets all of the spirit levels (spirit level 30 and spirit level 40) to active in step S120.
 その後、制御部200は撮像期間中に図17の処理を行う。
 この場合、水準器30、40がアクティブ設定されているため、撮像中において図17のステップS202,S203で、水準器30、40のそれぞれで得られる水準器情報が、記録される動画の各フレームに対応するメタデータとされる。
 例えば図14の通常状態と図15の延長状態のいずれの場合でも、水準器30、40のそれぞれで得られる水準器情報の両方が、撮像画像データの各フレームに対応するメタデータとされる。
Thereafter, the control unit 200 performs the process of FIG. 17 during the imaging period.
In this case, since the spirit levels 30, 40 are set to active, the spirit level information obtained by each of the spirit levels 30, 40 during image capture in steps S202, S203 of FIG. 17 is used as metadata corresponding to each frame of the video to be recorded.
For example, in either the normal state of FIG. 14 or the extended state of FIG. 15, both of the level information obtained by the levels 30 and 40 are used as metadata corresponding to each frame of the captured image data.
 このようにしておくことで、例えば後の再生時や編集時などにおいて、ユーザは任意の水準器情報を選択して処理を行うことができる。
 例えば撮像された撮像画像データによる画像ファイルの再生や編集を行う装置では、図20の処理を行う。
By doing so, for example, during later playback or editing, the user can select any level information and perform processing.
For example, in an apparatus for reproducing and editing an image file based on captured image data, the process shown in FIG. 20 is carried out.
 その装置は、ステップS301で水準器情報の選択設定を行う。例えばユーザの操作により、水準器30、水準器40のいずれの水準器情報を用いるかを決定する。 In step S301, the device selects and sets the level information. For example, the user determines whether to use the level information from level 30 or level 40.
 さらに装置はステップS302で、選択された水準器情報を使用して処理を実行する。処理とは、例えば再生処理や画像編集処理などである。
 例えば装置は、再生処理を行う場合に、選択設定された水準器情報を使用して水平補正を行いながら画像再生を行う。
 また例えば装置は、選択設定された水準器情報を使用して、画像の水平補正を加える編集処理を行ったり、各フレームからの水平補正された画像の切り出し処理を行ったりすることができる。
Furthermore, in step S302, the device executes a process using the selected level information, such as a playback process or an image editing process.
For example, when performing a playback process, the device plays back images while performing horizontal correction using the selected and set level information.
Also, for example, the device can use the selected and set spirit level information to perform editing processing to add horizontal correction to the image, or to perform processing to cut out horizontally corrected images from each frame.
 第4の実施の形態の処理例を図21及び上述の図19を用いて説明する。
 図19のように制御部200は、ステップS101で電源オン処理を行った後、ステップS120で全ての水準器(水準器30及び水準器40)のアクティブ設定を行う。
An example of processing in the fourth embodiment will be described with reference to FIG. 21 and the above-mentioned FIG.
As shown in FIG. 19, the control unit 200 performs power-on processing in step S101, and then sets all of the spirit levels (the spirit level 30 and the spirit level 40) to active in step S120.
 その後、制御部200は撮像期間中に図21の処理を行う。
 撮像開始に応じて制御部200はステップS201からS210に進み、カメラヘッド部3の分離判定を行う。即ちカメラヘッド部3が分離状態か非分離状態かを確認する。
Thereafter, the control unit 200 performs the process of FIG. 21 during the imaging period.
In response to the start of imaging, the control unit 200 proceeds from step S201 to step S210 to determine whether the camera head unit 3 is separated. That is, it is confirmed whether the camera head unit 3 is in a separated state or a non-separated state.
 そして制御部200は、ステップS204で、ユーザ操作もしくは何らかのトリガにより撮像終了と判定されるまで、ステップS202、S203、S211の処理を繰り返し行う。 Then, the control unit 200 repeats the processes of steps S202, S203, and S211 until it is determined in step S204 that imaging has ended due to a user operation or some other trigger.
 ステップS202で制御部200は、アクティブ設定されている水準器、即ちこの場合は水準器30、40の両方についての水準器情報を取得する。
 ステップS203で制御部200は、取得した水準器30、40のそれぞれの水準器情報を、撮像画像データの現在のフレームに対応するメタデータとして関連づける処理を行う。例えばフレームに対応するメタデータとして記録部203において記録媒体に記録させる。或いはフレームに対応するメタデータとして通信部204により外部機器に送信させる。
In step S202, the control unit 200 acquires level information for the level that is set as active, that is, for both of the level gauges 30 and 40 in this case.
In step S203, the control unit 200 performs processing to associate the acquired level information of each of the level gauges 30 and 40 as metadata corresponding to the current frame of the captured image data. For example, the recording unit 203 records the metadata corresponding to the frame on a recording medium. Alternatively, the communication unit 204 transmits the metadata corresponding to the frame to an external device.
 またステップS211で制御部200は、ステップS210で実行した分離判定結果の情報を、撮像画像データに関連づける。
 分離判定結果の情報は、水準器情報とともにフレーム毎に対応するメタデータとしてもよいし、画像ファイルの全体に対応するメタデータとしてもよい。
Furthermore, in step S211, the control unit 200 associates information on the result of the separation determination executed in step S210 with the captured image data.
The information on the separation determination result may be metadata corresponding to each frame together with the spirit level information, or may be metadata corresponding to the entire image file.
 以上の図21により、撮像中において水準器30、40のそれぞれで得られる水準器情報が、記録される動画の各フレームに対応するメタデータとされ、さらに、分離判定結果の情報も関連づけられる。 As shown in FIG. 21 above, the level information obtained by each of the levels 30 and 40 during image capture is treated as metadata corresponding to each frame of the video being recorded, and is also associated with information on the separation determination results.
 このようにしておくことで、例えば後の再生時や編集時などにおいて、ユーザは分離判定結果の情報に応じて水準器情報を選択して処理を行うことができる。或いは、編集装置、再生装置が、分離判定結果の情報を用いていずれかの水準器情報を自動的に選択して用いて、再生処理や編集処理を行うことができる。例えば再生装置や編集装置が、水平補正された画像の再生や、画像の水平補正編集を実行できる。 By doing this, for example, during later playback or editing, the user can select the level information according to the information on the separation judgment result and perform processing. Alternatively, the editing device or playback device can automatically select one of the level information using the information on the separation judgment result and use it to perform playback processing or editing processing. For example, the playback device or editing device can play back a horizontally corrected image or perform horizontal correction editing of an image.
 なお、例えば図17や図21のステップS203では、水準器情報を動画の各フレームに対応するメタデータとするものとしたが、水準器情報を取得するタイミングは、必ずしもフレームに同期させる必要はない。つまりフレームレートやフレームの位相に合わせる必要はない。従って上述のように各フレームタイミングで水準器情報を取得し、そのフレームのメタデータとするというのは一例である。
 水準器情報は、フレームと非同期で撮像画像データに関連づけられても良いし、例えば間欠的なフレームに対応して関連づけられてもよい。再生時や画像編集時などに、このようなメタデータを用いて時間軸方向の補間処理を行うことで、フレームタイミングに同期した水準器情報を得ることができるためである。
17 and 21, the level information is treated as metadata corresponding to each frame of the video, but the timing of acquiring the level information does not necessarily have to be synchronized with the frames. In other words, it does not have to be aligned with the frame rate or frame phase. Therefore, acquiring level information at the timing of each frame as described above and using it as metadata for that frame is just one example.
The level information may be associated with the captured image data asynchronously with the frames, or may be associated with, for example, intermittent frames, because during playback or image editing, such metadata can be used to perform interpolation along the time axis to obtain level information synchronized with the frame timing.
 また水準器情報がフレームタイミングに同期して取得できない場合は、ステップS203において補間処理により同期したタイミングの水準器情報を求め、それをフレームに関連づけるメタデータとしてもよい。
Furthermore, if the spirit level information cannot be obtained in synchronization with the frame timing, in step S203, spirit level information at a synchronized timing may be obtained by an interpolation process, and this may be used as metadata associated with the frame.
<6.まとめ及び変形例>
 以上の実施の形態によれば次のような効果が得られる。
6. Summary and Modifications
According to the above embodiment, the following effects can be obtained.
 実施の形態のカメラシステム1は、撮像画像データに対する処理を行う本体部2と、本体部2に対して着脱可能とされ、本体部2に装着された非分離状態及び本体部2から分離された分離状態のいずれであってもイメージセンサ300により生成した撮像画像データを本体部2に対して出力することができるカメラヘッド部3を備える。またカメラシステム1は、本体部2側に設けられる第1の水準器30と、カメラヘッド部3側に設けられる第2の水準器40と、水準器30による水準器情報と水準器40による水準器情報に関して撮像画像データへの関連づけ処理を行う制御部200を備える。
 これにより、実際の撮像の際の状況に応じて適切な水準器情報が撮像画像データに関連づけられる。例えば基本状態では少なくとも本体部2側の水準器30による水準器情報(ロール角、ピッチ角)が撮像画像データに関連づけられ、延長状態では、少なくともカメラヘッド部3側の水準器40によるカメラヘッド部3の水準器情報が撮像画像データに関連づけられるようにすることができる。つまり画像に影響する撮像時の姿勢に応じた適切な水準器情報を撮像画像データに関連づけることができる。
The camera system 1 of the embodiment includes a main body section 2 that processes captured image data, and a camera head section 3 that is detachable from the main body section 2 and can output captured image data generated by an image sensor 300 to the main body section 2, whether the camera head section 3 is in an undetached state attached to the main body section 2 or in a detached state separated from the main body section 2. The camera system 1 also includes a first spirit level 30 provided on the main body section 2 side, a second spirit level 40 provided on the camera head section 3 side, and a control section 200 that performs processing to associate spirit level information from the spirit level 30 and the spirit level information from the spirit level 40 with the captured image data.
This allows appropriate level information to be associated with the captured image data depending on the actual situation during image capture. For example, in the basic state, at least the level information (roll angle, pitch angle) from the level 30 on the main body 2 side can be associated with the captured image data, and in the extended state, at least the level information of the camera head 3 from the level 40 on the camera head 3 side can be associated with the captured image data. In other words, appropriate level information depending on the posture at the time of image capture that affects the image can be associated with the captured image data.
 第1,第2の実施の形態として、制御部200が関連づけ処理として、水準器30の水準器情報と水準器40の水準器情報の一方を自動的に選択して、撮像画像データに関連づける処理を行う例を挙げた(図16,図17参照)。
 例えば撮像画像に対して適切な方の水準器情報を選択して撮像画像データに関連づけることで、撮像時の状況に応じて適切な水準器情報を保存できる。
As the first and second embodiments, examples have been given in which the control unit 200 performs an association process in which it automatically selects one of the level information of the level 30 and the level information of the level 40 and associates it with the captured image data (see Figures 16 and 17).
For example, by selecting the appropriate level information for a captured image and associating it with the captured image data, it is possible to save appropriate level information according to the situation at the time of capturing the image.
 第1の実施の形態では、制御部200は、関連づけ処理として、カメラヘッド部3が非分離状態であるときは水準器30の水準器情報を選択し、カメラヘッド部3が分離状態であるときは水準器40の水準器情報を選択して、撮像画像データに関連づける処理を行う例を挙げた(図16,図17参照)。
 カメラヘッド部3が非分離状態、つまり通常状態で運用される場合は、本体側の水準器30による水準器情報を撮像画像データに関連づけることで、カメラシステム1において基準となっている水準器情報が保存できる。
 カメラヘッド部3が分離状態、つまり延長状態で運用される場合は、カメラヘッド部3側の水準器40による水準器情報を撮像画像データに関連づけることで、カメラヘッド部3の姿勢に応じた水準器情報、つまり撮像画像データに影響する水準器情報が保存できる。
In the first embodiment, an example was given of the control unit 200 performing the association process by selecting the spirit level information of the spirit level 30 when the camera head unit 3 is in a non-detached state, and selecting the spirit level information of the spirit level 40 when the camera head unit 3 is in a detached state, and performing the process of associating the information with the captured image data (see Figures 16 and 17).
When the camera head unit 3 is operated in a non-detached state, i.e., in a normal state, the level information provided by the level 30 on the main body side can be associated with the captured image data, thereby saving the level information that serves as the reference in the camera system 1.
When the camera head unit 3 is operated in a detached state, i.e., an extended state, the level information from the level 40 on the camera head unit 3 side can be associated with the captured image data, so that the level information according to the attitude of the camera head unit 3, i.e., the level information that affects the captured image data, can be saved.
 第2の実施の形態では、制御部200は、関連づけ処理として、ユーザ操作入力に応じて、水準器30、40のいずれか一方の水準器情報を選択して、撮像画像データに関連づける処理を行う例を挙げた(図18,図17参照)。
 ユーザの操作に応じて、本体側の水準器30とカメラヘッド部3側の水準器40を選択して水準器情報を撮像画像データに関連づける。これにより撮影時の事情、状況に応じて、いずれかの水準器情報を保存できる。
In the second embodiment, an example of the association process is shown in which the control unit 200 selects level information from either the level 30 or 40 in response to user operation input and associates it with the captured image data (see Figures 18 and 17).
Depending on the user's operation, the level 30 on the main body side and the level 40 on the camera head unit 3 side are selected and the level information is associated with the captured image data. This allows either level information to be saved depending on the circumstances and situation at the time of shooting.
 第3,第4の実施の形態では、制御部200は、関連づけ処理として、水準器30、40の両方の水準器情報を撮像画像データに関連づける処理を行う例を挙げた(図19,図17,図20,図21参照)。
 両方の水準器情報をともに撮像画像データに関連づけておくことで、後の時点、例えば画像再生時や画像編集時などに、いずれかの水準器情報を使用した処理を行うことができるようになる。
In the third and fourth embodiments, the control unit 200 performs the association process of associating the level information of both the levels 30, 40 with the captured image data (see FIGS. 19, 17, 20, and 21).
By associating both sets of spirit level information with the captured image data, it becomes possible to carry out processing using either of the spirit level information at a later time, for example, during image playback or image editing.
 第4の実施の形態では、制御部200は、関連づけ処理として、水準器30、40の両方の水準器情報とともに、非分離状態か分離状態かの判定結果の情報を撮像画像データに関連づける処理を行う例を挙げた(図19,図21参照)。
 上述のように、両方の水準器情報をともに撮像画像データに関連づけておくことで、後の時点に、いずれかの水準器情報を使用した処理を行うことができる。この場合にカメラヘッド部3の非分離状態(通常状態)、分離状態(延長状態)を判定できる情報も関連づけておくことで、いずれの水準器情報を使用することが適切かを判定できるものとなる。
In the fourth embodiment, an example of the association process is given in which the control unit 200 performs a process of associating the information on the determination result of whether the state is non-separated or separated, together with the level information of both the levels 30, 40, with the captured image data (see Figures 19 and 21).
As described above, by associating both sets of spirit level information with the captured image data, it is possible to carry out processing using either of the spirit level information at a later point in time. In this case, by associating with information that can determine whether the camera head unit 3 is in a non-separated state (normal state) or a separated state (extended state), it is possible to determine which spirit level information is appropriate to use.
 実施の形態では、分離状態では、カメラヘッド部3と本体部2の間は延長ケーブル20により接続され、撮像画像データがカメラヘッド部3から本体部2に伝送されるものとした。延長ケーブル20を用いることで、カメラヘッド部3を用いた柔軟な撮影が可能となる。
 なお、延長ケーブル20等を用いた有線接続ではなく、無線通信により、カメラヘッド部3と本体部2の間の信号伝送が行われるようにしてもよい。
 また無線通信によりカメラヘッド部3から本体部2に信号伝送が行われる場合は、本体部2の制御部200は、カメラヘッド部3側の水準器40をアクティブ設定して、その水準器情報を撮像画像データに関連づける処理を行うこともできる。
In the embodiment, in the separated state, the camera head unit 3 and the main body unit 2 are connected by the extension cable 20, and captured image data is transmitted from the camera head unit 3 to the main body unit 2. By using the extension cable 20, flexible shooting using the camera head unit 3 becomes possible.
Note that signal transmission between the camera head unit 3 and the main body unit 2 may be performed by wireless communication, rather than a wired connection using the extension cable 20 or the like.
In addition, when a signal is transmitted from the camera head unit 3 to the main body unit 2 via wireless communication, the control unit 200 of the main body unit 2 can set the spirit level 40 on the camera head unit 3 side to active and perform a process of relating the spirit level information to the captured image data.
 実施の形態では、水準器40は、分離状態でカメラヘッド部3に装着される装着体に設けられる例を挙げた。
 例えば図13の例ように、第1コネクタ部21にカメラヘッド部3側の水準器40が設けられるようにする。延長状態で第1コネクタ部21はカメラヘッド部3に装着されるものであるため、水準器40は、カメラヘッド部3の姿勢に応じた水準器情報を出力できるものとなる。つまり撮像画像の水平又は非水平状態を正しく示す情報となる。
 また第1コネクタ部21が水準器40を装備することで、カメラヘッド部3における水準器の有無に関係なく、カメラヘッド部3の姿勢に応じた水準器情報を出力できるものとなる。つまり水準器が設けられていなカメラヘッド部3を用いる場合であっても、延長状態においてカメラヘッド部3側に水準器を設けるということが可能になる。
In the embodiment, the level 40 is provided on a mounting body that is attached to the camera head unit 3 in a separate state.
13, for example, a spirit level 40 on the camera head 3 side is provided on the first connector 21. Since the first connector 21 is attached to the camera head 3 in an extended state, the spirit level 40 can output spirit level information according to the attitude of the camera head 3. In other words, the information correctly indicates whether the captured image is in a horizontal or non-horizontal state.
Furthermore, by equipping the first connector unit 21 with the spirit level 40, it becomes possible to output spirit level information according to the attitude of the camera head unit 3, regardless of the presence or absence of a spirit level in the camera head unit 3. In other words, even when using a camera head unit 3 that does not have a spirit level, it becomes possible to provide a spirit level on the camera head unit 3 side in the extended state.
 実施の形態では、分離状態でカメラヘッド部3に装着される装着体は、延長ケーブル20における第1コネクタ部21であるとした。
 第1コネクタ部21は、延長ケーブル20のコネクタである。従って第1コネクタ部21は延長状態で必ずカメラヘッド部3に取り付けられる部材であるため、カメラヘッド部3の姿勢を検出する水準器40を設ける箇所として好適である。
In the embodiment, the attachment part that is attached to the camera head unit 3 in a separated state is the first connector unit 21 of the extension cable 20 .
The first connector portion 21 is a connector for the extension cable 20. Therefore, the first connector portion 21 is a member that is always attached to the camera head portion 3 in an extended state, and is therefore suitable as a location for providing a spirit level 40 that detects the attitude of the camera head portion 3.
 実施の形態では、水準器40は、カメラヘッド部3に設けられる例を挙げた。
 例えば図15の例ように、カメラヘッド部3に、カメラヘッド部3側の水準器40が設けられるようにする。これにより水準器40は、カメラヘッド部3の姿勢に応じた水準器情報を出力できるものとなる。
In the embodiment, an example in which the spirit level 40 is provided in the camera head unit 3 has been given.
15 , for example, a level 40 on the camera head unit 3 side is provided on the camera head unit 3. This allows the level 40 to output level information according to the attitude of the camera head unit 3.
 実施の形態では、制御部200は、関連づけ処理として、水準器30又は水準器40の水準器情報を、撮像画像データの各フレームに関連づける処理を行う例を挙げた。
 例えば撮像画像データのフレーム毎に水準器情報を関連づけておくことで、後に画像の再生や編集を行う際に、フレーム毎に水平に対するロール角、ピッチ角の情報を用いた処理を行うことができる。
In the embodiment, the control unit 200 performs the association process by associating the level information of the level 30 or the level 40 with each frame of the captured image data.
For example, by associating level information with each frame of captured image data, when the images are later played back or edited, processing can be performed using information on the roll angle and pitch angle with respect to the horizontal for each frame.
 実施の形態では、制御部200は、関連づけ処理として、水準器30又は水準器40の水準器情報を、撮像画像データに関連づけたメタデータとして記録媒体に記録させる処理を行う例を述べた。
 これにより撮像画像データに対応して、水準器情報を含むメタデータを記録しておくことができる。
 なお水準器情報は、メタデータとして撮像画像データの画像ファイルに含む情報としてもよいが、画像ファイルとは別ファイルとして水準器情報が記録され、ファイル同士が関連づけられるようにしてもよい。
In the embodiment, the control unit 200 performs, as the association process, a process of recording the level information of the level 30 or the level 40 on the recording medium as metadata associated with the captured image data.
This makes it possible to record metadata including level information in association with captured image data.
The spirit level information may be included as metadata in the image file of the captured image data, but the spirit level information may be recorded as a file separate from the image file so that the files are associated with each other.
 実施の形態では、制御部200は、関連づけ処理として、水準器30又は水準器40の水準器情報を、撮像画像データに関連づけた情報として外部機器に送信させる処理を行う例を述べた。
 これにより撮像画像データに対応する水準器情報を外部機器に提供して、外部機器において記録、再生、編集に用いることができる。
In the embodiment, the control unit 200 performs the association process by transmitting the level information of the level 30 or the level 40 to an external device as information associated with the captured image data.
This makes it possible to provide the level information corresponding to the captured image data to an external device, and to use the level information for recording, playback, and editing in the external device.
 なお、本明細書に記載された効果はあくまでも例示であって限定されるものではなく、また他の効果があってもよい。 Note that the effects described in this specification are merely examples and are not limiting, and other effects may also be present.
 なお本技術は以下のような構成も採ることができる。
 (1)
 撮像画像データに対する処理を行う本体部と、
 前記本体部に対して着脱可能とされ、前記本体部に装着された非分離状態及び前記本体部から分離された分離状態のいずれであってもイメージセンサにより生成した前記撮像画像データを前記本体部に対して出力できるように構成されたカメラヘッド部と、
 前記本体部側に設けられる第1水準器と、
 前記カメラヘッド部側に設けられる第2水準器と、
 前記第1水準器による第1水準器情報と前記第2水準器による第2水準器情報に関して、前記撮像画像データへの関連づけ処理を行う制御部と、を備えた
 カメラシステム。
 (2)
 前記制御部は、前記関連づけ処理として、前記第1水準器情報と前記第2水準器情報の一方を自動的に選択して、前記撮像画像データに関連づける処理を行う
 上記(1)に記載のカメラシステム。
 (3)
 前記制御部は、前記関連づけ処理として、
 前記カメラヘッド部が前記非分離状態であるときは前記第1水準器情報を選択し、前記カメラヘッド部が前記分離状態であるときは前記第2水準器情報を選択して、前記撮像画像データに関連づける処理を行う
 上記(2)に記載のカメラシステム。
 (4)
 前記制御部は、前記関連づけ処理として、
 ユーザ操作入力に応じて、前記第1水準器情報と前記第2水準器情報のいずれか一方を選択して、前記撮像画像データに関連づける処理を行う
 上記(1)に記載のカメラシステム。
 (5)
 前記制御部は、前記関連づけ処理として、前記第1水準器情報と前記第2水準器情報の両方を前記撮像画像データに関連づける処理を行う
 上記(1)に記載のカメラシステム。
 (6)
 前記制御部は、前記関連づけ処理として、前記第1水準器情報と前記第2水準器情報の両方とともに、前記非分離状態か前記分離状態かを示す情報を、前記撮像画像データに関連づける処理を行う
 上記(1)に記載のカメラシステム。
 (7)
 前記分離状態では、前記カメラヘッド部と前記本体部の間はケーブルにより接続され、前記ケーブルにより前記撮像画像データが前記カメラヘッド部から前記本体部に伝送される
 上記(1)から(6)のいずれかに記載のカメラシステム。
 (8)
 前記第2水準器は、前記分離状態で前記カメラヘッド部に装着される装着体に設けられる
 上記(1)から(7)のいずれかに記載のカメラシステム。
 (9)
 前記装着体は、前記カメラヘッド部が前記分離状態とされたときに前記本体部に前記撮像画像データの伝送を行うケーブルにおける、前記カメラヘッド部側の端部のコネクタである
 上記(8)に記載のカメラシステム。
 (10)
 前記第2水準器は、前記カメラヘッド部に設けられる
 上記(1)から(7)のいずれかに記載のカメラシステム。
 (11)
 前記制御部は、前記関連づけ処理として、
 前記第1水準器情報又は前記第2水準器情報を、前記撮像画像データの各フレームに関連づける処理を行う
 上記(1)から(10)のいずれかに記載のカメラシステム。
 (12)
 前記制御部は、前記関連づけ処理として、
 前記第1水準器情報又は前記第2水準器情報を、前記撮像画像データに関連づけた情報として記録媒体に記録させる処理を行う
 上記(1)から(11)のいずれかに記載のカメラシステム。
 (13)
 前記制御部は、前記関連づけ処理として、
 前記第1水準器情報又は前記第2水準器情報を、前記撮像画像データに関連づけた情報として外部機器に送信させる処理を行う
 上記(1)から(12)のいずれかに記載のカメラシステム。
 (14)
 撮像画像データに対する処理を行う本体部と、
 前記本体部に対して着脱可能とされ、前記本体部に装着された非分離状態及び前記本体部から分離された分離状態のいずれであってもイメージセンサにより生成した前記撮像画像データを前記本体部に対して出力できるカメラヘッド部と、
 前記本体部に設けられる第1水準器と、
 前記カメラヘッド部の筐体と一体的に設けられる第2水準器と、
 を備えたカメラシステムの処理方法として、
 前記第1水準器による第1水準器情報と前記第2水準器による第2水準器情報に関する、前記撮像画像データへの関連づけ処理を行う
 処理方法。
The present technology can also be configured as follows.
(1)
a main body section for performing processing on captured image data;
a camera head unit that is detachable from the main body unit and configured to output the captured image data generated by an image sensor to the main body unit in either a non-detached state in which the camera head is attached to the main body unit or a detached state in which the camera head is separated from the main body unit;
A first level provided on the main body portion side;
A second level provided on the camera head unit side;
a control unit that performs processing to associate first level information from the first level and second level information from the second level with the captured image data.
(2)
The camera system according to (1) above, wherein the control unit performs the association process by automatically selecting one of the first level information and the second level information and associating it with the captured image data.
(3)
The control unit performs the association process by
The camera system described in (2) above, wherein when the camera head unit is in the non-separated state, the first level information is selected, and when the camera head unit is in the separated state, the second level information is selected, and processing is performed to associate the information with the captured image data.
(4)
The control unit performs the association process by
The camera system according to (1) above, further comprising: a process of selecting one of the first level information and the second level information in response to a user operation input, and associating the selected one with the captured image data.
(5)
The camera system according to (1) above, wherein the control unit performs a process of associating both the first spirit level information and the second spirit level information with the captured image data as the association process.
(6)
The control unit performs a process of associating, as the association process, both the first level information and the second level information, as well as information indicating whether the state is non-separated or separated, to the captured image data. The camera system described in (1) above.
(7)
In the separated state, the camera head and the main body are connected by a cable, and the captured image data is transmitted from the camera head to the main body via the cable.A camera system described in any one of (1) to (6) above.
(8)
The camera system according to any one of (1) to (7), wherein the second level is provided on a mounting body that is mounted on the camera head in the separated state.
(9)
The camera system described in (8) above, wherein the attachment body is a connector at the end of a cable that transmits the captured image data to the main body when the camera head is in the separated state.
(10)
The camera system according to any one of (1) to (7), wherein the second spirit level is provided in the camera head portion.
(11)
The control unit performs the association process by
The camera system according to any one of (1) to (10) above, further comprising a process for associating the first spirit level information or the second spirit level information with each frame of the captured image data.
(12)
The control unit performs the association process by
The camera system according to any one of (1) to (11) above, further comprising a process for recording the first level information or the second level information on a recording medium as information associated with the captured image data.
(13)
The control unit performs the association process by
The camera system according to any one of (1) to (12) above, further comprising a process for transmitting the first level information or the second level information to an external device as information associated with the captured image data.
(14)
a main body section for performing processing on captured image data;
a camera head unit that is detachable from the main body unit and capable of outputting the captured image data generated by an image sensor to the main body unit in either a non-detached state in which the camera head is attached to the main body unit or a detached state in which the camera head is separated from the main body unit;
A first level provided on the main body portion;
a second level provided integrally with a housing of the camera head;
As a processing method for a camera system having
a processing method for performing processing for associating first level information obtained by the first level and second level information obtained by the second level with the captured image data;
1 カメラシステム
2 本体部
3 カメラヘッド部
9 底面部
20 延長ケーブル
21 第1コネクタ部
30 本体側水準器
40 ヘッド側水準器
200 制御部
202 信号処理部
203 記録部
204 通信部
300 イメージセンサ
Reference Signs List 1 Camera system 2 Main body 3 Camera head 9 Bottom 20 Extension cable 21 First connector 30 Main body level 40 Head level 200 Control unit 202 Signal processing unit 203 Recording unit 204 Communication unit 300 Image sensor

Claims (14)

  1.  撮像画像データに対する処理を行う本体部と、
     前記本体部に対して着脱可能とされ、前記本体部に装着された非分離状態及び前記本体部から分離された分離状態のいずれであってもイメージセンサにより生成した前記撮像画像データを前記本体部に対して出力できるように構成されたカメラヘッド部と、
     前記本体部側に設けられる第1水準器と、
     前記カメラヘッド部側に設けられる第2水準器と、
     前記第1水準器による第1水準器情報と前記第2水準器による第2水準器情報に関して、前記撮像画像データへの関連づけ処理を行う制御部と、を備えた
     カメラシステム。
    a main body section for performing processing on captured image data;
    a camera head unit that is detachable from the main body unit and configured to output the captured image data generated by an image sensor to the main body unit in either a non-detached state in which the camera head is attached to the main body unit or a detached state in which the camera head is separated from the main body unit;
    A first level provided on the main body portion side;
    A second level provided on the camera head unit side;
    a control unit that performs processing to associate first level information from the first level and second level information from the second level with the captured image data.
  2.  前記制御部は、前記関連づけ処理として、前記第1水準器情報と前記第2水準器情報の一方を自動的に選択して、前記撮像画像データに関連づける処理を行う
     請求項1に記載のカメラシステム。
    The camera system according to claim 1 , wherein the control unit performs the association process by automatically selecting one of the first level information and the second level information and associating the selected one with the captured image data.
  3.  前記制御部は、前記関連づけ処理として、
     前記カメラヘッド部が前記非分離状態であるときは前記第1水準器情報を選択し、前記カメラヘッド部が前記分離状態であるときは前記第2水準器情報を選択して、前記撮像画像データに関連づける処理を行う
     請求項2に記載のカメラシステム。
    The control unit performs the association process by
    The camera system according to claim 2 , wherein the first level information is selected when the camera head unit is in the non-separated state, and the second level information is selected when the camera head unit is in the separated state, and a process of associating the first level information with the captured image data is performed.
  4.  前記制御部は、前記関連づけ処理として、
     ユーザ操作入力に応じて、前記第1水準器情報と前記第2水準器情報のいずれか一方を選択して、前記撮像画像データに関連づける処理を行う
     請求項1に記載のカメラシステム。
    The control unit performs the association process by
    The camera system according to claim 1 , wherein one of the first level information and the second level information is selected in response to a user operation input, and a process of associating the selected one with the captured image data is performed.
  5.  前記制御部は、前記関連づけ処理として、前記第1水準器情報と前記第2水準器情報の両方を前記撮像画像データに関連づける処理を行う
     請求項1に記載のカメラシステム。
    The camera system according to claim 1 , wherein the control unit performs the associating process by associating both the first spirit level information and the second spirit level information with the captured image data.
  6.  前記制御部は、前記関連づけ処理として、前記第1水準器情報と前記第2水準器情報の両方とともに、前記非分離状態か前記分離状態かを示す情報を、前記撮像画像データに関連づける処理を行う
     請求項1に記載のカメラシステム。
    The camera system according to claim 1 , wherein the control unit performs the association process by associating information indicating whether the state is non-separated or separated, together with both the first level information and the second level information, to the captured image data.
  7.  前記分離状態では、前記カメラヘッド部と前記本体部の間はケーブルにより接続され、前記ケーブルにより前記撮像画像データが前記カメラヘッド部から前記本体部に伝送される
     請求項1に記載のカメラシステム。
    The camera system according to claim 1 , wherein in the separated state, the camera head and the main body are connected by a cable, and the captured image data is transmitted from the camera head to the main body via the cable.
  8.  前記第2水準器は、前記分離状態で前記カメラヘッド部に装着される装着体に設けられる
     請求項1に記載のカメラシステム。
    The camera system according to claim 1 , wherein the second level is provided on a mounting body that is mounted on the camera head in the separated state.
  9.  前記装着体は、前記カメラヘッド部が前記分離状態とされたときに前記本体部に前記撮像画像データの伝送を行うケーブルにおける、前記カメラヘッド部側の端部のコネクタである
     請求項8に記載のカメラシステム。
    The camera system according to claim 8 , wherein the attachment is a connector at an end of a cable on the camera head side that transmits the captured image data to the main body when the camera head is in the separated state.
  10.  前記第2水準器は、前記カメラヘッド部に設けられる
     請求項1に記載のカメラシステム。
    The camera system according to claim 1 , wherein the second level is provided in the camera head.
  11.  前記制御部は、前記関連づけ処理として、
     前記第1水準器情報又は前記第2水準器情報を、前記撮像画像データの各フレームに関連づける処理を行う
     請求項1に記載のカメラシステム。
    The control unit performs the association process by
    The camera system according to claim 1 , further comprising a process for associating the first spirit level information or the second spirit level information with each frame of the captured image data.
  12.  前記制御部は、前記関連づけ処理として、
     前記第1水準器情報又は前記第2水準器情報を、前記撮像画像データに関連づけた情報として記録媒体に記録させる処理を行う
     請求項1に記載のカメラシステム。
    The control unit performs the association process by
    The camera system according to claim 1 , further comprising a process for recording the first level information or the second level information in a recording medium as information associated with the captured image data.
  13.  前記制御部は、前記関連づけ処理として、
     前記第1水準器情報又は前記第2水準器情報を、前記撮像画像データに関連づけた情報として外部機器に送信させる処理を行う
     請求項1に記載のカメラシステム。
    The control unit performs the association process by
    The camera system according to claim 1 , further comprising a process for transmitting the first level information or the second level information to an external device as information associated with the captured image data.
  14.  撮像画像データに対する処理を行う本体部と、
     前記本体部に対して着脱可能とされ、前記本体部に装着された非分離状態及び前記本体部から分離された分離状態のいずれであってもイメージセンサにより生成した前記撮像画像データを前記本体部に対して出力できるカメラヘッド部と、
     前記本体部に設けられる第1水準器と、
     前記カメラヘッド部の筐体と一体的に設けられる第2水準器と、
     を備えたカメラシステムの処理方法として、
     前記第1水準器による第1水準器情報と前記第2水準器による第2水準器情報に関する、前記撮像画像データへの関連づけ処理を行う
     処理方法。
    a main body section for performing processing on captured image data;
    a camera head unit that is detachable from the main body unit and capable of outputting the captured image data generated by an image sensor to the main body unit in either a non-detached state in which the camera head is attached to the main body unit or a detached state in which the camera head is separated from the main body unit;
    A first level provided on the main body portion;
    a second level provided integrally with a housing of the camera head;
    As a processing method for a camera system having
    a processing method for performing processing for associating first level information obtained by the first level and second level information obtained by the second level with the captured image data;
PCT/JP2023/038470 2022-11-11 2023-10-25 Camera system and processing method WO2024101153A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022-181358 2022-11-11
JP2022181358 2022-11-11

Publications (1)

Publication Number Publication Date
WO2024101153A1 true WO2024101153A1 (en) 2024-05-16

Family

ID=91032656

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/038470 WO2024101153A1 (en) 2022-11-11 2023-10-25 Camera system and processing method

Country Status (1)

Country Link
WO (1) WO2024101153A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010109477A (en) * 2008-10-28 2010-05-13 Canon Inc Imaging apparatus, control method thereof and program
JP2011120205A (en) * 2009-11-09 2011-06-16 Ricoh Co Ltd Camera system
JP2015162897A (en) * 2014-02-28 2015-09-07 オリンパス株式会社 Imaging system, imaging apparatus, portable apparatus, communication method, and program
JP2017085497A (en) * 2015-10-30 2017-05-18 キヤノン株式会社 Communication device and control method therefor, program and storage medium
WO2020054266A1 (en) * 2018-09-13 2020-03-19 ソニー株式会社 Camera system and cables

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010109477A (en) * 2008-10-28 2010-05-13 Canon Inc Imaging apparatus, control method thereof and program
JP2011120205A (en) * 2009-11-09 2011-06-16 Ricoh Co Ltd Camera system
JP2015162897A (en) * 2014-02-28 2015-09-07 オリンパス株式会社 Imaging system, imaging apparatus, portable apparatus, communication method, and program
JP2017085497A (en) * 2015-10-30 2017-05-18 キヤノン株式会社 Communication device and control method therefor, program and storage medium
WO2020054266A1 (en) * 2018-09-13 2020-03-19 ソニー株式会社 Camera system and cables

Similar Documents

Publication Publication Date Title
US7042499B1 (en) Digital camera including power supply controller responsive to connection detection
JP2008244801A (en) Image-taking device
JP2012239135A (en) Electronic apparatus
JP2009077090A (en) Imaging apparatus, imaging system, and image reading system
WO2024101153A1 (en) Camera system and processing method
US20120268569A1 (en) Composite camera system
JP2003319232A (en) Electronic image pickup device and lens system
JP7306403B2 (en) camera system, cable
JP7287400B2 (en) camera system, cable
JP2006254088A (en) Imaging apparatus
JPH1132240A (en) Electronic image-pickup device
JP3372534B2 (en) Recording / playback display device
JP2005354419A (en) Video camera
JPS59178084A (en) Video recorder incorporated with camera
JP2012237937A (en) Electronic apparatus
JP2005229538A (en) Digital camera system
JP4264559B2 (en) Shooting system and interface box
JP3372535B2 (en) Video recording and playback device
JP3344719B2 (en) Video recording and playback device
JP2010198195A (en) Communication apparatus, method of controlling the same, and program
JP2017037209A (en) Imaging device
JPH01204575A (en) Equipment for still video
JP2006157398A (en) Camera and electronic equipment
JP2000236469A (en) Electronic camera
JP3601563B2 (en) Optical axis alignment method for infrared AV signal transmission and infrared AV signal transmission system