WO2014115414A1 - 撮像装置 - Google Patents
撮像装置 Download PDFInfo
- Publication number
- WO2014115414A1 WO2014115414A1 PCT/JP2013/081433 JP2013081433W WO2014115414A1 WO 2014115414 A1 WO2014115414 A1 WO 2014115414A1 JP 2013081433 W JP2013081433 W JP 2013081433W WO 2014115414 A1 WO2014115414 A1 WO 2014115414A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- unit
- image processing
- image
- communication
- state
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/66—Remote control of cameras or camera parts, e.g. by remote control devices
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
- G06T1/20—Processor architectures; Processor configuration, e.g. pipelining
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/51—Housings
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/62—Control of parameters via user interfaces
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/66—Remote control of cameras or camera parts, e.g. by remote control devices
- H04N23/661—Transmitting camera control signals through networks, e.g. control via the Internet
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/681—Motion detection
- H04N23/6812—Motion detection based on additional sensors, e.g. acceleration sensors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/682—Vibration or motion blur correction
- H04N23/685—Vibration or motion blur correction performed by mechanical compensation
- H04N23/687—Vibration or motion blur correction performed by mechanical compensation by shifting the lens or sensor position
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/183—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
- H04N7/185—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source from a mobile camera, e.g. for remote control
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B17/00—Details of cameras or camera bodies; Accessories therefor
- G03B17/55—Details of cameras or camera bodies; Accessories therefor with provision for heating or cooling, e.g. in aircraft
Definitions
- the present invention relates to an imaging apparatus.
- the imaging apparatus described in Patent Document 1 does not take into account factors such as heat and magnetism that may affect the operation of the apparatus.
- the present invention has been made in view of the above-described problems, and an object thereof is to provide an imaging apparatus that takes into consideration factors that may affect the operation and the like.
- An imaging apparatus includes a first unit that includes a first communication unit that transmits image data captured by an image sensor, and a first image processing unit that performs image processing on the image data.
- a first communication unit including a second communication unit configured to receive the image data transmitted from the first communication unit; and a second image processing unit configured to perform image processing on the image data.
- the first image in a state in which the first unit and the second unit are separated from each other, and a second unit capable of transitioning between an integral state and a state separated from the first unit
- a selection unit that selects one of the processing unit and the second image processing unit.
- the selection unit may select one of the first image processing unit and the second image processing unit in accordance with the influence of heat by the image processing.
- the first unit includes a first housing having an attachment portion for attaching the second unit.
- the attachment portion includes the first unit and the second unit.
- a first heat radiating section for radiating at least heat generated in the first image processing section may be provided in a state where the unit is separated.
- the second unit A second housing having a portion attached to one unit, and the first unit and the second unit are integrated with at least a part of the second housing other than the attached portion. In this state and the separated state, at least a second heat radiating section for radiating heat generated in the second image processing section may be provided.
- the selection unit may hold information on a unit of which the influence of heat generated by performing image processing among the first and second units is smaller.
- the selection unit causes the second image processing unit to execute the image processing in a state where the first unit and the second unit are integrated, and the first unit and the second unit. In a state where the unit is separated, the first image processing unit may execute the image processing.
- the imaging apparatus of the present invention includes a first casing for the first unit and a second casing for the second unit, and the first casing and the second casing At least one of the second housings may be provided with an opening for heat dissipation.
- the heat release opening is provided in the second casing, and the first unit and the second unit of the second casing are integrated and separated.
- the heat radiation opening may be provided in a portion exposed to the outside.
- An imaging device of the present invention stores therein an imaging element, stores therein a first unit having a first housing having a metal part, and an electronic component that receives at least one of magnetism and radio waves, and at least A second casing having a non-metallic portion that allows at least one of the magnetism and radio waves to pass therethrough partially functions as a unit with the first unit, and is separated from the first unit.
- a second unit that functions independently even in a state.
- the metal part may dissipate at least the heat generated by the image sensor.
- the second unit may include a display unit.
- a conductive member that conducts at least the heat generated in the image sensor to the metal part may be provided.
- the first housing has an attachment portion for attaching the second housing, and the conductive member contacts a location different from the attachment portion of the first housing. The heat generated in at least the image sensor may be conducted.
- the electronic component may include an orientation sensor that detects geomagnetism and measures the orientation.
- the electronic component may include a GPS sensor that receives radio waves from the outside of the second unit and measures the position of the second unit.
- the electronic component may include a wireless power feeding mechanism.
- An image pickup apparatus of the present invention stores an image pickup element inside, stores at least a first unit having a first casing having a metal part that dissipates heat generated by the image pickup element, and an electronic component inside.
- a second housing having a non-metal portion at least partially and functioning integrally with the first unit and functioning independently even when separated from the first unit. And a unit.
- the imaging apparatus of the present invention includes a first unit having a first communication unit that transmits image data generated by an imaging element, and a second unit that receives the image data transmitted from the first communication unit.
- the imaging apparatus is configured to acquire the heat generation information acquisition unit that acquires information about heat generation in at least one of the first unit and the second unit, and based on the information acquired by the heat generation information acquisition unit.
- a first control unit that controls transmission of the image data from one communication unit to the second communication unit.
- the heat generation information acquisition unit acquires the temperature of at least one of the first image processing unit or the first communication unit and the second communication unit as information about the heat generation, and performs the first control.
- the unit may control transmission of the image data from the first communication unit to the second communication unit based on the temperature.
- the imaging device may receive the first processing unit from the first communication unit based on the processing amount acquisition unit that acquires the data processing amount of the first image processing unit and the data processing amount acquired by the processing amount acquisition unit. And a second control unit that controls transmission of the image data to the two communication units.
- first communication unit and the second communication unit can perform wireless communication and wired communication.
- first communication unit And the second communication unit performs either the wired communication or the wireless communication, and in a state where the first unit and the second unit are separated, the first communication unit and the second communication unit
- the communication unit may perform the wireless communication.
- the imaging apparatus may be configured to detect the first communication unit based on the communication speed detection unit that detects the wireless communication speed and the first wireless communication speed in a state where the first unit and the second unit are separated.
- a third control unit that controls transmission of the image data from the communication unit to the second communication unit.
- the first unit includes a second image processing unit that performs image processing on the image data, and the second image processing unit receives the second communication from the first communication unit.
- Image processing may be performed on image data that is not transmitted to the unit.
- the first unit may include a storage unit that temporarily stores image data that is not transmitted from the first communication unit to the second communication unit.
- the imaging apparatus of the present invention has an effect that it is possible to provide an imaging apparatus that takes into consideration factors that may affect the operation and the like.
- FIG. 1A is a perspective view illustrating a state in which the imaging apparatus according to the first embodiment is viewed from the back side
- FIG. 1B is a perspective view illustrating a state in which the imaging apparatus is viewed from the front side.
- FIG. 2 is a diagram illustrating a state in which the imaging devices illustrated in FIGS. 1A and 1B are separated.
- FIG. 3 is a block diagram illustrating a configuration of the imaging apparatus (integral state) according to the first embodiment.
- FIG. 4 is a block diagram illustrating a configuration of the imaging apparatus (separated state) according to the first embodiment.
- FIG. 5 is a diagram illustrating a heat sink provided inside the imaging apparatus according to the first embodiment.
- FIG. 6 is a partial cross-sectional view of the imaging apparatus according to the first embodiment.
- FIG. 7 is a perspective view illustrating an imaging apparatus (separated state) according to the second embodiment.
- FIG. 8 is a block diagram illustrating a configuration of an imaging apparatus according to the second embodiment.
- FIG. 9 is a flowchart illustrating processing of the imaging apparatus according to the second embodiment.
- FIG. 10 is a block diagram illustrating a configuration of an imaging apparatus according to the third embodiment.
- FIG. 11 is a flowchart illustrating processing of the imaging apparatus according to the third embodiment.
- FIG. 12 is a block diagram illustrating a configuration of an imaging apparatus according to the fourth embodiment.
- FIG. 13 is a flowchart illustrating processing of the imaging apparatus according to the fourth embodiment.
- FIG. 14 is a flowchart showing a specific process of step S66 of FIG.
- FIG. 15A and FIG. 15B are diagrams for explaining a modified example (No. 1).
- FIG. 16A and FIG. 16B are diagrams for explaining a modified example (No. 2).
- FIG. 1A is a perspective view illustrating a state in which the imaging apparatus 1a according to the first embodiment is viewed from the back side
- FIG. 1B illustrates a state in which the imaging apparatus 1a is viewed from the front side. It is a perspective view shown.
- the imaging device 1a includes a first unit 10a and a second unit 100a.
- the first unit 10a and the second unit 100a are in an integrated state (integrated state) as shown in FIGS. 1 (a) and 1 (b) and in a separated state (separated state) as shown in FIG. And can transition between.
- the second unit 100a is attached to the attachment portion 25aa of the first unit 10a.
- FIG. 3 shows a block diagram of each unit 10a, 100a (when in an integrated state).
- FIG. 4 shows a block diagram of the units 10a and 100a (in the separated state).
- the first unit 10a includes a control unit 11, a photographing lens 20, a lens driving unit 21, an image sensor 12, an A / D conversion unit 13, an image processing unit 14, an operation unit 15, a RAM 17, and a ROM 18. , An angular velocity sensor 19, a battery 23, a power supply unit 22, a wireless control unit 16, an antenna 24, and a connector 33.
- the control unit 11 has a CPU, is connected to each component in the first unit 10a, and controls the operation of the entire first unit 10a.
- the controller 11 recognizes whether the first unit 10a and the second unit 100a are in an integrated state or in a separated state, and performs control according to each state.
- the photographing lens 20 is composed of a plurality of lens groups including, for example, a zoom lens and a focusing lens, and forms a subject image on the imaging surface of the image sensor 12.
- the taking lens 20 may be replaceable with respect to the first unit 10a.
- the taking lens 20 is driven by the lens driving unit 21 under the control of the control unit 11.
- the imaging device 12 includes a CMOS image sensor in which light receiving elements are two-dimensionally arranged on the imaging surface, and generates an analog image signal.
- the A / D converter 13 converts the analog image signal generated by the image sensor 12 into a digital image signal and inputs the digital image signal to the image processor 14.
- the image processing unit 14 performs various types of image processing (color interpolation processing, gradation conversion processing, contour enhancement processing, white balance adjustment processing, image compression processing) on the digital image signal data input from the A / D conversion unit 13. , An image expansion process, etc.).
- image processing color interpolation processing, gradation conversion processing, contour enhancement processing, white balance adjustment processing, image compression processing
- the output of the image processing unit 14 is input to the control unit 11.
- the RAM 17 is connected to the control unit 11 and is used as a temporary storage area in processing by the control unit 11 and is also used as a buffer area when transferring data from the first unit 10a to the second unit 100a.
- the ROM 18 is, for example, a non-volatile semiconductor memory, and stores a control program and various parameters of the first unit 10a executed by the control unit 11. Further, the ROM 18 stores still image and moving image image data generated by the image processing unit 14.
- the operation unit 15 has a plurality of operation buttons and switches, and has a function of accepting various operations from the user.
- the operation unit 15 includes a release switch 15a, a menu button 15b, a cross key (multi-selector) 15c, and the like.
- the operation unit 15 may include a touch panel that receives an information input operation in response to a touch by the user.
- the angular velocity sensor 19 is a sensor that detects an angular velocity generated in the first unit 10a.
- the detection value (angular velocity) of the angular velocity sensor 19 is input to the control unit 11.
- camera shake correction is performed by moving (shifting) a part of the photographic lens 20 or the image sensor 12 based on the angular velocity detected by the angular velocity sensor 19 using a known camera shake correction technique.
- the battery 23 is a secondary battery such as a lithium ion battery.
- the battery 23 is connected to the connector 33.
- the power supply unit 22 is connected to the battery 23, converts the voltage generated by the battery 23 into a voltage used in each unit such as the control unit 11, and supplies the voltage to each unit.
- the wireless control unit 16 is connected to the control unit 11 and controls wireless communication with an external device such as the second unit 100a via the antenna 24.
- the connector 33 is connected to the control unit 11, and is connected to the connector 133 of the second unit 100a when the first unit 10a and the second unit 100a are in an integrated state.
- the control unit 11 can recognize that the units 10a and 100a are integrated.
- data transmission / reception between the control unit 11 and the control unit 101 of the second unit 100a is possible.
- power can be exchanged between the battery 23 and the battery 113 of the second unit 100a.
- the first casing 25a is made of a metal such as magnesium, and a heat radiating plate 26 is provided in the first casing 25a as indicated by a broken line in FIG.
- the heat sink 26 has a shape (substantially U-shaped) obtained by bending a rectangular plate member at two locations.
- each flat part of the heat sink 26 shall be called the 1st part 26a, the 2nd part 26b, and the 3rd part 26c.
- a material of the heat sink 26 a material having high thermal conductivity, for example, an aluminum alloy, SUS (stainless steel), a copper alloy, a magnesium alloy, a zinc alloy, a graphite sheet, or the like can be used.
- FIG. 6 which is a partial cross-sectional view of FIG. 5
- the heat sink 26 is in contact with a portion other than the mounting portion 25aa of the first housing 25a in the second portion 26b and the third portion 26c. It is in a state.
- the heat sink 26 holds the image sensor 12 in the first portion 26a and also holds the image processing unit 14 (the image processing unit 14 is not shown in FIG. 6).
- the imaging device 12 and the image processing unit 14 in the configuration of the imaging device 1a are components that generate a particularly large amount of heat.
- the heat generated by the imaging device 12 and the image processing unit 14 is used. Is conducted from the back side of the image pickup device 12 to the first portion 26a of the heat sink 26, further conducted through the heat sink 26, and conducted to the second portion 26b and the third portion 26c, and the first first made of metal. Conducted to the housing 25a. Then, heat is radiated from the entire surface of the first casing 25a (particularly, a portion other than the attachment portion 25aa).
- the heat generated in the imaging device 12 and the image processing unit 14 can be efficiently radiated from the entire surface of the first housing 25a.
- the heat radiating plate 26 is in contact with the first housing 25a at a portion other than the attachment portion 25aa, and therefore the second unit 100a is attached to the attachment portion 25aa (FIG. 1).
- the thermal influence with respect to the 2nd unit 100a can be reduced.
- the second unit 100a includes a control unit 101, a display unit 110, a display driving unit 109, a RAM 107, a ROM 108, an operation unit 105, a touch panel 115, a direction sensor 102, a GPS module 103, a power receiving coil 111, A battery 113, a power supply unit 112, a wireless control unit 106, an antenna 114, and a connector 133 are included.
- the control unit 101 has a CPU, is connected to each component in the second unit 100a, and controls the operation of the entire second unit 100a. In the first embodiment, the control unit 101 recognizes whether the units 10a and 100a are in an integrated state or a separated state, and performs control according to each state.
- the display unit 110 includes a liquid crystal panel, an organic EL panel, and the like, and displays images, operation menu screens, and the like.
- the display unit 110 is driven by the display driving unit 109 under the control of the control unit 101.
- the operation unit 105 receives various operations by the user, and includes a release switch 105a shown in FIG.
- the touch panel 115 is provided on the surface of the display unit 110 and accepts an information input operation in response to a touch by the user.
- the RAM 107 is connected to the control unit 101 and is used as a temporary storage area or the like in processing by the control unit 101.
- the ROM 108 is, for example, a nonvolatile semiconductor memory, and is connected to the control unit 101 and stores a control program and various parameters of the second unit 100a executed by the control unit 101. Further, the ROM 108 stores still image data, moving image data, and the like transferred from the first unit 10a.
- the orientation sensor 102 detects magnetism (geomagnetism) from the outside of the second unit 100a and obtains the orientation of the second unit 100a (the orientation indicated by the reference axis of the second unit 100a). Information on the orientation of the second unit 100 a obtained by the orientation sensor 102 is displayed on the display unit 110. Further, according to the operation (setting) of the operation unit 105 by the user, the direction information is stored in the ROM 18 or the ROM 108 together with the image data of the still image and the moving image.
- the GPS module 103 includes an antenna for receiving radio waves from a GPS (Global Positioning System) satellite, and detects position information (latitude, longitude, etc.) of the second unit 100a.
- the position information detected by the GPS module 103 is displayed on the display unit 110. Further, according to the operation (setting) of the operation unit 105 by the user, the position information is stored in the ROM 18 or the ROM 108 together with the image data of still images and moving images.
- the power receiving coil 111 generates an electromotive force by a magnetic flux from an external power transmission coil by a non-contact power feeding method (wireless power feeding method), and charges the battery 113 (electromagnetic induction method).
- a wireless power feeding method an electromagnetic field resonance method or a radio wave method may be adopted in addition to the electromagnetic induction method.
- the battery 113 is a secondary battery such as a lithium ion battery.
- the battery 113 is connected to the connector 133 and supplies power to the battery 23 via the connector 133 and the connector 33.
- the power supply unit 112 is connected to the battery 113, converts the voltage generated by the battery 113 into a voltage used in each unit such as the control unit 101, and supplies the voltage to each unit.
- the wireless control unit 106 is connected to the control unit 101 and controls wireless communication with an external device 114 such as the first unit 10a.
- the connector 133 is connected to the control unit 101, and is connected to the connector 33 of the first unit 10a when the first unit 10a and the second unit 100a are integrated as described above.
- the second casing 125a In the first embodiment, among the above-described configurations of the second unit 100a, configurations other than the display unit 110, the operation unit 105, and the touch panel 115 are stored in the second casing 125a (see FIG. 2). Yes.
- the second casing 125a is formed of a non-metallic member such as resin.
- a decrease in radio wave reception sensitivity of the GPS module 103 that occurs when the electronic component is housed in a metal highly conductive casing magnetic flux generated in the vicinity of the GPS module 103 is absorbed by the casing, and a resonance phenomenon occurs. It is possible to avoid a situation in which the sensitivity of the direction sensor 102 is lowered and the electromotive force is lowered by the power receiving coil 111.
- all of the second casing 125a may be formed of a non-metallic member such as resin, but only the vicinity of the electronic component may be formed of a non-metallic member such as resin.
- the first unit 10a and the second unit 100a are in the separated state (the state shown in FIGS. 2 and 4)
- the first unit 10a can perform the imaging process independently.
- the second unit 100a can independently perform a display process (a process for allowing the user to browse still image data and moving image data stored in the ROMs 18 and 108).
- the second unit 100a can be used as a remote controller for the first unit 10a for remote operation of imaging by the first unit 10a.
- control units 11 and 101 can determine whether the units 10a and 100a are in an integrated state or a separated state based on whether or not the connectors 33 and 133 are connected.
- the present invention is not limited to this, and it may be determined using a mechanical switch or a sensor (such as an IC tag reader) whether it is an integrated state or a separated state.
- the first casing 25a of the first unit 10a is made of a metal member that dissipates heat generated in the image sensor 12 and the image processing unit 14.
- the formed second housing 125a of the second unit 100a stores therein electronic components (102, 103, 111) that function by receiving magnetism and radio waves, and at least a part of the magnets and radio waves pass therethrough. It is formed of a non-metallic member.
- the second unit 100a functions in an integrated state with the first unit 10a and functions independently even in a state separated from the first unit 10a.
- the influence of heat generated in the image sensor 12 and the image processing unit 14 in the first unit 10a can be reduced, and an electronic component ( 102, 103, 111), the function of the electronic component can be effectively exhibited.
- an electronic component 102, 103, 111
- Usability can be improved.
- the second unit 100a since the second unit 100a includes the display unit 110, in the integrated state, an image captured by the image sensor is displayed on the display unit 110, or a through image ( Live view image) can be displayed. In the separated state, the captured image can be displayed on the display unit 110, or an image for remotely operating the first unit 10a can be displayed.
- the first unit 10a includes a heat radiating plate 26 that conducts heat generated in the image sensor 12 and the image processing unit 14 to the first housing 25a (metal portion). Yes. Thereby, the heat generated in the image sensor 12 and the image processing unit 14 can be efficiently radiated in the metal portion of the first housing 25a. Further, in the present embodiment, the heat radiating plate 26 is in contact with a location different from the attachment portion 25aa of the first housing 25a, and heat generated in the image sensor 12 and the image processing unit 14 is generated in the first housing 25a. Conduct to (metal part). Thereby, it is possible to prevent the second unit 100a from being affected by the heat generated in the image sensor 12 and the image processing unit 14, and the efficiency is improved even when the units 10a and 100a are in an integrated state. It can dissipate heat well.
- the imaging element 12 and the image processing unit 14 are provided in the heat dissipation plate 26 .
- the present invention is not limited to this.
- only the image sensor 12 may be provided on the heat sink 26.
- components other than the image sensor 12 and the image processing unit 14 may be provided on the heat sink 26.
- the heat sink 26 was provided in the 1st housing
- casing 25a was demonstrated in the said 1st Embodiment, it is not restricted to this.
- the image sensor 12 and the image processing unit 14 may be in direct contact with the first housing 25a (metal part).
- the azimuth sensor 102, the GPS module 103, and the power receiving coil 111 are employed as electronic components that function by receiving magnetism or radio waves.
- other electronic components may be employed. It is good.
- the direction sensor 102, the GPS module 103, and the power receiving coil 111 described in the first embodiment may be provided in the first unit 10a.
- a part of the first housing 25a is formed of a nonmetal such as a resin.
- all of the first housing 25a may be formed of a metal member, or only part of the first housing 25a may be formed of a metal member. Further, all of the second casing 125a may be formed of a non-metallic member, or only a part thereof may be formed of a non-metallic member.
- the present invention is not limited to this, and the battery 23 and the power receiving coil 111 are connected. Thus, the battery 23 may be charged.
- FIG. 7 is a perspective view showing an imaging apparatus 1b (separated state) according to the second embodiment.
- FIG. 8 is a block diagram illustrating a configuration of the imaging apparatus 1b.
- the imaging apparatus 1b includes a first unit 10b and a second unit 100b.
- the second unit 100b has the image processing unit 104 (see the bold line portion).
- a plurality of slit-like heat radiation openings 40 are provided in the attachment portion 25ba of the first casing 25b of the first unit 10b.
- a plurality of slit-shaped heat radiation openings 140 are provided in the second casing 125b of the second unit 100b.
- Other configurations are the same as those in the first embodiment.
- the heat radiating plate 26 described in the first embodiment may be provided in the first housing 25b or may not be provided.
- the image processing unit 104 performs various types of image processing (color interpolation processing, gradation conversion processing, contour enhancement processing, white balance adjustment processing, A circuit that performs image compression processing, image expansion processing, and the like.
- the image processing unit 104 is connected to the control unit 101.
- the heat radiation opening 40 efficiently radiates heat generated in the first housing 25b (heat generated in the image sensor 12 and the image processing unit 14) to the outside when the units 10b and 100b are separated. (In the integrated state, the heat radiation efficiency is lower than in the separated state because the heat radiation opening 40 is blocked by the second unit 100b).
- the heat radiation opening 140 radiates heat generated in the second casing 125b (heat generated in the image processing unit 104) with the same efficiency when the units 10b and 100b are integrated and separated. To do.
- step S10 the control unit 11 and the control unit 101 stand by until there is an imaging instruction from the user.
- the imaging instruction from the user in this case includes an imaging instruction when the release switch 15a or the release switch 105a is pressed by the user and an imaging instruction from the user by operating the touch panel 115.
- the control unit 101 receives an imaging instruction, the imaging instruction is transmitted to the control unit 11.
- the units 10b and 100b are in an integrated state, an imaging instruction is transmitted via the connectors 33 and 133.
- the radio control units 106 and 16 transmit an imaging instruction. If there is an imaging instruction from the user, the process proceeds to step S11.
- control unit 11 executes (starts) imaging using the photographing lens 20 and the imaging element 12. And the control part 11 converts the analog image signal produced
- the digital image signal data acquired by the control unit 11 is raw data (RAW file) before image processing is performed.
- step S12 the control unit 11 determines whether or not the units 10b and 100b are in an integrated state. When judgment here is affirmed, it transfers to step S14.
- step S14 the control unit 11 transmits the digital image signal data acquired in step S11 to the control unit 101 of the second unit 100b via the connector 33 and the connector 133. Then, the control unit 101 transmits digital image signal data to the image processing unit 104.
- the image processing unit 104 performs various types of image processing (color interpolation processing, gradation conversion processing, contour enhancement processing, white balance adjustment processing, image compression processing, image expansion processing, etc.) on the digital image signal data. Do.
- image data that has been subjected to various types of image processing is stored in the ROM 108 via the control unit 101.
- the image processing unit 104 of the second unit 100b performs image processing in the integrated state in which the heat radiation opening 40 of the first unit 10b is used. This is because the first unit 10b may be thermally affected when the image processing is performed by the image processing unit 14 of the first unit 10b.
- step S12 determines whether the units 10b and 100b are in the separated state. If transfering it to step S16, the control part 11 will extract the unit with the least influence of the heat
- the ROM 18 or the like stores in advance data indicating which of the first unit 10b and the second unit 100b is less affected by image processing.
- the controller 11 Is assumed to read data stored in the ROM 18 or the like. It is assumed that the data is generated by experiments or simulations performed at the design stage, manufacturing stage, etc. of the imaging apparatus and stored in the ROM 18 or the like.
- the heat generated by the image processing includes heat generated by processing accompanying the image processing such as heat generated by the image processing by the image processing units 14 and 104 and heat generated by wireless communication by the wireless control units 16 and 106. Is also included.
- step S18 it is determined whether or not the first unit 10b is less affected by heat due to image processing in the separated state. If the determination is negative, the process proceeds to step S14, and as described above, the image processing unit 101 of the second unit 100b performs image processing. On the other hand, if the determination in step S18 is affirmative, that is, if the first unit 10b is less affected by heat in image processing in the separated state, the control unit 11 proceeds to step S20. .
- step S20 the control unit 11 uses the image processing unit 14 of the first unit 10b to perform various types of image processing (color interpolation processing, gradation conversion processing, contour enhancement processing, white enhancement processing) on the digital image signal data. Balance adjustment processing, image compression processing, image expansion processing, etc.). In this case, the control unit 11 stores the image data after image processing in the ROM 18.
- the control unit 11 of the first unit 10b An image processing unit to be used for image processing is selected from either one of the image processing units 14 and 104 in accordance with the influence of heat from the processing (S18).
- image processing can be performed using the image processing unit of the unit that is less affected by heat. Reduction can be effectively suppressed.
- a heat radiation opening 40 for radiating heat is provided.
- the heat radiation opening 140 is provided in a part of the second housing 125b other than the portion attached to the first housing 25b. Thereby, the heat generated by the image processing unit 104 can be effectively radiated in both the integrated state and the separated state.
- the ROM 18 or the like retains information on the unit of the first and second units that is less affected by heat generated by performing image processing and is determined for each model. Therefore, the control unit 11 can appropriately determine which of the image processing units 14 and 104 should be used in the separated state.
- the present invention is not limited to this.
- the heat radiation opening 40 is closed by the second unit 100b, so image processing is performed using the image processing unit 104 of the second unit 100b, and in the separated state, Since the heat radiation opening 40 is not blocked by the second unit 100b, image processing may be performed using the image processing unit 14 of the first unit 100a. In this way, the heat generated by the image processing can be efficiently radiated, and the image data can be wired by way of the connectors 33 and 133, so that the transmission surface is also efficient. is there.
- the heat radiation opening 40 is provided in the attachment portion 25ba of the first housing 25b.
- the heat radiation opening 40 may be provided at a position other than the attachment portion 25ba.
- the heat radiation opening 40 is provided to dissipate the heat generated in the first housing 25a
- the heat radiation fin, the heat radiation A plate, a Peltier element or the like may be provided.
- the second unit 100b includes the direction sensor 102, the GPS module 103, and the power receiving coil 111 has been described. However, at least some of these are described. It may be omitted.
- FIG. 10 is a block diagram illustrating a configuration of an imaging apparatus 1c according to the third embodiment.
- the imaging device 1c includes a first unit 10c and a second unit 100c.
- the image processing unit 14 included in the first unit 10a is omitted.
- the second unit 100c has an image processing unit 104 as can be seen by comparing FIG. 10 and FIG. Note that the image processing unit 104 is the same image processing unit as the image processing unit 104 included in the second unit 100b of the second embodiment.
- step S30 as in step S10 of FIG. 9, the control unit 11 and the control unit 101 stand by until there is an imaging instruction from the user.
- the control unit 101 receives an imaging instruction, the imaging instruction is transmitted to the control unit 11. If there is an imaging instruction from the user, the process proceeds to step S31.
- control unit 11 executes (starts) imaging using the photographing lens 20 and the image sensor 12 as in step S11 of FIG.
- the control unit 11 acquires raw data (RAW file) of still images or moving images before image processing is performed as digital image signal data.
- step S32 the control unit 11 determines whether the units 10c and 100c are in an integrated state as in step S12 of FIG. When judgment here is affirmed, it transfers to step S34.
- step S34 the control unit 11 transmits the digital image signal data acquired in step S31 to the control unit 101 of the second unit 100c via the connector 33 and the connector 133.
- step S ⁇ b> 36 the control unit 101 executes various image processes using the image processing unit 104 of the second unit. Note that image data that has been subjected to various types of image processing is stored in the ROM 108 by the control unit 101.
- step S32 determines whether or not.
- the “specified value of the wireless communication speed” refers to a speed at which digital image data can be communicated within a predetermined time. The predetermined time varies depending on the file size of the digital image data.
- step S38 determines whether the wireless speed is sufficiently high. If the determination in step S38 is affirmative, that is, if the wireless speed is sufficiently high, the process proceeds to step S40, and the control unit 11 converts the digital image signal data acquired in step S31 into the wireless control unit 16 and the antenna. 24, and transmitted to the control unit 101 of the second unit 100c by wireless communication using the wireless control unit 106 and the antenna 114. Thereafter, step S36 is executed in the same manner as described above.
- step S42 the control unit 11 temporarily stores the digital image signal data acquired in step S31 in the RAM 17 or the ROM 18.
- the process of step S42 is complete
- finished it returns to step S32.
- the control unit 11 temporarily stores the data in the RAM 17 or the ROM 18 by wireless communication. The stored data is transmitted to the control unit 101. If the separated state is changed to the integrated state after returning to step S32, the process proceeds to step S34, and the control unit 11 temporarily stores the data in the RAM 17 or the ROM 18 via the connectors 33 and 133. The stored data is transmitted to the control unit 101.
- the first unit 10c and the second unit 100c are generated by the image sensor 12 regardless of whether the first unit 10c and the second unit 100c are integrated or separated.
- the processed image data is processed in the image processing unit 104 of the second unit 100c.
- the image pickup device 12 and the image processing unit 104 that generate a particularly large amount of heat in the entire image pickup apparatus 1c can be mounted in separate units, so that the heat generation source can be separated. Thereby, the heat generated in the image sensor 12 and the image processing unit 104 can be efficiently radiated.
- wired communication is performed via a connector in an integrated state, and wireless communication is performed in a separated state, so that communication efficiency can be improved.
- the present invention is not limited to this, and wireless communication may be performed even in an integrated state.
- control unit 11 since the control unit 11 detects the speed of wireless communication and controls the transmission of image data based on the speed of the wireless communication, appropriate and efficient communication is possible. Become.
- the first unit 10c includes the RAM 17 or the ROM 18 that temporarily stores image data that is not wirelessly communicated.
- the image data is stored in the stage where the wireless communication speed is recovered by temporarily storing the image data. It is possible to transmit to the control unit 101.
- the processing of FIG. 11 can be performed for both still images and moving images.
- a moving image having a real-time property such as a live view image
- a moving image having no real-time property can be performed. It is good also as making processing differ.
- the same processing as in FIG. 11 is performed.
- the image data is stored in the RAM 17 or ROM 18. It is good also as discarding, without storing temporarily.
- FIG. 12 is a block diagram illustrating a configuration of an imaging apparatus 1d according to the fourth embodiment.
- the imaging device 1d includes a first unit 10d and a second unit 100d.
- the fourth embodiment is different from the third embodiment in that the first unit 10d includes the temperature sensor 42 and the image processing unit 14. Further, the fourth embodiment is different from the third embodiment in that the second unit 100d includes a temperature sensor 116.
- the temperature sensor 42 is a sensor that is arranged around the image processing unit 14 and the wireless control unit 16 and measures the ambient temperature of the image processing unit 14 and the wireless control unit 16.
- the temperature sensor 116 is a sensor that is arranged around the image processing unit 104 and the wireless control unit 106 and measures the ambient temperature of the image processing unit 104 and the wireless control unit 106.
- the temperature sensor 42 is connected to the control unit 11, and the control unit 11 is connected to the control unit 101.
- FIG. 13 is a flowchart showing the process of the imaging device 1d
- FIG. 14 is a flowchart showing the specific process of step S66 of FIG.
- step S50 as in step S10 of FIG. 9 and step S30 of FIG. 11, the control unit 11 and the control unit 101 wait until there is an imaging instruction from the user.
- the control unit 101 receives an imaging instruction, the imaging instruction is transmitted to the control unit 11. If there is an imaging instruction from the user, the process proceeds to step S51.
- control unit 11 executes (starts) imaging using the photographing lens 20 and the image sensor 12 as in step S11 of FIG. 9 and step S31 of FIG. In this case, the control unit 11 acquires raw data (RAW file) before image processing is performed as data of a digital image signal.
- RAW file raw data
- step S52 the control unit 11 determines whether or not the units 10d and 100d are in an integrated state, similarly to step S12 in FIG. 9 and step S32 in FIG. When judgment here is affirmed, it transfers to step S54.
- step S54 the control unit 11 determines whether the ambient temperature of the image processing unit 104 measured by the temperature sensor 116 of the second unit 100d via the control unit 101 is equal to or higher than a specified value. If the determination here is affirmative, the process proceeds to step S56, and the control unit 11 passes a part of the data of the digital image signal input from the A / D conversion unit 13 via the connector 33 and the connector 133. To the control unit 101 of the second unit 100d.
- the data of the digital image signal is moving image data
- moving image data captured during a predetermined time out of the total imaging time is transmitted to the control unit 101.
- the data of the digital image signal is data obtained by continuously shooting still images
- a predetermined number of pieces of data among all the continuously shot images are transmitted to the control unit 101.
- step S58 the control unit 11 transmits data that has not been transmitted to the control unit 101 among the digital image signal data input from the A / D conversion unit 13 to the image processing unit 14 of the first unit 10d. input.
- the image processing unit 104 performs various types of image processing on the input digital image signal data under the instruction of the control unit 11.
- the image processing unit 14 inputs the processed image data to the control unit 11.
- the control unit 11 stores the image data processed by the image processing unit 14 in the ROM 18.
- step S58 the control unit 101 of the second unit 100d inputs the digital image signal data input from the control unit 11 to the image processing unit 104.
- the image processing unit 104 performs various types of image processing on the input digital image signal data in the same manner as the image processing unit 14.
- the image processing unit 104 inputs the processed image data to the control unit 101.
- the control unit 101 stores the image data processed by the image processing unit 104 in the ROM 108.
- step S58 after the image processing of the image processing unit 14 and the image processing unit 104 is completed, the data after the respective image processing is collected by the control unit 11 or the control unit 101 and combined into one. .
- the collected image data is stored in the ROM 18 or the ROM 108.
- the ambient temperature of the image processing unit 104 is equal to or higher than a specified value, if the image processing unit 104 tries to perform image processing on all data, the temperature of the image processing unit 104 further increases and erroneous processing is performed.
- the determination in step S54 is affirmed, the image processing unit 104 and the image processing unit 104 share the image processing. It is possible to reduce the load of the battery and suppress the occurrence of erroneous processing due to temperature rise.
- the amount of image processing performed by each of the image processing unit 14 and the image processing unit 104 depends on the ambient temperature of the image processing unit 104, the ambient temperature of the image processing unit 14 measured by the temperature sensor 42 of the first unit 10d, and the like. Can be determined accordingly.
- step S54 determines whether or not the processing amount for the digital image signal data input from the A / D conversion unit 13 is equal to or greater than a specified value. If the determination here is affirmative, steps S56 and S58 are executed in the same manner as described above.
- the image processing is performed by sharing the image processing between the image processing unit 14 and the image processing unit 104, thereby performing the image processing faster. (To shorten the image processing time).
- the amounts of image processing performed by the image processing unit 14 and the image processing unit 104 are the ambient temperature of the image processing unit 104 and the ambient temperature of the image processing unit 14 measured by the temperature sensor 42, as described above. It can be determined according to the above.
- step S60 If the determination in step S60 is negative, the process proceeds to step S62, and the control unit 11 transmits the digital image signal data input from the A / D conversion unit 13 via the connectors 33 and 133. It transmits to the control unit 101 of the second unit 100d.
- step S ⁇ b> 64 the control unit 101 inputs the digital image signal data input from the control unit 11 to the image processing unit 104.
- the image processing unit 104 performs various types of image processing on the input digital image signal data under the instruction of the control unit 11.
- the image processing unit 14 inputs the processed image data to the control unit 101. Note that the control unit 101 stores the image data processed by the image processing unit 14 in the ROM 108.
- step S64 when there is no problem in processing by the image processing unit 104 of the second unit 100d in terms of temperature and image processing amount, image processing using the image processing unit 104 is performed (step S64).
- the heat generation source can be separated by performing image processing using the image processing unit 104 arranged in a unit (second unit 100d) different from the imaging element 12 that generates a lot of heat.
- step S66 the separation state process in step S66 is executed. In this separation state process, the process according to the flowchart of FIG. 14 is executed.
- step S70 it is determined whether the wireless communication speed is equal to or higher than a specified value.
- the “specified value of the wireless communication speed” refers to a speed at which digital image data can be communicated within a predetermined time. The predetermined time varies depending on the file size of the digital image data.
- the control unit 11 inputs the digital image signal data input from the A / D conversion unit 13 to the image processing unit 14.
- the image processing unit 14 performs various types of image processing on the input digital image signal data.
- the image processing unit 14 inputs the processed image data to the control unit 11.
- the control unit 11 stores the image data processed by the image processing unit 14 in the ROM 18. Thereafter, all the processes in FIGS. 14 and 13 are terminated.
- step S70 determines whether or not the temperatures of the radio control units 16 and 106 of the first and second units 10d and 100d are equal to or higher than a specified value. Determine whether. If the determination here is negative, the wireless control units 16 and 106 may be thermally burdened when wireless communication is performed, and therefore the control unit 11 proceeds to step S86 and performs the above-described process. Image processing using the image processing unit 14 as described above is executed. Thereafter, all the processes in FIGS. 14 and 13 are terminated.
- step S74 the same determination as in step S54 described above (determination as to whether the temperature of the image processing unit 104 of the second unit 100d is equal to or higher than a specified value) is performed.
- the control unit 11 converts a part of the data of the digital image signal input from the A / D conversion unit 13 by wireless communication to the control unit 101 of the second unit 100d. Send to.
- step S78 processing similar to that in step S58 described above (image processing is shared by the image processing units 14 and 104 of the first and second units 10d and 100d) is executed, and the entire processing in FIGS.
- step S74 when the determination in step S74 is affirmed, the image processing unit 14 and the image processing unit 104 share the image processing (S78), so the load on the image processing unit 104 is reduced. It is possible to reduce the occurrence of erroneous processing due to temperature rise.
- step S74 determines whether the image processing amount is greater than or equal to a specified value. If the determination here is affirmed, the processes of steps S76 and S78 are executed in the same manner as described above.
- the image processing is performed by sharing the image processing between the image processing unit 14 and the image processing unit 104, thereby speeding up the image processing. It is possible (to shorten the image processing time).
- step S80 determines whether the determination in step S80 is negative. If the determination in step S80 is negative, the process proceeds to step S82, and the control unit 11 transmits the digital image signal data input from the A / D conversion unit 13 to the second unit by wireless communication. It transmits to the control unit 101 of 100d.
- step S84 processing similar to that in step S64 described above (image processing by the image processing unit 104 of the second unit 100d) is executed, and all the processing in FIGS. 14 and 13 ends.
- the heat generation source can be separated by performing image processing using the image processing unit 104 disposed in a unit different from the image sensor 12 that generates a lot of heat.
- the temperature sensor 42, 116 is provided in the first unit 10d and the second unit 100d, and the control unit 11 includes the temperature sensor 42, Based on the detection result 116, transmission of image data from the first unit 10d to the second unit 100d is controlled. Accordingly, in the fourth embodiment, it is possible to determine which unit the image processing unit 14 or 104 performs image processing according to the temperature (heat generation) of each unit 10d or 100d. Therefore, the influence of heat in image processing can be reduced.
- the temperature sensor is provided in both the first and second units 10d and 100d.
- the temperature sensor is not limited to this, and the temperature sensor is provided in at least one of the units 10d and 100d. What is necessary is just to be provided.
- a part of the processing of FIGS. 13 and 14 may be changed according to the way of providing the temperature sensor.
- the case where the temperature sensor 42 of the first unit 10d detects the temperature around the image processing unit 14 and the wireless control unit 16 has been described.
- the ambient temperature may be detected.
- the temperature sensor 116 of the second unit 100d detects the temperature around the wireless control unit 106 and the image processing unit 104 has been described. You may make it do.
- a part of the processing in FIGS. 13 and 14 may be changed in accordance with the objects detected by the temperature sensors 42 and 116.
- the image processing unit 14 A sharing method may be employed in which image processing of a through image (live view image) is performed, and the image processing unit 104 performs image processing of an actually captured image.
- FIGS. 15A and 15B may be employed as the imaging device.
- the imaging device 1e in FIG. 15A includes a first unit 10e and a second unit 100e.
- the second unit 100e can be separated by sliding it laterally from the first unit 10e.
- Such a configuration is effective, for example, when the display unit 110 ′ of the second unit 100 e has a large area like a smartphone and the casing is thin. Further, as shown in FIGS.
- an imaging apparatus 1f configured to slide the second unit 100f upward with respect to the first unit 10f may be employed.
- each release switch since the two release switches are exposed to the outside as shown in FIG. 16A in the integrated state, each release switch may have a different function.
- one release switch may have a function for instructing to capture a still image
- the other release switch may have a function for instructing to capture a moving image.
- the first unit may be provided with a display unit.
- the display unit may be provided on the attachment unit 25aa of the first unit.
- the first unit or the second unit may be provided with a projector that can project a captured image on a wall surface or a screen.
- the configuration described in the first embodiment (particularly, the direction sensor 102, the GPS module 103, the power receiving coil 111, and the heat dissipation plate 26) and the configuration described in the second embodiment (particularly, the heat dissipation opening 40). 140), the configuration described in the third embodiment (particularly, the configuration in which the image processing unit 104 is provided only in the second unit 100c), and the configuration described in the fourth embodiment (particularly, the temperature sensor 42, 116) may be appropriately selected and combined, or the configuration of one embodiment may be omitted from another embodiment.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Studio Devices (AREA)
- Structure And Mechanism Of Cameras (AREA)
- Camera Bodies And Camera Details Or Accessories (AREA)
- Indication In Cameras, And Counting Of Exposures (AREA)
- Details Of Cameras Including Film Mechanisms (AREA)
- Television Signal Processing For Recording (AREA)
- Cameras Adapted For Combination With Other Photographic Or Optical Apparatuses (AREA)
- Camera Data Copying Or Recording (AREA)
Abstract
Description
以下、第1の実施形態に係る撮像装置について、図1(a)~図6に基づいて詳細に説明する。図1(a)は、本第1の実施形態に係る撮像装置1aを背面側から見た状態を示す斜視図であり、図1(b)は、撮像装置1aを正面側から見た状態を示す斜視図である。
第1のユニット10aは、図3に示すように、制御部11、撮影レンズ20、レンズ駆動部21、撮像素子12、A/D変換部13、画像処理部14、操作部15、RAM17、ROM18、角速度センサ19、電池23、電源部22、無線制御部16、アンテナ24、及びコネクタ33を備える。
第2のユニット100aは、図3に示すように、制御部101、表示部110、表示駆動部109、RAM107、ROM108、操作部105、タッチパネル115、方位センサ102、GPSモジュール103、受電コイル111、電池113、電源部112、無線制御部106、アンテナ114、及びコネクタ133を有する。
次に、第2の実施形態に係る撮像装置について、図7~図9に基づいて説明する。図7は、第2の実施形態に係る撮像装置1b(分離状態)を示す斜視図である。また、図8は、撮像装置1bの構成を示すブロック図である。
次に、第3の実施形態について、図10、図11に基づいて詳細に説明する。図10は、第3の実施形態に係る撮像装置1cの構成を示すブロック図である。
次に、第4の実施形態について、図12~図14に基づいて詳細に説明する。図12は、第4の実施形態に係る撮像装置1dの構成を示すブロック図である。撮像装置1dは、第1のユニット10dと、第2のユニット100dとを備える。
Claims (25)
- 撮像素子で撮像した画像データを送信する第1の通信部と、前記画像データに対して画像処理を行う第1の画像処理部と、を有する第1のユニットと、
前記第1の通信部から送信された前記画像データを受信する第2の通信部と、前記画像データに対して画像処理を行う第2の画像処理部と、を有し、前記第1のユニットと一体の状態及び前記第1のユニットから分離した状態の間で遷移可能な第2のユニットと、
少なくとも前記第1のユニットと前記第2のユニットとが分離した状態で、前記第1の画像処理部と前記第2の画像処理部との一方を選択する選択部と、を備えたことを特徴とする撮像装置。 - 前記選択部は、前記画像処理による熱の影響に応じて、前記第1の画像処理部と前記第2の画像処理部との一方を選択することを特徴とする請求項1に記載の撮像装置。
- 前記第1のユニットは、前記第2のユニットを取り付けるための取り付け部を有する第1の筐体を有し、
前記取り付け部には、前記第1のユニットと前記第2のユニットとが分離した状態で、少なくとも前記第1の画像処理部で発生する熱を放熱させる第1の放熱部が設けられていることを特徴とする請求項2に記載の撮像装置。 - 前記第2のユニットは、前記第1のユニットに取り付けられる部分を有する第2の筐体を有し、
前記第2の筐体の前記取り付けられる部分以外の少なくとも一部に、前記第1のユニットと前記第2のユニットとが一体の状態及び分離した状態で、少なくとも前記第2の画像処理部で発生する熱を放熱させる第2の放熱部が設けられていることを特徴とする請求項2に記載の撮像装置。 - 前記選択部は、前記第1、第2のユニットのうち画像処理を行うことにより生じる熱の影響が少ない方のユニットの情報を保持していることを特徴とする請求項1~4のいずれか一項に記載の撮像装置。
- 前記選択部は、前記第1のユニットと前記第2のユニットとが一体の状態では、前記第2の画像処理部に前記画像処理を実行させ、前記第1のユニットと前記第2のユニットとが分離した状態では、前記第1の画像処理部に前記画像処理を実行させることを特徴とする請求項1~5のいずれか一項に記載の撮像装置。
- 前記第1のユニット用の第1の筐体と、
前記第2のユニット用の第2の筐体と、を備え、
前記第1の筐体と前記第2の筐体の少なくとも一方には、放熱用の開口部が設けられていることを特徴とする請求項1~6のいずれか一項に記載の撮像装置。 - 前記第2の筐体に前記放熱用の開口部が設けられており、
前記第2の筐体のうち、前記第1のユニットと前記第2のユニットとが一体の状態及び分離した状態のいずれにおいても外部に露出する部分に前記放熱用の開口部が設けられることを特徴とする請求項7に記載の撮像装置。 - 撮像素子を内部に格納し、金属部を有する第1の筐体を有する第1のユニットと、
磁気と電波との少なくとも一方を受ける電子部品を内部に格納し、少なくとも一部に前記磁気と電波との少なくとも一方を通過させる非金属部を有する第2の筐体を有し、前記第1のユニットと一体の状態で機能するとともに、前記第1のユニットと分離した状態でも独立して機能する第2のユニットと、
を備えたことを特徴とする撮像装置。 - 前記金属部は、少なくとも前記撮像素子で発生した熱を放熱させることを特徴とする請求項9に記載の撮像装置。
- 前記第2のユニットは、表示部を有することを特徴とする請求項9に記載の撮像装置。
- 少なくとも前記撮像素子で発生した熱を前記金属部に伝導させる伝導部材を備えたことを特徴とする請求項10に記載の撮像装置。
- 前記第1の筐体は、前記第2の筐体を取り付けるための取り付け部を有し、
前記伝導部材は、前記第1の筐体の前記取り付け部とは異なる箇所に接触し、少なくとも前記撮像素子で発生した熱を伝導させることを特徴とする請求項12に記載の撮像装置。 - 前記電子部品は、地磁気を検出して方位を測定する方位センサを含むことを特徴とする請求項9~13のいずれか一項に記載の撮像装置。
- 前記電子部品は、前記第2のユニットの外部から電波を受信して前記第2のユニットの位置を測定するGPSセンサを含むことを特徴とする請求項9~13のいずれか一項に記載の撮像装置。
- 前記電子部品は、ワイヤレス給電機構を含むことを特徴とする請求項9~13のいずれか一項に記載の撮像装置。
- 撮像素子を内部に格納し、少なくとも前記撮像素子で発生した熱を放熱させる金属部を有する第1の筐体を有する第1のユニットと、
電子部品を内部に格納し、少なくとも一部に非金属部を有する第2の筐体を有し、前記第1のユニットと一体の状態で機能するとともに、前記第1のユニットと分離した状態でも独立して機能する第2のユニットと、
を備えたことを特徴とする撮像装置。 - 撮像素子で生成された画像データを送信する第1の通信部を有する第1のユニットと、
前記第1の通信部から送信された前記画像データを受信する第2の通信部と、前記第2の通信部で受信した画像データに対して画像処理を行う第1の画像処理部と、を有し、前記第1のユニットと一体の状態と前記第1のユニットから分離した状態との間で遷移可能な第2のユニットと、を備えたことを特徴とする撮像装置。 - 前記第1のユニット及び前記第2のユニットの少なくとも一方における発熱に関する情報を取得する発熱情報取得部と、
前記発熱情報取得部が取得した情報に基づいて、前記第1の通信部から前記第2の通信部に対する前記画像データの送信を制御する第1制御部と、を備えたことを特徴とする請求項18に記載の撮像装置。 - 前記発熱情報取得部は、前記発熱に関する情報として前記第1の画像処理部又は前記第1の通信部と前記第2の通信部の少なくとも一方の温度を取得し、
前記第1制御部は、前記温度に基づいて、前記第1の通信部から前記第2の通信部に対する前記画像データの送信を制御することを特徴とする請求項19に記載の撮像装置。 - 前記第1の画像処理部のデータ処理量を取得する処理量取得部と、
前記処理量取得部が取得したデータ処理量に基づいて、前記第1の通信部から前記第2の通信部に対する前記画像データの送信を制御する第2制御部と、を備えたことを特徴とする請求項18~20のいずれか一項に記載の撮像装置。 - 前記第1の通信部及び前記第2の通信部は、無線通信及び有線通信が可能であり、
前記第1のユニットと前記第2のユニットが一体の状態では、前記第1の通信部及び前記第2の通信部は、前記有線通信及び前記無線通信のいずれかを行い、
前記第1のユニットと前記第2のユニットが分離した状態では、前記第1の通信部及び前記第2の通信部は、前記無線通信を行うことを特徴とする請求項18~20のいずれか一項に記載の撮像装置。 - 前記第1のユニットと前記第2のユニットが分離した状態において、前記無線通信の速度を検出する通信速度検出部と、
前記無線通信の速度に基づいて、前記第1の通信部から前記第2の通信部に対する前記画像データの送信を制御する第3制御部と、を備えたことを特徴とする請求項22に記載の撮像装置。 - 前記第1のユニットは、前記画像データに対して画像処理を行う第2の画像処理部を有し、
前記第2の画像処理部は、前記第1の通信部から前記第2の通信部へ送信されない画像データに対して画像処理を行うことを特徴とする請求項18~20のいずれか一項に記載の撮像装置。 - 前記第1のユニットは、前記第1の通信部から前記第2の通信部へ送信されない画像データを一時記憶する記憶部を有することを特徴とする請求項18~20のいずれか一項に記載の撮像装置。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201380075014.3A CN105052122A (zh) | 2013-01-24 | 2013-11-21 | 摄像装置 |
US14/762,357 US9667851B2 (en) | 2013-01-24 | 2013-11-21 | Camera with communication unit that communicates with external device |
EP13872539.5A EP2950517B1 (en) | 2013-01-24 | 2013-11-21 | Imaging device |
JP2014558448A JP6432349B2 (ja) | 2013-01-24 | 2013-11-21 | 電子機器、カメラ及びプログラム |
US15/581,482 US10868949B2 (en) | 2013-01-24 | 2017-04-28 | Camera with communication unit that communicates with external device |
Applications Claiming Priority (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013010759 | 2013-01-24 | ||
JP2013010761 | 2013-01-24 | ||
JP2013-010760 | 2013-01-24 | ||
JP2013-010759 | 2013-01-24 | ||
JP2013010760 | 2013-01-24 | ||
JP2013-010761 | 2013-01-24 |
Related Child Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/762,357 A-371-Of-International US9667851B2 (en) | 2013-01-24 | 2013-11-21 | Camera with communication unit that communicates with external device |
US15/581,482 Division US10868949B2 (en) | 2013-01-24 | 2017-04-28 | Camera with communication unit that communicates with external device |
US15/581,482 Continuation US10868949B2 (en) | 2013-01-24 | 2017-04-28 | Camera with communication unit that communicates with external device |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014115414A1 true WO2014115414A1 (ja) | 2014-07-31 |
Family
ID=51227218
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2013/081433 WO2014115414A1 (ja) | 2013-01-24 | 2013-11-21 | 撮像装置 |
Country Status (5)
Country | Link |
---|---|
US (2) | US9667851B2 (ja) |
EP (1) | EP2950517B1 (ja) |
JP (6) | JP6432349B2 (ja) |
CN (1) | CN105052122A (ja) |
WO (1) | WO2014115414A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2019219458A (ja) * | 2018-06-18 | 2019-12-26 | 株式会社シグマ | 撮像装置 |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6752640B2 (ja) | 2016-06-27 | 2020-09-09 | キヤノン株式会社 | 撮像装置 |
KR102489238B1 (ko) * | 2016-09-26 | 2023-01-16 | 한화테크윈 주식회사 | 인도어-아웃도어 겸용 카메라 어셈블리 |
US10958845B2 (en) * | 2019-04-16 | 2021-03-23 | Microsoft Technology Licensing, Llc | Camera with rotatable sensor |
CN111953903A (zh) * | 2020-08-13 | 2020-11-17 | 北京达佳互联信息技术有限公司 | 拍摄方法、装置、电子设备及存储介质 |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003172862A (ja) * | 2001-12-05 | 2003-06-20 | Chinontec Kk | カメラ |
JP2006041952A (ja) * | 2004-07-27 | 2006-02-09 | Sony Corp | 情報処理装置、情報機器及び情報機器の制御方法 |
JP2007306224A (ja) * | 2006-05-10 | 2007-11-22 | Fujifilm Corp | デジタルカメラ |
JP2007336527A (ja) | 2006-05-16 | 2007-12-27 | Canon Inc | モニター分離可能な撮像装置、その制御方法 |
JP2009004511A (ja) * | 2007-06-20 | 2009-01-08 | Panasonic Electric Works Co Ltd | 非接触型給電装置 |
JP2011114390A (ja) * | 2009-11-24 | 2011-06-09 | Canon Inc | 撮像装置、その制御方法 |
JP2012032704A (ja) * | 2010-08-02 | 2012-02-16 | Canon Inc | 撮像装置 |
Family Cites Families (35)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4039775B2 (ja) * | 1999-07-30 | 2008-01-30 | 富士フイルム株式会社 | 画像通信システムならびにそのシステムを構成するディジタル・カメラおよびその動作制御方法 |
JP2003023564A (ja) | 2001-07-09 | 2003-01-24 | Mazda Motor Corp | 車両用撮像装置及びその制御方法、並びにコンピュータ・プログラム |
JP3703414B2 (ja) * | 2001-09-06 | 2005-10-05 | キヤノン株式会社 | プリンタ付カメラ、通信装置、その制御方法、制御プログラム及び記憶媒体 |
JP2003337374A (ja) | 2002-05-22 | 2003-11-28 | Canon Inc | カメラの付属品、カメラ,フラッシュ装置及びカメラシステム |
JP2004088396A (ja) * | 2002-08-27 | 2004-03-18 | Nikon Corp | 電子機器システム、電子カメラシステム、電子カメラ本体、着脱式外部表示装置および着脱式外部カメラ装置 |
JP2004096165A (ja) | 2002-08-29 | 2004-03-25 | Nikon Corp | 電子カメラおよび電子カメラシステム |
JP4508596B2 (ja) | 2002-11-06 | 2010-07-21 | キヤノン株式会社 | 通信装置、画像記憶装置およびそれらの制御方法 |
KR100598494B1 (ko) * | 2003-02-11 | 2006-07-11 | 정연경 | 씨디엠에이모듈을 내장한 무선카메라 및 이를 이용한동영상정보전송시스템 |
KR100556861B1 (ko) * | 2003-06-24 | 2006-03-10 | 엘지전자 주식회사 | 휴대 단말기 영상 통화 중 선택적 영상 전송 방법 |
KR100678226B1 (ko) * | 2004-11-11 | 2007-02-02 | 삼성전자주식회사 | 무선 착탈 표시창이 구비된 이동통신단말기의 동작 방법 |
JP4556178B2 (ja) * | 2005-03-15 | 2010-10-06 | セイコーエプソン株式会社 | 撮像システム |
JP4612866B2 (ja) | 2005-05-20 | 2011-01-12 | キヤノン株式会社 | 撮像方法および撮像システム |
JP4504873B2 (ja) * | 2005-05-31 | 2010-07-14 | 富士フイルム株式会社 | カメラシステム、本体アダプタおよびヘッドアダプタ |
JP2007129645A (ja) * | 2005-11-07 | 2007-05-24 | Fujifilm Corp | 撮影装置 |
US7616232B2 (en) * | 2005-12-02 | 2009-11-10 | Fujifilm Corporation | Remote shooting system and camera system |
JP2007158609A (ja) * | 2005-12-02 | 2007-06-21 | Fujifilm Corp | カメラシステム |
JP4621157B2 (ja) * | 2006-03-09 | 2011-01-26 | 富士フイルム株式会社 | 携帯型画像通信システムならびにそのシステムを構成する送信装置および受信装置ならびにそれらの制御方法 |
US7956921B2 (en) | 2006-05-16 | 2011-06-07 | Canon Kabushiki Kaisha | Imaging apparatus including a separable monitor and capable of wireless communication, and method for controlling the imaging apparatus |
US8036469B2 (en) | 2006-05-16 | 2011-10-11 | Canon Kabushiki Kaisha | Imaging apparatus including a separable monitor, and method for controlling the imaging apparatus |
JP4607818B2 (ja) * | 2006-05-22 | 2011-01-05 | 富士フイルム株式会社 | カメラ |
JP2008193457A (ja) * | 2007-02-06 | 2008-08-21 | Fujifilm Corp | デジタルカメラ、及びデジタルカメラの制御方法 |
JP2009027647A (ja) | 2007-07-23 | 2009-02-05 | Fujifilm Corp | 撮影画像記録システム、撮影装置、撮影画像記録方法 |
JP5401810B2 (ja) * | 2008-03-18 | 2014-01-29 | 株式会社ニコン | 電子機器及び電池パック |
US8199251B2 (en) | 2008-07-07 | 2012-06-12 | Woodman Labs, Inc. | Camera housing with integrated expansion module |
JP5347144B2 (ja) * | 2009-02-03 | 2013-11-20 | リコーイメージング株式会社 | 定点撮影が可能なカメラ |
JP2011077654A (ja) * | 2009-09-29 | 2011-04-14 | Canon Inc | 撮像装置、その制御方法、及びプログラム |
JP5493729B2 (ja) * | 2009-11-06 | 2014-05-14 | 株式会社リコー | 撮像システムと、本体ユニットおよびこれに接続の外部電子機器 |
JP2011259065A (ja) * | 2010-06-07 | 2011-12-22 | Ricoh Co Ltd | 撮像装置 |
JP2012027422A (ja) * | 2010-07-28 | 2012-02-09 | Jvc Kenwood Corp | 撮像装置 |
JP2012037791A (ja) * | 2010-08-10 | 2012-02-23 | Nikon Corp | 撮像装置 |
JP2012129807A (ja) | 2010-12-15 | 2012-07-05 | Sony Corp | 画像記録装置、画像記録方法およびプログラム |
JP5733614B2 (ja) * | 2011-03-29 | 2015-06-10 | リコーイメージング株式会社 | 撮影情報管理方法、及び撮影情報管理装置 |
US20130077932A1 (en) * | 2011-09-26 | 2013-03-28 | David James Cornell | Digital video camera system having two microphones |
JP2013093826A (ja) | 2011-10-05 | 2013-05-16 | Sanyo Electric Co Ltd | 電子カメラ |
CN202385170U (zh) * | 2011-11-16 | 2012-08-15 | 天津三星光电子有限公司 | 一种具有gps定位功能的数码相机 |
-
2013
- 2013-11-21 EP EP13872539.5A patent/EP2950517B1/en active Active
- 2013-11-21 JP JP2014558448A patent/JP6432349B2/ja active Active
- 2013-11-21 CN CN201380075014.3A patent/CN105052122A/zh active Pending
- 2013-11-21 WO PCT/JP2013/081433 patent/WO2014115414A1/ja active Application Filing
- 2013-11-21 US US14/762,357 patent/US9667851B2/en active Active
-
2017
- 2017-04-28 US US15/581,482 patent/US10868949B2/en active Active
-
2018
- 2018-11-08 JP JP2018210858A patent/JP6645558B2/ja active Active
- 2018-11-08 JP JP2018210859A patent/JP6645559B2/ja active Active
-
2019
- 2019-12-25 JP JP2019235073A patent/JP6806229B2/ja active Active
- 2019-12-25 JP JP2019235072A patent/JP6795081B2/ja active Active
-
2020
- 2020-11-27 JP JP2020196453A patent/JP2021040339A/ja active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003172862A (ja) * | 2001-12-05 | 2003-06-20 | Chinontec Kk | カメラ |
JP2006041952A (ja) * | 2004-07-27 | 2006-02-09 | Sony Corp | 情報処理装置、情報機器及び情報機器の制御方法 |
JP2007306224A (ja) * | 2006-05-10 | 2007-11-22 | Fujifilm Corp | デジタルカメラ |
JP2007336527A (ja) | 2006-05-16 | 2007-12-27 | Canon Inc | モニター分離可能な撮像装置、その制御方法 |
JP2009004511A (ja) * | 2007-06-20 | 2009-01-08 | Panasonic Electric Works Co Ltd | 非接触型給電装置 |
JP2011114390A (ja) * | 2009-11-24 | 2011-06-09 | Canon Inc | 撮像装置、その制御方法 |
JP2012032704A (ja) * | 2010-08-02 | 2012-02-16 | Canon Inc | 撮像装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP2950517A4 |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2019219458A (ja) * | 2018-06-18 | 2019-12-26 | 株式会社シグマ | 撮像装置 |
JP7055369B2 (ja) | 2018-06-18 | 2022-04-18 | 株式会社シグマ | 撮像装置 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2014115414A1 (ja) | 2017-01-26 |
EP2950517A4 (en) | 2016-07-13 |
US20170230564A1 (en) | 2017-08-10 |
US9667851B2 (en) | 2017-05-30 |
JP6806229B2 (ja) | 2021-01-06 |
US10868949B2 (en) | 2020-12-15 |
JP6645558B2 (ja) | 2020-02-14 |
JP2020061767A (ja) | 2020-04-16 |
EP2950517B1 (en) | 2020-12-23 |
CN105052122A (zh) | 2015-11-11 |
US20160014317A1 (en) | 2016-01-14 |
JP6795081B2 (ja) | 2020-12-02 |
EP2950517A1 (en) | 2015-12-02 |
JP6432349B2 (ja) | 2018-12-05 |
JP2021040339A (ja) | 2021-03-11 |
JP6645559B2 (ja) | 2020-02-14 |
JP2019057920A (ja) | 2019-04-11 |
JP2019054528A (ja) | 2019-04-04 |
JP2020061768A (ja) | 2020-04-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6806229B2 (ja) | 電子機器及びプログラム | |
JP5194650B2 (ja) | 電子カメラ | |
US20150049233A1 (en) | Photographing apparatus and method of controlling the same | |
JP6700768B2 (ja) | 撮像装置 | |
US9049317B2 (en) | Communication system, communication terminal, and method of controlling a communication system | |
JP6553986B2 (ja) | 通信装置およびその制御方法、プログラム | |
US20160205386A1 (en) | Digital photographing apparatus and control method thereof | |
US8600226B2 (en) | Focusing methods and apparatus, and recording media for recording the methods | |
TW201442510A (zh) | 影像設備、用戶端設備、影像系統、控制影像設備之方法、控制用戶端設備之方法及控制影像系統之方法 | |
US20120320255A1 (en) | Digital photographing apparatus and method of controlling the same | |
JP4896066B2 (ja) | 電子装置及びデータ通信方法 | |
JP2008245109A (ja) | 撮影装置 | |
KR101575261B1 (ko) | 복수개의 이미지센서를 갖는 카메라모듈 패키지 | |
EP4379520A1 (en) | Method for providing image, and electronic device supporting same | |
JP2012216885A (ja) | 撮像装置及び画像共有システム | |
JP2011081076A (ja) | 位置測位機能内蔵装置およびカメラ | |
JP2009094890A (ja) | デジタルカメラ | |
JP5223554B2 (ja) | 電子カメラおよび電子機器 | |
JP2004253915A (ja) | 撮影装置、方法、プログラム、及びコンピュータ読み取り可能な記憶媒体 | |
JP5665434B2 (ja) | 撮像装置、その制御方法及びプログラム並びに記録媒体 | |
JP2004254230A (ja) | 撮影装置 | |
JP2016053665A (ja) | 撮像装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201380075014.3 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13872539 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2014558448 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2013872539 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14762357 Country of ref document: US |