CN117941367A - Imaging apparatus, image processing method, and program - Google Patents

Imaging apparatus, image processing method, and program Download PDF

Info

Publication number
CN117941367A
CN117941367A CN202180102019.5A CN202180102019A CN117941367A CN 117941367 A CN117941367 A CN 117941367A CN 202180102019 A CN202180102019 A CN 202180102019A CN 117941367 A CN117941367 A CN 117941367A
Authority
CN
China
Prior art keywords
exposure time
imaging
captured image
celestial body
maximum exposure
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202180102019.5A
Other languages
Chinese (zh)
Inventor
新井俊彦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Publication of CN117941367A publication Critical patent/CN117941367A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/56Accessories
    • G03B17/561Support related camera accessories
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/633Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/695Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Studio Devices (AREA)

Abstract

According to one embodiment, an imaging device includes a first acquisition module, a second acquisition module, a calculation module, and a display control module. The first acquisition module is used for acquiring a captured image of a celestial body photographed by the imaging unit. The second acquisition module is configured to acquire physical quantities related to at least the position and the posture of the imaging device detected by the plurality of detectors. The calculation module is configured to calculate an amount of motion and a rotation angle of a celestial body contained in the captured image, and calculate a maximum exposure time for which the celestial body contained in the captured image is tracked, based on a physical quantity of the imaging device. The display control module is configured to cause the display unit to display a captured image on which information relating to the maximum exposure time is superimposed.

Description

Imaging apparatus, image processing method, and program
Technical Field
The invention relates to an imaging apparatus, an image processing method, and a program.
Background
When a sky (night sky) is photographed using an imaging device, when long-time exposure is performed, since the diurnal movement of the celestial body is related to the rotation of the earth, the position of the star or celestial body may change. Therefore, even if the star in the star space is photographed as a point image by long-time exposure, an equatorial device is generally used to compensate for the circadian motion. By using the equatorial device, it is possible to photograph the celestial body while tracking the celestial body that performs the diurnal motion. The equatorial apparatus is, however, a special device requiring special setup work.
Accordingly, a method is disclosed which compensates for the diurnal motion of a celestial body with an image stabilization function of an imaging device and photographs the celestial body in the sky as an image of an dot, without using a specially installed device such as an equatorial device.
[ Prior Art literature ]
[ Patent literature ]
[ Patent document 1]
Japanese laid-open patent application 2014-209795
Disclosure of Invention
[ Problem to be solved by the invention ]
However, since the image stabilizing function has a limited moving range, diurnal motion compensation using the image stabilizing function naturally has a limited exposure time. In this case, even if the exposure time is set in advance by the user, since photographing cannot be performed within a specified exposure time exceeding the maximum exposure time based on a limited movement range, exposure is stopped at the maximum exposure time, so that photographing operation is performed within an unintended exposure time of the user.
The present invention has been made in view of the above-described problems, and an object of the present invention is to provide an imaging apparatus, an image processing method, and a program that can photograph a celestial body as a dot image within an intentional exposure time of a user by checking a maximum exposure time.
[ Solution to the problem ]
In order to solve the above-described problems and achieve the object, an imaging apparatus includes a first acquisition module, a second acquisition module, a calculation module, and a display control module. The first acquisition module is configured to acquire a captured image of a celestial body photographed by the imaging unit. The second acquisition module is configured to acquire physical quantities related to at least the position and the posture of the imaging device detected by the plurality of detectors. The calculation module is configured to calculate an amount of motion and a rotation angle of a celestial body contained in the captured image, and calculate a maximum exposure time for which the celestial body contained in the captured image is tracked, based on a physical quantity of the imaging device. The display control module is configured to cause the display unit to display a captured image having information relating to the maximum exposure time superimposed thereon.
[ Effect of the invention ]
According to one embodiment of the present invention, by checking the maximum exposure time, the celestial body can be photographed as a dot image within the user's intended exposure time.
Drawings
Fig. 1 is a schematic diagram showing an example of a hardware configuration of an imaging apparatus according to an embodiment;
FIG. 2 is a schematic diagram illustrating the velocity of celestial body movement as a function of declination of celestial sphere;
FIG. 3 is a schematic diagram illustrating celestial movement centered on a celestial north pole;
FIG. 4 is a schematic diagram showing that the maximum exposure time differs depending on the place where the night sky is framed;
FIG. 5 is a schematic diagram illustrating the change in appearance of a celestial body track as a function of elevation angle;
FIG. 6 is a schematic diagram illustrating the change in appearance of a celestial body track as a function of elevation angle;
Fig. 7 is a schematic diagram illustrating physical quantities detected by various sensors of the image forming apparatus according to the embodiment;
Fig. 8 is a schematic diagram illustrating a celestial sphere when an imaging device according to an embodiment photographs a celestial body;
fig. 9 is a schematic diagram illustrating a state in which the imaging apparatus according to the embodiment tracks the celestial body when the celestial body is obviously moving on an elliptical (circular) orbit;
FIG. 10 is a schematic diagram illustrating the relationship between an ellipse and a tangent line;
fig. 11 is a schematic diagram showing an example of a functional block configuration of an image forming apparatus according to an embodiment;
Fig. 12 is a schematic diagram illustrating an operation of obtaining a maximum exposure time of another area by interpolating the maximum exposure time of a sampling point within a frame in the imaging apparatus according to the embodiment;
Fig. 13 is a schematic diagram showing an example of a display mode and a framing operation of an intra-frame maximum exposure time of the imaging apparatus according to the embodiment;
Fig. 14 is a flowchart showing an example of the flow of the maximum exposure time display processing and the tracking shooting processing of the imaging apparatus according to the embodiment;
fig. 15 is a schematic diagram showing an example of an operation of a display area that cannot be compensated within a frame photographed by the imaging apparatus according to the embodiment; and
Fig. 16 is a schematic diagram showing an example of a display mode of a length of a photographable star track within a frame photographed by the imaging device according to the embodiment.
Detailed Description
Hereinafter, embodiments of an imaging apparatus, an image processing method, and a program according to the present invention will be described in detail with reference to fig. 1 to 16. Furthermore, the present invention is not limited to the following embodiments, and the components in the following embodiments include components easily conceived by those skilled in the art, substantially the same components, and so-called equivalents. Further, various omissions, substitutions, changes, and combinations of the elements may be made without departing from the spirit of the embodiments below.
Hardware configuration of imaging device
Fig. 1 shows a schematic diagram of a hardware configuration of an imaging apparatus 1 according to an embodiment. The hardware configuration of the imaging apparatus 1 according to the present embodiment will be described with reference to fig. 1.
The imaging apparatus 1 shown in fig. 1 is an imaging apparatus such as a smart phone and a tablet terminal having a camera function. It should be noted that in the present embodiment, the imaging apparatus 1 will be described as a smart phone.
As shown in fig. 1, the imaging apparatus 1 includes a Central Processing Unit (CPU) 501, a Read Only Memory (ROM) 502, a Random Access Memory (RAM) 503, a flash memory 504, a touch panel 505, a display 506, an external device connection I/F (interface) 507, a communication circuit 508, and a positioning signal receiver 509.
The CPU 501 is an arithmetic device that controls the operation of the entire imaging device 1. The ROM 502 is a nonvolatile memory for storing a program to be first executed by the CPU 501, and is, for example, an Initial Program Loader (IPL). The RAM 503 is a volatile memory serving as a work area of the CPU 501. It should be noted that, for example, the CPU 501, the ROM 502, and the RAM 503 may be configured by a system on chip (SoC) mounted on one board.
The flash memory 504 is a nonvolatile memory, and stores various programs and various data of a web browser or the like according to the control of the CPU 501.
The touch panel 505 is an input device capable of operating the imaging device 1 by a touch operation of a user. For example, the touch panel 505 operates based on a capacitance sensing method, a resistive film method, or the like. It should be noted that the touch panel 505 corresponds to an "operation unit".
The display 506 is a liquid crystal display, an organic electroluminescence display, or the like, and the display 506 is integrated with the touch panel 505 to display an image of a subject, various icons, or the like. It should be noted that the display 506 corresponds to a "display unit".
The external device connection I/F507 is an interface based on a standard such as USB (universal serial bus), and performs data communication through connection with an external device.
The communication circuit 508 is a communication circuit that performs wireless communication with an external device via a network through an antenna 508a based on a communication method such as Wi-Fi (registered trademark), 4G (fourth generation mobile communication system), and 5G (fifth generation mobile communication system).
The positioning signal receiver 509 is a receiving device that receives positioning signals from positioning satellites based on a Global Navigation Satellite System (GNSS). The positioning signal receiver 509 detects the latitude of the position of the imaging device 1 from the received positioning signal.
As shown in fig. 1, the imaging apparatus 1 further includes a lens 511, a focal length detector 511a, an imaging sensor 512, a driving unit 513, an image processing unit 514, an imaging control unit 515, a geomagnetic sensor 516, an attitude sensor 517, and an acceleration sensor 518.
The lens 511 is an optical system, and condenses light from a subject to form an image with the imaging sensor 512. The focal length detector 511a is a device that detects the focal length of the lens 511.
The imaging sensor 512 is an image sensor configured by CMOS (complementary metal oxide semiconductor), CCD (charge coupled device), or the like, converts light incident on the lens 511 into image data of an electric signal, and outputs the image data. The optical axis OA of the lens 511 is orthogonal to the imaging surface 512a of the imaging sensor 512.
The driving unit 513 is a unit including a fixed stage, a movable stage that is movable relative to the fixed stage, and an electromagnetic circuit that moves the movable stage. The imaging sensor 512 is held on the movable stage, and the parallel motion of the imaging sensor is controlled at a desired motion speed in a direction orthogonal to the optical axis OA, and the rotational motion of the imaging sensor is controlled at a desired rotational speed about an axis parallel to the optical axis OA. The image stabilization function is realized by controlling the imaging sensor 512 using the driving unit 513.
The image processing unit 514 is a unit that receives image data captured by the imaging sensor 512, performs processing on the image data, and outputs the processed image data.
The imaging control unit 515 is a unit that controls the imaging operation of the imaging sensor 512 and the operation of the driving unit 513 according to an instruction from the CPU 501.
The geomagnetic sensor 516 is a sensor that detects the earth magnetic force to detect an angle of the imaging direction of the imaging apparatus 1 as an azimuth angle.
The attitude sensor 517 is a sensor that detects an elevation angle and a rotation angle. The elevation angle is an angle between the imaging direction of the imaging apparatus 1 and the horizontal plane, and the rotation angle is an angle rotated about the optical axis OA from the reference position of the imaging sensor 512.
The acceleration sensor 518 is a sensor that detects acceleration of the imaging device 1 in three axis directions to detect inclination angle, parallel motion, motion speed, displacement, and the like of the imaging device 1.
The CPU 501, ROM 502, RAM 503, flash memory 504, touch panel 505, display 506, external device connection I/F507, communication circuit 508, positioning signal receiver 509, focal length detector 511a, image processing unit 514, imaging control unit 515, geomagnetic sensor 516, attitude sensor 517, and acceleration sensor 518 are connected by a bus 520 such as an address bus and a data bus so as to be able to communicate with each other.
The hardware configuration of the imaging apparatus 1 shown in fig. 1 is only an example. Thus, the imaging device does not necessarily include all components, and the imaging device may include another component. For example, the imaging apparatus 1 may include a GPU (graphics processing unit) that performs real-time image processing, a near field communication circuit that performs near field wireless communication through communication standards such as NFC (near field communication) and bluetooth (registered trademark), and the like.
Difference in movement speed of celestial body on celestial sphere
Fig. 2 is a schematic diagram illustrating that the movement speed of the celestial body varies with declination of the celestial sphere. Fig. 3 is a schematic diagram illustrating a celestial motion centered on a north pole of a celestial body. Fig. 4 is a schematic diagram showing that the maximum exposure time differs depending on the place where the night sky is framed. The difference between the movement speeds depends on the position of the celestial body (star) on the celestial sphere, which will be described with reference to fig. 2 and 4.
As shown in fig. 2 (a), on the celestial sphere CS, a celestial body located at the declination δ position from the celestial equator CE moves while drawing a circular orbit OB around the celestial north pole NCP. In other words, as shown in fig. 3, the user (observer) observes that each celestial body (star) in the sky rotationally moves in a counterclockwise manner along a concentric track OB around the celestial north pole NCP. Further, as shown in fig. 3, the celestial body moves a greater distance within a predetermined time and has a greater movement speed because the celestial body is farther from the celestial north pole NCP, i.e., the celestial body has a smaller declination δ. For example, as shown in fig. 2 b, a celestial body (arctic star, etc.) having declination δ of about 90 degrees, that is, a celestial body located in the vicinity of the arctic pole NCP of the celestial body has a movement speed of about zero, but a celestial body present on the celestial body equator CE has a maximum movement speed (movement speed D). In other words, the celestial body on the celestial sphere CS has a movement speed determined from the declination δ. In this case, as shown in fig. 2 (b), as the declination δ becomes smaller, the movement speed becomes larger, and the movement speed increases in the order of the movement speeds A, B, C and D.
As described above, even if the image stabilization function is realized by performing the parallel motion control and the rotational motion control on the imaging sensor 512 using the driving unit 513, the function can perform the tracking shooting while exposing the celestial body contained in the frame (image capturing range) framed by the imaging apparatus 1. However, since the movable range of the imaging sensor 512 moved by the driving unit 513 has a mechanical limit, that is, a limit of the range of variation of the positional relationship between the imaging sensor 512 and the lens 511, the time that can be exposed while tracking the celestial body also has a limit. The time during which exposure can be performed while tracking the celestial body is referred to herein as the maximum exposure time.
As described above, since the movement speed of the celestial body is greater as the celestial body is farther from the celestial body north pole NCP, the movement speed of the celestial body contained in the frame F1 in the vicinity of the captured north pole is smaller than the movement speed of the celestial body contained in the frame F2 that is framed at a position distant from the north pole, as shown in fig. 4. In other words, the maximum exposure time (for example, 30 seconds) when the tracking shooting is performed on the celestial body within the frame F1 is greater than the maximum exposure time (for example, 10 seconds) when the tracking shooting is performed on the celestial body within the frame F2. Here, among celestial bodies included in a specific frame, for example, when a celestial body whose maximum exposure time is minimum has a maximum exposure time of 10 seconds and a celestial body whose maximum exposure time is maximum has a maximum exposure time of 30 seconds, the maximum exposure time of the entire frame is 10 seconds. Therefore, the maximum exposure time is different depending on the celestial body contained in the frame determined by the user according to the imaging direction of the imaging device 1.
Differences in celestial orbit appearance due to elevation angle
Fig. 5 and 6 are schematic views illustrating the change in the appearance of the celestial body track according to elevation angles. The difference in appearance of the celestial track according to the difference in elevation angle of the imaging direction of the imaging device 1 will be referred to fig. 5 and 6.
As shown in fig. 5, assuming that even if the orbit OB drawn by the celestial body on the celestial sphere CS is a circular orbit, the case where the circular orbit is viewed from directly below is (A1), the case where the circular orbit is viewed from a certain angle is (A2) and (A3), and the case where the circular orbit is viewed from the side is (A4), the shape of each orbit OB observed is different as shown in fig. 6 (A1) to (A4). In other words, when the user takes a sky image with the imaging device 1, the elevation angle h of the imaging direction of the imaging device 1 affects the imaging state of the orbit OB of the celestial body. For example, when the imaging direction of the imaging device 1 is the elevation angle h as in the case of (A2) to (A4) in fig. 5, then the orbit OB of the observed celestial body is a part of a circular orbit as shown in fig. 6.
Physical quantity detected by imaging device
Fig. 7 is a schematic diagram illustrating physical quantities detected by various sensors of the image forming apparatus according to the embodiment. Fig. 8 is a schematic diagram illustrating a celestial sphere when an imaging device photographs a celestial body according to an embodiment. Physical quantities and the like detectable by various sensors of the imaging apparatus 1 will be described with reference to fig. 7 and 8.
As shown in fig. 7 (a), the positioning signal receiver 509 of the imaging device 1 can detect a latitude epsilon, which is an angle in the direction toward the north pole NP, formed by the equator E and the imaging point O of the imaging device 1 present on the earth ER.
As shown in fig. 7 (b), the geomagnetic sensor 516 of the imaging apparatus 1 may detect geomagnetism, which is the earth magnetic force, to detect an angle of an imaging direction of the imaging apparatus 1 (for example, a clockwise angle with respect to the north direction) as the azimuth a.
As shown in fig. 7 (c), the attitude sensor 517 of the imaging apparatus 1 can detect an elevation angle h, which is an angle between the horizontal plane of the imaging apparatus 1 and the imaging direction (i.e., the optical axis OA). Further, as shown in fig. 7 (d), the posture sensor 517 may detect a rotation angle ζ of rotation from the horizontal plane about the center C of the imaging sensor 512 (the optical axis OA of the lens 511).
As shown in fig. 8, it is assumed that on the celestial sphere CS, the zenith is Z, the point at which the horizontal plane intersects the celestial sphere CS in the north direction is N, and the celestial body existing in the imaging direction of the imaging device 1 is S. In this case, the angle between the horizontal plane and the direction from the shooting point O to the celestial body S is the elevation angle h. Further, an angle between a direction from the photographing point O to the point N and a direction in which the celestial body S exists is the azimuth angle a. The azimuth angle corresponds to an angle formed by a shortest curve connecting zenith Z and celestial north pole NCP and a shortest curve connecting zenith Z and celestial S. Further, the latitude ε detected by the positioning signal receiver 509 corresponds to an angle formed by the point N, the shooting point O, and the celestial north pole NCP.
Further, an angle formed by the shortest curve connecting the zenith Z and the zenith NCP and the shortest curve connecting the zenith S and the zenith NCP corresponds to the hour angle H. Here, the angle formed by the shortest curve connecting the celestial north pole NCP and celestial body S and the shortest curve connecting the zenith Z and celestial body S is γ.
Further, assuming that declination of the celestial body S is δ, an angle θ formed by the celestial body S, the photographing point O, and the celestial body north pole NCP corresponds to 90- δ.
Exercise amount per unit time of celestial body
Fig. 9 is a schematic diagram illustrating a state in which an imaging apparatus according to an embodiment tracks a celestial body when the celestial body obviously moves on an elliptical (circular) orbit. Fig. 10 is a schematic diagram illustrating the relationship between an ellipse and a tangent line. The amount of motion per unit time on the imaging surface 512a of the imaging sensor 512 with respect to the celestial body will be described with reference to fig. 9 and 10. From the perspective of a user looking at a location remote from the north pole (i.e. remote from the earth's axis) and the earth spinning, the following calculation of the amount of celestial body motion will be derived on the premise of an ellipse, considering that the apparent orbit of the celestial body may have a slightly elliptical shape. However, since the apparent orbit is actually regarded as a perfect circular orbit, and the ellipse conceptually includes a perfect circle, the elliptical motion amount calculation expression is also applicable to a perfect circular orbit.
The amount of movement of the celestial body S in the X direction (horizontal direction) and the Y direction (vertical direction) per unit time on the celestial body CS can be obtained as the amount of movement and the rotation angle per unit time on the imaging surface 512a of the imaging sensor 512 of the imaging device 1. In this case, it is assumed that when a celestial body present at the point S1 (center C) moves along the track OB (elliptical orbit as shown in fig. 9) to the point S1' on the imaging surface 512a in a unit time, the amount of movement in the X direction is Δx, the amount of movement in the Y direction is Δy, and the rotation angle is α. The rotation angle α is the inclination angle of the tangent L at the point S1 'on the ellipse of the celestial track OB, i.e., the angle between the ellipse tangent at the point S1 and the ellipse tangent at the point S1'. It should be noted that even though the orbit OB of the celestial body is described as an ellipse, the ellipse conceptually includes a perfect circle as described above.
Here, as shown in fig. 9, when the celestial body at the point S1 moves along the track OB (elliptical track as shown in fig. 9), if the movement from the point S1 to the point S1' is tracked assuming that the point S1 is the center C of the imaging surface 512a, the driving unit 513 may move the imaging sensor 512 such that the center C moves by the amounts Δx and Δy. Further, when another celestial body exists at a point S2 around the point S1, the other celestial body moves from the point S2 to the point S2'. To track the celestial body at point S2, the drive unit 513 may rotate the imaging sensor 512 about the center C by a rotation angle α.
Here, assuming that the focal length detected by the focal length detector 511a is f and the rotation angle of the celestial body around the celestial body north pole NCP is Φ, the amounts of movement Δx and Δy are represented by expressions (1) and (2), as disclosed in japanese laid-open patent application 2014-209795 (known document).
Δx=f×sinθ×sinΦ (1)
Δy=f×sinθ×cosθ(1-cosΦ) (2)
In an ellipse of the XY coordinate system shown in fig. 10, an equation of a tangent L to the ellipse at a point K on the ellipse is represented by expression (3) disclosed in the above-mentioned known document.
x0×(x/a2)+y0×(y/b2)=1 (3)
When the equation of the tangent line L is modified to the equation of y, it is expressed by expression (4).
y=-(b2×x0)/(a2×y0)×x-1/(a2×y0) (4)
The angle between the tangent line L of the ellipse and the X-axis is the rotation angle α of the image with the center C of the imaging sensor 512 as the rotation center. In fig. 9, assuming that the coordinates of the point S1 'are (x 0, y 0), since the inclination angle of the straight line Q orthogonal to the tangent line L at the point S1' of the elliptical orbit OB is- (b 2×x0)/(a2 ×y0), the rotation angle α to be obtained is expressed by expression (5).
α=arctan{(-b2×x0)/(a2×y0)} (5)
Further, the angle γ in fig. 8 is represented by expression (6) based on the tangent theorem or the like, as disclosed in the above-mentioned known document.
γ=arctan[cos(ε)×sin(A)/{sin(ε)×cos(h)-cos(ε)×sin(h)×cos(A)}] (6)
The amounts of motion x and y of the celestial body are converted into amounts of motion Δx and Δy in coordinates on the imaging surface 512a by using the angle γ obtained by the expression (6), and the amounts of motion Δx and Δy are represented by expressions (7) and (8) as disclosed in the above-mentioned known document.
Δx=x×cos(γ)+y×sin(γ) (7)
Δy=x×sin(γ)+y×cos(γ) (8)
Further, as shown in fig. 7 (d), when the imaging sensor 512 of the imaging apparatus 1 is tilted (rotated) from the horizontal plane by the rotation angle ζ about the optical axis OA of the lens 511, the amounts of movement Δx and Δy represented by expressions (7) and (8) can be compensated by expressions (9) and (10), respectively.
Δx=x×cos(γ+ξ)+y×sin(γ+ξ) (9)
Δy=x×sin(γ+ξ)+y×cos(γ+ξ) (10)
As described above, the imaging apparatus 1 can calculate the amounts of motion Δx and Δy of the celestial body and the rotation angle α by using the latitude ε detected by the positioning signal receiver 509, the azimuth angle a detected by the geomagnetic sensor 516, the elevation angle h and the rotation angle ζ detected by the attitude sensor 517, and the focal length f detected by the focal length detector 511 a. Then, the imaging apparatus 1 performs exposure while performing parallel motion control and rotational motion control on the imaging sensor 512 based on the calculated amounts of motion Δx and Δy and the rotation angle α according to the celestial body motion using the driving unit 513, thereby realizing tracking shooting of the celestial body. In this case, the posture of the imaging apparatus 1 may be fixed.
On the other hand, as described above, the movable range of the imaging sensor 512 realized by the driving unit 513 has a mechanical limitation. Due to mechanical limitations, exposure time is also limited. Assuming that, among the mechanical limits, the mechanical limit value in the X direction is Lx, the mechanical limit value in the Y direction is Ly, and the mechanical limit value of rotation is lα, the time Tlimit at which these limit values are reached can be calculated by using the mechanical limit values Lx, ly, and lα and the expressions (5), (9), and (10). Assuming that the amounts of movement Δx and Δy and the time Tlimit of the rotation angle α obtained in this case are Tlimit (Δx), tlimit (Δy) and Tlimit (α), respectively, the minimum value among the three times Tlimit (Δx), tlimit (Δy) and Tlimit (α) is the maximum exposure time of the target celestial body.
Configuration and operation of imaging device functional blocks
Fig. 11 is a schematic diagram showing an example of a functional block configuration of an image forming apparatus according to an embodiment. Fig. 12 is a schematic diagram illustrating an operation of obtaining a maximum exposure time of another area by interpolating the maximum exposure time of a sampling point within a frame in the imaging apparatus according to the embodiment. Fig. 13 is a schematic diagram showing an example of a display mode and a framing operation of an intra-frame maximum exposure time of the imaging apparatus according to the embodiment. The configuration and operation of the functional blocks of the imaging apparatus 1 according to the present embodiment will be described with reference to fig. 11 and 13.
As shown in fig. 11, the imaging apparatus 1 includes a first acquisition module 101, a second acquisition module 102, a motion amount calculation module 103, an exposure time calculation module 104, a display control module 105, a drive control module 106, a setting module 107, and a storage unit 108.
The first acquisition module 101 is a processing module configured to acquire a captured image captured by the imaging sensor 512 and image-processed by the image processing unit 514. The first acquisition module 101 outputs the acquired captured image to the display control module 105.
The second acquisition module 102 is a processing module configured to acquire physical quantities related to at least the position and posture of the imaging device 1, which are detected by various sensors (detection units). Specifically, the second acquisition module 102 acquires the latitude ε detected by the positioning signal receiver 509, the azimuth A detected by the geomagnetic sensor 516, the elevation h and the rotation angle ζ detected by the attitude sensor 517, and the focal length f detected by the focal length detector 511 a. The second acquisition module 102 outputs the acquired physical quantity to the motion amount calculation module 103.
The movement amount calculation module 103 is a processing module that calculates the movement amounts Δx and Δy and the rotation angle α of each point (each sampling point SP shown in fig. 12, which will be described later) included in the frame captured by the imaging sensor 512 by the above-described expressions (5), (9), and (10) using the latitude epsilon, the azimuth angle a, the elevation angle h, the rotation angle ζ, and the focal length f acquired by the second acquisition module 102. The motion amount calculation module 103 outputs the calculated motion amounts Δx, Δy and rotation angle α to the exposure time calculation module 104 and the drive control module 106. It should be noted that the motion amount calculation module 103 corresponds to "a first calculation module".
The exposure time calculation module 104 is a processing module configured to: by using the amounts of motion Δx and Δy and the rotation angle α calculated by the motion amount calculation module 103, the maximum exposure time of each point (each celestial body) included in the frame photographed by the imaging sensor 512 is calculated based on the mechanical limit of the movable range of the imaging sensor 512. Specifically, as shown in fig. 12, the exposure time calculation module 104 first calculates the maximum exposure time of each sampling point SP in the frame F3 photographed by the imaging sensor 512 by using the amounts of motion Δx and Δy and the rotation angle α corresponding to the respective sampling points SP arranged in a grid pattern. Then, the exposure time calculation module 104 performs interpolation processing by using the maximum exposure time of the sampling point SP, thereby calculating the maximum exposure time of points other than the sampling point SP in the frame F3. In other words, the exposure time calculation module 104 does not obtain the maximum exposure time of all points included in the frame F3 by the amounts of motion Δx, Δy and the rotation angle α corresponding to all points. Alternatively, the exposure time calculation module 104 obtains the maximum exposure time (first maximum exposure time) of each sampling point SP based only on the amounts of motion Δx and Δy and the rotation angle α of the corresponding sampling point SP, and calculates the maximum exposure time (second maximum exposure time) corresponding to each other point by interpolation processing. Therefore, the calculation loads of the motion amount calculation module 103 and the exposure time calculation module 104 are reduced. The exposure time calculation module 104 outputs the calculated maximum exposure time to the display control module 105 and the drive control module 106. It should be noted that the exposure time calculation module 104 corresponds to a "second calculation module". Further, the sampling points SP correspond to "predetermined plural points". Further, the movement amount calculation module 103 and the exposure time calculation module 104 correspond to "calculation modules".
The display control module 105 is a processing module configured to control the display operation of the display 506. When the user performs framing by changing the imaging direction of the imaging apparatus 1, the display control module 105 causes the display 506 to display the captured image acquired by the first acquisition module 101 and the information on the maximum exposure time calculated by the exposure time calculation module 104 to superimpose the information on the captured image. For example, as shown in fig. 13 (a), assuming that there are a region with a maximum exposure time of "10 seconds", a region with a maximum exposure time of "20 seconds", and a region with a maximum exposure time of "30 seconds", the display control module 105 displays the boundary between regions corresponding to a predetermined range of the maximum exposure time and displays the maximum exposure time representing the respective ranges (for example, "10 seconds", "20 seconds", and "30 seconds" in fig. 13 (a)) so that the distribution of the maximum exposure times in the frame (captured image) can be recognized. In this case, for example, a region of "10 seconds" may indicate a region having a maximum exposure time of not more than 10 seconds, a region of "20 seconds" may indicate a region having a maximum exposure time of more than 10 seconds and less than 20 seconds, and a region of "30 seconds" may indicate a region having a maximum exposure time of more than 20 seconds and less than 30 seconds. Thus, the user can recognize the distribution of the maximum exposure time of the points (celestial bodies) in the frame determined by the imaging direction of the imaging device 1. In this case, when the exposure time (specified exposure time) set by the setting module 107 is 15 seconds, for example, because tracking shooting cannot be performed on the "10 second" area with the exposure of the specified exposure time, the user can remove the "10 second" area from the frame by performing framing (changing the imaging direction of the imaging apparatus 1) toward the arrow shown in fig. 13 (b), thereby determining the frame in which tracking shooting can be performed by using the exposure of the specified exposure time.
It should be noted that the information about the maximum exposure time superimposed on the captured image is not limited to the information display as shown in fig. 13. For example, the display control module 105 may display a maximum exposure time corresponding to a boundary between regions as superimposed on the boundary. In this case, for example, when the maximum exposure time superimposed on the displayed boundary is "20 seconds", it is understood that one side of the boundary is a region where the maximum exposure time is less than 20 seconds, and the other side is a region where the maximum exposure time is more than 20 seconds. Alternatively, the display control module 105 may impart transparency to the gray-scale image represented by the tone of the specific color according to the magnitude of the maximum exposure time to superimpose the gray-scale image on the captured image. In this case, the display control module 105 may switch the image to be displayed on the display 506 between the captured image and the grayscale image according to the operation on the touch panel 505. Therefore, the distribution of the maximum exposure time can be intuitively grasped according to the hue of the specific color, so that appropriate framing can be easily performed.
The drive control module 106 is a processing module configured to movably control the drive unit 513 via the imaging control unit 515 to perform parallel motion control in a direction orthogonal to the optical axis OA of the imaging sensor 512 mounted in the drive unit 513 and rotational motion control centered on an axis parallel to the optical axis OA. Further, the drive control module 106 controls the imaging operation of the imaging sensor 512 via the image processing unit 514. Further, when a photographing operation related to the tracking photographing is started, the drive control module 106 performs parallel motion control and rotational motion control on the imaging sensor 512 by using the drive unit 513 according to the celestial motion based on the amounts of motion Δx and Δy and the rotation angle α calculated by the motion amount calculation module 103, and performs exposure according to the specified exposure time set by the setting module 107. In other words, the drive control module 106 performs control to change the positional relationship between the imaging sensor 512 and the lens 511 based on the amounts of movement Δx and Δy and the rotation angle α according to the movement of the celestial body so that the position of the celestial body, which is a subject, contained in the captured image on the imaging surface 512a is unchanged.
The setting module 107 is a processing module configured to set an operation mode of the imaging apparatus 1 and various types of information according to the operation of the touch panel 505. For example, the setting module 107 sets the exposure time used by the imaging sensor 512, and causes the storage unit 108 to store the set exposure time as a specified exposure time. Further, the setting module 107 sets the tracking shooting mode or the normal shooting mode as the operation mode. The tracking shooting mode is one of the following modes, namely: shooting is performed by exposure using the set specified exposure time while moving the imaging sensor 512 to track the movement of the celestial body as described above. The normal photographing mode is a mode in which photographing is performed by performing exposure using a specified exposure time in a state in which the imaging sensor 512 is fixed. The setting module 107 causes the storage unit 108 to store the set operation mode. Further, the setting module 107 outputs the set specified exposure time to the drive control module 106.
The first acquisition module 101, the second acquisition module 102, the movement amount calculation module 103, the exposure time calculation module 104, the display control module 105, the drive control module 106, and the setting module 107 of the imaging apparatus 1 shown in fig. 11 are realized by executing programs by the CPU 501 shown in fig. 1. It should be noted that at least one of the first acquisition module 101, the second acquisition module 102, the movement amount calculation module 103, the exposure time calculation module 104, the display control module 105, the drive control module 106, and the setting module 107 may be implemented by hardware such as an integrated circuit.
The storage unit 108 is a unit configured to store programs of the imaging apparatus 1, various types of setting information set by the setting module 107, and the like. For example, the storage unit 108 is implemented by the RAM 503 or the flash memory 504 shown in fig. 1.
It should be noted that the processing module of the imaging apparatus 1 shown in fig. 11 conceptually indicates functions, and is therefore not limited to such a configuration. For example, a plurality of processing modules independent of each other in fig. 11 may be configured as one processing module. On the other hand, by dividing the function of one processing module into two or more, one processing module shown in fig. 11 can be configured as a plurality of processing modules.
Fig. 14 is a flowchart showing an example of the flow of the maximum exposure time display processing and the tracking shooting processing of the imaging apparatus according to the present embodiment. Fig. 15 is a schematic diagram showing an example of an operation of a display area that cannot be compensated within a frame photographed by the imaging apparatus according to the embodiment. The flow of the maximum exposure time display processing and the tracking shooting processing of the imaging apparatus 1 according to the present embodiment will be described with reference to fig. 14 and 15. It should be noted that the operation mode is regarded as being set in advance to the tracking shooting mode by the setting module 107.
Operation of step S11
The drive control module 106 of the imaging apparatus 1 performs initialization processing on the drive unit 513 via the image processing unit 514. For example, the drive control module 106 returns the position of the imaging sensor 512 to the original position by performing an initialization process on the drive unit 513. Then, the process proceeds to step S12.
Operation of step S12
The first acquisition module 101 of the imaging apparatus 1 acquires a captured image that is photographed by the imaging sensor 512 and on which image processing is performed by the image processing unit 514. In this case, the captured image taken by the imaging sensor 512 may be stored in the storage unit 108, and the first acquisition module 101 may acquire the captured image stored in the storage unit 108. The first acquisition module 101 outputs the acquired captured image to the display control module 105. Then, the process proceeds to step S13.
Operation of step S13
The second acquisition module 102 of the imaging apparatus 1 acquires the latitude epsilon detected by the positioning signal receiver 509, the azimuth a detected by the geomagnetic sensor 516, the elevation h and the rotation angle ζ detected by the attitude sensor 517, and the focal length f detected by the focal length detector 511 a. The second acquisition module 102 outputs the acquired physical quantity to the motion amount calculation module 103. Then, the process proceeds to step S14.
Operation of step S14
The movement amount calculation module 103 of the imaging apparatus 1 calculates the movement amounts Δx and Δy and the rotation angle α of each sampling point SP included in the frame photographed by the imaging sensor 512 by the above-described expressions (5), (9) and (10) using the latitude epsilon, the azimuth angle a, the elevation angle h, the rotation angle ζ and the focal length f acquired by the second acquisition module 102. The motion amount calculation module 103 outputs the calculated motion amounts Δx and Δy and the rotation angle α to the exposure time calculation module 104.
Next, the exposure time calculation module 104 of the imaging apparatus 1 calculates the maximum exposure time of each sampling point SP by using the amounts of motion Δx and Δy and the rotation angle α corresponding to the respective sampling points SP arranged in a grid pattern in the frame photographed by the imaging sensor 512. Then, the exposure time calculation module 104 calculates the maximum exposure time of points other than the sampling point SP in the frame by interpolation processing using the maximum exposure time of the sampling point SP. The exposure time calculation module 104 outputs the calculated maximum exposure time to the display control module 105. Then, the process proceeds to step S15.
Operation of step S15
The display control module 105 of the imaging device 1 causes the display 506 to display the captured image acquired by the first acquisition module 101 and the information on the maximum exposure time calculated by the exposure time calculation module 104 to superimpose the information on the captured image. Thus, the user can recognize the distribution of the maximum exposure time of the points (celestial bodies) in the frame determined by the imaging direction of the imaging device 1. Then, the process proceeds to step S16.
Operation of step S16
The user checks the captured image displayed by the display 506 and information about the maximum exposure time superimposed on the captured image, and changes the imaging direction of the imaging device 1 to adjust the framing to the desired distribution of the maximum exposure time. Then, in response to an operation to the touch panel 505 (step S16: yes), when shooting is started in the tracking shooting mode, the drive control module 106 sets the state of the imaging sensor 512 to the exposure state, and the process proceeds to step S17. On the other hand, when the user continues to adjust framing by changing the imaging direction of the imaging apparatus 1 (step S16: NO), the process returns to step S12.
Operation of step S17
The motion amount calculation module 103 calculates the motion amounts Δx and Δy and the rotation angle α of each sampling point SP included in the frame photographed by the imaging sensor 512 by the above-described expressions (5), (9), and (10) by using the latitude epsilon, the azimuth a, the elevation h, the rotation angle ζ, and the focal length f acquired by the second acquisition module 102. The motion amount calculation module 103 outputs the calculated motion amounts Δx and Δy and the rotation angle α to the exposure time calculation module 104 and the drive control module 106. Then, the process proceeds to step S18.
Operation of step S18
The drive control module 106 of the imaging apparatus 1 performs parallel motion control and rotational motion control on the imaging sensor 512 based on the amounts of motion Δx and Δy and the rotation angle α calculated by the motion amount calculation module 103 by the drive unit 513 in accordance with the motion of the celestial body. Then, the process proceeds to step S19.
Operation of step S19
When the tracking shooting is started and the specified exposure time has elapsed (yes in step S19), the process proceeds to step S20. When the specified exposure time has not elapsed (no in step S19), the process returns to step S16. It should be noted that the drive control module 106 may terminate the tracking capture when the amount of motion of the imaging sensor 512 reaches a mechanical limit before a specified exposure time has elapsed.
Operation of step S20
The drive control module 106 terminates the tracking shooting. Then, the process proceeds to step S21.
Operation of step S21
The first acquisition module 101 acquires a captured image taken by the imaging sensor 512 in a state where the imaging sensor is exposed for a specified exposure time. Then, the process proceeds to step S22.
Operation of step S22
The first acquisition module 101 stores the acquired captured image in the storage unit 108, and outputs the captured image to the display control module 105. The display control module 105 causes the display 506 to display the captured image acquired by the first acquisition module 101. Thus, the user may examine an image on the display 506 in which a celestial body (sidereal) in the sky is captured as a point image in the tracking shooting mode.
The maximum exposure time display processing and the tracking shooting processing of the imaging apparatus 1 are performed by the flow of steps S11 to S22.
Further, in step S15, as shown in fig. 15, for example, when the display control module 105 superimposes and displays the captured image and the information on the maximum exposure time of the frame F4 on the display 506, if an area where the tracking shooting of the specified exposure time set by the setting module 107 cannot be performed is contained in the frame F4, the display control module 105 may perform a specific color display transparent to the area to highlight the area. For example, since the 11 second follow-up photographing cannot be performed on the "10 second" area in the frame F4 when the setting module 107 sets 11 seconds as the specified exposure time, the display control module 105 performs a specific color display transparent to the "10 second" area, as shown in fig. 15. In this case, as shown in fig. 15, for example, the user clicks on the display portion of "10 seconds" in the "10 seconds" area, and thus the setting module 107 may set the specified exposure time to the minimum value of the maximum exposure time of the "10 seconds" area. Accordingly, the user can perform not only framing so as to enable tracking shooting within a set specified exposure time, but also setting the specified exposure time so as to enable tracking shooting within a specific frame.
The imaging sensor 512 is physically translated and rotated by the driving control of the driving unit 513, thereby realizing the tracking shooting operation described above, but the embodiment is not limited thereto. For example, a combination of the image blur compensation apparatus and the driving unit 513 that rotates the imaging sensor 512 is also capable of photographing a celestial body as a point image by tracking photographing the celestial body. The image blur compensation apparatus includes an image blur compensation lens that moves a subject on the imaging sensor 512 in a position in the lens 511. Even in this case, the drive control module 106 performs control to change the positional relationship between the imaging sensor 512 and the lens 511 based on the amounts of movement Δx and Δy and the rotation angle α according to the movement of the celestial body so that the position of the celestial body as the object contained in the captured image on the imaging surface 512a is unchanged.
Operation in normal shooting mode
Fig. 16 is a schematic diagram showing an example of a display mode of a length of a photographable star track within a frame photographed by the imaging device according to the embodiment. The operation of the imaging apparatus 1 of the present embodiment in the normal shooting mode will be described with reference to fig. 16.
The first acquisition module 101 acquires a captured image that is photographed by the imaging sensor 512 and on which image processing is performed by the image processing unit 514. The first acquisition module 101 outputs the acquired captured image to the display control module 105.
The second acquisition module 102 acquires the latitude epsilon detected by the positioning signal receiver 509, the azimuth a detected by the geomagnetic sensor 516, the elevation angle h and the rotation angle ζ detected by the attitude sensor 517, and the focal length f detected by the focal length detector 511 a. The second acquisition module 102 outputs the acquired physical quantity to the motion amount calculation module 103.
The motion amount calculation module 103 calculates the motion amounts Δx and Δy and the rotation angle α of the central celestial body contained in the frame captured by the imaging sensor 512 by the above-described expressions (5), (9), and (10) using the latitude epsilon, the azimuth a, the elevation h, the rotation angle ζ, and the focal length f acquired by the second acquisition module 102. Further, based on the calculated amounts of motion Δx and Δy and the rotation angle α, the motion amount calculation module 103 calculates a star light trajectory length to be drawn when performing exposure within the specified exposure time set by the setting module 107. It should be noted that the amounts of motion Δx and Δy and the rotation angle α calculated by the motion amount calculation module 103 are not limited to the amount of motion Δy and the rotation angle α of the center celestial body in the frame (captured image). The motion amount calculation module 103 outputs the calculated star trajectory length to the display control module 105.
The display control module 105 causes the display 506 to display the captured image acquired by the first acquisition module 101, and displays the star light trajectory length calculated by the motion amount calculation module 103 in a predetermined area of the captured image. For example, when the frame F5 is captured by the imaging sensor 512, as shown in fig. 16, the display control module 105 causes the display 506 to display the captured image of the frame F5, and displays the star trace length of the central celestial body in the frame F5 calculated by the motion amount calculation module 103 in the display area AR 5. Further, as shown in fig. 16, when the imaging sensor 512 captures a frame F6 framed in an area closer to the north pole of the celestial body than the frame F5, the display control module 105 causes the display 506 to display the captured image of the frame F6, and displays the star trace length (shorter than the star trace length of the frame F5) of the central celestial body in the frame F6 calculated by the motion amount calculation module 103 in the display area AR 6.
In this way, when capturing a track of a celestial body (star) by performing exposure for a specified exposure time in the normal capturing mode, the user can check in advance how long the length of the star track to be captured is in the imaging direction in which framing is performed.
As described above, in the imaging apparatus 1 according to the present embodiment: the first acquisition module 101 acquires a captured image of the celestial body photographed by the imaging sensor 512; the second acquisition module 102 acquires physical quantities detected by the plurality of sensors relating to at least the position and posture of the imaging device 1; the movement amount calculation module 103 calculates the movement amounts Δx and Δy and the rotation angle α of the celestial body contained in the image captured on the imaging surface 512a of the imaging sensor 512, based on the physical amount acquired by the second acquisition module 102; the exposure time calculation module 104 calculates the maximum exposure time that can track the celestial body contained in the captured image according to the change in the positional relationship when the imaging sensor 512 performs exposure, based on the amounts of movement Δx and Δy, the rotation angle α, and the limitation of the range of variation in the positional relationship between the imaging sensor 512 and the lens 511; the display control module 105 causes the display 506 to display the captured image and information about the maximum exposure time to superimpose the information on the captured image. In this case, for example, the display control module 105 displays, as information on the maximum exposure time, boundaries between areas corresponding to predetermined ranges of the maximum exposure time and the maximum exposure time representing the respective predetermined ranges. Accordingly, the user can check the maximum exposure time, can perform framing of a desired maximum exposure time, and can photograph the celestial body as a dot image within the user's intended exposure time.
The drive control module 106 changes the positional relationship between the imaging sensor 512 and the lens 511 based on the amounts of movement Δx and Δy and the rotation angle α so that the position of each celestial body contained in the captured image on the imaging surface 512a is unchanged. For example, based on the amounts of motion Δx and Δy and the rotation angle α, the drive control module 106 performs parallel motion control on the imaging sensor 512 in a direction orthogonal to the optical axis OA of the lens 511, and performs rotational motion control on the imaging sensor 512 about an axis parallel to the optical axis OA. Accordingly, compensation with respect to the circadian motion of the celestial body can be performed, and the celestial body can be photographed as an dot image.
Further, among the maximum exposure times in the captured image, the exposure time calculation module 104 calculates a first maximum exposure time corresponding to a predetermined sampling point SP in the captured image, and calculates a second maximum exposure time of an area other than the sampling point SP in the captured image by using interpolation processing for the first maximum exposure time corresponding to the sampling point SP. Therefore, the calculation load can be reduced.
It should be noted that each of the functions of the above-described embodiments may be implemented by one or more processing circuits. Here, the "processing circuit" includes: a processor programmed to perform functions by software, similar to a processor implemented by electronic circuitry; and devices such as ASICs (application specific integrated circuits), DSPs (digital signal processors), FPGAs (field programmable gate arrays) and conventional circuit modules, which are designed to perform the above-described functions.
The program executed by the imaging apparatus 1 according to the above-described embodiment may be configured to be provided by being incorporated in advance in a ROM or the like.
Further, the program executed by the imaging apparatus 1 according to the above-described embodiment may be configured to be provided as a computer program product by being recorded in a computer-readable recording medium such as a CD-ROM (compact disc read only memory), a Flexible Disc (FD), a CD-R (compact disc recordable), and a DVD (digital versatile disc) in an installable or executable file format.
Further, the program executed by the imaging apparatus 1 according to the above-described embodiment may be configured to be provided by being stored on a computer connected to a network (such as the internet) and downloaded through the network. Further, the program executed by the imaging apparatus 1 according to the above-described embodiment may be configured to be provided or distributed through a network (such as the internet).
Further, the program executed by the imaging apparatus 1 according to the above-described embodiment has a module configuration including the above-described functional portions. From the viewpoint of actual hardware, the CPU (processor) reads and executes a program from the ROM, and thus the above-described functional portions are loaded onto and generated on the main memory.
[ Interpretation of letters or numbers ]
1. Image forming apparatus
101. First acquisition module
102. Second acquisition module
103. Motion quantity calculating module
104. Exposure time calculation module
105. Display control module
106. Drive control module
107. Setting module
108. Memory cell
501 CPU
502 ROM
503 RAM
504. Flash memory
505. Touch panel
506. Display device
507. External device connection I/F
508. Communication circuit
508A antenna
509. Positioning signal receiver
511. Lens
511A focal length detector
512. Imaging sensor
512A imaging surface
513. Driving unit
514. Image processing unit
515. Imaging control unit
516. Geomagnetic sensor
517. Attitude sensor
518. Acceleration sensor
520. Bus line
Azimuth angle A
AR5, AR6 display area
C center
CE celestial body equator
CS celestial sphere
E equator
ER earth
F1-F6 frame
H elevation angle
H hours angle
N point
NCP celestial North Pole
NP North Pole
O shooting point
OA optical axis
OB track
S celestial body
SP sampling point
Z zenith
Angle gamma
Delta declination
Epsilon latitude
Angle theta
And xi rotation angle.

Claims (15)

1. An image forming apparatus comprising:
a first acquisition module configured to acquire a captured image of a celestial body photographed by an imaging unit;
A second acquisition module configured to acquire physical quantities detected by a plurality of detectors relating to at least a position and an attitude of the imaging apparatus;
a calculation module configured to calculate an amount of motion and a rotation angle of a celestial body contained in the captured image, and calculate a maximum exposure time for which the celestial body contained in the captured image is tracked, based on the physical quantity of the imaging device; and
And a display control module configured to cause the display unit to display a captured image on which information about the maximum exposure time is superimposed.
2. The imaging apparatus according to claim 1, wherein the display control module is configured to display, as the information, a boundary between areas corresponding to a predetermined range of maximum exposure times and a maximum exposure time representing the respective predetermined ranges.
3. The imaging apparatus according to claim 1, further comprising:
a drive control module configured to change a positional relationship based on the amount of motion and the rotation angle such that a position of each of the celestial bodies contained in the captured image on an imaging surface is unchanged.
4. The imaging apparatus according to claim 3, wherein the drive control module is configured to perform parallel motion control on the imaging unit in a direction orthogonal to an optical axis of the optical system based on the amount of motion and the rotation angle, and perform rotational motion control on the imaging unit about an axis parallel to the optical axis to change the positional relationship.
5. The imaging apparatus according to claim 1, wherein the physical quantity includes any one of: the latitude where the imaging device is located, the azimuth of the imaging direction of the imaging device, the elevation angle between the imaging direction and the horizontal plane, the rotation angle centering on the optical axis of the imaging unit, and the focal length of the optical system.
6. The imaging device of claim 1, wherein the display control module is configured to highlight an area including the celestial body contained in the captured image.
7. The imaging apparatus according to claim 6, further comprising:
A setting module configured to set a minimum value of the maximum exposure time of the celestial body contained in the region to a specified exposure time in response to an operation of the region highlighted by the operation unit.
8. The imaging device of claim 1, wherein the computing module is configured to: calculating a first maximum exposure time corresponding to a predetermined plurality of points in the captured image among the maximum exposure times in the captured image; and calculating a second maximum exposure time of an area other than the plurality of points in the captured image by performing interpolation processing on the first maximum exposure time corresponding to the plurality of points.
9. The imaging apparatus according to claim 1, wherein,
The calculation module is configured to calculate a star trace length to be drawn by the celestial body when the imaging unit is exposed to light at a preset specified exposure time, based on the calculated amount of motion and rotation angle, and
The display control module is configured to cause the display unit to display the captured image and display the starlight track length in a predetermined area on the captured image.
10. The imaging apparatus according to claim 9, further comprising:
A setting module configured to set the specified exposure time in response to an operation of the operation unit.
11. The imaging apparatus according to claim 1, wherein the calculation module is configured to calculate the maximum exposure time based on a limitation of a range of variation of a positional relationship between the imaging unit and an optical system that condenses light on an imaging surface of the imaging unit.
12. An image processing method for an imaging apparatus, the method comprising:
acquiring a captured image of a celestial body photographed by an imaging unit;
Acquiring physical quantities detected by a plurality of detectors relating to at least the position and posture of the imaging device;
Calculating an amount of motion and a rotation angle of a celestial body contained in the captured image based on the physical quantity of the imaging device, and calculating a maximum exposure time for which the celestial body contained in the captured image is tracked; and
Causing a display unit to display a captured image on which information about the maximum exposure time is superimposed.
13. The method of claim 12, wherein the calculating comprises: the maximum exposure time is calculated based on a limit of a variation range of a positional relationship between the imaging unit and an optical system that condenses light on an imaging surface of the imaging unit.
14. A program that causes a computer to execute:
acquiring a captured image of a celestial body photographed by an imaging unit;
acquiring physical quantities detected by the plurality of detectors relating to at least the position and posture of the imaging device;
Calculating an amount of motion and a rotation angle of a celestial body contained in the captured image based on the physical quantity of the imaging device, and calculating a maximum exposure time for which the celestial body contained in the captured image is tracked; and
Causing a display unit to display a captured image on which information about the maximum exposure time is superimposed.
15. The program of claim 14, wherein the calculating comprises: the maximum exposure time is calculated based on a limit of a variation range of a positional relationship between the imaging unit and an optical system that condenses light on an imaging surface of the imaging unit.
CN202180102019.5A 2021-09-03 2021-09-03 Imaging apparatus, image processing method, and program Pending CN117941367A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2021/116446 WO2023028989A1 (en) 2021-09-03 2021-09-03 Imaging device, image processing method, and program

Publications (1)

Publication Number Publication Date
CN117941367A true CN117941367A (en) 2024-04-26

Family

ID=85411870

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202180102019.5A Pending CN117941367A (en) 2021-09-03 2021-09-03 Imaging apparatus, image processing method, and program

Country Status (2)

Country Link
CN (1) CN117941367A (en)
WO (1) WO2023028989A1 (en)

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5458802B2 (en) * 2008-10-23 2014-04-02 リコーイメージング株式会社 Digital camera
EP2566149A4 (en) * 2010-04-28 2014-03-12 Pentax Ricoh Imaging Co Ltd Automatic celestial-body tracking / image-capturing method and automatic celestial-body tracking / image-capturing camera
JP5751014B2 (en) * 2010-05-28 2015-07-22 リコーイメージング株式会社 Astronomical auto tracking imaging method and astronomical auto tracking imaging device
JP5751040B2 (en) * 2011-06-17 2015-07-22 リコーイメージング株式会社 Astronomical auto tracking imaging method and astronomical auto tracking imaging device
JP6080487B2 (en) * 2012-10-19 2017-02-15 キヤノン株式会社 Moving object detection apparatus and control method thereof
WO2015071174A1 (en) * 2013-11-18 2015-05-21 Tamiola Kamil Controlled long-exposure imaging of a celestial object
CN105407294A (en) * 2015-10-27 2016-03-16 努比亚技术有限公司 Image exposure method and mobile terminal

Also Published As

Publication number Publication date
WO2023028989A1 (en) 2023-03-09

Similar Documents

Publication Publication Date Title
JP5849448B2 (en) Direction measuring method and direction measuring apparatus using a three-axis electronic compass
US9759801B2 (en) Method of automatically tracking and photographing celestial objects and photographic apparatus employing this method
US10911680B2 (en) Method and system of geolocation and attitude correction for mobile rolling shutter cameras
US11412142B2 (en) Translation correction for optical image stabilization
US9509920B2 (en) Method of automatically tracking and photographing celestial objects, and camera employing this method
EP2915139B1 (en) Adaptive scale and gravity estimation
US9110365B2 (en) Imaging apparatus
JP2012010324A (en) Automatic astronomical tracking and photographing method and apparatus
CN112040126A (en) Shooting method, shooting device, electronic equipment and readable storage medium
JP5572053B2 (en) Orientation estimation apparatus and program
US20230388643A1 (en) Portable device
US10540809B2 (en) Methods and apparatus for tracking a light source in an environment surrounding a device
CN117941367A (en) Imaging apparatus, image processing method, and program
JP6137286B2 (en) Direction measuring method and direction measuring apparatus
CN113301249B (en) Panoramic video processing method, device, computer equipment and storage medium
JP7269354B2 (en) IMAGING DEVICE, SYSTEM, IMAGE STABILIZATION METHOD, PROGRAM AND RECORDING MEDIUM
JP7017961B2 (en) Blur correction device and blur correction method
CN108513058B (en) Image device capable of compensating image change
CN112835021A (en) Positioning method, device, system and computer readable storage medium
US20230209192A1 (en) Control device, imaging apparatus, control method, and control program
US20230199326A1 (en) Systems and methods for capturing stabilized images
US20230417553A1 (en) Orientation calculation apparatus, orientation calculation method, imaging apparatus including orientation calculation apparatus, and method for controlling same
CN115457047A (en) Sky segmentation method and device, computer equipment and storage medium
CN117853549A (en) Image ranging method, device, computer equipment and storage medium
JP2021064973A (en) Portable device and video display method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination