Embodiment
Below, with reference to accompanying drawing, the form (below be called " execution mode ") of embodiment of the present invention is described.
(execution mode 1)
Fig. 1 is that the display unit of embodiment of the present invention 1 is the diagrammatic sketch of facing the formation of being taken the photograph body one side (face side) of camera head.Fig. 2 is that the display unit of this execution mode 1 is the formation ground diagrammatic sketch in the face of user's one side (rear side) of camera head.Fig. 3 is that the display unit of this execution mode 1 is the block diagram of the formation of camera head.
Like Fig. 1~shown in Figure 3, camera head 1 possesses image pickup part 2, illuminating part 3, posture detecting part 4, timer 5, orientation detection portion 6, operation inputting part 7, gps signal acceptance division 8, display part 9, touch panel 10, storage part 11, control part 12.
Image pickup part 2 possesses camera lens part 21, lens driving portion 22, aperture 23, aperture drive division 24, shutter 25, shutter drive division 26, imaging apparatus 27, shooting drive division 28, signal processing part 29.
Camera lens part 21 is by focusing and a plurality of set of lenses of zoom constitute, from predetermined area of visual field converging light.
Lens driving portion 22 usefulness stepper motors or DC motor constitute, and the set of lenses through making camera lens part 21 is along moving the focusing position of change camera lens part 21 and focal length etc. on the optical axis L 1.
The adjusting that the amount of incident of the light that aperture 23 is assembled through restriction camera lens part 21 makes public.Aperture drive division 24 is made up of stepper motor etc., drives aperture 23.
Shutter 25 becomes exposure status or shading state with the setting state of imaging apparatus 27.
Shutter drive division 26 is made up of stepper motor etc., according to the shutter release signal from operation inputting part 7 inputs, drives shutter 25.
Imaging apparatus 27 is by CCD (Charge Coupled Device) or CMOS formations such as (Complementary Metal Oxide Semiconductor).Imaging apparatus 27 is transformed to electric signal (analog signal) through accepting the light of being assembled by camera lens part 21 and carrying out light-to-current inversion with light.
Shooting drive division 28 generates the commutator pulse that drives imaging apparatus 27, and imaging apparatus 27 is exported the electric signal after the light-to-current inversion to signal processing part 29.
Signal processing part 29 is made up of analogue amplifier, A/D converter etc.After 29 pairs of signal processing parts implement to amplify the signal processing that (gain-adjusted) wait from the electric signals of imaging apparatus 27 outputs, be transformed to the view data of numeral, export to control part 12 through carrying out the A/D conversion.
The area of visual field irradiation fill-in light that illuminating part 3 will be taken to camera head 1.Illuminating part 3 usefulness xenon lamps or LED formations such as (Light Emitting Diode).
Posture detecting part 4 usefulness acceleration transducers constitute.Posture detecting part 4 is through detecting the acceleration of camera head 1, the posture (angle of inclination) of the camera head 1 when detection is benchmark with the horizontal plane.
Timer 5 has clocking capability and time on date decision-making function.
Orientation detection portion 6 usefulness geomagnetic sensors constitute, the optical axis of detector lens portion 21 towards the orientation.
Operation inputting part 7 is accepted operator input signal from the outside of camera head 1.Operation inputting part 7 possesses: the mains switch 71 that the power supply status of camera head 1 is switched to on-state or off-state; Accept the shutter release switch 72 of the input of the shutter release signal of giving the photography indication; Carry out the mode selector switch 73 of the switching of the exercises pattern that camera head 1 can set; Switch the console switch 74 of the various settings of camera head 1; The menu switch 75 that on display part 9, shows the various settings of camera head 1; The ZSW 76 of the input of the index signal of the angle of visual field of acceptance change camera head 1.
Gps signal acceptance division 8 receives the gps signal that is sent by a plurality of gps satellites 100.Gps signal acceptance division 8 possesses: can receive the antenna at interior gps signal (electric wave) such as the orbit information that comprises gps satellite 100, the time information that carries the atomic clock on gps satellite 100; The gps signal of this antenna reception is carried out the signal processing circuit of signal processing.
The display floater that display part 9 usefulness liquid crystal or organic EL (Electro Luminescence) etc. form is realized.Display part 9 shows the corresponding image of view data that generates with image pickup part 2.Display part 9 shows operation information and the relevant information of photography with camera head 1 rightly.
Touch panel 10 is overlapping to be arranged in the display frame of display part 9 (with reference to Fig. 2).Touch panel 10 detects users according to the information that shows on the display part 9 and position contacting is accepted the input with this contact position corresponding indicating signals.Touch panel 10 can adopt the touch panel of any mode in the general resistive film mode of being known, electrostatic capacitance mode, the optical profile type etc.
Storage part 11 is used in the flash memory and the RAM semiconductor memories such as (Random Access Memory) of the internal fixation setting of camera head 1 and realizes.Storage part 11 possesses: the image data storage portion 111 of storage image pickup part 2 shot image data; The program storage part 112 of various programs that storage camera head 1 is carried out and the display routine of this execution mode 1; Storage comprises latitude and the longitude information of the various places point map information 113 at interior cartographic information; Storage comprises position (latitude, longitude) and the title (landmark name) of a plurality of terrestrial references landmark information storage part 114 at interior landmark information.Here said terrestrial reference comprises for example artifact such as natural thing such as mountain, sea, gulf or building.In addition, storage part 11 also can comprise the recording medium of the embodied on computer readable such as storage card of installing from the outside.
Control part 12 usefulness CPU formations such as (Central Processing Unit).Control part 12 carries out passing on of the indication corresponding with each one of formation camera head 1, data etc. according to from the index signal of operation inputting part 7, touch panel 10 or switching signal etc., controls the action of camera head 1 uniformly.
The detailed formation of control part 12 is described.Control part 12 possesses image processing part 121, location portion 122, position calculation portion 123, posture detection unit 124, imaging control part 125, search part 126, direction determining portion 127, the synthetic portion 128 of image, display control unit 129.
The view data of signal processing part 29 output of 121 pairs of image pickup parts 2 of image processing part comprises that the edge is stressed, white balance and γ revisal be in interior image processing.In addition, image processing part 121 carries out processed compressed, the decompression of view data according to JPEG (Joint Photographic Experts Group) compress mode etc.
Location portion 122 is according to the gps signal that gps signal acceptance division 8 receives, and measures the position of camera head 1, highly termly.At this moment, location portion 122 adopts the gps signal that receives from the gps satellite more than 4, measures the position of camera head 1, highly termly.
Position calculation portion 123 adopts location portion 122 at the position of the camera head 1 of 2 different chronometries and the time information of timer 5 etc., and the quilt of calculating camera head 1 and camera head 1 shooting is taken the photograph the position of body.
Posture detection unit 124 is judged the posture of camera head 1 according to the testing result of posture detecting part 4.Specifically, posture detection unit 124 judges that camera heads 1 are the postures (shooting posture) of action of making a video recording, and still carries out the posture (position display posture) of action of the position of the camera head 1 on the map that displayed map information storage part 113 stored.Here, the position display posture means that the surface of the display frame of display part 9 is the posture of approximate horizontal state (forming 10 ° of states with interior angle with horizontal plane).In addition, the shooting posture means the posture beyond the position display posture.
Imaging control part 125 has been accepted the occasion of the input of photography index signal (shutter release signal) at shutter release switch 72, controls the photography action beginning that makes in the camera head 1.Here, the photography action in the camera head 1 is meant, for the view data of being exported by imaging apparatus 27 through the driving of shutter drive division 26 and shooting drive division 28, implements predetermined processing action by signal processing part 29 and image processing part 121.Implemented view data after such processing through control part 12, quilt and photography date temporal information etc. associate and are stored in the image data storage portion 111.
Search part 126 is taken the photograph the corresponding terrestrial reference in position of body from landmark information storage part 114 retrieval and the quilt of image, taking in.
Direction determining portion 127 changed according to the time of the position of the camera head in the position calculation portion 123 1, judged the moving direction of camera head 1.
The synthetic portion 128 of image carries out the synthetic of image.For example, the synthetic portion 128 of image will take the photograph the related information of body with the quilt that comprises in the image that image pickup part 2 is taken and be synthesized to this image.With taking the photograph the related information of body for example is this quilt position (latitude, longitude) of taking the photograph body, title etc.
Demonstration in the display control unit 129 control display parts 9.For example, display control unit 129 makes the corresponding respectively image of a plurality of view data that display part 9 demonstrations and image data storage portion 111 are stored.In addition, display control unit 129 makes the downscaled images (thumbnail image) after display part 9 demonstrations are dwindled a plurality of images respectively.
Have in the camera head 1 of above formation, pattern can be set at the reproduction mode of the photograph mode and the reproduced image of photographic images.Wherein, photograph mode also can be set photograph mode in the machine corresponding with the photography in the machine of aircraft except common photograph mode.Photograph mode is the photograph mode of the interior shooting of the in-flight machine of supposition from the situation of the observed landscape of window of aircraft in this machine.
Fig. 4 is the flow chart of the summary of the processing carried out of camera head 1.At first, in the occasion (step S1: be) of the power connection of camera head 1, get into step S2.On the other hand, the occasion of the power supply access failure of camera head 1 (step S1: not), camera head 1 end process.
Explain that camera head 1 sets the occasion of photograph mode (step S2: photograph mode) for.This occasion, during photograph mode (step S3: be), control part 12 is controlled in setting machine for, so that the state-detection of posture detecting part 4 and orientation detection portion 6 beginning camera heads 1, and makes the reception (step S4) of gps signal acceptance division 8 beginning gps signals.Photograph mode is to be assumed to the cameraman who takes a flight under the situation of the wide now landscape of the window shooting of aircraft, to set in the machine.Fig. 5 is a sketch map of setting the situation that the camera head 1 of photograph mode is taken in the machine for.Among Fig. 5, suppose that 200 the cameraman 300 of taking a flight will take the landscape of the periphery of being taken the photograph body 400.
Then, display control unit 129 shows live view image (step S5) on display part 9.The live view image here be meant with the state of not pressing shutter release switch 72 under the corresponding image of view data that generates of image pickup part 2.
Then; Measured the occasion (step S6: be) of position of the camera head 1 in 2 different moment according to gps signal in location portion 122, the position of camera head 1 when taking (below be called " current location ") (step S7) calculated in 2 positions (locating) that position calculation portion 123 usefulness have been measured.In addition, measured the occasion of the position of the different camera heads 1 constantly more than 3 in location portion 122, nearest 2 of position calculation portion 123 usefulness locate and calculate the current location of camera head 1.
Here, the concrete position calculating method that position calculating part 123 carries out is described.Fig. 6 is the key diagram of the computational methods of camera head 1 and the position of being taken the photograph body.Among Fig. 6, represented to calculate the occasion of being taken the photograph the position of body 400 shown in Figure 5.Among Fig. 6,2 that location portion 122 is measured recently locate and are made as P
1, P
2Position P
1Be the position of measuring at moment T1, its latitude is I
1, longitude is K
1Position P
2Be at moment T
2(>T
1) position measured, its latitude is I
2, longitude is K
2In addition, among Fig. 6, current location is made as P
0Current location P
0The moment be T
0(>T
2), its latitude is made as I
0, longitude is made as K
0In addition, among Fig. 6, taken the photograph the position P of body 400
LLatitude be made as I
L, longitude is made as K
LHere, latitude is with just representing north latitude, and uses the negative indication south latitude.That is, latitude is with-90 degree (south latitude 90 degree)~+ 90 degree (north latitude 90 degree) expression.In addition, longitude is with just representing east longitude, and uses the negative indication west longitude.That is, longitude is with-180 degree (west longitude 180 degree)~+ 180 degree (east longitude 180 degree) expression.
At first, the 123 employing position P of position calculation portion
1, P
2Latitude longitude information and information constantly, calculate the latitudinal speed BI of aircraft 200 and the speed BK of longitudinal respectively.
BI=(I
2-I
1)/(T
2-T
1)...(1)
BK=(K
2-K1)/(T
2-T
1)...(2)
Then, the result of calculation and the current location P of position calculation portion 123 employing formulas (1), (2)
0Time information, calculate current location P respectively
0Latitude I
0, longitude K
0
I
0=I
2+BI·(T
0-T
2)
=I
2+(I
2-I
1)·(T
0-T
2)/(T
2-T
1)...(3)
K
0=K
2+BK·(T
0-T
2)
=K
2+(K
2-K
1)·(T
0-T
2)/(T
2-T
1)...(4)
Then, posture detection unit 124 is judged the posture (step S8) of camera head 1.At first, the posture result of determination of posture detection unit 124 being described is the occasion (step S8: be) that camera head 1 is got the shooting posture.Fig. 7 is the diagrammatic sketch that camera head 1 is got the situation of shooting posture.As shown in Figure 7, cameraman 300 holds camera head 1, takes the window 201 observed landscape from aircraft 200.
Get the occasion (step S8: be) of shooting posture at camera head 1, posture detection unit 124 is judged the angle of depression (step S9) of camera heads 1.The angle of depression here is meant the formed angle of horizontal plane of direction and center through camera lens part 21 of the optical axis L 1 of camera lens part 21.
Quilt when then, photography is calculated by position calculation portion 123 is taken the photograph the position (step S10) of body.Among Fig. 5 and Fig. 6; Inclination angle in the direction of the
optical axis L 1 of camera lens part 21 and vertical formation of center through camera lens part 21 is made as θ; The height of
aircraft 200 is made as H; As the expression orientation detection portion 6 detected azimuths of being taken the photograph the orientation of body, will be that the azimuth of benchmark is made as with the north orientation of warp
When (, counterclockwise just rotating to be), taken the photograph the position P of
body 400 with respect to north orientation
LLatitude I
L, longitude K
LCan calculate respectively according to following formula (5), (6).
Here, the I on the right of 123 pairs of formulas of position calculation portion (5)
0And the K on the right of formula (6)
0, the result of calculation of difference substitution formula (3), (4).In addition, when the angle of depression that posture detection unit 124 determines in establishing step S9 is α, then with degree (°) be unit, angle of depression α and tiltangle satisfy α+θ=90 °.
Then, search part 126 is from the landmark information storage part 114 retrievals terrestrial reference corresponding with the position of being taken the photograph body (step S11).Specifically, search part 126 is in Fig. 5 and occasion shown in Figure 6, and retrieval is to be taken the photograph the position PL (I of body 400 from landmark information storage part 114
L, K
L) be the center region D=(I, K) | I
L-Δ I≤I≤I
L+ Δ I, K
L-Δ K≤K≤K
LThe terrestrial reference that+Δ K} is comprised.Confirm the parameter Δ I of the size of region D, Δ K can be set at expectation through the user from the input of operation inputting part 7 value.In addition, in region D, have the occasion of a plurality of terrestrial references, search part 126 is selected from the position P that is taken the photograph body 400
L(I
L, K
L) nearest terrestrial reference.
Result in search part 126 retrievals has obtained the occasion of the terrestrial reference corresponding with the position of being taken the photograph body (step S12: be), and the synthetic portion 128 of image generates the synthetic live view image that landmark name is synthesized to the live view image.Then, display control unit 129 shows display part 9 should synthesize live view image (step S13), stated step S14 backward and shifted.Fig. 8 is the diagrammatic sketch of the demonstration example of the synthetic live view image in the display part 9.Synthetic live view image 500 shown in this figure by the image of being taken the photograph body 400 with show that the landmark name display image 501 with being taken the photograph the corresponding landmark name of body 400 (Fig. 8 be combined into " Fuji ") synthesizes.
Among the step S12, can't obtain the occasion (step S12: not) of the terrestrial reference corresponding with the position of being taken the photograph body in search part 126, camera head 1 is stated step S14 backward and is shifted.
Among the step S14, when pressing shutter release switch 72, after the input photography index signal (step S14: be), imaging control part 125 begin the to photograph control (step S15) of action.
Then, display control unit 129 shows and the corresponding image (writing down the browse graph picture) (step S16) of view data that generates according to the input of photography index signal and by image pickup part 2 display part 9.In addition, imaging control part 125 makes image data storage portion 111 storage these view data (step S17).In addition, the order of step S16 and step S17 can be carried out conversely, also can executed in parallel step S16 and step S17.
Then, in the occasion (step S18: be) of having imported the mode switch indication through mode selector switch 73 or touch panel 10, camera head 1 change action pattern (step S19) is returned step S1.On the other hand, in the occasion (step S18: deny) of not switching indication through mode selector switch 73 or touch panel 10 input patterns, camera head 1 returns step S1.
In step S14, do not press the occasion (step S14: deny) of shutter release switch 72, camera head 1 shifts to step S18.
Then, explain that the result that the posture in step S8 is judged is the occasion that camera head 1 is not got the shooting posture, i.e. occasion (the step S8: not) that shows posture is put in camera head 1 fetch bit.Fig. 9 is the sketch map that the situation that shows posture is put in camera head 1 fetch bit.Among Fig. 9, cameraman 300 is that the state of approximate horizontal is controlled camera head 1 with the display frame of display part 9.Put the occasion that shows posture in camera head 1 fetch bit, the heading (moving direction of camera head 1) (step S20) of 2 location determination aircrafts 200 that direction determining portion 127 calculates according to position calculation portion 123.Specifically, direction determining portion 127 adopts the speed BK of above-mentioned latitudinal speed BI and longitudinal, judges the heading of aircraft 200.
Then, image synthesizes the current location (I that portion 128 obtains according to step S7
0, K
0) and the heading judged of step S9, be created on and comprise the current location display image that has synthesized the aircraft picture in the map image of being taken the photograph body 400.Then, display control unit 129 makes display part 9 show this current location display image (step S21).Figure 10 is the diagrammatic sketch of the demonstration example of the current location display image in the display part 9.In the current location display image 600, the overlapping aircraft that shown is as 601 in map shown in this figure.Aircraft as 601 in, the direction of advance of aircraft is represented with arrow.Then, camera head 1 shifts to step S18.
Among the description of step S6, location portion 122 undetermineds finish at the occasion of the position of the camera head 1 in 2 different moment (step S6: not).This occasion also comprises the occasion of only having measured a position.Finish the occasion in the position of the camera head 1 in 2 different moment at location portion 122 undetermineds, the synthetic portion 128 of image is created on the overlapping warning live view image of warning in the live view image.The warning live view image (step S22) that display control unit 129 generates the synthetic portion 128 of display part 9 display images.Here the warning content that shows for example is " in the current location.Please do not cut off the electricity supply.」。In addition, also can audio output unit be set, by this audio output unit output warning sound at camera head 1.Then, camera head 1 shifts to step S14.
Then, among the description of step S3, camera head 1 is not set the occasion of photograph mode in the machine for, and promptly camera head 1 is set occasion (the step S3: not) of common photograph mode for.This occasion, display control unit 129 make display part 9 show live view image (step S23).Then, camera head 1 shifts to step S14.
Then, explain that in step S2 camera head 1 sets occasion (the step S2: reproduction mode) of reproduction mode for.This occasion, display control unit 129 make display part 9 display images select picture (step S24).This image is selected in the picture, and guide look shows a plurality of reproduction candidate images.
Then, in the occasion (step S25: be) of having selected image through the input of operation inputting part 7 or touch panel 10, display control unit 129 makes display part 9 full frames show selected image (step S26).On the other hand, in the occasion (step S25: deny) of non-selected image, camera head 1 is stated step S28 backward and is shifted.
Then, in the occasion (step S27: be) of the switching indication of having imported image through the input of operation inputting part 7 or touch panel 10, camera head 1 returns step S26, the demonstration of display part 9 is switched to the image and the full frame demonstration (step S26) of selection.On the other hand, in the occasion (step S27: deny) of the not switching indication of input picture, camera head 1 is stated step S28 backward and is shifted.
Among the step S28, in the occasion (step S28: be) of the end indication of having imported the reconstruction of image through operation inputting part 7 or touch panel 10, camera head 1 shifts to step S18.(step S28: not), camera head 1 returns step S25 to the occasion of relatively, indicating in the end of not input picture reproduction.
Execution mode of the present invention 1 according to above explanation; Be used in the position of this display unit of position calculation of 2 different chronometries at predetermined instant; Will with the corresponding information in the position that calculates be synthesized to the corresponding image of this information in and show; Therefore, while show and to extract and to show the information related when the corresponding image of its moving state moving with wanting images displayed.
In addition; According to this execution mode 1; Take the photograph in the image of body and show through being synthesized to this quilt with the corresponding information in position that the quilt that calculate the position of this display unit that is used in 2 different chronometries is taken the photograph body; The occasion of when the cameraman is moving, photographing can show the information of being taken the photograph body.
In addition; According to this execution mode 1; Put the occasion that shows posture in the display unit fetch bit, generate the map of the position that comprises this display unit, and be created on the current location display image that has synthesized the position of this display unit in the map of this generation; Therefore, the user can hold current location in the timing of expectation.
In addition, in the above explanation, entitling synthetically and situation about showing have been described in being taken the photograph the image of body, still, except landmark name, also can have been shown the simple declaration of this terrestrial reference.This occasion, the explanation that also can in landmark information storage part 114, store terrestrial reference in advance.
(execution mode 2)
Embodiment of the present invention 2 is characterised in that, the record browse graph after showing photography as the time overlapping demonstration landmark name, perhaps overlapping demonstration landmark name in the reproduced image when reproducing.In addition, the display unit of this execution mode 2 is that the camera head 1 of formation and execution mode 1 explanation of camera head is same.
Figure 11 is the flow chart of the summary of the processing carried out of camera head 1.At first, in the occasion (step S31: be) of the power connection of camera head 1, get into step S32.On the other hand, the occasion of the power supply access failure of camera head 1 (step S31: not), camera head 1 end process.
Set photograph mode (step S32: photograph mode) at camera head 1; And set the occasion (step S33: be) of photograph mode in the machine for; Control part 12 is controlled; So that the state-detection action of posture detecting part 4 and orientation detection portion 6 beginning camera heads 1, and make gps signal acceptance division 8 begin the reception (step S34) of gps signals.
Then, display control unit 129 makes display part 9 show live view image (step S35).
Then, supressing shutter release switch 72, importing the occasion (step S36: be) of photography index signal, imaging control part 125 makes the control (step S37) of photography action beginning.Then, control part 12 makes image data storage portion 111 storage shot image data and photographic informations (step S38).The photographic information here is the angle of depression of camera head 1 for example.
In subsequent step S39, measured the occasion (step S39: be) of position of the camera head 1 in 2 different moment in location portion 122,2 of position calculation portion 123 usefulness locate and calculate the current location (step S40) of camera head 1.The concrete computational methods of the current location among this step S40 and execution mode 1 same (with reference to formula (3), (4)).
Quilt when then, camera head 1 is calculated photography is taken the photograph the position (step S41) of body.The quilt here take the photograph body the position concrete computational methods also with execution mode 1 same (with reference to formula (5), (6)).
Then, search part 126 is from the landmark information storage part 114 retrievals terrestrial reference corresponding with the position of being taken the photograph body (step S42).
Result in search part 126 retrievals has obtained the occasion of the terrestrial reference corresponding with the position of being taken the photograph body (step S43: be), and the synthetic portion 128 of image is created on the composite traces browse graph picture that has synthesized landmark name in the record browse graph picture.Then, display control unit 129 makes display part 9 show this composite traces browse graph picture (step S44).Composite traces browse graph picture and synthetic live view image 500 shown in Figure 8 likewise show.
Then, control part 12 with shot image data with associate with the corresponding landmark information of this view data, be stored in (step S45) in the image data storage portion 111.
Then, in the occasion (step S46: be) of having imported the mode switch indication through mode selector switch 73 or touch panel 10, camera head 1 change action pattern (step S47) is returned step S1.On the other hand, in the occasion (step S46: deny) of not switching indication through mode selector switch 73 or touch panel 10 input patterns, camera head 1 returns step S1.
Among the step S42, in the occasion (step S43: deny) of result for obtaining the terrestrial reference corresponding with the position of being taken the photograph body of search part 126 retrievals, camera head 1 shifts to above-mentioned steps S46.
Then, among the description of step S39, the occasion of the position of the camera head 1 in 2 moment that location portion 122 undetermineds are intact different (step S39: not).Here, " occasion of the position of the camera head 1 in 2 moment that undetermined is intact different " also comprises the occasion of only having measured a position.In the occasion of the position of the camera head 1 in intact different 2 moment of location portion 122 undetermineds, the synthetic portion 128 of image is created on the warning live view image that has synthesized warning in the live view image.The warning live view image (step S48) that display control unit 129 generates the synthetic portion 128 of display part 9 display images.The content of the warning that shows here is for example " in the current location.Please do not cut off the electricity supply.」。
Then, location portion 122 is in the occasion (step S49: be) of the mensuration that shows 2 positions that are through with in the scheduled time of having warned, and the quilt when 2 of position calculation portion 123 usefulness locate the calculating photography is taken the photograph the position P of body 400
L(step S50).In addition, among the step S39, only calculate the occasion of a position in location portion 122, among the step S49, location portion 122 is as long as can measure at least one position showing in the scheduled time of having warned.The scheduled time here is about 1 minute.
Here, the concrete position calculating method among the description of step S50.Figure 12 is camera head 1 and the key diagram of being taken the photograph the position calculating method of body.Among Figure 12,2 that location portion 122 is measured recently locate and are made as P
3, P
4Position P
3Be at moment T
3The position of measuring, its latitude is I
3, longitude is K
3Position P
4Be at moment T
4(>T
3) position measured, its latitude is I
4, longitude is K
4In addition, among Figure 12, the position (camera position) of the camera head 1 during with shooting is made as P
0'.Camera position P
0' the moment be T
0' (<T
3), its latitude is made as I
0', longitude is made as K
0'.In addition, among Figure 12, the position of being taken the photograph body 400 shown in Figure 5 is made as P
L', its latitude is made as I
L', longitude is made as K
L'.In addition, the technique of expression of latitude I and longitude K and execution mode 1 are same.
At first, position calculation portion 123 uses position P
3, P
4Latitude longitude information and information constantly, calculate the latitudinal speed BI ' of aircraft 200 and the speed BK ' of longitudinal respectively.
BI′=(I
4-I
3)/(T
4-T
3)...(7)
BK′=(K
4-K
3)/(T
4-T
3)...(8)
Then, the result of calculation and the camera position P of position calculation portion 123 usefulness formulas (7), (8)
0' time information, calculate camera position P respectively
0' latitude I
0', longitude K
0'.
I
0′=I
3-BI′·(T
3-T
0′)
=I
3-(I
4-I
3)·(T
3-T
0′)/(T
4-T
3)...(9)
K
0′=K
3-BK′·(T
3-T
0′)
=K
3-(K
4-K
3)·(T
3-T
0′)/(T
4-T
3)...(10)
At last, the height H of the tiltangle of the direction of the
optical axis L 1 of position calculation portion 123 through using camera lens part 21 and vertical formation of center through camera lens part 21,
aircraft 200, will be as the azimuth in the detected orientation of being taken the
photograph body 400 of expression orientation detection portion 6 with the north orientation of warp azimuth as benchmark
(with respect to north orientation, counterclockwise just rotating to be) calculated the latitude I that is taken the photograph body respectively
L', longitude K
L'.
Here, the I on the right of 123 pairs of formulas of position calculation portion (11)
0' and the K on the right of formula (12)
0', the result of calculation of difference substitution formula (9), (10).
In addition, Figure 12 and formula (9)~(12) have represented that location portion 122 has measured the occasion of 2 different positions after the photography of step S37, still, in addition, also can be employed in 2 different positions of photography front and back location portion 122 mensuration of step S37.This occasion, also set up formula (9)~(12).
Behind this step S50, camera head 1 shifts to above-mentioned steps S42.
Then, among the description of step S33, camera head 1 is not set the occasion of photograph mode in the machine for, and promptly camera head 1 is set occasion (the step S33: not) of common photograph mode for.This occasion, display control unit 129 make display part 9 show live view image (step S51).
Then, press when shutter release switch 72, imported (step S52: be) after the photography index signal, imaging control part 125 makes the control (step S53) of photography action beginning.Then, display control unit 129 make display part 9 show with according to the input of photography index signal and by the corresponding record browse graph picture (step S54) of view data of image pickup part 2 generations.Imaging control part 125 makes image data storage portion 111 storage these view data (step S55).In addition, the order of step S54 and step S55 also can be carried out conversely, also can executed in parallel step S54 and step S55.Then, camera head 1 shifts to step S46.
Do not press the occasion (step S52: deny) of shutter release switch 72 among the step S52, camera head 1 shifts to step S46.
Then, camera head 1 is set occasion (the step S2: reproduction mode) of reproduction mode among the description of step S2.This occasion, display control unit 129 make display part 9 display images select picture (step S56).This image is selected in the picture, and guide look shows a plurality of reproduction candidate images.
Then; In the occasion (step S57: be) of having selected image through operation inputting part 7 or touch panel 10; When having the landmark information related with the image of selecting (step S58: be), display control unit 129 will synthesize landmark name in the image of selecting synthetic reproduced image full frame on display part 9 shows (step S59).Should also likewise show by synthetic reproduced image with synthetic live view image shown in Figure 8.
Then, in the occasion (step S60: be) of the switching indication of having imported image, camera head 1 returns step S58, judges to have or not and the related landmark information of selecting (step S58) of image.On the other hand, in the occasion (step S60: deny) of the not switching indication of input picture, when the end indication of having imported the reconstruction of image through operation inputting part 7 or touch panel 10 (step S61: be), camera head 1 shifts to step S46.(step S61: not), camera head 1 returned step S57 when the end of input picture was not indicated among the step S60.
In step S57, do not selected the occasion (step S57: deny) of image through operation inputting part 7 or touch panel 10, camera head 1 shifts to step S61.
Then, occasion (the step S58: not) that does not have the landmark information related among the description of step S58 with the image of selecting.This occasion; If from photography elapsed time constantly of the image of this selection at the fixed time with interior (step S62: be); Then control part 12 is controlled; Make the state-detection action of posture detecting part 4 and orientation detection portion 6 beginning camera heads 1, and make the reception action (step S63) of gps signal acceptance division 8 beginning gps signals.The scheduled time here preferably is made as the time that reduces the time-delay when reproducing as far as possible, is at most about 1 minute.
Then, locate quilt when calculating photography of in the scheduled time of having selected image to rise, the be through with occasion (step S64: be) of mensuration of 2 positions of location portion 122,2 of position calculation portion 123 usefulness is taken the photograph the position (step S65) of body.It is same that the quilt is here taken the photograph the method for position calculating method and step S50 explanation of body.In addition, the scheduled time among the step S65 is preferably short as far as possible, for example is about 1 minute.
Then, search part 126 is from 114 retrievals of landmark information storage part and the corresponding terrestrial reference of positional information (step S66) of being taken the photograph body.Result in search part 126 retrievals has obtained the occasion of the terrestrial reference corresponding with the position of being taken the photograph body (step S67: be), and the synthetic portion 128 of image is created on the synthetic reproduced image that has synthesized landmark name in the reproduced image.Then, display control unit 129 shows display part 9 full frames and is somebody's turn to do synthetic reproduced image (step S68).
Then, control part 12 associates landmark information and view data and stores (step S69) in the image data storage portion 111 into.Behind the step S69, camera head 1 shifts to step S60.
Among the step S62; The elapsed time that photography rises constantly is not at the fixed time with interior occasion (step S62: not); Among the step S64; Location portion 122 does not finish the occasion (step S64: not) of the mensuration of 2 positions in the given time, or search part 126 can't obtain the occasion of landmark information (step S67: not), display control unit 129 makes display part 9 full frames show selected image (step S70) among the step S67.Then, camera head 1 shifts to step S60.
Embodiment of the present invention 2 according to above explanation; The position of the predetermined instant of this display unit is calculated in the position that is used in 2 different chronometries; Will with the corresponding information in the position that calculates be synthesized to the corresponding image of this information in and show; Therefore, while show and to extract and to show the information related when the corresponding image of its moving state moving with wanting images displayed.
In addition; According to this execution mode 2; To be synthesized to this quilt with the corresponding information in position that the quilt that calculate the position of this display unit that is used in 2 different chronometries is taken the photograph body takes the photograph in the image of body and shows; Thereby while moving the occasion of photographing, can show the information of being taken the photograph body the cameraman.
In addition, according to this execution mode 2, even can't calculate the position of camera head during photography, the position in the time of also can calculating photography through the position of calculating subsequently.Thereby, also can after photography, discern the title of being taken the photograph body.Like this; According to this execution mode 2; Need like this through such behavior of carrying out at short notice and the terrestrial reference retrieval of will photographing that behavior is separated relatively for a long time handles; Not only can make the cameraman fix attention on photography, can also after photography, obtain and taken the photograph the related information of body, intensification is to being taken the photograph the understanding of body.
As the form of embodiment of the present invention, 2 execution modes have been described, but have been the invention is not restricted to above-mentioned execution mode 1,2.For example, location portion also can only locate, and height detects or be made as the average height of aircraft with semiconductor pressure sensor or baroceptor.According to display unit with such formation, adopt gps signal to get final product from 3 gps satellites, therefore can shorten the processing time of location portion.
In addition, also can make display unit of the present invention possess the function that communicates via communication networks such as internets, obtain cartographic information, landmark information from the outside via communication network.
Like this, the present invention also comprises here the various execution modes of not record etc.