CN102761703A - Display device and display program - Google Patents

Display device and display program Download PDF

Info

Publication number
CN102761703A
CN102761703A CN2012100465122A CN201210046512A CN102761703A CN 102761703 A CN102761703 A CN 102761703A CN 2012100465122 A CN2012100465122 A CN 2012100465122A CN 201210046512 A CN201210046512 A CN 201210046512A CN 102761703 A CN102761703 A CN 102761703A
Authority
CN
China
Prior art keywords
mentioned
display unit
image
display
taken
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2012100465122A
Other languages
Chinese (zh)
Other versions
CN102761703B (en
Inventor
桥本公郎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Aozhixin Digital Technology Co ltd
Original Assignee
Olympus Imaging Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Olympus Imaging Corp filed Critical Olympus Imaging Corp
Publication of CN102761703A publication Critical patent/CN102761703A/en
Application granted granted Critical
Publication of CN102761703B publication Critical patent/CN102761703B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Studio Devices (AREA)
  • Indication In Cameras, And Counting Of Exposures (AREA)
  • Camera Data Copying Or Recording (AREA)
  • Television Signal Processing For Recording (AREA)

Abstract

The invention provides a display device and a display program. When images corresponding to a moving state are displayed during moving, information related to images to be displayed can be extracted and displayed. The display device comprises a position measuring part (122) used for measuring the position of a camera device (1) regularly, a position calculation part (123) calculating the position of the camera device (1) at a predetermined moment according to the positions measured by the position measuring part (122) at two different moments, an image synthesis part (128) synthesizing information corresponding to the position calculated by the position calculation part (123) to an image corresponding to the information, and a display control part (129) enabling a display part (9) to display a synthetic image synthesized by the image synthesis part (128).

Description

Display unit and display routine
Technical field
The present invention relates to can display image display unit and display routine.
Background technology
Tradition, the known camera that has the captured image storage place name corresponding etc. with the positional information that obtains through location.As such camera; Disclose when handling with the parallel location that carries out GPS (Global Positioning System) of shooting; Extract the titles such as place name corresponding with the position of being taken the photograph body; This title that extracts is being taken the photograph overlapping technique for displaying on the body (for example, with reference to patent documentation 1).In this technology, camera is obtained position of camera through the location processing of GPS, and carries out being taken the photograph the range finding and the orientation instrumentation of body, calculates the position of being taken the photograph body according to this tittle.Then, camera is taken the photograph the position searching data storehouse of body with the quilt of calculating, and extracts and the corresponding title of zoom level when photographing, and shows with photographs.
[patent documentation 1] TOHKEMY 2001-169164 communique
Summary of the invention
But it is prerequisite that above-mentioned conventional art is photographed under static state with the user.Thereby, while take a flight the user, the vehicles such as ship move when photographing, and can't show the information related with image such as information of being taken the photograph body.
More generally, in traditional technology,, can't extract and show the information related with wanting images displayed while show when the corresponding image of its moving state moving.
The present invention proposes in view of the above problems, shows when the corresponding image of its moving state while purpose provides moving, and can extract and show the display unit and the display routine of the information related with wanting images displayed.
In order to address the above problem, reach purpose, display unit of the present invention possess can display image display part, it is characterized in that this display unit possesses: location portion, it measures the position of this display unit termly; Position calculation portion, it uses above-mentioned location portion to calculate the position of this display unit at predetermined instant in the position of 2 different chronometries; Image synthesizes portion, its will be synthesized to the corresponding information in position that above-mentioned position calculation portion calculates with the corresponding image of this information in; And display control unit, it makes above-mentioned display part show the synthetic composograph of the synthetic portion of above-mentioned image.
In addition; Display unit of the present invention is characterized in that, in foregoing invention; This display unit also possesses the gps signal acceptance division; This gps signal acceptance division receives the gps signal that is sent by a plurality of gps satellites, and the gps signal that above-mentioned location portion uses above-mentioned gps signal acceptance division to receive is measured the position of this display unit.
In addition; Display unit of the present invention is characterized in that, in foregoing invention; This display unit also possesses image pickup part; This image pickup part is taken and is taken the photograph body, generates the view data that this quilt is taken the photograph body, the position of this display unit when above-mentioned position calculation portion uses above-mentioned location portion to calculate shooting in the position of 2 different chronometries; And using this result of calculation to calculate the above-mentioned position of being taken the photograph body, the synthetic portion of above-mentioned image will be synthesized to the above-mentioned corresponding information in position of being taken the photograph body that above-mentioned position calculation portion calculates in the above-mentioned image of being taken the photograph body.
In addition, display unit of the present invention is characterized in that, in foregoing invention, above-mentioned display unit also possesses: the landmark information storage part, and it stores position and the relevant information of title with a plurality of terrestrial references; And search part; It retrieves the above-mentioned corresponding terrestrial reference in position of being taken the photograph body that calculates with above-mentioned position calculation portion from above-mentioned landmark information storage part, the synthetic portion of above-mentioned image is synthesized to the above-mentioned search part retrieval information relevant with title terrestrial reference that obtain in the above-mentioned image of being taken the photograph body.
In addition; Display unit of the present invention; It is characterized in that in foregoing invention, the above-mentioned image of being taken the photograph body that the synthetic portion of above-mentioned image uses is that the view data with above-mentioned image pickup part generation is presented at the live view image on the above-mentioned display part accordingly in real time in synthetic.
In addition; Display unit of the present invention; It is characterized in that; In foregoing invention, above-mentioned display unit possesses the operation inputting part that can import the photography index signal, the above-mentioned image of being taken the photograph body that the synthetic portion of above-mentioned image uses in synthetic be with according to the input of the photography index signal of aforesaid operations input part and by the corresponding record browse graph picture of view data of above-mentioned image pickup part generation.
In addition; Display unit of the present invention; It is characterized in that; In foregoing invention, above-mentioned display unit possesses storage and comprises the storage part of the view data of above-mentioned image pickup part generation in interior information, and the above-mentioned image of being taken the photograph body that the synthetic portion of above-mentioned image uses in synthetic is the image corresponding with the view data of above-mentioned storage portion stores.
In addition, display unit of the present invention is characterized in that, above-mentioned display unit possesses: map information, and its storage comprises the cartographic information of map; Posture detecting part, it detects the posture of this display unit; The posture detection unit; The result that it detects according to above-mentioned posture detecting part; At least judge whether this display unit has taked to show the position display posture of the position of this display unit on the above-mentioned map; The result that the synthetic portion of above-mentioned image is judged at above-mentioned posture detection unit has taked the occasion of position display posture for this display unit; According to the cartographic information of above-mentioned map information storage, will comprise this display unit the position the zone map and represent that the images of positions of this display unit on this map is synthetic.
In addition, display routine of the present invention is characterized in that, make can display image display unit carry out: the location step, measure the position of this display unit termly; The position calculation step is used the position of above-mentioned location step in 2 different chronometries, calculates the position of this display unit at predetermined instant; The image synthesis step will be synthesized in the predetermined picture with the corresponding information in position that above-mentioned position calculation step calculates; Step display shows the composograph that above-mentioned image synthesis step is synthetic.
According to the present invention; Use is in this display unit of position calculation of 2 different chronometries position at predetermined instant; Will with the corresponding information in the position that calculates be synthesized to the corresponding image of this information in and show; Therefore, while show and to extract and to show the information related when the corresponding image of its moving state moving with wanting images displayed.
Description of drawings
Fig. 1 is that display unit that embodiment of the present invention 1 is shown is the figure in the face of the formation of being taken the photograph body one side (face side) of camera head.
Fig. 2 is that display unit that embodiment of the present invention 1 is shown is the figure in the face of the formation of user's a side (rear side) of camera head.
Fig. 3 is that the display unit of embodiment of the present invention 1 is the block diagram of the formation of camera head.
Fig. 4 is that display unit that embodiment of the present invention 1 is shown is the flow chart of the summary of the processing carried out of camera head.
Fig. 5 is that display unit that embodiment of the present invention 1 is shown is the figure of the camera head situation of taking when setting photograph mode in the machine for.
Fig. 6 is that the display unit of embodiment of the present invention 1 is the key diagram of the computational methods of the quilt of camera head and this camera head position of taking the photograph body.
Fig. 7 is that the display unit that embodiment of the present invention 1 is shown is the figure that camera head is got the situation of shooting posture.
Fig. 8 is that display unit that embodiment of the present invention 1 is shown is the figure of the demonstration example of the composograph in the display part of camera head.
Fig. 9 is that the display unit of embodiment of the present invention 1 is the sketch map that the situation that shows posture is put in the camera head fetch bit.
Figure 10 is that display unit that embodiment of the present invention 1 is shown is the figure of the demonstration example of the current location display image in the display part of camera head.
Figure 11 is that display unit that embodiment of the present invention 2 is shown is the flow chart of the summary of the processing carried out of camera head.
Figure 12 is that the display unit of embodiment of the present invention 2 is the key diagram of the computational methods of the quilt of camera head and this camera head position of taking the photograph body.
Label declaration
1 camera head (display unit)
2 image pickup parts
3 illuminating parts
4 posture detecting parts
5 timers
6 orientation detection portions
7 operation inputting part
The 8GPS signal receiving part
9 display parts
10 touch panels
11 storage parts
12 control parts
21 camera lens parts
22 lens driving portions
23 apertures
24 aperture drive divisions
25 shutters
26 shutter drive divisions
27 imaging apparatuss
28 shooting drive divisions
29 signal processing parts
The 100GPS satellite
111 image data storage portions
112 program storage parts
113 map information
114 landmark information storage parts
121 image processing parts
122 location portions
123 position calculation portions
124 posture detection units
125 imaging control part
126 search part
127 direction determining portions
128 images synthesize portion
129 display control units
200 aircrafts
300 cameramans
400 are taken the photograph body
500 synthetic live view images
501 landmark name display images
600 current location display images
601 aircraft pictures
Embodiment
Below, with reference to accompanying drawing, the form (below be called " execution mode ") of embodiment of the present invention is described.
(execution mode 1)
Fig. 1 is that the display unit of embodiment of the present invention 1 is the diagrammatic sketch of facing the formation of being taken the photograph body one side (face side) of camera head.Fig. 2 is that the display unit of this execution mode 1 is the formation ground diagrammatic sketch in the face of user's one side (rear side) of camera head.Fig. 3 is that the display unit of this execution mode 1 is the block diagram of the formation of camera head.
Like Fig. 1~shown in Figure 3, camera head 1 possesses image pickup part 2, illuminating part 3, posture detecting part 4, timer 5, orientation detection portion 6, operation inputting part 7, gps signal acceptance division 8, display part 9, touch panel 10, storage part 11, control part 12.
Image pickup part 2 possesses camera lens part 21, lens driving portion 22, aperture 23, aperture drive division 24, shutter 25, shutter drive division 26, imaging apparatus 27, shooting drive division 28, signal processing part 29.
Camera lens part 21 is by focusing and a plurality of set of lenses of zoom constitute, from predetermined area of visual field converging light.
Lens driving portion 22 usefulness stepper motors or DC motor constitute, and the set of lenses through making camera lens part 21 is along moving the focusing position of change camera lens part 21 and focal length etc. on the optical axis L 1.
The adjusting that the amount of incident of the light that aperture 23 is assembled through restriction camera lens part 21 makes public.Aperture drive division 24 is made up of stepper motor etc., drives aperture 23.
Shutter 25 becomes exposure status or shading state with the setting state of imaging apparatus 27.
Shutter drive division 26 is made up of stepper motor etc., according to the shutter release signal from operation inputting part 7 inputs, drives shutter 25.
Imaging apparatus 27 is by CCD (Charge Coupled Device) or CMOS formations such as (Complementary Metal Oxide Semiconductor).Imaging apparatus 27 is transformed to electric signal (analog signal) through accepting the light of being assembled by camera lens part 21 and carrying out light-to-current inversion with light.
Shooting drive division 28 generates the commutator pulse that drives imaging apparatus 27, and imaging apparatus 27 is exported the electric signal after the light-to-current inversion to signal processing part 29.
Signal processing part 29 is made up of analogue amplifier, A/D converter etc.After 29 pairs of signal processing parts implement to amplify the signal processing that (gain-adjusted) wait from the electric signals of imaging apparatus 27 outputs, be transformed to the view data of numeral, export to control part 12 through carrying out the A/D conversion.
The area of visual field irradiation fill-in light that illuminating part 3 will be taken to camera head 1.Illuminating part 3 usefulness xenon lamps or LED formations such as (Light Emitting Diode).
Posture detecting part 4 usefulness acceleration transducers constitute.Posture detecting part 4 is through detecting the acceleration of camera head 1, the posture (angle of inclination) of the camera head 1 when detection is benchmark with the horizontal plane.
Timer 5 has clocking capability and time on date decision-making function.
Orientation detection portion 6 usefulness geomagnetic sensors constitute, the optical axis of detector lens portion 21 towards the orientation.
Operation inputting part 7 is accepted operator input signal from the outside of camera head 1.Operation inputting part 7 possesses: the mains switch 71 that the power supply status of camera head 1 is switched to on-state or off-state; Accept the shutter release switch 72 of the input of the shutter release signal of giving the photography indication; Carry out the mode selector switch 73 of the switching of the exercises pattern that camera head 1 can set; Switch the console switch 74 of the various settings of camera head 1; The menu switch 75 that on display part 9, shows the various settings of camera head 1; The ZSW 76 of the input of the index signal of the angle of visual field of acceptance change camera head 1.
Gps signal acceptance division 8 receives the gps signal that is sent by a plurality of gps satellites 100.Gps signal acceptance division 8 possesses: can receive the antenna at interior gps signal (electric wave) such as the orbit information that comprises gps satellite 100, the time information that carries the atomic clock on gps satellite 100; The gps signal of this antenna reception is carried out the signal processing circuit of signal processing.
The display floater that display part 9 usefulness liquid crystal or organic EL (Electro Luminescence) etc. form is realized.Display part 9 shows the corresponding image of view data that generates with image pickup part 2.Display part 9 shows operation information and the relevant information of photography with camera head 1 rightly.
Touch panel 10 is overlapping to be arranged in the display frame of display part 9 (with reference to Fig. 2).Touch panel 10 detects users according to the information that shows on the display part 9 and position contacting is accepted the input with this contact position corresponding indicating signals.Touch panel 10 can adopt the touch panel of any mode in the general resistive film mode of being known, electrostatic capacitance mode, the optical profile type etc.
Storage part 11 is used in the flash memory and the RAM semiconductor memories such as (Random Access Memory) of the internal fixation setting of camera head 1 and realizes.Storage part 11 possesses: the image data storage portion 111 of storage image pickup part 2 shot image data; The program storage part 112 of various programs that storage camera head 1 is carried out and the display routine of this execution mode 1; Storage comprises latitude and the longitude information of the various places point map information 113 at interior cartographic information; Storage comprises position (latitude, longitude) and the title (landmark name) of a plurality of terrestrial references landmark information storage part 114 at interior landmark information.Here said terrestrial reference comprises for example artifact such as natural thing such as mountain, sea, gulf or building.In addition, storage part 11 also can comprise the recording medium of the embodied on computer readable such as storage card of installing from the outside.
Control part 12 usefulness CPU formations such as (Central Processing Unit).Control part 12 carries out passing on of the indication corresponding with each one of formation camera head 1, data etc. according to from the index signal of operation inputting part 7, touch panel 10 or switching signal etc., controls the action of camera head 1 uniformly.
The detailed formation of control part 12 is described.Control part 12 possesses image processing part 121, location portion 122, position calculation portion 123, posture detection unit 124, imaging control part 125, search part 126, direction determining portion 127, the synthetic portion 128 of image, display control unit 129.
The view data of signal processing part 29 output of 121 pairs of image pickup parts 2 of image processing part comprises that the edge is stressed, white balance and γ revisal be in interior image processing.In addition, image processing part 121 carries out processed compressed, the decompression of view data according to JPEG (Joint Photographic Experts Group) compress mode etc.
Location portion 122 is according to the gps signal that gps signal acceptance division 8 receives, and measures the position of camera head 1, highly termly.At this moment, location portion 122 adopts the gps signal that receives from the gps satellite more than 4, measures the position of camera head 1, highly termly.
Position calculation portion 123 adopts location portion 122 at the position of the camera head 1 of 2 different chronometries and the time information of timer 5 etc., and the quilt of calculating camera head 1 and camera head 1 shooting is taken the photograph the position of body.
Posture detection unit 124 is judged the posture of camera head 1 according to the testing result of posture detecting part 4.Specifically, posture detection unit 124 judges that camera heads 1 are the postures (shooting posture) of action of making a video recording, and still carries out the posture (position display posture) of action of the position of the camera head 1 on the map that displayed map information storage part 113 stored.Here, the position display posture means that the surface of the display frame of display part 9 is the posture of approximate horizontal state (forming 10 ° of states with interior angle with horizontal plane).In addition, the shooting posture means the posture beyond the position display posture.
Imaging control part 125 has been accepted the occasion of the input of photography index signal (shutter release signal) at shutter release switch 72, controls the photography action beginning that makes in the camera head 1.Here, the photography action in the camera head 1 is meant, for the view data of being exported by imaging apparatus 27 through the driving of shutter drive division 26 and shooting drive division 28, implements predetermined processing action by signal processing part 29 and image processing part 121.Implemented view data after such processing through control part 12, quilt and photography date temporal information etc. associate and are stored in the image data storage portion 111.
Search part 126 is taken the photograph the corresponding terrestrial reference in position of body from landmark information storage part 114 retrieval and the quilt of image, taking in.
Direction determining portion 127 changed according to the time of the position of the camera head in the position calculation portion 123 1, judged the moving direction of camera head 1.
The synthetic portion 128 of image carries out the synthetic of image.For example, the synthetic portion 128 of image will take the photograph the related information of body with the quilt that comprises in the image that image pickup part 2 is taken and be synthesized to this image.With taking the photograph the related information of body for example is this quilt position (latitude, longitude) of taking the photograph body, title etc.
Demonstration in the display control unit 129 control display parts 9.For example, display control unit 129 makes the corresponding respectively image of a plurality of view data that display part 9 demonstrations and image data storage portion 111 are stored.In addition, display control unit 129 makes the downscaled images (thumbnail image) after display part 9 demonstrations are dwindled a plurality of images respectively.
Have in the camera head 1 of above formation, pattern can be set at the reproduction mode of the photograph mode and the reproduced image of photographic images.Wherein, photograph mode also can be set photograph mode in the machine corresponding with the photography in the machine of aircraft except common photograph mode.Photograph mode is the photograph mode of the interior shooting of the in-flight machine of supposition from the situation of the observed landscape of window of aircraft in this machine.
Fig. 4 is the flow chart of the summary of the processing carried out of camera head 1.At first, in the occasion (step S1: be) of the power connection of camera head 1, get into step S2.On the other hand, the occasion of the power supply access failure of camera head 1 (step S1: not), camera head 1 end process.
Explain that camera head 1 sets the occasion of photograph mode (step S2: photograph mode) for.This occasion, during photograph mode (step S3: be), control part 12 is controlled in setting machine for, so that the state-detection of posture detecting part 4 and orientation detection portion 6 beginning camera heads 1, and makes the reception (step S4) of gps signal acceptance division 8 beginning gps signals.Photograph mode is to be assumed to the cameraman who takes a flight under the situation of the wide now landscape of the window shooting of aircraft, to set in the machine.Fig. 5 is a sketch map of setting the situation that the camera head 1 of photograph mode is taken in the machine for.Among Fig. 5, suppose that 200 the cameraman 300 of taking a flight will take the landscape of the periphery of being taken the photograph body 400.
Then, display control unit 129 shows live view image (step S5) on display part 9.The live view image here be meant with the state of not pressing shutter release switch 72 under the corresponding image of view data that generates of image pickup part 2.
Then; Measured the occasion (step S6: be) of position of the camera head 1 in 2 different moment according to gps signal in location portion 122, the position of camera head 1 when taking (below be called " current location ") (step S7) calculated in 2 positions (locating) that position calculation portion 123 usefulness have been measured.In addition, measured the occasion of the position of the different camera heads 1 constantly more than 3 in location portion 122, nearest 2 of position calculation portion 123 usefulness locate and calculate the current location of camera head 1.
Here, the concrete position calculating method that position calculating part 123 carries out is described.Fig. 6 is the key diagram of the computational methods of camera head 1 and the position of being taken the photograph body.Among Fig. 6, represented to calculate the occasion of being taken the photograph the position of body 400 shown in Figure 5.Among Fig. 6,2 that location portion 122 is measured recently locate and are made as P 1, P 2Position P 1Be the position of measuring at moment T1, its latitude is I 1, longitude is K 1Position P 2Be at moment T 2(>T 1) position measured, its latitude is I 2, longitude is K 2In addition, among Fig. 6, current location is made as P 0Current location P 0The moment be T 0(>T 2), its latitude is made as I 0, longitude is made as K 0In addition, among Fig. 6, taken the photograph the position P of body 400 LLatitude be made as I L, longitude is made as K LHere, latitude is with just representing north latitude, and uses the negative indication south latitude.That is, latitude is with-90 degree (south latitude 90 degree)~+ 90 degree (north latitude 90 degree) expression.In addition, longitude is with just representing east longitude, and uses the negative indication west longitude.That is, longitude is with-180 degree (west longitude 180 degree)~+ 180 degree (east longitude 180 degree) expression.
At first, the 123 employing position P of position calculation portion 1, P 2Latitude longitude information and information constantly, calculate the latitudinal speed BI of aircraft 200 and the speed BK of longitudinal respectively.
BI=(I 2-I 1)/(T 2-T 1)...(1)
BK=(K 2-K1)/(T 2-T 1)...(2)
Then, the result of calculation and the current location P of position calculation portion 123 employing formulas (1), (2) 0Time information, calculate current location P respectively 0Latitude I 0, longitude K 0
I 0=I 2+BI·(T 0-T 2)
=I 2+(I 2-I 1)·(T 0-T 2)/(T 2-T 1)...(3)
K 0=K 2+BK·(T 0-T 2)
=K 2+(K 2-K 1)·(T 0-T 2)/(T 2-T 1)...(4)
Then, posture detection unit 124 is judged the posture (step S8) of camera head 1.At first, the posture result of determination of posture detection unit 124 being described is the occasion (step S8: be) that camera head 1 is got the shooting posture.Fig. 7 is the diagrammatic sketch that camera head 1 is got the situation of shooting posture.As shown in Figure 7, cameraman 300 holds camera head 1, takes the window 201 observed landscape from aircraft 200.
Get the occasion (step S8: be) of shooting posture at camera head 1, posture detection unit 124 is judged the angle of depression (step S9) of camera heads 1.The angle of depression here is meant the formed angle of horizontal plane of direction and center through camera lens part 21 of the optical axis L 1 of camera lens part 21.
Quilt when then, photography is calculated by position calculation portion 123 is taken the photograph the position (step S10) of body.Among Fig. 5 and Fig. 6; Inclination angle in the direction of the optical axis L 1 of camera lens part 21 and vertical formation of center through camera lens part 21 is made as θ; The height of aircraft 200 is made as H; As the expression orientation detection portion 6 detected azimuths of being taken the photograph the orientation of body, will be that the azimuth of benchmark is made as with the north orientation of warp
Figure BDA0000138226070000101
When (, counterclockwise just rotating to be), taken the photograph the position P of body 400 with respect to north orientation LLatitude I L, longitude K LCan calculate respectively according to following formula (5), (6).
Figure BDA0000138226070000102
Figure BDA0000138226070000103
Here, the I on the right of 123 pairs of formulas of position calculation portion (5) 0And the K on the right of formula (6) 0, the result of calculation of difference substitution formula (3), (4).In addition, when the angle of depression that posture detection unit 124 determines in establishing step S9 is α, then with degree (°) be unit, angle of depression α and tiltangle satisfy α+θ=90 °.
Then, search part 126 is from the landmark information storage part 114 retrievals terrestrial reference corresponding with the position of being taken the photograph body (step S11).Specifically, search part 126 is in Fig. 5 and occasion shown in Figure 6, and retrieval is to be taken the photograph the position PL (I of body 400 from landmark information storage part 114 L, K L) be the center region D=(I, K) | I L-Δ I≤I≤I L+ Δ I, K L-Δ K≤K≤K LThe terrestrial reference that+Δ K} is comprised.Confirm the parameter Δ I of the size of region D, Δ K can be set at expectation through the user from the input of operation inputting part 7 value.In addition, in region D, have the occasion of a plurality of terrestrial references, search part 126 is selected from the position P that is taken the photograph body 400 L(I L, K L) nearest terrestrial reference.
Result in search part 126 retrievals has obtained the occasion of the terrestrial reference corresponding with the position of being taken the photograph body (step S12: be), and the synthetic portion 128 of image generates the synthetic live view image that landmark name is synthesized to the live view image.Then, display control unit 129 shows display part 9 should synthesize live view image (step S13), stated step S14 backward and shifted.Fig. 8 is the diagrammatic sketch of the demonstration example of the synthetic live view image in the display part 9.Synthetic live view image 500 shown in this figure by the image of being taken the photograph body 400 with show that the landmark name display image 501 with being taken the photograph the corresponding landmark name of body 400 (Fig. 8 be combined into " Fuji ") synthesizes.
Among the step S12, can't obtain the occasion (step S12: not) of the terrestrial reference corresponding with the position of being taken the photograph body in search part 126, camera head 1 is stated step S14 backward and is shifted.
Among the step S14, when pressing shutter release switch 72, after the input photography index signal (step S14: be), imaging control part 125 begin the to photograph control (step S15) of action.
Then, display control unit 129 shows and the corresponding image (writing down the browse graph picture) (step S16) of view data that generates according to the input of photography index signal and by image pickup part 2 display part 9.In addition, imaging control part 125 makes image data storage portion 111 storage these view data (step S17).In addition, the order of step S16 and step S17 can be carried out conversely, also can executed in parallel step S16 and step S17.
Then, in the occasion (step S18: be) of having imported the mode switch indication through mode selector switch 73 or touch panel 10, camera head 1 change action pattern (step S19) is returned step S1.On the other hand, in the occasion (step S18: deny) of not switching indication through mode selector switch 73 or touch panel 10 input patterns, camera head 1 returns step S1.
In step S14, do not press the occasion (step S14: deny) of shutter release switch 72, camera head 1 shifts to step S18.
Then, explain that the result that the posture in step S8 is judged is the occasion that camera head 1 is not got the shooting posture, i.e. occasion (the step S8: not) that shows posture is put in camera head 1 fetch bit.Fig. 9 is the sketch map that the situation that shows posture is put in camera head 1 fetch bit.Among Fig. 9, cameraman 300 is that the state of approximate horizontal is controlled camera head 1 with the display frame of display part 9.Put the occasion that shows posture in camera head 1 fetch bit, the heading (moving direction of camera head 1) (step S20) of 2 location determination aircrafts 200 that direction determining portion 127 calculates according to position calculation portion 123.Specifically, direction determining portion 127 adopts the speed BK of above-mentioned latitudinal speed BI and longitudinal, judges the heading of aircraft 200.
Then, image synthesizes the current location (I that portion 128 obtains according to step S7 0, K 0) and the heading judged of step S9, be created on and comprise the current location display image that has synthesized the aircraft picture in the map image of being taken the photograph body 400.Then, display control unit 129 makes display part 9 show this current location display image (step S21).Figure 10 is the diagrammatic sketch of the demonstration example of the current location display image in the display part 9.In the current location display image 600, the overlapping aircraft that shown is as 601 in map shown in this figure.Aircraft as 601 in, the direction of advance of aircraft is represented with arrow.Then, camera head 1 shifts to step S18.
Among the description of step S6, location portion 122 undetermineds finish at the occasion of the position of the camera head 1 in 2 different moment (step S6: not).This occasion also comprises the occasion of only having measured a position.Finish the occasion in the position of the camera head 1 in 2 different moment at location portion 122 undetermineds, the synthetic portion 128 of image is created on the overlapping warning live view image of warning in the live view image.The warning live view image (step S22) that display control unit 129 generates the synthetic portion 128 of display part 9 display images.Here the warning content that shows for example is " in the current location.Please do not cut off the electricity supply.」。In addition, also can audio output unit be set, by this audio output unit output warning sound at camera head 1.Then, camera head 1 shifts to step S14.
Then, among the description of step S3, camera head 1 is not set the occasion of photograph mode in the machine for, and promptly camera head 1 is set occasion (the step S3: not) of common photograph mode for.This occasion, display control unit 129 make display part 9 show live view image (step S23).Then, camera head 1 shifts to step S14.
Then, explain that in step S2 camera head 1 sets occasion (the step S2: reproduction mode) of reproduction mode for.This occasion, display control unit 129 make display part 9 display images select picture (step S24).This image is selected in the picture, and guide look shows a plurality of reproduction candidate images.
Then, in the occasion (step S25: be) of having selected image through the input of operation inputting part 7 or touch panel 10, display control unit 129 makes display part 9 full frames show selected image (step S26).On the other hand, in the occasion (step S25: deny) of non-selected image, camera head 1 is stated step S28 backward and is shifted.
Then, in the occasion (step S27: be) of the switching indication of having imported image through the input of operation inputting part 7 or touch panel 10, camera head 1 returns step S26, the demonstration of display part 9 is switched to the image and the full frame demonstration (step S26) of selection.On the other hand, in the occasion (step S27: deny) of the not switching indication of input picture, camera head 1 is stated step S28 backward and is shifted.
Among the step S28, in the occasion (step S28: be) of the end indication of having imported the reconstruction of image through operation inputting part 7 or touch panel 10, camera head 1 shifts to step S18.(step S28: not), camera head 1 returns step S25 to the occasion of relatively, indicating in the end of not input picture reproduction.
Execution mode of the present invention 1 according to above explanation; Be used in the position of this display unit of position calculation of 2 different chronometries at predetermined instant; Will with the corresponding information in the position that calculates be synthesized to the corresponding image of this information in and show; Therefore, while show and to extract and to show the information related when the corresponding image of its moving state moving with wanting images displayed.
In addition; According to this execution mode 1; Take the photograph in the image of body and show through being synthesized to this quilt with the corresponding information in position that the quilt that calculate the position of this display unit that is used in 2 different chronometries is taken the photograph body; The occasion of when the cameraman is moving, photographing can show the information of being taken the photograph body.
In addition; According to this execution mode 1; Put the occasion that shows posture in the display unit fetch bit, generate the map of the position that comprises this display unit, and be created on the current location display image that has synthesized the position of this display unit in the map of this generation; Therefore, the user can hold current location in the timing of expectation.
In addition, in the above explanation, entitling synthetically and situation about showing have been described in being taken the photograph the image of body, still, except landmark name, also can have been shown the simple declaration of this terrestrial reference.This occasion, the explanation that also can in landmark information storage part 114, store terrestrial reference in advance.
(execution mode 2)
Embodiment of the present invention 2 is characterised in that, the record browse graph after showing photography as the time overlapping demonstration landmark name, perhaps overlapping demonstration landmark name in the reproduced image when reproducing.In addition, the display unit of this execution mode 2 is that the camera head 1 of formation and execution mode 1 explanation of camera head is same.
Figure 11 is the flow chart of the summary of the processing carried out of camera head 1.At first, in the occasion (step S31: be) of the power connection of camera head 1, get into step S32.On the other hand, the occasion of the power supply access failure of camera head 1 (step S31: not), camera head 1 end process.
Set photograph mode (step S32: photograph mode) at camera head 1; And set the occasion (step S33: be) of photograph mode in the machine for; Control part 12 is controlled; So that the state-detection action of posture detecting part 4 and orientation detection portion 6 beginning camera heads 1, and make gps signal acceptance division 8 begin the reception (step S34) of gps signals.
Then, display control unit 129 makes display part 9 show live view image (step S35).
Then, supressing shutter release switch 72, importing the occasion (step S36: be) of photography index signal, imaging control part 125 makes the control (step S37) of photography action beginning.Then, control part 12 makes image data storage portion 111 storage shot image data and photographic informations (step S38).The photographic information here is the angle of depression of camera head 1 for example.
In subsequent step S39, measured the occasion (step S39: be) of position of the camera head 1 in 2 different moment in location portion 122,2 of position calculation portion 123 usefulness locate and calculate the current location (step S40) of camera head 1.The concrete computational methods of the current location among this step S40 and execution mode 1 same (with reference to formula (3), (4)).
Quilt when then, camera head 1 is calculated photography is taken the photograph the position (step S41) of body.The quilt here take the photograph body the position concrete computational methods also with execution mode 1 same (with reference to formula (5), (6)).
Then, search part 126 is from the landmark information storage part 114 retrievals terrestrial reference corresponding with the position of being taken the photograph body (step S42).
Result in search part 126 retrievals has obtained the occasion of the terrestrial reference corresponding with the position of being taken the photograph body (step S43: be), and the synthetic portion 128 of image is created on the composite traces browse graph picture that has synthesized landmark name in the record browse graph picture.Then, display control unit 129 makes display part 9 show this composite traces browse graph picture (step S44).Composite traces browse graph picture and synthetic live view image 500 shown in Figure 8 likewise show.
Then, control part 12 with shot image data with associate with the corresponding landmark information of this view data, be stored in (step S45) in the image data storage portion 111.
Then, in the occasion (step S46: be) of having imported the mode switch indication through mode selector switch 73 or touch panel 10, camera head 1 change action pattern (step S47) is returned step S1.On the other hand, in the occasion (step S46: deny) of not switching indication through mode selector switch 73 or touch panel 10 input patterns, camera head 1 returns step S1.
Among the step S42, in the occasion (step S43: deny) of result for obtaining the terrestrial reference corresponding with the position of being taken the photograph body of search part 126 retrievals, camera head 1 shifts to above-mentioned steps S46.
Then, among the description of step S39, the occasion of the position of the camera head 1 in 2 moment that location portion 122 undetermineds are intact different (step S39: not).Here, " occasion of the position of the camera head 1 in 2 moment that undetermined is intact different " also comprises the occasion of only having measured a position.In the occasion of the position of the camera head 1 in intact different 2 moment of location portion 122 undetermineds, the synthetic portion 128 of image is created on the warning live view image that has synthesized warning in the live view image.The warning live view image (step S48) that display control unit 129 generates the synthetic portion 128 of display part 9 display images.The content of the warning that shows here is for example " in the current location.Please do not cut off the electricity supply.」。
Then, location portion 122 is in the occasion (step S49: be) of the mensuration that shows 2 positions that are through with in the scheduled time of having warned, and the quilt when 2 of position calculation portion 123 usefulness locate the calculating photography is taken the photograph the position P of body 400 L(step S50).In addition, among the step S39, only calculate the occasion of a position in location portion 122, among the step S49, location portion 122 is as long as can measure at least one position showing in the scheduled time of having warned.The scheduled time here is about 1 minute.
Here, the concrete position calculating method among the description of step S50.Figure 12 is camera head 1 and the key diagram of being taken the photograph the position calculating method of body.Among Figure 12,2 that location portion 122 is measured recently locate and are made as P 3, P 4Position P 3Be at moment T 3The position of measuring, its latitude is I 3, longitude is K 3Position P 4Be at moment T 4(>T 3) position measured, its latitude is I 4, longitude is K 4In addition, among Figure 12, the position (camera position) of the camera head 1 during with shooting is made as P 0'.Camera position P 0' the moment be T 0' (<T 3), its latitude is made as I 0', longitude is made as K 0'.In addition, among Figure 12, the position of being taken the photograph body 400 shown in Figure 5 is made as P L', its latitude is made as I L', longitude is made as K L'.In addition, the technique of expression of latitude I and longitude K and execution mode 1 are same.
At first, position calculation portion 123 uses position P 3, P 4Latitude longitude information and information constantly, calculate the latitudinal speed BI ' of aircraft 200 and the speed BK ' of longitudinal respectively.
BI′=(I 4-I 3)/(T 4-T 3)...(7)
BK′=(K 4-K 3)/(T 4-T 3)...(8)
Then, the result of calculation and the camera position P of position calculation portion 123 usefulness formulas (7), (8) 0' time information, calculate camera position P respectively 0' latitude I 0', longitude K 0'.
I 0′=I 3-BI′·(T 3-T 0′)
=I 3-(I 4-I 3)·(T 3-T 0′)/(T 4-T 3)...(9)
K 0′=K 3-BK′·(T 3-T 0′)
=K 3-(K 4-K 3)·(T 3-T 0′)/(T 4-T 3)...(10)
At last, the height H of the tiltangle of the direction of the optical axis L 1 of position calculation portion 123 through using camera lens part 21 and vertical formation of center through camera lens part 21, aircraft 200, will be as the azimuth in the detected orientation of being taken the photograph body 400 of expression orientation detection portion 6 with the north orientation of warp azimuth as benchmark
Figure BDA0000138226070000151
(with respect to north orientation, counterclockwise just rotating to be) calculated the latitude I that is taken the photograph body respectively L', longitude K L'.
Figure BDA0000138226070000152
Figure BDA0000138226070000153
Here, the I on the right of 123 pairs of formulas of position calculation portion (11) 0' and the K on the right of formula (12) 0', the result of calculation of difference substitution formula (9), (10).
In addition, Figure 12 and formula (9)~(12) have represented that location portion 122 has measured the occasion of 2 different positions after the photography of step S37, still, in addition, also can be employed in 2 different positions of photography front and back location portion 122 mensuration of step S37.This occasion, also set up formula (9)~(12).
Behind this step S50, camera head 1 shifts to above-mentioned steps S42.
Then, among the description of step S33, camera head 1 is not set the occasion of photograph mode in the machine for, and promptly camera head 1 is set occasion (the step S33: not) of common photograph mode for.This occasion, display control unit 129 make display part 9 show live view image (step S51).
Then, press when shutter release switch 72, imported (step S52: be) after the photography index signal, imaging control part 125 makes the control (step S53) of photography action beginning.Then, display control unit 129 make display part 9 show with according to the input of photography index signal and by the corresponding record browse graph picture (step S54) of view data of image pickup part 2 generations.Imaging control part 125 makes image data storage portion 111 storage these view data (step S55).In addition, the order of step S54 and step S55 also can be carried out conversely, also can executed in parallel step S54 and step S55.Then, camera head 1 shifts to step S46.
Do not press the occasion (step S52: deny) of shutter release switch 72 among the step S52, camera head 1 shifts to step S46.
Then, camera head 1 is set occasion (the step S2: reproduction mode) of reproduction mode among the description of step S2.This occasion, display control unit 129 make display part 9 display images select picture (step S56).This image is selected in the picture, and guide look shows a plurality of reproduction candidate images.
Then; In the occasion (step S57: be) of having selected image through operation inputting part 7 or touch panel 10; When having the landmark information related with the image of selecting (step S58: be), display control unit 129 will synthesize landmark name in the image of selecting synthetic reproduced image full frame on display part 9 shows (step S59).Should also likewise show by synthetic reproduced image with synthetic live view image shown in Figure 8.
Then, in the occasion (step S60: be) of the switching indication of having imported image, camera head 1 returns step S58, judges to have or not and the related landmark information of selecting (step S58) of image.On the other hand, in the occasion (step S60: deny) of the not switching indication of input picture, when the end indication of having imported the reconstruction of image through operation inputting part 7 or touch panel 10 (step S61: be), camera head 1 shifts to step S46.(step S61: not), camera head 1 returned step S57 when the end of input picture was not indicated among the step S60.
In step S57, do not selected the occasion (step S57: deny) of image through operation inputting part 7 or touch panel 10, camera head 1 shifts to step S61.
Then, occasion (the step S58: not) that does not have the landmark information related among the description of step S58 with the image of selecting.This occasion; If from photography elapsed time constantly of the image of this selection at the fixed time with interior (step S62: be); Then control part 12 is controlled; Make the state-detection action of posture detecting part 4 and orientation detection portion 6 beginning camera heads 1, and make the reception action (step S63) of gps signal acceptance division 8 beginning gps signals.The scheduled time here preferably is made as the time that reduces the time-delay when reproducing as far as possible, is at most about 1 minute.
Then, locate quilt when calculating photography of in the scheduled time of having selected image to rise, the be through with occasion (step S64: be) of mensuration of 2 positions of location portion 122,2 of position calculation portion 123 usefulness is taken the photograph the position (step S65) of body.It is same that the quilt is here taken the photograph the method for position calculating method and step S50 explanation of body.In addition, the scheduled time among the step S65 is preferably short as far as possible, for example is about 1 minute.
Then, search part 126 is from 114 retrievals of landmark information storage part and the corresponding terrestrial reference of positional information (step S66) of being taken the photograph body.Result in search part 126 retrievals has obtained the occasion of the terrestrial reference corresponding with the position of being taken the photograph body (step S67: be), and the synthetic portion 128 of image is created on the synthetic reproduced image that has synthesized landmark name in the reproduced image.Then, display control unit 129 shows display part 9 full frames and is somebody's turn to do synthetic reproduced image (step S68).
Then, control part 12 associates landmark information and view data and stores (step S69) in the image data storage portion 111 into.Behind the step S69, camera head 1 shifts to step S60.
Among the step S62; The elapsed time that photography rises constantly is not at the fixed time with interior occasion (step S62: not); Among the step S64; Location portion 122 does not finish the occasion (step S64: not) of the mensuration of 2 positions in the given time, or search part 126 can't obtain the occasion of landmark information (step S67: not), display control unit 129 makes display part 9 full frames show selected image (step S70) among the step S67.Then, camera head 1 shifts to step S60.
Embodiment of the present invention 2 according to above explanation; The position of the predetermined instant of this display unit is calculated in the position that is used in 2 different chronometries; Will with the corresponding information in the position that calculates be synthesized to the corresponding image of this information in and show; Therefore, while show and to extract and to show the information related when the corresponding image of its moving state moving with wanting images displayed.
In addition; According to this execution mode 2; To be synthesized to this quilt with the corresponding information in position that the quilt that calculate the position of this display unit that is used in 2 different chronometries is taken the photograph body takes the photograph in the image of body and shows; Thereby while moving the occasion of photographing, can show the information of being taken the photograph body the cameraman.
In addition, according to this execution mode 2, even can't calculate the position of camera head during photography, the position in the time of also can calculating photography through the position of calculating subsequently.Thereby, also can after photography, discern the title of being taken the photograph body.Like this; According to this execution mode 2; Need like this through such behavior of carrying out at short notice and the terrestrial reference retrieval of will photographing that behavior is separated relatively for a long time handles; Not only can make the cameraman fix attention on photography, can also after photography, obtain and taken the photograph the related information of body, intensification is to being taken the photograph the understanding of body.
As the form of embodiment of the present invention, 2 execution modes have been described, but have been the invention is not restricted to above-mentioned execution mode 1,2.For example, location portion also can only locate, and height detects or be made as the average height of aircraft with semiconductor pressure sensor or baroceptor.According to display unit with such formation, adopt gps signal to get final product from 3 gps satellites, therefore can shorten the processing time of location portion.
In addition, also can make display unit of the present invention possess the function that communicates via communication networks such as internets, obtain cartographic information, landmark information from the outside via communication network.
Like this, the present invention also comprises here the various execution modes of not record etc.

Claims (9)

1. display unit that possesses display part that can display image is characterized in that this display unit possesses:
Location portion, it measures the position of this display unit termly;
Position calculation portion, it uses above-mentioned location portion to calculate the position of this display unit at predetermined instant in the position of 2 different chronometries;
Image synthesizes portion, its will be synthesized to the corresponding information in position that above-mentioned position calculation portion calculates with the corresponding image of this information in; And
Display control unit, it makes above-mentioned display part show the synthetic composograph of the synthetic portion of above-mentioned image.
2. display unit according to claim 1 is characterized in that,
This display unit also possesses the gps signal acceptance division, and this gps signal acceptance division receives the gps signal that is sent by a plurality of gps satellites,
The gps signal that above-mentioned location portion uses above-mentioned gps signal acceptance division to receive is measured the position of this display unit.
3. display unit according to claim 1 and 2 is characterized in that,
This display unit also possesses image pickup part, and this image pickup part is taken and taken the photograph body, generates the view data that this quilt is taken the photograph body,
Above-mentioned position calculation portion uses above-mentioned location portion to calculate the position of this display unit when taking in the position of 2 different chronometries, and uses this result of calculation to calculate the above-mentioned position of being taken the photograph body,
The synthetic portion of above-mentioned image will be synthesized to the above-mentioned corresponding information in position of being taken the photograph body that above-mentioned position calculation portion calculates in the above-mentioned image of being taken the photograph body.
4. display unit according to claim 3 is characterized in that, above-mentioned display unit also possesses:
The landmark information storage part, it stores position and the relevant information of title with a plurality of terrestrial references; With
Search part, it retrieves the above-mentioned corresponding terrestrial reference in position of being taken the photograph body that calculates with above-mentioned position calculation portion from above-mentioned landmark information storage part,
The synthetic portion of above-mentioned image is synthesized to the above-mentioned search part retrieval information relevant with title terrestrial reference that obtain in the above-mentioned image of being taken the photograph body.
5. according to claim 3 or 4 described display unit, it is characterized in that,
The above-mentioned image of being taken the photograph body that the synthetic portion of above-mentioned image uses in synthetic is that the view data with above-mentioned image pickup part generation is presented at the live view image on the above-mentioned display part accordingly in real time.
6. according to claim 3 or 4 described display unit, it is characterized in that,
Above-mentioned display unit possesses the operation inputting part that can import the photography index signal,
The above-mentioned image of being taken the photograph body that the synthetic portion of above-mentioned image uses in synthetic be with according to the input of the photography index signal of aforesaid operations input part and by the corresponding record browse graph picture of view data of above-mentioned image pickup part generation.
7. according to claim 3 or 4 described display unit, it is characterized in that,
Above-mentioned display unit possesses storage and comprises the storage part of the view data of above-mentioned image pickup part generation in interior information,
The above-mentioned image of being taken the photograph body that the synthetic portion of above-mentioned image uses in synthetic is the image corresponding with the view data of above-mentioned storage portion stores.
8. according to each the described display unit in the claim 1~7, it is characterized in that above-mentioned display unit possesses:
Map information, its storage comprises the cartographic information of map;
Posture detecting part, it detects the posture of this display unit;
The posture detection unit, it is according to the result that above-mentioned posture detecting part detects, and judges at least whether this display unit has taked to show the position display posture of the position of this display unit on the above-mentioned map,
The result that the synthetic portion of above-mentioned image is judged at above-mentioned posture detection unit has taked under the situation of position display posture for this display unit; According to the cartographic information of above-mentioned map information storage, will comprise this display unit the position the zone map and represent that the images of positions of this display unit on this map is synthetic.
9. a display routine is characterized in that, make can display image display unit carry out:
The location step is measured the position of this display unit termly;
The position calculation step is used the position of above-mentioned location step in 2 different chronometries, calculates the position of this display unit at predetermined instant;
The image synthesis step will be synthesized in the predetermined picture with the corresponding information in position that above-mentioned position calculation step calculates;
Step display shows the composograph that above-mentioned image synthesis step is synthetic.
CN201210046512.2A 2011-04-28 2012-02-24 Display device and display packing Expired - Fee Related CN102761703B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2011102004A JP5750697B2 (en) 2011-04-28 2011-04-28 Display device and display program
JP2011-102004 2011-04-28

Publications (2)

Publication Number Publication Date
CN102761703A true CN102761703A (en) 2012-10-31
CN102761703B CN102761703B (en) 2016-02-03

Family

ID=47056015

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201210046512.2A Expired - Fee Related CN102761703B (en) 2011-04-28 2012-02-24 Display device and display packing

Country Status (2)

Country Link
JP (1) JP5750697B2 (en)
CN (1) CN102761703B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111295879A (en) * 2017-11-15 2020-06-16 村田机械株式会社 Management server, management system, management method, and program

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1087605A2 (en) * 1999-09-17 2001-03-28 Sony Corporation Information processing apparatus and method , and program storage medium
CN101184135A (en) * 2006-11-16 2008-05-21 三星Techwin株式会社 System and method for inserting position information into image
CN101335816A (en) * 2007-06-25 2008-12-31 三星Techwin株式会社 System and method for inputting position information in captured image
US20090115862A1 (en) * 2007-11-05 2009-05-07 Sony Ericsson Mobile Communications Ab Geo-tagging of moving pictures
JP4292682B2 (en) * 2000-04-19 2009-07-08 ソニー株式会社 GPS receiver, GPS positioning method, and storage medium
CN101753807A (en) * 2009-12-16 2010-06-23 惠州Tcl移动通信有限公司 Image pick-up device

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4274656B2 (en) * 1999-12-08 2009-06-10 カシオ計算機株式会社 Camera device and subject name display method
JP2006186542A (en) * 2004-12-27 2006-07-13 Konica Minolta Photo Imaging Inc Imaging apparatus and navigation system
JP4264099B2 (en) * 2006-09-26 2009-05-13 Sky株式会社 Mobile device with camera
JP2009141644A (en) * 2007-12-06 2009-06-25 Casio Comput Co Ltd Image data management apparatus

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1087605A2 (en) * 1999-09-17 2001-03-28 Sony Corporation Information processing apparatus and method , and program storage medium
JP4292682B2 (en) * 2000-04-19 2009-07-08 ソニー株式会社 GPS receiver, GPS positioning method, and storage medium
CN101184135A (en) * 2006-11-16 2008-05-21 三星Techwin株式会社 System and method for inserting position information into image
CN101335816A (en) * 2007-06-25 2008-12-31 三星Techwin株式会社 System and method for inputting position information in captured image
US20090115862A1 (en) * 2007-11-05 2009-05-07 Sony Ericsson Mobile Communications Ab Geo-tagging of moving pictures
CN101753807A (en) * 2009-12-16 2010-06-23 惠州Tcl移动通信有限公司 Image pick-up device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111295879A (en) * 2017-11-15 2020-06-16 村田机械株式会社 Management server, management system, management method, and program
CN111295879B (en) * 2017-11-15 2021-08-03 村田机械株式会社 Management server, management system, management method, and recording medium

Also Published As

Publication number Publication date
CN102761703B (en) 2016-02-03
JP2012235285A (en) 2012-11-29
JP5750697B2 (en) 2015-07-22

Similar Documents

Publication Publication Date Title
US20220270197A1 (en) Image capturing method and display method for recognizing a relationship among a plurality of images displayed on a display screen
CN100452834C (en) Photographing device and method for obtaining photographic image having image vibration correction
US8564710B2 (en) Photographing apparatus and photographing method for displaying information related to a subject
CN1965578B (en) Digital 3D/360 degree camera system
CN105594191B (en) Imaging device, image processing device, and image processing method
JP4178009B2 (en) Shooting system
US8339477B2 (en) Digital camera capable of detecting name of captured landmark and method thereof
CN101910936A (en) Guided photography based on image capturing device rendered user recommendations
CN103067658A (en) Imaging device, image display device, and electronic camera
JP2006526945A (en) Apparatus and method for recording "pass-enhanced" multimedia
CN101909152A (en) Camera head
KR101218220B1 (en) Apparatus for drawing digital map
CN103067656B (en) Camera head and image capture method
CN103907341A (en) Image generation device, and image generation method
CN103491331A (en) Display control apparatus, display control method, and program
KR101030066B1 (en) The camera control system for producing the panorama of a map information
CN102959938A (en) Image processing method and apparatus
US9215340B2 (en) Display apparatus, display method, and computer-readable recording medium
CN106254754A (en) Filming apparatus, image processing apparatus, the control method of filming apparatus
JP2943263B2 (en) Image search system
CN102761703A (en) Display device and display program
JP6039471B2 (en) Imaging apparatus and control method thereof
CN102843512B (en) Camera head, image capture method
JPH0888825A (en) Method for editing construction video image
JP2003296329A (en) Information providing device, information providing method and information processing program

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C41 Transfer of patent application or patent right or utility model
TA01 Transfer of patent application right

Effective date of registration: 20151208

Address after: Tokyo, Japan

Applicant after: OLYMPUS Corp.

Address before: Tokyo, Japan

Applicant before: Olympus Imaging Corp.

C14 Grant of patent or utility model
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20211210

Address after: Tokyo, Japan

Patentee after: Aozhixin Digital Technology Co.,Ltd.

Address before: Tokyo, Japan

Patentee before: OLYMPUS Corp.

CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20160203