JP5231119B2 - Display device - Google Patents

Display device Download PDF

Info

Publication number
JP5231119B2
JP5231119B2 JP2008198133A JP2008198133A JP5231119B2 JP 5231119 B2 JP5231119 B2 JP 5231119B2 JP 2008198133 A JP2008198133 A JP 2008198133A JP 2008198133 A JP2008198133 A JP 2008198133A JP 5231119 B2 JP5231119 B2 JP 5231119B2
Authority
JP
Japan
Prior art keywords
image
unit
subject
camera
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
JP2008198133A
Other languages
Japanese (ja)
Other versions
JP2010041076A5 (en
JP2010041076A (en
Inventor
三義 洲脇
修 野中
Original Assignee
オリンパス株式会社
オリンパスイメージング株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オリンパス株式会社, オリンパスイメージング株式会社 filed Critical オリンパス株式会社
Priority to JP2008198133A priority Critical patent/JP5231119B2/en
Publication of JP2010041076A publication Critical patent/JP2010041076A/en
Publication of JP2010041076A5 publication Critical patent/JP2010041076A5/ja
Application granted granted Critical
Publication of JP5231119B2 publication Critical patent/JP5231119B2/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Description

The present invention relates to a display device that enables display so that a subject can be observed from a plurality of angles.
Patent Document 1 discloses a mechanism for photographing a subject from a plurality of angles. In the mechanism disclosed in Patent Document 1, the camera can be fixed to a cylindrical housing. In such a configuration, three-dimensional information of the subject can be obtained by sequentially capturing images with a camera fixed to the casing while moving the casing along the circumference surrounding the subject. Thus, the subject can be viewed from a plurality of angles, for example, by rotating the subject displayed on the display unit.
JP 2002-374454 A
  Here, in patent document 1, the housing | casing for image | photographing a to-be-photographed object from multiple directions from equal distance is required. For this reason, it is not always possible to shoot in response to the shooting situation. On the other hand, if the user manually operates the camera to shoot the subject from a plurality of angles, the user can shoot immediately in accordance with the shooting situation. However, in this case, there is a possibility that accurate three-dimensional information cannot be obtained because the subject cannot be photographed from the correct angle due to camera shake or the like.
The present invention has been made in view of the above circumstances, and an object thereof is to provide a display device that can obtain accurate three-dimensional information of a subject without requiring a special device.
In order to achieve the above object, a display device according to a first aspect of the present invention includes a motion detection unit that detects a movement of the display device during shooting, and a positional relationship detection that detects a positional relationship between a subject and a shooting position. And a plurality of images obtained at a plurality of different shooting positions surrounding the subject based on the movement of the display device detected by the motion detection unit and each of the images satisfying the allowable condition An image correction unit that corrects the relationship to be a relationship, a warning unit that issues a warning according to the positional relationship, and a display unit that displays the corrected image, and according to the detection result of the position detection unit, The image related to the subject is enlarged / reduced, and the image related to the previous subject is deleted according to the detection result of the motion detection unit .
According to the present invention, without the need for special equipment, it is possible to provide a display device obtained an accurate three-dimensional information of the object.
Hereinafter, embodiments of the present invention will be described with reference to the drawings.
FIG. 1 is a diagram showing a configuration of a camera according to an embodiment of the present invention. A camera 100 illustrated in FIG. 1 includes a control unit 101, an operation unit 102, an imaging unit 103, an image processing unit 104, a display unit 105, a recording unit 106, a motion detection unit 107, and a communication unit 108. Have. The camera 100 is connected to a personal computer (PC) 200 through a communication unit 108 so as to be able to communicate.
  The control unit 101 controls the operation of each unit of the camera 100. The control unit 101 receives various operations from the user operation unit 102 and controls various sequences according to the operation content. In addition, the control unit 101 uses an image processing unit serving as an image correction unit during shooting (hereinafter referred to as 3D shooting) for obtaining an image necessary for performing three-dimensional (3D) display, which will be described in detail later. The image correction is also instructed.
  The operation unit 102 is various operation members for the user to operate the camera 100. The operation unit 102 includes, for example, an operation member for the user to instruct the camera 100 to execute 3D shooting, an operation member for the user to perform various operations for 3D display, and the like.
  The photographing unit 103 includes a photographing lens, an image sensor, an analog / digital (A / D) conversion unit, and the like. The imaging unit 103 having such a configuration forms imaging light incident through the imaging lens as an object image on an image sensor, and converts the formed object image into an electrical signal (image signal) by photoelectric conversion. . Further, the imaging unit 103 obtains an image by digitizing the imaging signal obtained by the imaging element by the A / D conversion unit.
  The image processing unit 104 performs various image processing on the image obtained by the photographing unit 103. This image processing includes white balance correction processing, gradation correction processing, and the like. Further, the image processing unit 104 also has a function as an image correction unit, and is obtained by the photographing unit 103 so that appropriate 3D display can be performed from the image obtained by the photographing unit 103 and the movement of the camera 100. Correct the image. Furthermore, the image processing unit 104 also performs subject detection for detecting the movement of the subject in the image at the time of 3D shooting described later. The subject in the image can be detected, for example, by detecting a high contrast portion in the image.
  The display unit 105 displays the image processed by the image processing unit 104 and the image recorded in the recording unit 106 under the control of the control unit 101. The display unit 105 is composed of a liquid crystal display, for example. The recording unit 106 records the image processed by the image processing unit 104. The recording unit 106 is a memory configured to be detachable from the camera 100, for example.
  The motion detection unit 107 that functions as a position detection unit together with the image processing unit 104 detects the movement of the camera 100 in order to detect the positional relationship between the imaging unit 103 and the subject during 3D imaging. Here, the configuration of the motion detection unit 107 is not particularly limited, and various types can be applied. Here, an example will be described. In the present embodiment, the movement of the camera 100 is detected as a translational movement of the camera 100 indicated by an arrow A in FIG. 2 and a rotation of the camera 100 indicated by an arrow B in FIG. Note that FIG. 2 shows only translation in one direction and rotation around that direction, but actually three axes including an axis along arrow A and an axis orthogonal to arrow A, and around each axis. Rotation is also detected.
  FIG. 3A is a diagram illustrating an example of a configuration for detecting the movement of the camera 100 in the translation direction. 3A includes two electrodes 12 fixed to the camera 100, and an electrode 11 that is bridged with respect to the electrode 12 and is movable along with the translational movement of the camera 100. Configured.
  In FIG. 3A, when the camera 100 moves in the direction of arrow A, acceleration is generated in the electrode 11 along with this movement, and the electrode 11 also moves in the direction of arrow A. A change in acceleration of the electrode 11 as shown in FIG. 3B can be detected as a change in capacitance between the electrode 11 and the electrode 12. If the acceleration obtained as shown in FIG. 3B is integrated, the moving speed of the electrode 11 as shown in FIG. 3C can be obtained. Further, by integrating the movement speed obtained as shown in FIG. 3C, the movement amount of the electrode 11 as shown in FIG. 3D, that is, the movement amount of the camera 100 in the translation direction can be obtained. This makes it possible to detect the shooting position of each image at the time of 3D shooting described later.
  FIG. 4A is a diagram illustrating an example of a configuration for detecting the rotation of the camera 100. The rotation detection unit shown in FIG. 4A includes a pair of piezoelectric elements 13 that vibrate when a voltage is applied.
  When the angular velocity indicated by the arrow shown in the figure is generated in the camera 100, a Coriolis force (a force that is 90 degrees to the right in the direction of travel in the case of clockwise rotation and 90 degrees to the left of the direction of movement in the case of counterclockwise rotation) Therefore, the vibrating piezoelectric element pair 13 is deformed. By detecting a voltage change caused by this deformation, the angular velocity can be detected. If the angular velocity is integrated, a rotation amount (rotation angle) can be obtained as shown in FIG. Accordingly, it is possible to detect the tilt of the camera 100 at the time of 3D shooting described later.
  Returning to FIG. 1, the communication unit 108 is an interface circuit that mediates communication between the camera 100 and the PC 200. Note that the communication method between the camera 100 and the PC 200 is not particularly limited, and may be wired communication using a USB cable or the like, or wireless communication using a wireless LAN or the like.
  The PC 200 is installed with software for displaying and editing images taken by the camera 100. When 3D shooting is performed by the camera 100, it is possible to perform 3D display on the PC 200 by transmitting an image group obtained as a result of the 3D shooting to the PC 200.
  Hereinafter, 3D imaging will be described. In 3D shooting in the present embodiment, the user manually moves the camera 100 and performs shooting at a plurality of different positions (angles) surrounding the subject. FIG. 5 is a diagram showing an outline of 3D imaging. As shown in FIG. 5, the user moves the camera 100 along the periphery of the subject 300. At this time, the control unit 101 of the camera 100 performs continuous shooting by controlling the shooting unit 103 at equal time intervals. Thereby, the subject 300 can be photographed from a plurality of photographing positions (photographing angles). By appropriately displaying images captured at a plurality of shooting angles on the display unit 105 according to a user operation, the subject 300 can be rotated on the display unit 105 and observed from a plurality of angles. In the 3D shooting shown in FIG. 5, it is possible to observe the subject 300 displayed on the display unit 105 by rotating it in the horizontal direction in the screen.
  Here, in order to perform appropriate 3D display, it is necessary that the images used for the 3D image have an equivalent positional relationship. This equivalent positional relationship is that (1) the shooting positions of the respective images are equally spaced, (2) the distance from the shooting position of each image to the subject 300 is equal, (3) each The camera 100 is correctly oriented in the direction intended by the user (that is, the direction of the subject 300) at the shooting position.
  First, regarding (1), if the user can move the camera 100 at a constant speed, the shooting positions of the respective images are equally spaced. However, even if shooting is performed at predetermined intervals such as every 2 °, it is difficult to keep the moving speed of the user's hand constant. As a result, the shooting interval becomes 1 ° or 3 °. There is a high possibility that Such a change in the moving speed of the camera 100 can be detected by the translational movement detector as described above. Since the product of the moving speed and the continuous shooting time interval is the shooting interval, more images than the number of images actually used for 3D display are acquired, and the shooting interval is selected from the large number of images. Only an image group with equal intervals is adopted as an image used for 3D display.
  Next, regarding (2), if the user can move the camera 100 correctly on the circumference surrounding the subject 300, the subject can be photographed at the same distance at each photographing position. However, with manual movement, it is difficult to move the camera 100 in a correct circular shape. Therefore, for example, when the camera 100 is moved from the position A to the position B in FIG. 2, there is a high possibility that a shift 401 occurs with respect to the circular locus 400 centered on the subject 300.
  FIG. 6A is a diagram illustrating an example of a deviation from the circular locus 400 that may occur when the camera 100 moves. If the camera 100 can be moved following the circular trajectory 400 during 3D shooting, the distance difference is zero. However, since it is difficult to move the camera 100 so as to completely follow the circular locus 400, the distance between the camera 100 and the subject 300 usually varies depending on the shooting position as shown in FIG. Resulting in. When such a distance difference occurs, the size of the subject in the image obtained by shooting changes. That is, when the distance between the subject 300 and the camera 100 is short, the subject in the image becomes large, and when the distance between the subject 300 and the camera 100 is long, the subject in the image becomes small. Note that such a change in distance from the subject 300 cannot be easily detected by the motion detection unit 107 described above. Therefore, in the present embodiment, a change in distance from the subject 300 is detected based on the size of the subject in the captured image. When a change in the size of the subject in the image is detected, the image of the subject portion is corrected by enlargement or reduction processing.
  Furthermore, regarding (3), the angle change of the camera 100 can be detected by the rotation detection unit. FIG. 6B is a diagram illustrating an example of a change in the angle of the camera 100 that may occur when the camera 100 is moved. If the camera 100 is beaten when the camera 100 is moved, the angle at which the camera 100 views the subject 300 changes. If such an image is employed for 3D display, an image that cannot smoothly rotate the subject is obtained. Therefore, such an image is not adopted for 3D display. Instead, a corrected image is generated by synthesizing at least two images obtained at the most recent photographing position of the image.
  Note that depending on the state of hand shake of the user and a change in the moving speed, the subject may be arranged not on the center of the image but on either the top, bottom, left or right side of the image. In such a case, it is preferable to correct so that the subject comes to the center of the image by shifting the image of the subject portion.
  FIG. 7 is a diagram illustrating a specific example of image correction after 3D imaging. Here, FIG. 7 is a diagram showing a specific example when correcting five images selected so that the photographing positions are equally spaced.
  First, among the five images 501 to 505 shown in FIG. 7A, the images 501 and 505 have the same subject size as that of the other images (that is, no change in distance), and It is obtained without any change in angle. These images are employed as 3D display images without correction.
  Next, among the five images 501 to 505 shown in FIG. 7A, the size of the subject portion of the image 503 is smaller than the other images. This means that when the image 503 was captured, the camera 100 was at a position farther from the locus 400. For such an image 503, the subject portion is enlarged, and a corrected image 503a obtained thereby is adopted as an image for 3D display.
  Also, among the five images 501 to 505 shown in FIG. 7A, the images 502 and 504 are images in which an angle change is detected. In these cases, the corrected image 502a is generated from the images 501 and 503a without using the image 502, and the corrected image 504a is generated from the images 503a and 505 without using the image 504, and these images 502a and 504a are generated. Is adopted as an image for 3D display.
  By performing the correction as described above, it is possible to generate an image capable of performing smooth 3D display as shown in FIG. 7B.
  Control of the camera capable of 3D shooting as described above will be described below. FIG. 8 is a flowchart showing the main operation of the camera according to the embodiment of the present invention. Here, the camera of the present embodiment is assumed to be a camera that can also perform general photographing, but in FIG. 8, illustration of control related to general photographing is omitted.
  When the camera 100 is activated, the control unit 101 determines whether execution of 3D imaging is instructed by an operation of the operation unit 102 by the user (step 211). If execution of 3D display is instructed in the determination in step S211, the control unit 101 controls the photographing unit 103 to perform photographing of the subject (step S212). In addition, the control unit 101 detects the movement of the camera 100 and the subject in the image in synchronization with shooting (step S213). The movement of the camera 100 can be detected by the movement detection unit 107. Further, the movement of the subject in the image can be detected by the image processing unit 104. The result of motion detection in step S213 is associated with the image obtained by the photographing unit 103.
  After the motion detection, the control unit 101 determines whether or not an angle change of the camera 100 (rotation of the camera 100) has been detected and whether or not the subject in the image is out of the allowable range from the center (step S214). . In the determination in step S214, when a change in the angle of the camera 100 is detected or when the subject in the image is outside the allowable range from the center, the control unit 101 gives a warning to the user by sound, light emission, or the like ( Step S215). This warning allows the user to pay attention to the tilt and movement speed of the camera 100 during 3D shooting. On the other hand, if it is determined in step S214 that the camera 100 is not tilted and the subject in the image is not more than the allowable range from the center, the control unit 101 skips the process in step S215.
  Next, the control unit 101 determines whether or not the size of the subject in the image has changed beyond an allowable range (step S216). If it is determined in step S216 that the size of the subject in the image has changed beyond an allowable range, the control unit 101 warns the user by sound or light emission (step S217). With this warning, the user can pay attention to the distance between the camera 100 and the subject during 3D shooting. On the other hand, if it is determined in step S216 that the size of the subject in the image has not changed beyond the allowable range, the control unit 101 skips the process in step S217.
  Next, the control unit 101 determines whether or not the end of 3D imaging is instructed by the operation of the operation unit 102 by the user (step 218). If the end of 3D display is not instructed in the determination in step S218, the process returns to step S212, and the control unit 101 executes the next shooting. That is, the shooting is repeated until the end of 3D shooting is instructed by the operation of the operation unit 102 by the user. It should be noted that the time intervals of the shootings are made equal. On the other hand, when the end of 3D display is instructed in the determination in step S218, the control unit 101 performs image correction processing so that correct 3D display can be performed using an image obtained as a result of 3D imaging ( Step S219). Details of the image correction processing will be described later. After the image correction process, the control unit 101 collects a series of image groups obtained by the image correction process into one folder and records it in the recording unit 106 (step S220), and then ends the process shown in FIG.
  Further, when the execution of 3D display is not instructed in the determination in step S211, the control unit 101 determines whether or not an instruction for 3D display has been given by the operation of the operation unit 102 by the user (step S221). If it is determined in step S221 that no 3D display instruction has been given, the process returns to step S211. On the other hand, when a 3D display instruction is given in the determination in step S221, the control unit 101 executes a 3D display process (step S222). This 3D display process will be described later. After the 3D display process, the control unit 101 determines whether the end of 3D display is instructed by the operation of the operation unit 102 by the user (step S223). If the end of 3D display is not instructed in the determination in step S223, the process returns to step S222, and the control unit 101 continues the 3D display process. On the other hand, when the end of 3D display is instructed in the determination in step S223, the control unit 101 ends the process of FIG.
  Next, image correction processing will be described. FIG. 9 is a flowchart showing image correction processing. In FIG. 9, first, the control unit 101 selects a reference image. Here, for example, the first image captured in the series of images obtained by the 3D imaging in FIG. 8 is set as the reference image (step S301). Here, the reference image is an image serving as a reference for selecting an image necessary for 3D display and an unnecessary image. Corrections are made so that the distances between the subject 300 and the camera 100 in the reference image are equally spaced, and images are adopted so that the shooting positions are equally spaced with reference to the reference image.
  Next, the control unit 101 selects an image at a shooting position that is separated from the reference image by a predetermined shooting interval (step S302). Then, the control unit 101 determines whether or not determination has been performed on images at all shooting positions necessary for 3D display (step S303).
  In the determination in step S303, if the determination is not made for all images at the shooting positions necessary for 3D display, the control unit 101 indicates that the camera 100 is intended by the user when shooting the image selected in step S302. It is determined whether or not the camera 100 is in an appropriate shooting direction based on whether or not the camera 100 has an angle change that exceeds an allowable range (step S304). If it is determined in step S304 that there is an angle change that exceeds the allowable range of the camera 100, the control unit 101 rejects the image selected in step S302 (step S305). Thereafter, the process returns to step S302, and the control unit 101 selects the next image.
  On the other hand, in the determination in step S304, when there is no change in angle beyond the allowable range of the camera 100, the control unit 101 compares the size of the subject in the image selected in step S302 with respect to the size of the subject in the reference image. It is determined whether or not the amount of change is greater than or equal to an allowable range (step S306). If it is determined in step S306 that the amount of change in the subject size is greater than or equal to the allowable range, the control unit 101 performs a process of correcting the subject size in the image selected in step S302 (step S307).
  Here, the correction process in step S307 will be described. FIG. 10 is a flowchart illustrating the subject size correction process. First, the control unit 101 detects the height H1 of the subject in the correction target image (step S321). Next, the control unit 101 detects the height H0 of the subject in the reference image (step S322). Note that the height H0 of the subject in the reference image may be detected in advance. By detecting H1 and H0, it is possible to calculate the amount of change H1 / H0 in the height of the subject in the image. Therefore, the control unit 101 uses the image processing unit 104 to perform correction by multiplying the height and width of the central portion (square area including the subject) in the correction target image by H0 / H1 (step S323).
  Thereafter, returning to FIG. 9, the description will be continued. In the determination in step S303, when the determination is made for all the images at the photographing positions necessary for 3D display, the control unit 101 identifies whether there is an image that has not been adopted among the selected images. (Step S308). If there is a rejected image, the rejected image is removed, and further, at least two images including the shooting positions before and after the rejected image are combined and rejected. A corrected image to be obtained at the shooting position is generated (step S309). Thereafter, the control unit 101 ends the process of FIG.
  Next, 3D display will be described. FIG. 11 is a flowchart illustrating 3D display processing. As described above, 3D display can be performed by either the camera 100 or the PC 200. Here, an example in which 3D display is performed on the display unit 105 of the camera 100 will be described, but the processing of the flowchart shown in FIG. 11 can also be applied to 3D display on the PC 200.
  At the time of 3D display, the control unit 101 first causes the display unit 105 to display a reference image (an image first captured at the time of 3D shooting) in an image group collected in a folder for 3D display (step S401). ). Thereafter, the control unit 101 determines whether or not an instruction to rotate the display unit 105 subject 300 in the left-right direction is given by the operation of the operation unit 102 by the user (step S402). If it is determined in step S402 that a rotation operation instruction has not been issued, the control unit 101 exits the process of FIG. 11 and returns to the process of FIG. On the other hand, when a rotation operation instruction is given in the determination in step S402, the control unit 101 displays the display unit so that the subject is rotated rightward or leftward according to the user's operation direction as shown in FIG. The image displayed on 105 is switched (step S403). Thereafter, the control unit 101 exits the process of FIG. 11 and returns to the process of FIG.
  As described above, according to the present embodiment, it is possible to perform correct 3D display using images obtained by manually moving the camera 100 and shooting at a plurality of positions surrounding the subject 300. It is. That is, correct 3D information of the subject is obtained by correcting the image in consideration of a change in the angle and speed of the camera 100 that may occur when the user manually moves the camera 100, and a change in the relative distance to the subject 300. This makes it possible to perform appropriate 3D display.
  Here, in the above-described example, the camera 100 corrects an image obtained by 3D shooting. However, if the correction process shown in FIG. 9 can be executed, the correction process may be performed by, for example, a server provided on the network. In this case, it is necessary to record information such as an angle change and a speed change of the camera 100 at each shooting position at the time of 3D shooting in association with the image, and transmit the information together with the image.
  In the above-described example, an example is described in which shooting is performed by moving the camera 100 only in the direction around the subject 300 and parallel to the ground surface during 3D shooting. On the other hand, the camera 100 may be moved around the subject 300 in a direction perpendicular to the ground surface for shooting. In this way, the subject 300 can be rotated in the vertical direction of the screen during 3D display. Further, in the above-described example, when the camera 100 changes its angle during 3D shooting, the image obtained at that time is not adopted for 3D display. On the other hand, as shown in FIG. 13, by synthesizing the image 502 and the image 504 obtained when the angle change has occurred, an image 503b that moves in the vertical direction is generated. Alternatively, images other than parallel directions may be obtained.
  Although the present invention has been described above based on the embodiments, the present invention is not limited to the above-described embodiments, and various modifications and applications are naturally possible within the scope of the gist of the present invention.
  Further, the above-described embodiments include various stages of the invention, and various inventions can be extracted by appropriately combining a plurality of disclosed constituent elements. For example, even if some configuration requirements are deleted from all the configuration requirements shown in the embodiment, the above-described problem can be solved, and this configuration requirement is deleted when the above-described effects can be obtained. The configuration can also be extracted as an invention.
It is a figure which shows the structure of the camera which concerns on one Embodiment of this invention. It is a figure shown about translation and rotation of the camera which can be detected with a motion detection part. It is a figure for demonstrating the translation movement detection part which detects the movement of the translation direction of a camera. It is a figure for demonstrating the rotation detection part which detects rotation of a camera. It is a figure which shows the outline | summary of 3D imaging | photography. FIG. 6A is a diagram showing an example of a deviation from a circular trajectory that can occur when the camera moves, and FIG. 6B is a diagram that shows an example of a camera angle change that can occur when the camera moves. is there. It is a figure which shows the specific example of the image correction after 3D imaging | photography. It is a flowchart shown about the main operation | movement of the camera which concerns on one Embodiment of this invention. It is the flowchart shown about the image correction process. 6 is a flowchart illustrating subject size correction processing; It is a flowchart shown about the process of 3D display. It is a figure shown about rotation of the image in the case of 3D display. It is a figure shown about the modification of embodiment of this invention.
Explanation of symbols
  DESCRIPTION OF SYMBOLS 100 ... Camera, 101 ... Control part, 102 ... Operation part, 103 ... Imaging | photography part, 104 ... Image processing part, 105 ... Display part, 106 ... Recording part, 107 ... Motion detection part, 108 ... Communication part, 200 ... Personal computer (PC)

Claims (2)

  1. In a display device having a photographing unit for observing and photographing a subject from a plurality of angles to obtain an image related to the subject, a motion detecting unit for detecting a movement of the display device at the time of photographing , the subject and the photographing unit And a position detection unit that detects a positional relationship between the plurality of images obtained by the photographing unit at a plurality of different positions surrounding the subject based on the positional relationship detected by the position detection unit. comprising an image correcting unit for each satisfying image is corrected so as to be equivalent to the positional relationship, a warning unit for issuing a warning in accordance with the positional relationship, and a display unit for displaying the corrected image A display device , wherein the image relating to the subject is enlarged or reduced according to the detection result of the position detection unit, and the image relating to the previous subject is deleted according to the detection result of the motion detection unit. Place.
  2.   The image correction unit selects an image group having shooting positions at equal intervals from a plurality of images obtained by the shooting unit based on the positional relationship detected by the position detection unit. The display device according to claim 1.
JP2008198133A 2008-07-31 2008-07-31 Display device Expired - Fee Related JP5231119B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2008198133A JP5231119B2 (en) 2008-07-31 2008-07-31 Display device

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2008198133A JP5231119B2 (en) 2008-07-31 2008-07-31 Display device
CN200910161802XA CN101640811B (en) 2008-07-31 2009-07-30 Camera
CN201110344425.0A CN102438103B (en) 2008-07-31 2009-07-30 The method for imaging of camera and camera

Publications (3)

Publication Number Publication Date
JP2010041076A JP2010041076A (en) 2010-02-18
JP2010041076A5 JP2010041076A5 (en) 2011-08-18
JP5231119B2 true JP5231119B2 (en) 2013-07-10

Family

ID=41615553

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2008198133A Expired - Fee Related JP5231119B2 (en) 2008-07-31 2008-07-31 Display device

Country Status (2)

Country Link
JP (1) JP5231119B2 (en)
CN (2) CN101640811B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012019292A (en) * 2010-07-06 2012-01-26 Sharp Corp Imaging inspection device of 3d camera module and imaging inspection method thereof, imaging inspection control program of 3d camera module, imaging correction method of 3d camera module, imaging correction control program of 3d camera module, readable recording medium, 3d camera module and electronic information apparatus
JP5892060B2 (en) * 2012-12-25 2016-03-23 カシオ計算機株式会社 Display control apparatus, display control system, display control method, and program
WO2014109125A1 (en) 2013-01-09 2014-07-17 ソニー株式会社 Image processing device, image processing method and program

Family Cites Families (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09187037A (en) * 1995-12-27 1997-07-15 Canon Inc Image processor
JP2000305207A (en) * 1999-04-21 2000-11-02 Canon Inc Electronic still camera
JP2001324305A (en) * 2000-05-17 2001-11-22 Minolta Co Ltd Image correspondent position detector and range finder equipped with the same
JP2002230585A (en) * 2001-02-06 2002-08-16 Canon Inc Method for displaying three-dimensional image and recording medium
JP4566908B2 (en) * 2003-02-18 2010-10-20 パナソニック株式会社 Imaging system
JP4172352B2 (en) * 2003-07-11 2008-10-29 ソニー株式会社 Imaging apparatus and method, imaging system, and program
JP4130641B2 (en) * 2004-03-31 2008-08-06 富士フイルム株式会社 Digital still camera and control method thereof
WO2006004043A1 (en) * 2004-07-07 2006-01-12 Nec Corporation Wide field-of-view image input method and device
JP4479396B2 (en) * 2004-07-22 2010-06-09 セイコーエプソン株式会社 Image processing apparatus, image processing method, and image processing program
JP5029012B2 (en) * 2004-09-21 2012-09-19 株式会社ニコン Electronics
JP2006113807A (en) * 2004-10-14 2006-04-27 Canon Inc Image processor and image processing program for multi-eye-point image
JP4517813B2 (en) * 2004-10-15 2010-08-04 株式会社ニコン Panning camera and video editing program
JP2006217478A (en) * 2005-02-07 2006-08-17 Sony Ericsson Mobilecommunications Japan Inc Apparatus and method for photographing image
JP4285422B2 (en) * 2005-03-04 2009-06-24 日本電信電話株式会社 Moving image generation system, moving image generation apparatus, moving image generation method, program, and recording medium
JP2007214887A (en) * 2006-02-09 2007-08-23 Fujifilm Corp Digital still camera and image composition method
JP4757085B2 (en) * 2006-04-14 2011-08-24 キヤノン株式会社 IMAGING DEVICE AND ITS CONTROL METHOD, IMAGE PROCESSING DEVICE, IMAGE PROCESSING METHOD, AND PROGRAM
JP4315971B2 (en) * 2006-11-09 2009-08-19 三洋電機株式会社 Imaging device

Also Published As

Publication number Publication date
CN102438103A (en) 2012-05-02
CN101640811B (en) 2012-01-11
CN101640811A (en) 2010-02-03
CN102438103B (en) 2015-10-14
JP2010041076A (en) 2010-02-18

Similar Documents

Publication Publication Date Title
CN107026973B (en) Image processing device, image processing method and photographic auxiliary equipment
KR101270893B1 (en) Image processing device, image processing method, program thereof, recording medium containing the program, and imaging device
US6304284B1 (en) Method of and apparatus for creating panoramic or surround images using a motion sensor equipped camera
US7697025B2 (en) Camera surveillance system and method for displaying multiple zoom levels of an image on different portions of a display
KR101357425B1 (en) Jiggle measuring system and jiggle measuring method
KR101339193B1 (en) Camera platform system
JP5063749B2 (en) Imaging control system, imaging apparatus control apparatus, control method therefor, and program
JP2005086279A (en) Imaging apparatus and vehicle provided with imaging apparatus
KR100585822B1 (en) Monitor system use panorama image and control method the system
KR101452342B1 (en) Surveillance Camera Unit And Method of Operating The Same
US8692879B2 (en) Image capturing system, image capturing device, information processing device, and image capturing method
JP5231119B2 (en) Display device
KR20120065997A (en) Electronic device, control method, program, and image capturing system
JP2004088558A (en) Monitoring system, method, program, and recording medium
JP2007089042A (en) Imaging apparatus
JP2005175852A (en) Photographing apparatus and method of controlling photographing apparatus
JP2019186635A (en) Imaging system, information processing apparatus, control method of information processing apparatus, and program
JP2019054369A (en) Imaging device, control method of imaging device, and program
JP2005210507A (en) Imaging apparatus and imaging method
JP2013145949A (en) Projection system, and alignment adjusting method for superposed image
US20200396385A1 (en) Imaging device, method for controlling imaging device, and recording medium
JP2010278738A (en) Imaging device and imaging method
WO2021020358A1 (en) Image processing device, image processing method, program, and storage medium
JP2021027584A (en) Image processing device, image processing method, and program
US20210021759A1 (en) Imaging apparatus and non-transitory storage medium

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20110704

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20110704

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20120613

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20120619

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20120803

A02 Decision of refusal

Free format text: JAPANESE INTERMEDIATE CODE: A02

Effective date: 20121030

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20130117

Free format text: JAPANESE INTERMEDIATE CODE: A821

Effective date: 20130117

A911 Transfer of reconsideration by examiner before appeal (zenchi)

Free format text: JAPANESE INTERMEDIATE CODE: A911

Effective date: 20130125

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20130305

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20130321

R150 Certificate of patent or registration of utility model

Ref document number: 5231119

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150

Free format text: JAPANESE INTERMEDIATE CODE: R150

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20160329

Year of fee payment: 3

S111 Request for change of ownership or part of ownership

Free format text: JAPANESE INTERMEDIATE CODE: R313115

R350 Written notification of registration of transfer

Free format text: JAPANESE INTERMEDIATE CODE: R350

S531 Written request for registration of change of domicile

Free format text: JAPANESE INTERMEDIATE CODE: R313531

R350 Written notification of registration of transfer

Free format text: JAPANESE INTERMEDIATE CODE: R350

LAPS Cancellation because of no payment of annual fees