EP2718905A1 - Image processing apparatus, image processing method, and program - Google Patents

Image processing apparatus, image processing method, and program

Info

Publication number
EP2718905A1
EP2718905A1 EP12796143.1A EP12796143A EP2718905A1 EP 2718905 A1 EP2718905 A1 EP 2718905A1 EP 12796143 A EP12796143 A EP 12796143A EP 2718905 A1 EP2718905 A1 EP 2718905A1
Authority
EP
European Patent Office
Prior art keywords
image
images
plane
processing apparatus
plane images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP12796143.1A
Other languages
German (de)
French (fr)
Inventor
Daisuke Kurosaki
Yasumasa Oda
Hanae Higuchi
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2011126792A external-priority patent/JP2012094111A/en
Application filed by Sony Corp filed Critical Sony Corp
Publication of EP2718905A1 publication Critical patent/EP2718905A1/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/20Perspective computation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/388Volumetric displays, i.e. systems where the image is built up from picture elements distributed through a volume
    • H04N13/395Volumetric displays, i.e. systems where the image is built up from picture elements distributed through a volume with depth sampling, i.e. the volume being constructed from a stack or sequence of 2D image planes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text

Definitions

  • the present disclosure relates to an image processing apparatus, an image processing method, and a program and particularly, to a technology for generating a three-dimensional image.
  • an image display apparatus that displays an image (three-dimensional image: so-called 3D image) perceived by a user as a three-dimensional image is released and begins to spread (for example, Patent Literature 1).
  • the apparatus that can display the three-dimensional image is not limited to a television and other image display apparatuses. Of personal computers, there are types that can display a three-dimensional image.
  • applications that can generate contents having three-dimensional images. If the contents are generated by the applications and the user views the contents in a predetermined manner, the user can perceive images included in the contents as the three-dimensional images.
  • An image processing apparatus of the present disclosure comprises an image generating unit that generates a plurality of plane images and sets virtual distances in a depth direction to the plurality of generated plane images, respectively.
  • An image processing apparatus of the present disclosure comprises a three-dimensional image converting unit that converts the plurality of plane images into a three-dimensional image where objects' positions in space in each of the plurality of plane images are set, based on the virtual distances set to the plurality of plane images generated by the image generating unit.
  • An image processing apparatus of the present disclosure comprises a three-dimensional image generating unit that outputs data of the three-dimensional image converted by the three-dimensional image converting unit.
  • An image processing apparatus of the present disclosure comprises an editing screen generating unit that displays the plurality of plane images generated by the image generating unit individually or in a overlapped manner and generates display data of an editing screen displayed by providing tabs to the plane images, respectively.
  • An image processing apparatus of the present disclosure comprises and an input unit that receives an operation to generate or edit images in the editing screen generated by the editing screen generating unit.
  • An image processing method of the present disclosure comprises generating a plurality of plane images and setting virtual distances in a depth direction to the plurality of generated plane images, respectively; converting the plurality of plane images into a three-dimensional image where space positions of objects in each of the plurality of plane images are set, based on the virtual distances set to the plurality of generated plane images; outputting data of the converted three-dimensional image; displaying the plurality of generated plane images individually or to be overlapped and generating display data of an editing screen displayed by providing tabs to the plane images, respectively; and accepting an operation to generate or edit images in the generated editing screen.
  • a program of the present disclosure that causes a computer to execute an image process, the program causing the computer to execute: generating a plurality of plane images and setting the virtual distances of a depth direction to the plurality of generated plane images, respectively; converting the plurality of plane images into a three-dimensional image where the space positions of objects of the plurality of plane images are set, on the basis of the virtual distances set to the plurality of generated plane images; outputting data of the converted three-dimensional image; displaying the plurality of generated plane images individually or to be overlapped and generating display data of an editing screen displayed by providing tabs to the plane images, respectively; and accepting an operation to generate or edit images in the generated editing screen.
  • a plurality of plane images are appropriately displayed so that a user can generate and display a desired three-dimensional image through a simple operation.
  • an image processing apparatus an image processing method, and a computer program which are novel or improved ones and which enables contents having three-dimensional images to be easily generated.
  • Fig. 1 is a block diagram illustrating an example of the configuration of an image processing apparatus according to an embodiment of the present disclosure.
  • Fig. 2 is a diagram illustrating an example of an editing screen (example in which all layers are displayed) according to the embodiment of the present disclosure.
  • Fig. 3 is an exemplary diagram illustrating an example of an image of each layer according to the embodiment of the present disclosure.
  • Fig. 4 is an exemplary diagram illustrating an example of a depth display screen according to the embodiment of the present disclosure.
  • Fig. 5 is an exemplary diagram illustrating a concept of converting images of a plurality of layers into three-dimensional images.
  • Fig. 6 is an exemplary diagram illustrating a state in which images of a plurality of layers are converted into three-dimensional images.
  • Fig. 1 is a block diagram illustrating an example of the configuration of an image processing apparatus according to an embodiment of the present disclosure.
  • Fig. 2 is a diagram illustrating an example of an editing screen (example in which all layers are
  • FIG. 7 is an exemplary diagram illustrating an example in which images of a plurality of layers are converted into three-dimensional left and right channel images.
  • Fig. 8 is an exemplary diagram illustrating an example of three-dimensional images that are generated from images of a plurality of layers.
  • Fig. 9 is an exemplary diagram illustrating a converted state of the case in which there is a layer in which the front side of a virtual display surface is set to a virtual position.
  • Fig. 10 is an exemplary diagram illustrating an example of a three-dimensional image of the case where there is a layer in which the front side of a virtual display surface is set to a virtual position.
  • Fig. 11 is a flowchart illustrating an example of the flow of an editing screen display process according to the embodiment of the present disclosure.
  • Fig. 11 is a flowchart illustrating an example of the flow of an editing screen display process according to the embodiment of the present disclosure.
  • FIG. 12 is an exemplary diagram illustrating an example of an editing screen (example in which a third layer is displayed) according to the embodiment of the present disclosure.
  • Fig. 13 is an exemplary diagram illustrating an example of an editing screen (example in which a second layer is displayed) according to the embodiment of the present disclosure.
  • Fig. 14 is an exemplary diagram illustrating an example of an editing screen (example in which a first layer is displayed) according to the embodiment of the present disclosure.
  • Fig. 15 is a flowchart illustrating an example of the flow of a depth screen display process according to the embodiment of the present disclosure.
  • Fig. 16 is an exemplary diagram illustrating an example (first example) of displaying a depth screen according to the embodiment of the present disclosure.
  • FIG. 17 is an exemplary diagram illustrating an example (second example) of displaying a depth screen according to the embodiment of the present disclosure.
  • Fig. 18 is an exemplary diagram illustrating an example (third example) of displaying a depth screen according to the embodiment of the present disclosure.
  • Fig. 19 is an exemplary diagram illustrating an example (fourth example) of displaying a depth screen according to the embodiment of the present disclosure.
  • Fig. 20 is a flowchart illustrating an example of the flow of a horizontal line setting process according to the embodiment of the present disclosure.
  • Fig. 21 is an exemplary diagram illustrating an example of a horizontal line setting screen according to the embodiment of the present disclosure.
  • FIG. 22 is an exemplary diagram illustrating an example (first example) of a display screen of a specific layer at the time of setting a horizontal line according to the embodiment of the present disclosure.
  • Fig. 23 is an exemplary diagram illustrating an example (second example) of a display screen of a specific layer at the time of setting a horizontal line according to the embodiment of the present disclosure.
  • Fig. 24 is an exemplary diagram illustrating an example (third example) of a display screen of a specific layer at the time of setting a horizontal line according to the embodiment of the present disclosure.
  • Fig. 25 is an exemplary diagram illustrating an example (fourth example) of a display screen of a specific layer at the time of setting a horizontal line according to the embodiment of the present disclosure.
  • Fig. 23 is an exemplary diagram illustrating an example (second example) of a display screen of a specific layer at the time of setting a horizontal line according to the embodiment of the present disclosure.
  • Fig. 24 is an exemplary diagram illustrating an example (third example) of
  • FIG. 26 is an exemplary diagram illustrating an example of a display screen at the time of capturing a camera image according to the embodiment of the present disclosure.
  • Fig. 27 is an exemplary diagram illustrating a display example of a three-dimensional image where camera images are synthesized according to the embodiment of the present disclosure.
  • Fig. 28 is an exemplary diagram illustrating an example of a display screen at the time of importing an image file according to the embodiment of the present disclosure.
  • Fig. 29 is an exemplary diagram illustrating an example of an importing range setting at the time of importing an image file according to the embodiment of the present disclosure.
  • Fig. 30 is an exemplary diagram illustrating a display example of a three-dimensional image in which images imported from an image file are synthesized according to the embodiment of the present disclosure.
  • Fig. 31 is an exemplary diagram illustrating an example of a list display screen of productions according to the embodiment of the present disclosure.
  • Fig. 32 is a block diagram illustrating an example of detailed hardware configuration of an image processing
  • Embodiment of the present disclosure > ⁇ 1-1.
  • Configuration of an image processing apparatus (Fig. 1)> ⁇ 1-2.
  • Outline of a three-dimensional image (Figs. 3 to 10)> ⁇ 1-3.
  • Display example of an editing screen (Figs. 2 and Figs. 11 to 14)> ⁇ 1-4.
  • Display example of a depth adjustment screen (Figs.
  • FIG. 1 is a diagram illustrating the configuration of an image processing apparatus 100 according to this example.
  • An image processing apparatus 100 is configured to generate an image by a user operation and display and store the generated image.
  • the image processing apparatus 100 includes an image generating/processing unit 110, an image storage unit 120, an input unit 130, and an image display unit 140.
  • the image generating/processing unit 110 provides the user with an image generation screen through the image display unit 140 or generates a three-dimensional image from the image generated by the user.
  • the image generating/processing unit 110 that is included in the image processing apparatus 100 according to this example includes an image generating unit 112, a three-dimensional image converting unit 114, a three-dimensional image generating unit 116, and an editing screen generating unit 118.
  • the editing screen generating unit 118 also functions as an image control unit that controls image generation in each unit.
  • the image generating/processing unit 110 may include an image control unit that controls image generation in each unit, separately from the editing screen generating unit 118.
  • the image generating/processing unit 110 generates a plurality of plane images (for example, three plane images) on the basis of a user operation in a state in which an editing screen for image generation is displayed on the image display unit 140 and generates a three-dimensional image from the plurality of generated plane images (two-dimensional images).
  • the editing screen for the image generation is a screen illustrated in Fig. 2 and displays an intermediate image in generation processing at the center and displays operation buttons or tabs around the image.
  • the editing screen illustrated in Fig. 2 will be described below in detail.
  • Information of the depth from the reference position (for example, virtual display surface) is set to each of the plurality of generated plane images and the three-dimensional image is generated on the basis of the depth.
  • the image generating/processing unit 110 supplies image data of the three-dimensional image generated by the image generating/processing unit 110 to the image display unit 140 and the image display unit 140 displays the three-dimensional image.
  • the user views the three-dimensional image using a predetermined method (for example, in a state in which the user wears a shutter spectacle of a time-sharing driving type) and perceives the three-dimensional image displayed on the image display unit 140 as a three-dimensional image.
  • the image generating unit 112 displays the editing screen for the image generation on the image display unit 140 and generates an image by the user operation. If the image generating unit 112 generates images including a plurality of layers using the image generation screen provided by the image generating unit 112, the images that include the plurality of layers are converted into a three-dimensional image by the three-dimensional image converting unit 114 and the three-dimensional image generating unit 116. The images including the plurality of layers that are generated by the image generating unit 112 are stored in the image storage unit 120 according to the user operation.
  • the three-dimensional image converting unit 114 executes a conversion process for displaying the images including the plurality of layers transmitted from the image generating unit 112 as the three-dimensional image on the image display unit 140.
  • the image processing apparatus 100 previously assumes the distance between eyes of the user and the distance between the user and the display surface and executes a conversion process for displaying the image as the three-dimensional image on the image display unit 140, on the basis of the virtual distance between the layers (information of the depth of each layer of the images).
  • the three-dimensional image converting unit 114 executes a coordinate conversion process with respect to the images including the plurality of layers.
  • the three-dimensional image converting unit 114 executes the conversion process in real time according to the change. Thereby, the user can adjust the depth of each layer of the images and confirm the three-dimensional image after the adjustment through the display on the editing screen in real time.
  • An example of a process for adjusting the depth of each layer of the images will be described in detail below.
  • the image processing apparatus 100 executes preview display of the three-dimensional image.
  • the preview display of the three-dimensional image the user can previously grasp how the images would be seen in a three-dimensional manner before storing the generated image as the three-dimensional image.
  • the three-dimensional image generating unit 116 generates the three-dimensional image from the images including the plurality of layers, on the basis of the conversion process executed by the three-dimensional image converting unit 114.
  • the three-dimensional image generated by the three-dimensional image generating unit 116 is displayed on the image display unit 140 and is stored in the image storage unit 120 according to the operation of the input unit 130 from the user.
  • the editing screen generating unit 118 generates display data of the editing screen, on the basis of a reception state of an input operation in the input unit 130.
  • the editing screen generating unit 118 supplies the display data generated by the editing screen generating unit 118 to the image display unit 140 and displays the editing screen.
  • the image storage unit 120 stores the images including the plurality of layers generated by the image generating/processing unit 110 or the three-dimensional image generated by converting the images including the plurality of layers.
  • the images stored in the image storage unit 120 are read from the image storage unit 120 according to the operation of the input unit 130 by the user, are processed by the image generating/processing unit 110, and are displayed by the image display unit 140.
  • the input unit 130 includes various input devices that execute an input operation with respect to the image processing apparatus 100 by the user, for example, a keyboard, a mouse, a graphic tablet, and a touch panel.
  • the user can generate the images including the plurality of layers by operating the input unit 130 and adjust the depth of each layer of the images when the images are converted into the three-dimensional image.
  • the image display unit 140 is a display that displays an image.
  • the image display unit 140 displays the images including the plurality of layers generated by the image generating/processing unit 110 or the three-dimensional image generated by converting the images including the plurality of layers.
  • the image display unit 140 displays a screen to allow the images to be generated by the user of the image processing apparatus 100.
  • An example of the display screen will be described below.
  • a touch panel may be disposed on an image display surface of the image display unit 140 and the user may directly operate buttons in a displayed image.
  • the touch panel that is included in the image display unit 140 functions as a part of the input unit 130.
  • the image display unit 140 may be a device that is separated from the image processing apparatus 100.
  • the image display unit 140 may be configured using a display device that can display the three-dimensional image.
  • a method that displays the three-dimensional image is not limited to a specific display method.
  • a method that switches an image for a right eye and an image for a left eye at a high speed and displays the images is known.
  • a method that transmits the three-dimensional image to the image display unit 140 a frame sequential method, a side-by-side method, and a top and bottom method and the like are known.
  • the images that are generated by the image generating/processing unit 110 may be output to the television receiver and other display devices that are connected to the image processing apparatus 100 and can display the three-dimensional image.
  • the image processing apparatus 100 has plane images 251, 252, and 253 including three layers of a long-distance view, a middle-distance view, and a short-distance view generated by the user.
  • the image processing apparatus 100 synthesizes the plane images 251, 252, and 253 including the three layers of the long-distance view, the middle-distance view, and the short-distance view and the plane images are converted into a three-dimensional image by the three-dimensional image converting unit 114.
  • the user can generate the three-dimensional image without executing a complicated imaging process.
  • objects a, b, c, and g respectively corresponding to a mountain, the sun, a road, and a horizontal line are drawn in the plane image 251 of the long-distance view
  • an object d of a tree is drawn in the plane image 252 of the middle-distance view
  • objects e and f each corresponding to a dog and an insect are drawn in the plane image 253 of the short-distance view.
  • the number of layers that is three is exemplary and the number of layers may be plural.
  • Fig. 4 is a diagram illustrating a setting situation of the depth.
  • a setting state illustrated in Fig. 4 is a 3D state thumbnail tab 202 of the editing screen to be described below and the state illustrated in the drawing is reduced and displayed.
  • the setting state illustrated in Fig. 4 is enlarged and displayed on the editing screen.
  • a depth axis 301 is set to a direction orthogonal to a display screen and the depth positions of the layer images 311, 312, and 313 are set on the depth axis 301.
  • the depth axis 301 is illustrated as an oblique axis that has a predetermined angle.
  • An image frame that is illustrated by a broken line in Fig. 4 illustrates a virtual display surface 304 at the position of the image display surface of the display.
  • the virtual display surface 304 exists at the depth position 301a on the depth axis 301.
  • the position of the most front side of the depth axis 301 is illustrated as a front edge portion 302.
  • the first layer image 311 of the long-distance view exists at the innermost depth position 301b on the depth axis 301 and the second layer image 312 of the middle-distance view exists at the depth position 301c near almost the center on the depth axis 301.
  • the third layer image 313 of the short-distance view exists at the depth position 301d of the front side of the virtual display surface 304 on the depth axis 301.
  • the layer image 313 of the short-distance view exists at the depth position of the front side of the virtual display surface 304.
  • the layer image 313 of the short-distance view may be inner than the virtual display surface 304 (closer to the long-distance view).
  • the layer image may be automatically set to the predetermined depth position suitable for the long-distance view as the first layer image 311, in an initial state.
  • the layer image may be automatically set to the predetermined depth position suitable for the middle-distance view as the second layer image 312 and the layer image may be automatically set to the predetermined depth position suitable for the short-distance view as the third layer image 313.
  • the three-dimensional image is generated such that the objects a to g in the layer images 311, 312, and 313 are disposed at the depth positions of the layer images in a virtual three-dimensional space.
  • an image portion of the lower side of the horizontal line position 321 of the layer image 311 of the long-distance view is disposed to be inclined on the three-dimensional space and protrude to the front side.
  • the image portion of the lower side of the horizontal line c becomes an inclined surface 303 that gradually protrudes to the front side, from the depth position 301b of the layer image 311 to the front edge portion 302, as the image portion progresses to the lower side.
  • the object g of the road that is drawn on the lower side of the horizontal line position 321 in the layer image 311 is disposed on the inclined surface 303.
  • the inclined surface 303 protrudes to the front edge portion 302 of the front side of the third layer image 313.
  • the inclined surface 303 may protrude to the virtual display surface 304.
  • the layer (third layer image 313) of the most front side protrudes to the front side of a terminating point of the front side of the inclined surface 303.
  • each object is automatically appropriately disposed on the inclined surface 303.
  • a process for erasing inappropriate portions of the objects is executed.
  • a specific example of the setting of the horizontal line and the process associated with the horizontal line will be described below.
  • Fig. 5 is a diagram illustrating a concept of converting normal two-dimensional images including a plurality of layers into a three-dimensional image.
  • Fig. 5 illustrates an aspect where two-dimensional images including a plurality of layers (plane images 251, 252, and 253) are converted into an image 250R for a right eye which the user views using the right eye and an image 250L for a left eye which the user views using the left eye.
  • the process of the horizontal line described above is not illustrated.
  • the three-dimensional image converting unit 114 calculates the drawing positions of the image for the right eye and the image for the left eye to generate the image 250R for the right eye and the image 250L for the left eye from the two-dimensional images.
  • Figs. 6 to 8 are exemplary diagrams illustrating an example where normal two-dimensional images including a plurality of layers are converted into a three-dimensional image.
  • Fig. 6 illustrates a coordinate conversion when an image for a right eye and an image for a left eye are generated from two-dimensional images including three layers illustrated in Fig. 7 and illustrates a coordinate conversion when the three layers are inner than the display surface, as illustrated in Fig. 8.
  • Fig. 6 schematically illustrates a state in which each layer and the display surface are viewed from the top side.
  • the three-dimensional image converting unit 114 executes a projection coordinate conversion for the image for the right eye and a projection coordinate conversion for the image for the left eye, using the layer depths D1, D2, and D3 between the virtual display surface 259 and the layers. With the projection coordinate conversion, the pixel positions of the objects of each layer image in the virtual display surface 259 are calculated.
  • the three-dimensional image converting unit 114 executes the projection coordinate conversion with respect to the virtual display surface 259 and the image processing apparatus 100 can convert the normal two-dimensional images including the plurality of layers into the three-dimensional image.
  • Figs. 6 to 8 are examples where each layer is set to the depth deeper than the depth of the virtual display surface 259. However, each layer may be set to the depth of the front side of the virtual display surface 259.
  • Figs. 9 and 10 illustrate an example where two-dimensional images are converted into a three-dimensional image when an image of the layer of the short-distance view is disposed on the front side of a virtual display surface 259'.
  • Fig. 9 schematically illustrates a coordinate conversion when an image for a right eye and an image for a left eye are generated from two-dimensional images including three layers illustrated in Fig. 10, in an upwardly viewed state.
  • the pixel positions of the image for the right eye and the image for the left eye where the objects of the image of the layer of the front side of the virtual display surface 259' are projected on the virtual display surface 259' are reversed to the pixel positions in the layer deeper than the virtual display surface 259' at the left and right sides.
  • a three-dimensional image where the objects e and f disposed on the layer of the short-distance view are viewed to protrude to the front side of the display surface is obtained.
  • the editing screen is the screen illustrated in Fig. 2 that is displayed on the image display unit 140, on the basis of the process in the editing screen generating unit 118.
  • a flowchart of Fig. 11 illustrates a flow of the process in the editing screen generating unit 118. The description will be given with reference to Fig. 11.
  • step S11 it is determined whether the input unit 130 receives a user operation to display the editing screen. In this case, when the user operation is not received, a waiting state is maintained and when the user operation to display the editing screen is received, the process for displaying the 3D editing screen illustrated in Fig. 2 is executed (step S12). In this state, in the 3D editing screen, all of the layers are overlapped and displayed as the displayed intermediate images in generation processing. In addition, multi-layer thumbnails that correspond to the number of layers are displayed.
  • step S13 it is determined whether an operation to select any one of the displayed multi-layer thumbnails by the input unit 130 is received.
  • a waiting state is maintained and the image of the editing screen is not changed.
  • the image of the layer that corresponds to the selected layer thumbnail is displayed on the editing screen (step S14).
  • the display brightness of the images of the other layers is decreased and the image of each layer is simultaneously displayed, and the entire three-dimensional image is displayed to be recognized.
  • step S15 it is determined whether an operation to change the layer to another layer by selecting the layer thumbnail exists.
  • step S16 a change process of the layer that is displayed on the editing screen is executed (step S16), the process returns to step S14, and an image of the layer after the change is displayed.
  • step S17 it is determined whether an operation to change the display to the overlapped display of all of the layers exists (step S17). In this case, when it is determined that the operation to change the display to the overlapped display of all of the layers exists, the display is changed to the overlapped display of all of the layers and the process returns to the 3D editing screen display in step S12.
  • step S17 when it is determined that the operation to change the display to the overlapped display of all of the layers does not exist, the screen display of the selected layer in step S14 is continuously executed.
  • FIG. 2 illustrates an example of an editing screen in a state in which all layers of intermediate images in generation processing are overlapped and displayed.
  • intermediate image display 203 in edition processing is performed with a relatively large image at a center portion.
  • the editing image display 203 an image where the images 251 to 253 of the three layers illustrated in Fig. 3 are overlapped and displayed is displayed.
  • edge portions 203a to 203c of the three images are overlapped and illustrated with respect to an upper edge portion of the editing image display 203.
  • the edge portion (for example, edge portion 203a) of the image that corresponds to the selected tab is displayed on the front side and the edge portions of the other images are displayed on the inner side.
  • plural tabs 201, 202, and 211 to 213 to select a display image are disposed on the upper side of the intermediate image display 203 in edition processing in the editing screen.
  • the tabs 201, 202, and 211 to 213 display the images allocated to the selected tabs, when a user operation to select the corresponding tab display places by the touch panel operation or the like exists.
  • the layer thumbnail tabs 211, 212, and 213 are tabs to individually display the layers.
  • the first layer thumbnail tab 211 is a tab that displays a first layer image
  • the second layer thumbnail tab 212 is a tab that displays a second layer image
  • the third layer thumbnail tab 213 is a tab that displays a third layer image.
  • images of the layers are reduced and displayed as the thumbnail images. Therefore, when the image of each layer is modified, the thumbnail image is also modified in the same way.
  • the tab 201 that is displayed adjacent to the layer thumbnail tabs 211, 212, and 213 is a tab that adds the layer image. That is, when the tab 201 is selected, a layer image is newly added and an editing screen of the newly added layer image is displayed.
  • the 3D state thumbnail tab 202 that is disposed to be closer to the right side of the upper side of the intermediate image display 203 in edition processing is a tab that displays a depth state of the image of each layer.
  • the display screen of the depth state illustrated in Fig. 4 is enlarged and displayed in the place of the editing image display 203.
  • the display screen of the depth state illustrated in Fig. 4 is reduced and displayed.
  • the depth state is adjusted and a display content changes according to the adjustment.
  • buttons 221 to 225 are disposed on the left end of the editing image display 203.
  • a file importing button 221 a camera capturing button 222, a stamp button 223, a character input button 224, and a depth operation button 225 are prepared.
  • a process for importing a file image that is prepared in the image storage unit 120 or an external memory starts. If a user operation to select the camera capturing button 222 exists, a process for capturing image data from a camera device connected to the apparatus starts. In the stamp button 223 and the character input button 224, a process for inputting prepared figures or characters starts, by a user operation of each button. If a user operation to select the depth operation button 225 exists, a depth adjustment screen is displayed on the editing image display 203. A display process of the depth adjustment screen will be described below.
  • a generation start button 241, a save button 242, and a three-dimensional display button 243 are displayed on the right end of the editing image display 203.
  • the generation start button 241 is a button to instruct to generate a new three-dimensional image.
  • the save button 242 is a button to instruct to store an intermediate image in generation processing in the image storage unit 120.
  • the three-dimensional display button 243 is a button to switch three-dimensional display and normal display (2D display) of the image displayed on the editing image display 203.
  • a pen tool 290 is displayed on the lower side of the right end of the editing image display 203.
  • the pen tool 290 of this example displays a first pen 291, a second pen 292, a third pen293, and an eraser 294.
  • a line can be drawn with a color or a line type allocated to each pen. If a user operation to select a display place of the eraser 294 exists, the drawn line is erased.
  • the pen tool 290 may have a function of supporting other drawing or erasure.
  • object display units 231 to 237 are provided on a lower end of the editing image display 203.
  • the plurality of object display units 231 to 237 for example, seven newest objects that are used by the user are displayed.
  • each object of the intermediate image in generation processing is illustrated.
  • Fig. 2 illustrates an example where images of the individual layers are overlapped and displayed in the editing image display 203. Even when the other images are displayed in the editing image display 203, the same display is performed around the editing image display 203.
  • a tab that selects overlapped display where the image of each layer illustrated in Fig. 2 is overlapped is not provided in particular.
  • the tab that selects the overlapped display may be prepared separately from the tab for the image of each layer.
  • a thumbnail image where an overlapped image of the image of each layer is reduced and displayed is displayed on the tab. Therefore, when the image of each layer is modified, the thumbnail image where the overlapped image is reduced and displayed is modified according to the change.
  • Fig. 12 illustrates an example where the image of the third layer (image 253 of the short-distance view of Fig. 3) is displayed in the editing image display 203.
  • the image display of the third layer is performed by selecting the third layer thumbnail tab 213 by a user operation.
  • the objects e and f in the image of the third layer are displayed with set colors or brightness.
  • the objects of the images of the other layers are overlapped and displayed after decreasing the display brightness. That is, only the image of the third layer is highlighted and displayed and the images of the other layers are grayed out and displayed.
  • the image of the third layer is generated or edited.
  • Fig. 13 illustrates an example where the image of the second layer (image 252 of the middle-distance view of Fig. 3) is displayed in the editing image display 203.
  • the image display of the second layer is performed by selecting the second layer thumbnail tab 212 by a user operation.
  • the object d in the image of the second layer is displayed with set colors or brightness.
  • the objects of the images of the other layers are overlapped and displayed after decreasing the display brightness.
  • Fig. 14 illustrates an example where the image of the first layer (image 251 of the long-distance view of Fig. 3) is displayed in the editing image display 203.
  • the image display of the first layer is performed by selecting the first layer thumbnail tab 211 by a user operation.
  • the objects a, c, and g in the image of the first layer are displayed with set colors or brightness.
  • the objects of the images of the other layers are overlapped and displayed after decreasing the display brightness.
  • the image of the selected layer is highlighted and displayed and the images of the other layers are grayed out and displayed. Meanwhile, as the editing image, only the objects of the image of the selected layer may be displayed, and the objects of the images of the other layers may not be displayed.
  • a depth adjustment process is started by a user operation to select a depth operation button 225 of the editing screen illustrated in Fig. 2, for example. That is, as illustrated in Fig. 15, the editing screen generating unit 118 determines whether an operation to select the depth operation button 225 exists in a state in which the editing screen is displayed (step S21). When the operation to select the depth operation button 225 does not exist, a waiting state is maintained until the operation exists.
  • step S21 when it is determined that the operation to select the depth operation button 225 exists, overlapped display of the images of all of the layers is performed as the intermediate image display 203 in edition processing and a depth bar is displayed on the upper side of the intermediate image display 203 in edition processing (step S22).
  • the depth bar is a scale that illustrates the depth of each layer. As illustrated in this example, when the depth bar is configured using the images of the three layers, the depth positions of the three layers are displayed by the depth bar. On the lower side of the editing image display 203, a depth adjustment button is displayed.
  • a specific display example will be described below. However, in this example, for each layer, an adjustment button to move the depth of the image to the inner side and an adjustment button to move the depth of the image to the front side are prepared and displayed.
  • the editing screen generating unit 118 determines whether a user operation of any adjustment button exists (step S23). In this case, when it is determined that the operation of the adjustment button does not exist, the display of step S22 is continuously executed. When it is determined that the operation of the adjustment button exists, the image of the layer that corresponds to the operated operation button is displayed as the intermediate image display 203 in edition processing (step S24). The depth position that is set to the image of the corresponding layer is changed according to the operation situation of the adjustment button and the depth position of the depth bar display is changed according to the corresponding position (step S25). After the setting or the display is changed in step S25, the display returns to the display of step S22. However, the intermediate image display 203 in edition processing may be the display of only the operated layer until a next operation exists.
  • Fig. 16 is a diagram illustrating a display example of the depth bar.
  • the overlapped display of the images of all of the layers is performed as the intermediate image display 203 in edition processing and the depth bar 401 is displayed on the upper side of the editing image display 203.
  • the depth positions of the images of the three layers are illustrated in one depth bar 401. That is, in the depth bar 401, the depth position 401a of the image of the first layer, the depth position 401b of the image of the second layer, and the depth position 401c of the image of the third layer are illustrated by changing the display colors.
  • "0" indicates the depth position of the virtual display surface, the inner side is indicated by a minus value, and the front side is indicated by a plus value (plus display is not illustrated).
  • buttons that adjust the depth position are displayed for the image of each layer.
  • a depth adjustment button 411 that moves the depth to the front side and a depth adjustment button 412 that moves the depth to the inner side are displayed as adjustment buttons for the image of the first layer.
  • a depth adjustment button 421 that moves the depth to the front side and a depth adjustment button 422 that moves the depth to the inner side are displayed as adjustment buttons for the image of the second layer.
  • a depth adjustment button 431 that moves the depth to the front side and a depth adjustment button 432 that moves the depth to the inner side are displayed as adjustment buttons for the image of the third layer.
  • the setting of the depth of the image of each layer is changed.
  • the depth position of the image of the first layer is changed by the operations of the depth adjustment buttons 411 and 412 and the display of the depth position 401a of the image of the first layer in the depth bar 401 is moved as illustrated by an arrow La.
  • the depth position of the image of the second layer is changed by the operations of the depth adjustment buttons 421 and 422 and the display of the depth position 401b of the image of the second layer in the depth bar 401 is moved as illustrated by an arrow Lb.
  • the depth position of the image of the third layer is changed by the operations of the depth adjustment buttons 431 and 432 and the display of the depth position 401c of the image of the third layer in the depth bar 401 is moved as illustrated by an arrow Lc.
  • the depth adjustment buttons 411 to 432 are limited to the positions adjacent to the depth positions of the images of the adjacent layers. For example, a range of the movement La of the image of the first image is from the deepest position to the position adjacent to the depth position of the image of the adjacent second layer.
  • Fig. 17 illustrates another display example of the depth bar.
  • the image of the selected layer in this example, image of the third layer
  • a depth bar 501 only the depth position 502 of the layer of the emphasis display is illustrated as the depth bar 501.
  • the depth bar 501 in the depth bar 501, an entire depth range set by the image processing apparatus 100 is illustrated and a position display 503 of the virtual display surface is illustrated.
  • only the depth adjustment buttons 511 and 512 for the image of the third layer that is the selecting layer are displayed.
  • the depth position of the image of the layer of the emphasis display is adjusted by the operations of the depth adjustment buttons 511 and 512 and the display position of the depth position 502 of the depth bar 501 is changed as illustrated by an arrow Ld.
  • Fig. 18 illustrates another display example of the depth bar.
  • the image of the selected layer in this example, image of the second layer
  • the editing image display 203 In a depth bar 601, only a range where the depth of the image of the layer can be adjusted is displayed with scales.
  • a depth position 602 is displayed in the depth bar 601. That is, in the depth of the image of the second layer, the inner side is limited to the depth position of the image of the first layer and the front side is limited to the depth position of the image of the third layer.
  • a range where the depth is limited and adjusted becomes a scale display range of the depth bar 601. Therefore, when the depth position is adjusted by the operations of the depth adjustment buttons 611 and 612, the depth position 602 moves in the range of the displayed depth bar 601, as illustrated by an arrow Le.
  • Fig. 19 illustrates another display example of the depth bar.
  • the overlapped display of the images of all of the layers is performed in the intermediate image display 203 in edition processing and a depth position 702 of the image of the layer where the depth is adjusted is illustrated in a depth bar 701.
  • a position 703 of the virtual display surface is indicated.
  • depth adjustment buttons 711, 712, 721, 722, 731, and 732 are provided for each layer and the depth position 702 is changed by adjusting the depth by the button operation, as illustrated by an arrow Lf.
  • the position of the image of the layer that is compatible with the adjustment of the depth is indicated by an image frame 704 of four corners, in the editing image display 203.
  • display 705 of a numerical value in this example, "-25" indicating the setting position of the depth is performed. In this way, display where setting of the depth can be recognized from the display image may be performed.
  • the ground surface setting process starts when the user operates a button (not illustrated in the drawings) to instruct setting of the ground surface, in the editing screen illustrated by Fig. 2. That is, as illustrated in Fig. 20, the editing screen generating unit 118 determines whether an operation to instruct setting of the ground surface exists, in a state in which the editing screen is displayed (step S31). When the operation of the setting of the ground surface does not exist, a waiting state is maintained until the corresponding operation exists.
  • step S31 when it is determined that the operation of the setting of the ground surface exists, the overlapped display of the images of all of the layers is performed as the intermediate image display 203 in edition processing and a slider bar for horizontal line adjustment is vertically displayed on one end of the intermediate image display 203 in edition processing (step S32).
  • a slider handle that indicates the position of the horizontal line is displayed on the slider bar for the horizontal line adjustment, and it is determined whether a drag operation of the slider handle exists (step S33).
  • a change process of the position of the horizontal line is executed according to the operation (step S34).
  • the lower side of the horizontal line of the image of the innermost layer (first layer) is set to the inclined surface on the three-dimensional space, according to the change of the position of the horizontal line.
  • the setting of the inclined surface is the process already described in Fig. 4 and the inclined surface corresponds to the ground surface.
  • step S35 it is determined whether a mode to erase the lower side of the ground surface in the images of the layers other than the first layer is set.
  • the mode to erase the lower side of the ground surface is set, the objects at the positions that become the lower side of the inclined surface (ground surface) of the images of the layers other than the first layer are erased (step S36).
  • step S35 it is determined whether the object set to be disposed on the ground surface exists among the objects of the images of the individual layers (step S37). In this case, when it is determined that the object set to be disposed on the ground surface exists, the position of the lower end of the corresponding object is adjusted to the position crossing the inclined surface in the image of the layer where the object exists (step S38).
  • step S33 When it is determined that a drag operation does not exist in step S33, after the processes of steps S36 and S38 are executed and when it is determined that the object set to be disposed on the ground surface does not exist in step S37, the process returns to the horizontal line slider bar display process of step S32.
  • FIG. 21 illustrates an example where the horizontal line adjustment bar 261 which is the horizontal line slider bar is displayed in the intermediate image display 203 in edition processing in the editing screen.
  • the ground surface setting position display 262 is indicated.
  • the ground surface setting position display 262 is matched with the ground line c drawn in the image of the first layer. The matching process is executed by the user operation.
  • the ground surface of the lower side of the horizontal line of the image of the first layer is set as the inclined surface illustrated in Fig. 4 by setting the horizontal line.
  • the position of the ground surface may be displayed in the images of the layers other than the first layer (images of the second layer and the third layer).
  • ground surface position display 271 where the image 252 of the second layer and the ground surface (inclined surface) cross is indicated by a broken line as the intermediate image display 203 in edition processing in the editing screen.
  • this display it is determined whether the arrangement position of the object in the image 252 of the layer (in this example, object d of a tree) is appropriate, from the display of the ground surface position. That is, a lower end of the object d of the tree is almost matched with the ground surface position display 271 illustrate by a broken line and an appropriate three-dimensional image is obtained. Meanwhile, the lower end of the object d of the tree is at the upper side of the ground surface position display 271 illustrated by the broken line, the tree is floated.
  • the ground surface position display 271 is performed to effectively prevent the image from becoming the unnatural three-dimensional image.
  • Fig. 23 illustrates another display example of the position of the ground surface.
  • the image of the second layer is displayed as the image of the second layer and the lower side of the ground surface is displayed as a non-display portion 272 (black display portion).
  • the image of the third layer of the front side of the second layer is displayed, for example, after decreasing the brightness.
  • the objects e and f in the image of the third layer may not be displayed.
  • Fig. 24 illustrates an example where the objects of the lower side of the ground surface that is the inclined surface in the layers other than the first layer that is the long-distance view are erased and the display is performed as the intermediate image display 203 in edition processing in the editing screen.
  • a process of Fig. 24 corresponds to the process in step S36 of the flowchart of Fig. 20.
  • a lower part of an object e of a dog of the third layer that is the short-distance view becomes the lower side of the ground surface.
  • the object e is displayed in a state in which the lower part of the object e of the lower side of the ground surface is erased.
  • the partial erasure process of the object illustrated in Fig. 24 may be executed in the three-dimensional image display and the corresponding object may be completely displayed when the image of each layer is individually displayed.
  • Fig. 25 illustrates an example of an operation screen in the case where a lower end of the object in the image of each layer is matched with the ground surface of the inclined surface and display is performed as the intermediate image display 203 in edition processing in the editing screen.
  • a process of Fig. 25 corresponds to the process in step S38 of the flowchart of Fig. 20.
  • an operation screen with respect to the object e of the dog of the third layer that is the short-distance view is illustrated.
  • position movement buttons 281 and 282, a returning button 283, an erasure button 284, and a ground surface adjustment button 285 are displayed around the object e.
  • the user performs the operation to select each button and the position of the object e is modified.
  • the editing screen generating unit 118 executes a process for automatically matching the position of the lower end of the object e with a surface crossing the ground surface. Therefore, the corresponding object e is automatically disposed on the ground surface by selecting the ground surface adjustment button 285 and an appropriate three-dimensional image can be generated.
  • FIGs. 26 and 27 illustrate an example where a camera image is taken in an intermediate image in generation processing.
  • a camera capturing operation screen 810 is displayed on the editing screen.
  • a process for reading a camera image from an external camera device (or storage device where the camera image is stored) connected to the image processing apparatus 100 is executed by an operation using the camera capturing operation screen 810.
  • display of the camera capturing image 811 is performed by the reading.
  • An extraction image 812 where a background is removed from the camera capturing image 811 is obtained by an operation in the camera capturing operation screen 810.
  • the extraction image 812 By disposing the extraction image 812 on the image of any layer, the extraction image 812 can be disposed as one of the objects in the intermediate image in generation processing, as illustrated in Fig. 27.
  • the depth position of the layer where the extraction image is disposed is selected by the user using the camera capturing operation screen 810.
  • the camera capturing image may be automatically disposed on the layer of the most front side (short-distance view).
  • FIGs. 28 to 30 illustrate an example where a file image is imported in an intermediate image in generation processing.
  • an operation to select the file importing button 221 in the editing screen illustrated in Fig. 2 exists, as illustrated in Fig. 28, an image file importing operation screen 820 is displayed in the editing screen.
  • a process for reading selected image data from the image file stored in the designated place is executed by an operation using the image file importing operation screen 820.
  • an imported image 821 is displayed by the reading.
  • an extraction image 822 that is partially extracted from the imported image 821 is obtained by an operation in the image file importing operation screen 820.
  • the extraction image 822 By disposing the extraction image 822 on the image of any layer, as illustrated in Fig. 30, the extraction image 822 can be disposed as one of the objects in the generated image. In this case, the depth position of the layer where the extraction image is disposed is selected by the user using the image file importing operation screen 820.
  • the three-dimensional image that is generated using the editing screen in the process described above is stored in the image storage unit 120 of the image processing apparatus 100.
  • a list of data of the stored three-dimensional images can be displayed on one screen.
  • Fig. 31 illustrates an example where a list of generated images is displayed. In this example, generated images 11, 12, 13, ... are reduced and displayed. In this case, in display of each image, a two-dimensional image or a three-dimensional image may be displayed. In columns of generated image display, numbers of layers displays 11a, 12a, 13a, ... that number of layers are indicated by figures are performed.
  • a figure where three images are overlapped is displayed when the number of layers is three.
  • the generated images can be easily selected.
  • the selected images may be displayed in the editing screen illustrated in Fig. 2 and editing work may be performed.
  • Fig. 32 illustrates an example where the image processing apparatus 100 is configured as an information processing device such as a computer device.
  • the image processing apparatus 100 mainly includes a CPU 901, a ROM 903, a RAM 905, a host bus 907, a bridge 909, an external bus 911, an interface 913, an input device 915, an output device 917, an image capturing device 918, a storage device 919, a drive 921, a connection port 923, and a communication device 925.
  • the CPU 901 functions as an operation processing device and a control device and controls all or a part of operations in the image processing apparatus 100, according to various programs stored in the ROM 903, the RAM 905, the storage device 919, and a removable recording medium 927.
  • the ROM 903 stores programs or operation parameters that are used by the CPU 901.
  • the RAM 905 primarily stores programs used in execution of the CPU 901 or parameters appropriately changed in the execution. These devices are connected to each other by a host bus 907 configured by an internal bus such as a CPU bus.
  • the host bus 907 is connected to an external bus 911 such as a peripheral component interconnect/interface (PCI) bus through the bridge 909.
  • PCI peripheral component interconnect/interface
  • the input device 915 is an operation unit such as a mouse, a keyboard, a touch panel, a button, a switch, and a lever operated by the user.
  • the input device 915 may be a remote control unit (so-called remote controller) using infrared rays and other radio waves or an external connection apparatus 929 such as a mobile phone or a PDA that is compatible with the operation of the image processing apparatus 100.
  • the input device 915 is configured using an input control circuit that generates an input signal on the basis of information input by the user using the operation unit and outputs the input signal to the CPU 901.
  • the user of the image processing apparatus 100 operates the input device 915 and can input various data to the image processing apparatus 100 or instructs the image processing apparatus 100 to execute a process operation.
  • the output device 917 is configured using a display device such as a liquid crystal display device, a plasma display device, an EL display device, and a lamp, a sound output device such as a speaker and a headphone, or a device such as a printer device, a mobile phone, and a facsimile that can visually or audibly notify the user of acquired information.
  • the output device 917 outputs the result that is obtained by various processes executed by the image processing apparatus 100. Specifically, the display device displays the result obtained by the various processes executed by the image processing apparatus 100 with a text or an image. Meanwhile, the sound output device converts an audio signal configured using reproduced sound data or acoustic data into an analog signal and outputs the analog signal.
  • the image capturing device 918 is provided on the display device and the image processing apparatus 100 can capture a still image or a moving image of the user with the image capturing device 918.
  • the image capturing device 918 includes a charge coupled device (CCD) image sensor or a complementary metal oxide semiconductor (CMOS) image sensor and converts light condensed by a lens into an electric signal and can capture a still image or a moving image.
  • CCD charge coupled device
  • CMOS complementary metal oxide semiconductor
  • the storage device 919 is a data storage device that is configured as an example of a storage unit of the image processing apparatus 100.
  • the storage device 919 is configured using a magnetic storage device such as a hard disk drive (HDD), a semiconductor storage device, an optical storage device, or a magneto-optical storage device.
  • the storage device 919 stores programs executed by the CPU 901, various data, and acoustic signal data or image signal data acquired from the outside.
  • the drive 921 is a reader/writer for a storage medium and is incorporated in the image processing apparatus 100 or is attached to the outside of the image processing apparatus 100.
  • the drive 921 reads information that is recorded in the mounted removable recording medium 927 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory and outputs the information to the RAM 905.
  • the drive 921 can record an information on the mounted removable recording medium 927 such as the magnetic disk, the optical disk, the magneto-optical disk, or the semiconductor memory.
  • the removable recording medium 927 is a DVD medium, a Blu-ray medium, a compact flash (registered trademark) (CompactFlash: CF), a memory stick, a secure digital (SD) memory card, or the like.
  • the removable recording medium 927 may be an integrated circuit (IC) card or an electronic apparatus where an IC chip of a non-contact type is mounted.
  • the connection port 923 is a port to directly connect the apparatus to the image processing apparatus 100, such as a universal serial bus (USB) port, an IEEE1394 port such as i.Link, a small computer system interface (SCSI) port, an RS-232C port, an optical audio terminal, or a high-definition multimedia interface (HDMI) port.
  • USB universal serial bus
  • IEEE1394 port such as i.Link
  • SCSI small computer system interface
  • RS-232C Small computer system interface
  • HDMI high-definition multimedia interface
  • the communication device 925 is a communication interface that is configured by a communication device for connection with a communication network 931.
  • the communication device 925 is a communication card such as a wired or wireless local area network (LAN), a Bluetooth, and a communication card for a wireless USB (WUSB), a router for optical communication, a router for an asymmetrical digital subscriber line (ADSL), or a modem for various communications.
  • the communication device 925 can transmit and receive a signal based on a predetermined protocol such as TCP/IP between the Internet or other communication apparatuses and the communication device.
  • the communication network 931 that is connected to the communication device 925 is configured by a network connected by wire or wireless.
  • the communication network 931 may be the Internet, a home LAN, infrared communication, radio wave communication, or satellite communications.
  • the example of the hardware configuration that can realize the function of the image processing apparatus 100 according to this example is described.
  • the various components may be configured using general-purpose members or may be configured by hardware specialized in the functions of the individual components. Therefore, the hardware used in the configuration may be appropriately changed according to various technological levels to carry out this embodiment.
  • the program (software) that executes each process step executed by the image processing apparatus 100 according to this example may be generated, the program may be deployed in a general-purpose computer device, and the same process may be executed.
  • the program may be stored in various media or may be downloaded from the server side to the computer device through the Internet.
  • An image processing apparatus comprising: an image generating unit that generates a plurality of plane images and sets virtual distances in a depth direction to the plurality of generated plane images, respectively; a three-dimensional image converting unit that converts the plurality of plane images into a three-dimensional image where objects' positions in space in each of the plurality of plane images are set, based on the virtual distances set to the plurality of plane images generated by the image generating unit; a three-dimensional image generating unit that outputs data of the three-dimensional image converted by the three-dimensional image converting unit; an editing screen generating unit that displays the plurality of plane images generated by the image generating unit individually or in a overlapped manner and generates display data of an editing screen displayed by providing tabs to the plane images, respectively; and an input unit that receives an operation to generate or edit images in the editing screen generated by the editing screen generating unit.
  • An image processing apparatus comprising: an image control unit that controls an image displayed on a display unit; a three-dimensional image generating unit that generates a three-dimensional image where the space positions of objects of a plurality of plane images are set, from the plurality of plane images having the virtual distances in a depth direction, respectively; and an input unit that receives an operation from a user, wherein the image control unit displays thumbnail images in which the plurality of plane images and overlapped images, in which the plane images are displayed in an overlapped manner at a predetermined angle in a depth direction, are reduced, and the image control unit displays the plane image or the overlapped image corresponding to the selected thumbnail image in an editable state along the thumbnail images of the plurality of plane images and the overlapped images, when the input unit receives a command selecting the thumbnail image.
  • An image processing method comprising: generating a plurality of plane images and setting virtual distances in a depth direction to the plurality of generated plane images, respectively; converting the plurality of plane images into a three-dimensional image where space positions of objects in each of the plurality of plane images are set, based on the virtual distances set to the plurality of generated plane images; outputting data of the converted three-dimensional image; displaying the plurality of generated plane images individually or to be overlapped and generating display data of an editing screen displayed by providing tabs to the plane images, respectively; and accepting an operation to generate or edit images in the generated editing screen.
  • a program that causes a computer to execute an image process the program causing the computer to execute: generating a plurality of plane images and setting the virtual distances of a depth direction to the plurality of generated plane images, respectively; converting the plurality of plane images into a three-dimensional image where the space positions of objects of the plurality of plane images are set, on the basis of the virtual distances set to the plurality of generated plane images; outputting data of the converted three-dimensional image; displaying the plurality of generated plane images individually or to be overlapped and generating display data of an editing screen displayed by providing tabs to the plane images, respectively; and accepting an operation to generate or edit images in the generated editing screen.

Abstract

The present disclosure provides an image processing apparatus to be newly improved that can easily generate contents having three-dimensional images. The image processing apparatus generates a plurality of plane images and sets the virtual distances of a depth direction to the plurality of generated plane images, respectively. The image processing apparatus converts the plurality of plane images into a three-dimensional image where the space positions of objects of the plurality of plane images are set, on the basis of the virtual distances set to the plurality of generated plane images, and obtains data of the three-dimensional image. In addition, the image processing apparatus displays an editing screen that generates the plurality of plane images. The editing screen displays the plurality of plane images individually or to be overlapped and displays the plane images by providing tabs to the plane images, respectively.

Description

    IMAGE PROCESSING APPARATUS, IMAGE PROCESSING METHOD, AND PROGRAM
  • The present disclosure relates to an image processing apparatus, an image processing method, and a program and particularly, to a technology for generating a three-dimensional image.
  • In recent years, an image display apparatus that displays an image (three-dimensional image: so-called 3D image) perceived by a user as a three-dimensional image is released and begins to spread (for example, Patent Literature 1). The apparatus that can display the three-dimensional image is not limited to a television and other image display apparatuses. Of personal computers, there are types that can display a three-dimensional image.
  • Among applications running in the personal computer, there are applications that can generate contents having three-dimensional images. If the contents are generated by the applications and the user views the contents in a predetermined manner, the user can perceive images included in the contents as the three-dimensional images.
  • JP 2010-210712A
  • Summary
  • However, according to the related art, in generating contents having three-dimensional images, it is necessary to set a positional relation using dedicated software. Accordingly, it is difficult for an end user to generate such contents.
  • In light of the foregoing, it is desirable to provide an image processing apparatus, an image processing method, and a computer program which are novel or improved ones and which enable contents having three-dimensional images to be generated easily.
  • An image processing apparatus of the present disclosure comprises an image generating unit that generates a plurality of plane images and sets virtual distances in a depth direction to the plurality of generated plane images, respectively.
    An image processing apparatus of the present disclosure comprises a three-dimensional image converting unit that converts the plurality of plane images into a three-dimensional image where objects' positions in space in each of the plurality of plane images are set, based on the virtual distances set to the plurality of plane images generated by the image generating unit.
    An image processing apparatus of the present disclosure comprises a three-dimensional image generating unit that outputs data of the three-dimensional image converted by the three-dimensional image converting unit.
    An image processing apparatus of the present disclosure comprises an editing screen generating unit that displays the plurality of plane images generated by the image generating unit individually or in a overlapped manner and generates display data of an editing screen displayed by providing tabs to the plane images, respectively.
    An image processing apparatus of the present disclosure comprises and an input unit that receives an operation to generate or edit images in the editing screen generated by the editing screen generating unit.
  • An image processing method of the present disclosure comprises generating a plurality of plane images and setting virtual distances in a depth direction to the plurality of generated plane images, respectively; converting the plurality of plane images into a three-dimensional image where space positions of objects in each of the plurality of plane images are set, based on the virtual distances set to the plurality of generated plane images; outputting data of the converted three-dimensional image; displaying the plurality of generated plane images individually or to be overlapped and generating display data of an editing screen displayed by providing tabs to the plane images, respectively; and accepting an operation to generate or edit images in the generated editing screen.
  • A program of the present disclosure that causes a computer to execute an image process, the program causing the computer to execute: generating a plurality of plane images and setting the virtual distances of a depth direction to the plurality of generated plane images, respectively; converting the plurality of plane images into a three-dimensional image where the space positions of objects of the plurality of plane images are set, on the basis of the virtual distances set to the plurality of generated plane images; outputting data of the converted three-dimensional image; displaying the plurality of generated plane images individually or to be overlapped and generating display data of an editing screen displayed by providing tabs to the plane images, respectively; and accepting an operation to generate or edit images in the generated editing screen.
  • According to the present disclosure, a plurality of plane images are appropriately displayed so that a user can generate and display a desired three-dimensional image through a simple operation.
  • According to the present disclosure, there are provided an image processing apparatus, an image processing method, and a computer program which are novel or improved ones and which enables contents having three-dimensional images to be easily generated.
  • Fig. 1 is a block diagram illustrating an example of the configuration of an image processing apparatus according to an embodiment of the present disclosure. Fig. 2 is a diagram illustrating an example of an editing screen (example in which all layers are displayed) according to the embodiment of the present disclosure. Fig. 3 is an exemplary diagram illustrating an example of an image of each layer according to the embodiment of the present disclosure. Fig. 4 is an exemplary diagram illustrating an example of a depth display screen according to the embodiment of the present disclosure. Fig. 5 is an exemplary diagram illustrating a concept of converting images of a plurality of layers into three-dimensional images. Fig. 6 is an exemplary diagram illustrating a state in which images of a plurality of layers are converted into three-dimensional images. Fig. 7 is an exemplary diagram illustrating an example in which images of a plurality of layers are converted into three-dimensional left and right channel images. Fig. 8 is an exemplary diagram illustrating an example of three-dimensional images that are generated from images of a plurality of layers. Fig. 9 is an exemplary diagram illustrating a converted state of the case in which there is a layer in which the front side of a virtual display surface is set to a virtual position. Fig. 10 is an exemplary diagram illustrating an example of a three-dimensional image of the case where there is a layer in which the front side of a virtual display surface is set to a virtual position. Fig. 11 is a flowchart illustrating an example of the flow of an editing screen display process according to the embodiment of the present disclosure. Fig. 12 is an exemplary diagram illustrating an example of an editing screen (example in which a third layer is displayed) according to the embodiment of the present disclosure. Fig. 13 is an exemplary diagram illustrating an example of an editing screen (example in which a second layer is displayed) according to the embodiment of the present disclosure. Fig. 14 is an exemplary diagram illustrating an example of an editing screen (example in which a first layer is displayed) according to the embodiment of the present disclosure. Fig. 15 is a flowchart illustrating an example of the flow of a depth screen display process according to the embodiment of the present disclosure. Fig. 16 is an exemplary diagram illustrating an example (first example) of displaying a depth screen according to the embodiment of the present disclosure. Fig. 17 is an exemplary diagram illustrating an example (second example) of displaying a depth screen according to the embodiment of the present disclosure. Fig. 18 is an exemplary diagram illustrating an example (third example) of displaying a depth screen according to the embodiment of the present disclosure. Fig. 19 is an exemplary diagram illustrating an example (fourth example) of displaying a depth screen according to the embodiment of the present disclosure. Fig. 20 is a flowchart illustrating an example of the flow of a horizontal line setting process according to the embodiment of the present disclosure. Fig. 21 is an exemplary diagram illustrating an example of a horizontal line setting screen according to the embodiment of the present disclosure. Fig. 22 is an exemplary diagram illustrating an example (first example) of a display screen of a specific layer at the time of setting a horizontal line according to the embodiment of the present disclosure. Fig. 23 is an exemplary diagram illustrating an example (second example) of a display screen of a specific layer at the time of setting a horizontal line according to the embodiment of the present disclosure. Fig. 24 is an exemplary diagram illustrating an example (third example) of a display screen of a specific layer at the time of setting a horizontal line according to the embodiment of the present disclosure. Fig. 25 is an exemplary diagram illustrating an example (fourth example) of a display screen of a specific layer at the time of setting a horizontal line according to the embodiment of the present disclosure. Fig. 26 is an exemplary diagram illustrating an example of a display screen at the time of capturing a camera image according to the embodiment of the present disclosure. Fig. 27 is an exemplary diagram illustrating a display example of a three-dimensional image where camera images are synthesized according to the embodiment of the present disclosure. Fig. 28 is an exemplary diagram illustrating an example of a display screen at the time of importing an image file according to the embodiment of the present disclosure. Fig. 29 is an exemplary diagram illustrating an example of an importing range setting at the time of importing an image file according to the embodiment of the present disclosure. Fig. 30 is an exemplary diagram illustrating a display example of a three-dimensional image in which images imported from an image file are synthesized according to the embodiment of the present disclosure. Fig. 31 is an exemplary diagram illustrating an example of a list display screen of productions according to the embodiment of the present disclosure. Fig. 32 is a block diagram illustrating an example of detailed hardware configuration of an image processing apparatus according to the embodiment of the present disclosure.
  • Hereinafter, preferred embodiments of the present disclosure will be described in detail with reference to the appended drawings. Note that, in this specification and the appended drawings, structural elements that have substantially the same function and structure are denoted with the same reference numerals, and repeated explanation of these structural elements is omitted.
    Hereinafter, preferred embodiments of the present disclosure will be described in detail with reference to the appended drawings, in the following order.
    <1. Embodiment of the present disclosure>
    <1-1. Configuration of an image processing apparatus (Fig. 1)>
    <1-2. Outline of a three-dimensional image (Figs. 3 to 10)>
    <1-3. Display example of an editing screen (Figs. 2 and Figs. 11 to 14)>
    <1-4. Display example of a depth adjustment screen (Figs. 15 to 19)>
    <1-5. Display example of a ground surface setting screen (Figs. 20 to 25)>
    <1-6. Example where a camera image is captured (Figs. 26 and 27)>
    <1-7. Example where an image file is imported (Figs. 28 to 30)>
    <1-8. Display example of a generated image list (Fig. 31)>
    <1-9. Specific example of hardware configuration (Fig. 32)>
  • <1. Embodiment of the present disclosure>
    <1-1. The functional configuration of an image processing apparatus>
    First, the configuration of an image processing apparatus according to an embodiment (hereinafter, called "this example") of the present disclosure will be described. Fig. 1 is a diagram illustrating the configuration of an image processing apparatus 100 according to this example.
  • An image processing apparatus 100 according to this example illustrated in Fig. 1 is configured to generate an image by a user operation and display and store the generated image. As illustrated in Fig. 1, the image processing apparatus 100 includes an image generating/processing unit 110, an image storage unit 120, an input unit 130, and an image display unit 140.
  • The image generating/processing unit 110 provides the user with an image generation screen through the image display unit 140 or generates a three-dimensional image from the image generated by the user. As illustrated in Fig. 1, the image generating/processing unit 110 that is included in the image processing apparatus 100 according to this example includes an image generating unit 112, a three-dimensional image converting unit 114, a three-dimensional image generating unit 116, and an editing screen generating unit 118. The editing screen generating unit 118 also functions as an image control unit that controls image generation in each unit. Alternatively, the image generating/processing unit 110 may include an image control unit that controls image generation in each unit, separately from the editing screen generating unit 118.
  • The image generating/processing unit 110 generates a plurality of plane images (for example, three plane images) on the basis of a user operation in a state in which an editing screen for image generation is displayed on the image display unit 140 and generates a three-dimensional image from the plurality of generated plane images (two-dimensional images). The editing screen for the image generation is a screen illustrated in Fig. 2 and displays an intermediate image in generation processing at the center and displays operation buttons or tabs around the image. The editing screen illustrated in Fig. 2 will be described below in detail. Information of the depth from the reference position (for example, virtual display surface) is set to each of the plurality of generated plane images and the three-dimensional image is generated on the basis of the depth.
    In addition, the image generating/processing unit 110 supplies image data of the three-dimensional image generated by the image generating/processing unit 110 to the image display unit 140 and the image display unit 140 displays the three-dimensional image. The user views the three-dimensional image using a predetermined method (for example, in a state in which the user wears a shutter spectacle of a time-sharing driving type) and perceives the three-dimensional image displayed on the image display unit 140 as a three-dimensional image.
  • The image generating unit 112 displays the editing screen for the image generation on the image display unit 140 and generates an image by the user operation. If the image generating unit 112 generates images including a plurality of layers using the image generation screen provided by the image generating unit 112, the images that include the plurality of layers are converted into a three-dimensional image by the three-dimensional image converting unit 114 and the three-dimensional image generating unit 116. The images including the plurality of layers that are generated by the image generating unit 112 are stored in the image storage unit 120 according to the user operation.
  • The three-dimensional image converting unit 114 executes a conversion process for displaying the images including the plurality of layers transmitted from the image generating unit 112 as the three-dimensional image on the image display unit 140. The image processing apparatus 100 according to this example previously assumes the distance between eyes of the user and the distance between the user and the display surface and executes a conversion process for displaying the image as the three-dimensional image on the image display unit 140, on the basis of the virtual distance between the layers (information of the depth of each layer of the images). Specifically, in order for the three-dimensional image to be generated by performing a coordinate conversion on the images including the plurality of layers, the three-dimensional image converting unit 114 executes a coordinate conversion process with respect to the images including the plurality of layers.
  • At the time of the conversion process in the three-dimensional image converting unit 114, if the user adjusts the depth of each layer of the images while the three-dimensional image is displayed on the image display unit 140 and changes a depth state, the three-dimensional image converting unit 114 executes the conversion process in real time according to the change. Thereby, the user can adjust the depth of each layer of the images and confirm the three-dimensional image after the adjustment through the display on the editing screen in real time. An example of a process for adjusting the depth of each layer of the images will be described in detail below.
  • When the image processing apparatus 100 generates the three-dimensional image from the planar images including the plurality of layers generated by the user, the image processing apparatus 100 executes preview display of the three-dimensional image. By the preview display of the three-dimensional image, the user can previously grasp how the images would be seen in a three-dimensional manner before storing the generated image as the three-dimensional image.
  • The three-dimensional image generating unit 116 generates the three-dimensional image from the images including the plurality of layers, on the basis of the conversion process executed by the three-dimensional image converting unit 114. The three-dimensional image generated by the three-dimensional image generating unit 116 is displayed on the image display unit 140 and is stored in the image storage unit 120 according to the operation of the input unit 130 from the user.
  • The editing screen generating unit 118 generates display data of the editing screen, on the basis of a reception state of an input operation in the input unit 130. The editing screen generating unit 118 supplies the display data generated by the editing screen generating unit 118 to the image display unit 140 and displays the editing screen.
  • The image storage unit 120 stores the images including the plurality of layers generated by the image generating/processing unit 110 or the three-dimensional image generated by converting the images including the plurality of layers. The images stored in the image storage unit 120 are read from the image storage unit 120 according to the operation of the input unit 130 by the user, are processed by the image generating/processing unit 110, and are displayed by the image display unit 140.
  • The input unit 130 includes various input devices that execute an input operation with respect to the image processing apparatus 100 by the user, for example, a keyboard, a mouse, a graphic tablet, and a touch panel. The user can generate the images including the plurality of layers by operating the input unit 130 and adjust the depth of each layer of the images when the images are converted into the three-dimensional image.
  • The image display unit 140 is a display that displays an image. For example, the image display unit 140 displays the images including the plurality of layers generated by the image generating/processing unit 110 or the three-dimensional image generated by converting the images including the plurality of layers. The image display unit 140 displays a screen to allow the images to be generated by the user of the image processing apparatus 100. An example of the display screen will be described below. A touch panel may be disposed on an image display surface of the image display unit 140 and the user may directly operate buttons in a displayed image. The touch panel that is included in the image display unit 140 functions as a part of the input unit 130. The image display unit 140 may be a device that is separated from the image processing apparatus 100.
  • The image display unit 140 may be configured using a display device that can display the three-dimensional image. A method that displays the three-dimensional image is not limited to a specific display method. For example, as the method that displays the three-dimensional image, a method that switches an image for a right eye and an image for a left eye at a high speed and displays the images is known. As a method that transmits the three-dimensional image to the image display unit 140, a frame sequential method, a side-by-side method, and a top and bottom method and the like are known.
  • The images that are generated by the image generating/processing unit 110 may be output to the television receiver and other display devices that are connected to the image processing apparatus 100 and can display the three-dimensional image.
  • <1-2. Outline of generation of a three-dimensional image>
    Next, an outline of generation of the three-dimensional image by the image processing apparatus 100 according to the embodiment of the present disclosure will be described with reference to Figs. 3 to 10.
    First, as illustrated in Fig. 3, the image processing apparatus 100 according to this example has plane images 251, 252, and 253 including three layers of a long-distance view, a middle-distance view, and a short-distance view generated by the user. The image processing apparatus 100 synthesizes the plane images 251, 252, and 253 including the three layers of the long-distance view, the middle-distance view, and the short-distance view and the plane images are converted into a three-dimensional image by the three-dimensional image converting unit 114. As such, by converting the images including the three layers into the three-dimensional image, the user can generate the three-dimensional image without executing a complicated imaging process.
  • In an example of Fig. 3, objects a, b, c, and g respectively corresponding to a mountain, the sun, a road, and a horizontal line are drawn in the plane image 251 of the long-distance view, an object d of a tree is drawn in the plane image 252 of the middle-distance view, and objects e and f each corresponding to a dog and an insect are drawn in the plane image 253 of the short-distance view.
    The number of layers that is three is exemplary and the number of layers may be plural.
  • The depth that is illustrated by the distance from the reference position is set to the image of each layer. Fig. 4 is a diagram illustrating a setting situation of the depth. A setting state illustrated in Fig. 4 is a 3D state thumbnail tab 202 of the editing screen to be described below and the state illustrated in the drawing is reduced and displayed. When the 3D state thumbnail tab 202 is selected, the setting state illustrated in Fig. 4 is enlarged and displayed on the editing screen.
  • The setting state of the depth illustrated in Fig. 4 will be described. A depth axis 301 is set to a direction orthogonal to a display screen and the depth positions of the layer images 311, 312, and 313 are set on the depth axis 301. In the example of Fig. 4, the depth axis 301 is illustrated as an oblique axis that has a predetermined angle. An image frame that is illustrated by a broken line in Fig. 4 illustrates a virtual display surface 304 at the position of the image display surface of the display. The virtual display surface 304 exists at the depth position 301a on the depth axis 301. The position of the most front side of the depth axis 301 is illustrated as a front edge portion 302.
  • In the example of FIG. 4, the first layer image 311 of the long-distance view exists at the innermost depth position 301b on the depth axis 301 and the second layer image 312 of the middle-distance view exists at the depth position 301c near almost the center on the depth axis 301. The third layer image 313 of the short-distance view exists at the depth position 301d of the front side of the virtual display surface 304 on the depth axis 301. In this example, the layer image 313 of the short-distance view exists at the depth position of the front side of the virtual display surface 304. However, the layer image 313 of the short-distance view may be inner than the virtual display surface 304 (closer to the long-distance view).
    As such, when the three layer images are prepared, the layer image may be automatically set to the predetermined depth position suitable for the long-distance view as the first layer image 311, in an initial state. Likewise, the layer image may be automatically set to the predetermined depth position suitable for the middle-distance view as the second layer image 312 and the layer image may be automatically set to the predetermined depth position suitable for the short-distance view as the third layer image 313.
  • As illustrated in Fig. 4, the three-dimensional image is generated such that the objects a to g in the layer images 311, 312, and 313 are disposed at the depth positions of the layer images in a virtual three-dimensional space.
    However, when the position of a horizontal line c is set to the horizontal line position 321 with respect to the layer image 311 of the long-distance view, an image portion of the lower side of the horizontal line position 321 of the layer image 311 of the long-distance view is disposed to be inclined on the three-dimensional space and protrude to the front side. That is, the image portion of the lower side of the horizontal line c becomes an inclined surface 303 that gradually protrudes to the front side, from the depth position 301b of the layer image 311 to the front edge portion 302, as the image portion progresses to the lower side. In the example of Fig. 4, the object g of the road that is drawn on the lower side of the horizontal line position 321 in the layer image 311 is disposed on the inclined surface 303.
    In the example of Fig. 4, the inclined surface 303 protrudes to the front edge portion 302 of the front side of the third layer image 313. However, the inclined surface 303 may protrude to the virtual display surface 304. In this case, the layer (third layer image 313) of the most front side protrudes to the front side of a terminating point of the front side of the inclined surface 303.
  • As such, when setting of the horizontal line is performed with respect to the first layer image 311 of the long-distance view and setting matched with the horizontal line is performed with respect to the objects in the other layer images 312 and 313, each object is automatically appropriately disposed on the inclined surface 303. Alternatively, a process for erasing inappropriate portions of the objects is executed. A specific example of the setting of the horizontal line and the process associated with the horizontal line will be described below.
  • Next, an example of a process for converting the images into the three-dimensional image in the three-dimensional image converting unit 114 will be described.
    Fig. 5 is a diagram illustrating a concept of converting normal two-dimensional images including a plurality of layers into a three-dimensional image. Fig. 5 illustrates an aspect where two-dimensional images including a plurality of layers (plane images 251, 252, and 253) are converted into an image 250R for a right eye which the user views using the right eye and an image 250L for a left eye which the user views using the left eye. However, in Fig. 5, the process of the horizontal line described above is not illustrated. The three-dimensional image converting unit 114 calculates the drawing positions of the image for the right eye and the image for the left eye to generate the image 250R for the right eye and the image 250L for the left eye from the two-dimensional images.
  • Next, an example of a specific method to calculate the drawing positions of the image for the right eye and the image for the left eye will be described.
  • Figs. 6 to 8 are exemplary diagrams illustrating an example where normal two-dimensional images including a plurality of layers are converted into a three-dimensional image. Fig. 6 illustrates a coordinate conversion when an image for a right eye and an image for a left eye are generated from two-dimensional images including three layers illustrated in Fig. 7 and illustrates a coordinate conversion when the three layers are inner than the display surface, as illustrated in Fig. 8. Fig. 6 schematically illustrates a state in which each layer and the display surface are viewed from the top side.
  • As illustrated in Fig. 6, a distance E between the left eye and the right eye and the virtual viewing distance L are previously assumed. The three-dimensional image converting unit 114 executes a projection coordinate conversion for the image for the right eye and a projection coordinate conversion for the image for the left eye, using the layer depths D1, D2, and D3 between the virtual display surface 259 and the layers. With the projection coordinate conversion, the pixel positions of the objects of each layer image in the virtual display surface 259 are calculated.
  • As such, the three-dimensional image converting unit 114 executes the projection coordinate conversion with respect to the virtual display surface 259 and the image processing apparatus 100 can convert the normal two-dimensional images including the plurality of layers into the three-dimensional image.
  • The examples of Figs. 6 to 8 are examples where each layer is set to the depth deeper than the depth of the virtual display surface 259. However, each layer may be set to the depth of the front side of the virtual display surface 259.
    Figs. 9 and 10 illustrate an example where two-dimensional images are converted into a three-dimensional image when an image of the layer of the short-distance view is disposed on the front side of a virtual display surface 259'. Fig. 9 schematically illustrates a coordinate conversion when an image for a right eye and an image for a left eye are generated from two-dimensional images including three layers illustrated in Fig. 10, in an upwardly viewed state.
  • As illustrated in Fig. 9, the pixel positions of the image for the right eye and the image for the left eye where the objects of the image of the layer of the front side of the virtual display surface 259' are projected on the virtual display surface 259' are reversed to the pixel positions in the layer deeper than the virtual display surface 259' at the left and right sides.
    In the case of this example, as illustrated in Fig. 10, a three-dimensional image where the objects e and f disposed on the layer of the short-distance view are viewed to protrude to the front side of the display surface is obtained.
  • <1-3. Display example of the editing screen>
    Next, an image generation process using the editing screen that is needed to generate the three-dimensional image will be described.
    The editing screen is the screen illustrated in Fig. 2 that is displayed on the image display unit 140, on the basis of the process in the editing screen generating unit 118.
    A flowchart of Fig. 11 illustrates a flow of the process in the editing screen generating unit 118. The description will be given with reference to Fig. 11. First, it is determined whether the input unit 130 receives a user operation to display the editing screen (step S11). In this case, when the user operation is not received, a waiting state is maintained and when the user operation to display the editing screen is received, the process for displaying the 3D editing screen illustrated in Fig. 2 is executed (step S12). In this state, in the 3D editing screen, all of the layers are overlapped and displayed as the displayed intermediate images in generation processing. In addition, multi-layer thumbnails that correspond to the number of layers are displayed.
  • In addition, it is determined whether an operation to select any one of the displayed multi-layer thumbnails by the input unit 130 is received (step S13). When the operation is not received, a waiting state is maintained and the image of the editing screen is not changed.
    In addition, when the operation to select any one of the layer thumbnails is received, the image of the layer that corresponds to the selected layer thumbnail is displayed on the editing screen (step S14). However, in this case, in the image display of each layer, the display brightness of the images of the other layers is decreased and the image of each layer is simultaneously displayed, and the entire three-dimensional image is displayed to be recognized.
  • In a state in which the image of the specific layer is displayed in step S14, it is determined whether an operation to change the layer to another layer by selecting the layer thumbnail exists (step S15). In this case, when it is determined that the operation to change the layer to another layer exists, a change process of the layer that is displayed on the editing screen is executed (step S16), the process returns to step S14, and an image of the layer after the change is displayed.
    When it is determined that the operation to change the layer to another layer does not exist in step S15, it is determined whether an operation to change the display to the overlapped display of all of the layers exists (step S17). In this case, when it is determined that the operation to change the display to the overlapped display of all of the layers exists, the display is changed to the overlapped display of all of the layers and the process returns to the 3D editing screen display in step S12.
  • In step S17, when it is determined that the operation to change the display to the overlapped display of all of the layers does not exist, the screen display of the selected layer in step S14 is continuously executed.
  • Next, a specific display example of the editing screen will be described with reference to Figs. 2 and 12 to 14. Fig. 2 illustrates an example of an editing screen in a state in which all layers of intermediate images in generation processing are overlapped and displayed. In the editing screen, intermediate image display 203 in edition processing is performed with a relatively large image at a center portion. In the example of Fig. 2, as the editing image display 203, an image where the images 251 to 253 of the three layers illustrated in Fig. 3 are overlapped and displayed is displayed. In order to illustrate the image including the images 251, 252, and 253 of the three layers, three layer edge portions of a first layer edge portion 203a, a second layer edge portion 203b, and a third layer edge portion 203c are overlapped and illustrated with respect to an upper edge portion of the editing image display 203. In the edge portions 203a to 203c of the three images, the edge portion (for example, edge portion 203a) of the image that corresponds to the selected tab is displayed on the front side and the edge portions of the other images are displayed on the inner side.
  • In addition, plural tabs 201, 202, and 211 to 213 to select a display image are disposed on the upper side of the intermediate image display 203 in edition processing in the editing screen. The tabs 201, 202, and 211 to 213 display the images allocated to the selected tabs, when a user operation to select the corresponding tab display places by the touch panel operation or the like exists.
  • The image that is allocated to each tab will be described. The layer thumbnail tabs 211, 212, and 213 are tabs to individually display the layers. Specifically, the first layer thumbnail tab 211 is a tab that displays a first layer image, the second layer thumbnail tab 212 is a tab that displays a second layer image, and the third layer thumbnail tab 213 is a tab that displays a third layer image. In the thumbnail tabs 211 to 213, images of the layers are reduced and displayed as the thumbnail images. Therefore, when the image of each layer is modified, the thumbnail image is also modified in the same way.
  • The tab 201 that is displayed adjacent to the layer thumbnail tabs 211, 212, and 213 is a tab that adds the layer image. That is, when the tab 201 is selected, a layer image is newly added and an editing screen of the newly added layer image is displayed.
  • In addition, the 3D state thumbnail tab 202 that is disposed to be closer to the right side of the upper side of the intermediate image display 203 in edition processing is a tab that displays a depth state of the image of each layer. When the 3D state thumbnail tab 202 is selected, the display screen of the depth state illustrated in Fig. 4 is enlarged and displayed in the place of the editing image display 203. In the 3D state thumbnail tab 202, the display screen of the depth state illustrated in Fig. 4 is reduced and displayed. With respect to the reduction display in the 3D state thumbnail tab 202, the depth state is adjusted and a display content changes according to the adjustment.
  • A peripheral portion of the intermediate image display 203 in edition processing of the editing screen illustrated in Fig. 2 will be described. On the left end of the editing image display 203, a plurality of buttons 221 to 225 are disposed. In this example, a file importing button 221, a camera capturing button 222, a stamp button 223, a character input button 224, and a depth operation button 225 are prepared.
  • If a user operation to select the file importing button 221 exists, a process for importing a file image that is prepared in the image storage unit 120 or an external memory starts.
    If a user operation to select the camera capturing button 222 exists, a process for capturing image data from a camera device connected to the apparatus starts.
    In the stamp button 223 and the character input button 224, a process for inputting prepared figures or characters starts, by a user operation of each button.
    If a user operation to select the depth operation button 225 exists, a depth adjustment screen is displayed on the editing image display 203. A display process of the depth adjustment screen will be described below.
  • As illustrated in Fig. 2, a generation start button 241, a save button 242, and a three-dimensional display button 243 are displayed on the right end of the editing image display 203. The generation start button 241 is a button to instruct to generate a new three-dimensional image. The save button 242 is a button to instruct to store an intermediate image in generation processing in the image storage unit 120. The three-dimensional display button 243 is a button to switch three-dimensional display and normal display (2D display) of the image displayed on the editing image display 203.
    On the lower side of the right end of the editing image display 203, a pen tool 290 is displayed. The pen tool 290 of this example displays a first pen 291, a second pen 292, a third pen293, and an eraser 294. If a user operation to select display places of the pens 291, 292, and 293 exists, a line can be drawn with a color or a line type allocated to each pen. If a user operation to select a display place of the eraser 294 exists, the drawn line is erased. The pen tool 290 may have a function of supporting other drawing or erasure.
  • As illustrated in Fig. 2, object display units 231 to 237 are provided on a lower end of the editing image display 203. In the plurality of object display units 231 to 237, for example, seven newest objects that are used by the user are displayed. In the example of Fig. 2, each object of the intermediate image in generation processing is illustrated.
    Fig. 2 illustrates an example where images of the individual layers are overlapped and displayed in the editing image display 203. Even when the other images are displayed in the editing image display 203, the same display is performed around the editing image display 203. In the editing screen of Fig. 2, a tab that selects overlapped display where the image of each layer illustrated in Fig. 2 is overlapped is not provided in particular. However, the tab that selects the overlapped display may be prepared separately from the tab for the image of each layer. When the tab to select the overlapped display where the image of each layer is overlapped is provided, a thumbnail image where an overlapped image of the image of each layer is reduced and displayed is displayed on the tab. Therefore, when the image of each layer is modified, the thumbnail image where the overlapped image is reduced and displayed is modified according to the change.
  • Fig. 12 illustrates an example where the image of the third layer (image 253 of the short-distance view of Fig. 3) is displayed in the editing image display 203. The image display of the third layer is performed by selecting the third layer thumbnail tab 213 by a user operation.
    When the image 253 of the third layer is displayed, the objects e and f in the image of the third layer are displayed with set colors or brightness. The objects of the images of the other layers are overlapped and displayed after decreasing the display brightness. That is, only the image of the third layer is highlighted and displayed and the images of the other layers are grayed out and displayed.
    By performing an operation to add the objects in the images in the intermediate image display 203 in edition processing or drawing by the user in a state illustrated in Fig. 12, the image of the third layer is generated or edited.
  • Fig. 13 illustrates an example where the image of the second layer (image 252 of the middle-distance view of Fig. 3) is displayed in the editing image display 203. The image display of the second layer is performed by selecting the second layer thumbnail tab 212 by a user operation.
    When the image 252 of the second layer is displayed, the object d in the image of the second layer is displayed with set colors or brightness. The objects of the images of the other layers are overlapped and displayed after decreasing the display brightness.
    By performing an operation to add the objects in the images in the intermediate image display 203 in edition processing or drawing by the user in a state illustrated in Fig. 13, the image of the second layer is generated or edited.
  • Fig. 14 illustrates an example where the image of the first layer (image 251 of the long-distance view of Fig. 3) is displayed in the editing image display 203. The image display of the first layer is performed by selecting the first layer thumbnail tab 211 by a user operation.
    When the image 251 of the first layer is displayed, the objects a, c, and g in the image of the first layer are displayed with set colors or brightness. At the time of the display, the objects of the images of the other layers are overlapped and displayed after decreasing the display brightness.
    By performing an operation to add the objects in the images in the intermediate image display 203 in edition processing or drawing by the user in a state illustrated in Fig. 14, the image of the first layer is generated or edited.
    In the display examples of Figs. 12 to 14, the image of the selected layer is highlighted and displayed and the images of the other layers are grayed out and displayed. Meanwhile, as the editing image, only the objects of the image of the selected layer may be displayed, and the objects of the images of the other layers may not be displayed.
  • <1-4. Display example of a depth adjustment screen>
    Next, a flow of a depth adjustment process of an image of each layer will be described with reference to a flowchart of Fig. 15.
    A depth adjustment process is started by a user operation to select a depth operation button 225 of the editing screen illustrated in Fig. 2, for example. That is, as illustrated in Fig. 15, the editing screen generating unit 118 determines whether an operation to select the depth operation button 225 exists in a state in which the editing screen is displayed (step S21). When the operation to select the depth operation button 225 does not exist, a waiting state is maintained until the operation exists.
  • In step S21, when it is determined that the operation to select the depth operation button 225 exists, overlapped display of the images of all of the layers is performed as the intermediate image display 203 in edition processing and a depth bar is displayed on the upper side of the intermediate image display 203 in edition processing (step S22). The depth bar is a scale that illustrates the depth of each layer. As illustrated in this example, when the depth bar is configured using the images of the three layers, the depth positions of the three layers are displayed by the depth bar. On the lower side of the editing image display 203, a depth adjustment button is displayed. A specific display example will be described below. However, in this example, for each layer, an adjustment button to move the depth of the image to the inner side and an adjustment button to move the depth of the image to the front side are prepared and displayed.
  • The editing screen generating unit 118 determines whether a user operation of any adjustment button exists (step S23). In this case, when it is determined that the operation of the adjustment button does not exist, the display of step S22 is continuously executed. When it is determined that the operation of the adjustment button exists, the image of the layer that corresponds to the operated operation button is displayed as the intermediate image display 203 in edition processing (step S24). The depth position that is set to the image of the corresponding layer is changed according to the operation situation of the adjustment button and the depth position of the depth bar display is changed according to the corresponding position (step S25). After the setting or the display is changed in step S25, the display returns to the display of step S22. However, the intermediate image display 203 in edition processing may be the display of only the operated layer until a next operation exists.
  • Fig. 16 is a diagram illustrating a display example of the depth bar.
    When the operation of the depth operation button 225 of the editing screen exists, as illustrated in Fig. 16, the overlapped display of the images of all of the layers is performed as the intermediate image display 203 in edition processing and the depth bar 401 is displayed on the upper side of the editing image display 203.
    In this example, the depth positions of the images of the three layers are illustrated in one depth bar 401. That is, in the depth bar 401, the depth position 401a of the image of the first layer, the depth position 401b of the image of the second layer, and the depth position 401c of the image of the third layer are illustrated by changing the display colors.
    In scales that are given to the depth bar 401, "0" indicates the depth position of the virtual display surface, the inner side is indicated by a minus value, and the front side is indicated by a plus value (plus display is not illustrated).
  • On the lower side of the editing image display 203, buttons that adjust the depth position are displayed for the image of each layer. Specifically, a depth adjustment button 411 that moves the depth to the front side and a depth adjustment button 412 that moves the depth to the inner side are displayed as adjustment buttons for the image of the first layer. A depth adjustment button 421 that moves the depth to the front side and a depth adjustment button 422 that moves the depth to the inner side are displayed as adjustment buttons for the image of the second layer. A depth adjustment button 431 that moves the depth to the front side and a depth adjustment button 432 that moves the depth to the inner side are displayed as adjustment buttons for the image of the third layer.
  • If the user operation to select the display places of the depth adjustment buttons 411 to 432 exists, the setting of the depth of the image of each layer is changed. For example, the depth position of the image of the first layer is changed by the operations of the depth adjustment buttons 411 and 412 and the display of the depth position 401a of the image of the first layer in the depth bar 401 is moved as illustrated by an arrow La.
    In addition, the depth position of the image of the second layer is changed by the operations of the depth adjustment buttons 421 and 422 and the display of the depth position 401b of the image of the second layer in the depth bar 401 is moved as illustrated by an arrow Lb.
    In addition, the depth position of the image of the third layer is changed by the operations of the depth adjustment buttons 431 and 432 and the display of the depth position 401c of the image of the third layer in the depth bar 401 is moved as illustrated by an arrow Lc.
  • The depth adjustment buttons 411 to 432 are limited to the positions adjacent to the depth positions of the images of the adjacent layers. For example, a range of the movement La of the image of the first image is from the deepest position to the position adjacent to the depth position of the image of the adjacent second layer.
  • Fig. 17 illustrates another display example of the depth bar.
    In this example, in the editing image display 203, the image of the selected layer (in this example, image of the third layer) is highlighted and displayed. In a depth bar 501, only the depth position 502 of the layer of the emphasis display is illustrated as the depth bar 501. In the case of the example of Fig. 17, in the depth bar 501, an entire depth range set by the image processing apparatus 100 is illustrated and a position display 503 of the virtual display surface is illustrated.
    In the case of the example of Fig. 17, with respect to the depth adjustment buttons, only the depth adjustment buttons 511 and 512 for the image of the third layer that is the selecting layer are displayed. The depth position of the image of the layer of the emphasis display is adjusted by the operations of the depth adjustment buttons 511 and 512 and the display position of the depth position 502 of the depth bar 501 is changed as illustrated by an arrow Ld.
  • Fig. 18 illustrates another display example of the depth bar.
    In the example of Fig. 18, similar to the example of Fig. 17, the image of the selected layer (in this example, image of the second layer) is highlighted and displayed in the editing image display 203. In a depth bar 601, only a range where the depth of the image of the layer can be adjusted is displayed with scales. A depth position 602 is displayed in the depth bar 601. That is, in the depth of the image of the second layer, the inner side is limited to the depth position of the image of the first layer and the front side is limited to the depth position of the image of the third layer. A range where the depth is limited and adjusted becomes a scale display range of the depth bar 601.
    Therefore, when the depth position is adjusted by the operations of the depth adjustment buttons 611 and 612, the depth position 602 moves in the range of the displayed depth bar 601, as illustrated by an arrow Le.
  • Fig. 19 illustrates another display example of the depth bar.
    In an example of Fig. 19, the overlapped display of the images of all of the layers is performed in the intermediate image display 203 in edition processing and a depth position 702 of the image of the layer where the depth is adjusted is illustrated in a depth bar 701. In addition, a position 703 of the virtual display surface is indicated.
    Similar to the example of Fig. 16, depth adjustment buttons 711, 712, 721, 722, 731, and 732 are provided for each layer and the depth position 702 is changed by adjusting the depth by the button operation, as illustrated by an arrow Lf.
  • When the depth of the specific layer is adjusted by the operations of the depth adjustment buttons 711, 712, 721, 722, 731, and 732, the position of the image of the layer that is compatible with the adjustment of the depth is indicated by an image frame 704 of four corners, in the editing image display 203.
    At almost the center of the image, display 705 of a numerical value (in this example, "-25") indicating the setting position of the depth is performed.
    In this way, display where setting of the depth can be recognized from the display image may be performed.
  • <1-5. Display example of a ground surface setting screen>
    Next, a flow of a ground surface setting process of an image will be described with reference to a flowchart of Fig. 20.
    The ground surface setting process starts when the user operates a button (not illustrated in the drawings) to instruct setting of the ground surface, in the editing screen illustrated by Fig. 2. That is, as illustrated in Fig. 20, the editing screen generating unit 118 determines whether an operation to instruct setting of the ground surface exists, in a state in which the editing screen is displayed (step S31). When the operation of the setting of the ground surface does not exist, a waiting state is maintained until the corresponding operation exists.
  • In step S31, when it is determined that the operation of the setting of the ground surface exists, the overlapped display of the images of all of the layers is performed as the intermediate image display 203 in edition processing and a slider bar for horizontal line adjustment is vertically displayed on one end of the intermediate image display 203 in edition processing (step S32). A slider handle that indicates the position of the horizontal line is displayed on the slider bar for the horizontal line adjustment, and it is determined whether a drag operation of the slider handle exists (step S33).
  • When it is determined that the operation of the slider handle exists, a change process of the position of the horizontal line is executed according to the operation (step S34). The lower side of the horizontal line of the image of the innermost layer (first layer) is set to the inclined surface on the three-dimensional space, according to the change of the position of the horizontal line. The setting of the inclined surface is the process already described in Fig. 4 and the inclined surface corresponds to the ground surface.
  • Next, it is determined whether a mode to erase the lower side of the ground surface in the images of the layers other than the first layer is set (step S35). In this case, when the mode to erase the lower side of the ground surface is set, the objects at the positions that become the lower side of the inclined surface (ground surface) of the images of the layers other than the first layer are erased (step S36).
  • When it is determined that the mode to erase the lower side of the ground surface is not set in step S35, it is determined whether the object set to be disposed on the ground surface exists among the objects of the images of the individual layers (step S37). In this case, when it is determined that the object set to be disposed on the ground surface exists, the position of the lower end of the corresponding object is adjusted to the position crossing the inclined surface in the image of the layer where the object exists (step S38).
  • When it is determined that a drag operation does not exist in step S33, after the processes of steps S36 and S38 are executed and when it is determined that the object set to be disposed on the ground surface does not exist in step S37, the process returns to the horizontal line slider bar display process of step S32.
  • Next, a specific display example at the time of adjusting the horizontal line will be described.
    Fig. 21 illustrates an example where the horizontal line adjustment bar 261 which is the horizontal line slider bar is displayed in the intermediate image display 203 in edition processing in the editing screen. In the horizontal line adjustment bar 261, the ground surface setting position display 262 is indicated. In this example, the ground surface setting position display 262 is matched with the ground line c drawn in the image of the first layer. The matching process is executed by the user operation.
  • As such, the ground surface of the lower side of the horizontal line of the image of the first layer is set as the inclined surface illustrated in Fig. 4 by setting the horizontal line. The position of the ground surface may be displayed in the images of the layers other than the first layer (images of the second layer and the third layer).
  • For example, as illustrated in Fig. 22, ground surface position display 271 where the image 252 of the second layer and the ground surface (inclined surface) cross is indicated by a broken line as the intermediate image display 203 in edition processing in the editing screen. By this display, it is determined whether the arrangement position of the object in the image 252 of the layer (in this example, object d of a tree) is appropriate, from the display of the ground surface position. That is, a lower end of the object d of the tree is almost matched with the ground surface position display 271 illustrate by a broken line and an appropriate three-dimensional image is obtained. Meanwhile, the lower end of the object d of the tree is at the upper side of the ground surface position display 271 illustrated by the broken line, the tree is floated. When the lower end of the object d is at the lower side of the ground surface position display 271, the tree sinks into the ground surface. As a result, an unnatural three-dimensional image is obtained in both cases. As illustrated in Fig. 22, the ground surface position display 271 is performed to effectively prevent the image from becoming the unnatural three-dimensional image.
  • Fig. 23 illustrates another display example of the position of the ground surface. In the example of Fig. 23, only the upper side of the position crossing the ground surface is displayed as the image of the second layer and the lower side of the ground surface is displayed as a non-display portion 272 (black display portion). In Fig. 23, the image of the third layer of the front side of the second layer is displayed, for example, after decreasing the brightness. However, the objects e and f in the image of the third layer may not be displayed.
  • Fig. 24 illustrates an example where the objects of the lower side of the ground surface that is the inclined surface in the layers other than the first layer that is the long-distance view are erased and the display is performed as the intermediate image display 203 in edition processing in the editing screen. A process of Fig. 24 corresponds to the process in step S36 of the flowchart of Fig. 20.
    In the example of Fig. 24, a lower part of an object e of a dog of the third layer that is the short-distance view becomes the lower side of the ground surface. At this time, the object e is displayed in a state in which the lower part of the object e of the lower side of the ground surface is erased.
    In this way, the object that becomes the lower side of the ground surface is not displayed so that unnatural display where the object exists on the lower side of the ground surface when the generated image is viewed three-dimensionally can be prevented. The partial erasure process of the object illustrated in Fig. 24 may be executed in the three-dimensional image display and the corresponding object may be completely displayed when the image of each layer is individually displayed.
  • Fig. 25 illustrates an example of an operation screen in the case where a lower end of the object in the image of each layer is matched with the ground surface of the inclined surface and display is performed as the intermediate image display 203 in edition processing in the editing screen. A process of Fig. 25 corresponds to the process in step S38 of the flowchart of Fig. 20.
    In the example of Fig. 25, an operation screen with respect to the object e of the dog of the third layer that is the short-distance view is illustrated. In this example, as illustrated in Fig. 25, position movement buttons 281 and 282, a returning button 283, an erasure button 284, and a ground surface adjustment button 285 are displayed around the object e.
  • The user performs the operation to select each button and the position of the object e is modified. In this case, when the operation to select the ground surface adjustment button 285 exists, the editing screen generating unit 118 executes a process for automatically matching the position of the lower end of the object e with a surface crossing the ground surface.
    Therefore, the corresponding object e is automatically disposed on the ground surface by selecting the ground surface adjustment button 285 and an appropriate three-dimensional image can be generated.
  • <1-6. Example where a camera image is captured>
    Figs. 26 and 27 illustrate an example where a camera image is taken in an intermediate image in generation processing.
    For example, if an operation to select the camera capturing button 222 in the editing screen illustrated in Fig. 2 exists, as illustrated in Fig. 26, a camera capturing operation screen 810 is displayed on the editing screen. A process for reading a camera image from an external camera device (or storage device where the camera image is stored) connected to the image processing apparatus 100 is executed by an operation using the camera capturing operation screen 810. In addition, as illustrated in Fig. 26, display of the camera capturing image 811 is performed by the reading. An extraction image 812 where a background is removed from the camera capturing image 811 is obtained by an operation in the camera capturing operation screen 810.
    By disposing the extraction image 812 on the image of any layer, the extraction image 812 can be disposed as one of the objects in the intermediate image in generation processing, as illustrated in Fig. 27. The depth position of the layer where the extraction image is disposed is selected by the user using the camera capturing operation screen 810. Alternatively, the camera capturing image may be automatically disposed on the layer of the most front side (short-distance view).
  • <1-7. Example where an image file is imported>
    Figs. 28 to 30 illustrate an example where a file image is imported in an intermediate image in generation processing.
    For example, if an operation to select the file importing button 221 in the editing screen illustrated in Fig. 2 exists, as illustrated in Fig. 28, an image file importing operation screen 820 is displayed in the editing screen. A process for reading selected image data from the image file stored in the designated place is executed by an operation using the image file importing operation screen 820. In addition, as illustrated in Fig. 28, an imported image 821 is displayed by the reading. As illustrated in Fig. 29, an extraction image 822 that is partially extracted from the imported image 821 is obtained by an operation in the image file importing operation screen 820.
  • By disposing the extraction image 822 on the image of any layer, as illustrated in Fig. 30, the extraction image 822 can be disposed as one of the objects in the generated image. In this case, the depth position of the layer where the extraction image is disposed is selected by the user using the image file importing operation screen 820.
  • <1-8. Display example of a list of generated images>
    The three-dimensional image that is generated using the editing screen in the process described above is stored in the image storage unit 120 of the image processing apparatus 100. A list of data of the stored three-dimensional images can be displayed on one screen.
    Fig. 31 illustrates an example where a list of generated images is displayed. In this example, generated images 11, 12, 13, ... are reduced and displayed. In this case, in display of each image, a two-dimensional image or a three-dimensional image may be displayed.
    In columns of generated image display, numbers of layers displays 11a, 12a, 13a, ... that number of layers are indicated by figures are performed. In this case, as the displays of the number of layers by the figures, a figure where three images are overlapped is displayed when the number of layers is three.
    By displaying the list of generated images, the generated images can be easily selected. The selected images may be displayed in the editing screen illustrated in Fig. 2 and editing work may be performed.
  • <1-9. Specific example of hardware configuration>
    Next, a specific example of the hardware configuration of the image processing apparatus 100 according to this example will be described with reference to Fig. 32. Fig. 32 illustrates an example where the image processing apparatus 100 is configured as an information processing device such as a computer device.
  • The image processing apparatus 100 mainly includes a CPU 901, a ROM 903, a RAM 905, a host bus 907, a bridge 909, an external bus 911, an interface 913, an input device 915, an output device 917, an image capturing device 918, a storage device 919, a drive 921, a connection port 923, and a communication device 925.
  • The CPU 901 functions as an operation processing device and a control device and controls all or a part of operations in the image processing apparatus 100, according to various programs stored in the ROM 903, the RAM 905, the storage device 919, and a removable recording medium 927. The ROM 903 stores programs or operation parameters that are used by the CPU 901. The RAM 905 primarily stores programs used in execution of the CPU 901 or parameters appropriately changed in the execution. These devices are connected to each other by a host bus 907 configured by an internal bus such as a CPU bus.
  • The host bus 907 is connected to an external bus 911 such as a peripheral component interconnect/interface (PCI) bus through the bridge 909.
  • The input device 915 is an operation unit such as a mouse, a keyboard, a touch panel, a button, a switch, and a lever operated by the user. The input device 915 may be a remote control unit (so-called remote controller) using infrared rays and other radio waves or an external connection apparatus 929 such as a mobile phone or a PDA that is compatible with the operation of the image processing apparatus 100. The input device 915 is configured using an input control circuit that generates an input signal on the basis of information input by the user using the operation unit and outputs the input signal to the CPU 901. The user of the image processing apparatus 100 operates the input device 915 and can input various data to the image processing apparatus 100 or instructs the image processing apparatus 100 to execute a process operation.
  • The output device 917 is configured using a display device such as a liquid crystal display device, a plasma display device, an EL display device, and a lamp, a sound output device such as a speaker and a headphone, or a device such as a printer device, a mobile phone, and a facsimile that can visually or audibly notify the user of acquired information. The output device 917 outputs the result that is obtained by various processes executed by the image processing apparatus 100. Specifically, the display device displays the result obtained by the various processes executed by the image processing apparatus 100 with a text or an image. Meanwhile, the sound output device converts an audio signal configured using reproduced sound data or acoustic data into an analog signal and outputs the analog signal.
  • For example, the image capturing device 918 is provided on the display device and the image processing apparatus 100 can capture a still image or a moving image of the user with the image capturing device 918. The image capturing device 918 includes a charge coupled device (CCD) image sensor or a complementary metal oxide semiconductor (CMOS) image sensor and converts light condensed by a lens into an electric signal and can capture a still image or a moving image.
  • The storage device 919 is a data storage device that is configured as an example of a storage unit of the image processing apparatus 100. For example, the storage device 919 is configured using a magnetic storage device such as a hard disk drive (HDD), a semiconductor storage device, an optical storage device, or a magneto-optical storage device. The storage device 919 stores programs executed by the CPU 901, various data, and acoustic signal data or image signal data acquired from the outside.
  • The drive 921 is a reader/writer for a storage medium and is incorporated in the image processing apparatus 100 or is attached to the outside of the image processing apparatus 100. The drive 921 reads information that is recorded in the mounted removable recording medium 927 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory and outputs the information to the RAM 905. The drive 921 can record an information on the mounted removable recording medium 927 such as the magnetic disk, the optical disk, the magneto-optical disk, or the semiconductor memory. The removable recording medium 927 is a DVD medium, a Blu-ray medium, a compact flash (registered trademark) (CompactFlash: CF), a memory stick, a secure digital (SD) memory card, or the like. The removable recording medium 927 may be an integrated circuit (IC) card or an electronic apparatus where an IC chip of a non-contact type is mounted.
  • The connection port 923 is a port to directly connect the apparatus to the image processing apparatus 100, such as a universal serial bus (USB) port, an IEEE1394 port such as i.Link, a small computer system interface (SCSI) port, an RS-232C port, an optical audio terminal, or a high-definition multimedia interface (HDMI) port. By connecting the external connection apparatus 929 to the connection port 923, the image processing apparatus 100 acquires the acoustic image data or the image signal data directly from the external connection apparatus 929 or provides the acoustic image data or the image signal data to the external connection apparatus 929.
  • The communication device 925 is a communication interface that is configured by a communication device for connection with a communication network 931. The communication device 925 is a communication card such as a wired or wireless local area network (LAN), a Bluetooth, and a communication card for a wireless USB (WUSB), a router for optical communication, a router for an asymmetrical digital subscriber line (ADSL), or a modem for various communications. The communication device 925 can transmit and receive a signal based on a predetermined protocol such as TCP/IP between the Internet or other communication apparatuses and the communication device. The communication network 931 that is connected to the communication device 925 is configured by a network connected by wire or wireless. For example, the communication network 931 may be the Internet, a home LAN, infrared communication, radio wave communication, or satellite communications.
  • The example of the hardware configuration that can realize the function of the image processing apparatus 100 according to this example is described. The various components may be configured using general-purpose members or may be configured by hardware specialized in the functions of the individual components. Therefore, the hardware used in the configuration may be appropriately changed according to various technological levels to carry out this embodiment.
  • The program (software) that executes each process step executed by the image processing apparatus 100 according to this example may be generated, the program may be deployed in a general-purpose computer device, and the same process may be executed. The program may be stored in various media or may be downloaded from the server side to the computer device through the Internet.
    It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and alterations may occur depending on design requirements and other factors insofar as they are within the scope of the appended claims or the equivalents thereof.
  • Note that the following configurations are within the scope of the present disclosure.
    (1)
    An image processing apparatus comprising:
    an image generating unit that generates a plurality of plane images and sets virtual distances in a depth direction to the plurality of generated plane images, respectively;
    a three-dimensional image converting unit that converts the plurality of plane images into a three-dimensional image where objects' positions in space in each of the plurality of plane images are set, based on the virtual distances set to the plurality of plane images generated by the image generating unit;
    a three-dimensional image generating unit that outputs data of the three-dimensional image converted by the three-dimensional image converting unit;
    an editing screen generating unit that displays the plurality of plane images generated by the image generating unit individually or in a overlapped manner and generates display data of an editing screen displayed by providing tabs to the plane images, respectively; and
    an input unit that receives an operation to generate or edit images in the editing screen generated by the editing screen generating unit.
  • (2)
    The image processing apparatus according to (1),
    wherein the editing screen that is generated by the editing screen generating unit includes the tabs for the individual plane images that display thumbnail images where the plane images corresponding to the individual tabs are reduced, and
    the plane image corresponding to any tab is displayed on the editing screen by a selection operation of any tab in the input unit.
  • (3)
    The image processing apparatus according to (1) or (2),
    wherein the editing screen that is generated by the editing screen generating unit includes tabs that display thumbnail images where images showing the virtual distances of the plurality of plane images are reduced.
  • (4)
    The image processing apparatus according to any one of (1) to (3),
    wherein distance scales showing setting of the virtual distances of the plane images and operation positions to operate the virtual distances of the plane images by the input unit are displayed on the editing screen that is generated by the editing screen generating unit.
  • (5)
    The image processing apparatus according to any one of (1) to (4),
    wherein, in the distance scales, settings of the distances of the plurality of plane images are distinguished and displayed, and the operation positions are prepared for every plane image.
  • (6)
    The image processing apparatus according to any one of (1) to (5),
    wherein displaying to indicate the position of a virtual display surface is performed in the distance scales.
  • (7)
    The image processing apparatus according to any one of (1) to (6),
    wherein the three-dimensional image converting unit converts the plurality of plane images into a three-dimensional image where an image portion of a lower side from the horizontal line position set to one specific plane image of the plural plane images becomes an inclined surface gradually changing from the virtual distance.
  • (8)
    The image processing apparatus according to any one of (1) to (7),
    wherein horizontal line position scales showing setting of the horizontal line position are displayed on the editing screen generated by the editing screen generating unit and the horizontal line position shown by the horizontal line position scales is changed by receiving an operation in the input unit.
  • (9)
    The image processing apparatus according to any one of (1) to (8),
    wherein the plane images other than the specific plane image display the position crossing the inclined surface in the three-dimensional image, on the editing screen generated by the editing screen generating unit.
  • (10)
    The image processing apparatus according to any one of (1) to (9),
    wherein the three-dimensional image converting unit erases an object that becomes the lower side of the position crossing the inclined surface in the three-dimensional image, with respect to the plane images other than the specific plane image.
  • (11)
    The image processing apparatus according to any one of (1) to (10),
    wherein the image generating unit sets a lower end of a designated object in the plane images other than the specific plane image to the position matched with the position crossing the inclined surface in the three-dimensional image.
  • (12)
    An image processing apparatus comprising:
    an image control unit that controls an image displayed on a display unit;
    a three-dimensional image generating unit that generates a three-dimensional image where the space positions of objects of a plurality of plane images are set, from the plurality of plane images having the virtual distances in a depth direction, respectively; and
    an input unit that receives an operation from a user,
    wherein the image control unit displays thumbnail images in which the plurality of plane images and overlapped images, in which the plane images are displayed in an overlapped manner at a predetermined angle in a depth direction, are reduced, and
    the image control unit displays the plane image or the overlapped image corresponding to the selected thumbnail image in an editable state along the thumbnail images of the plurality of plane images and the overlapped images, when the input unit receives a command selecting the thumbnail image.
  • (13)
    The image processing apparatus according to (12),
    wherein the image control unit displays the plane images and the overlapped images in an overlapped manner and displays the thumbnail images as tabs of the plurality of plane images and the overlapped images, respectively.
  • (14)
    The image processing apparatus according to (12) or (13),
    wherein the image control unit further displays a tab to receive an addition of the plane images.
  • (15)
    The image processing apparatus according to any one of (12) to (14),
    wherein the image control unit displays the overlapped image on a front surface and displays only the tabs of the non-selected plane images, when the thumbnail image corresponding to the overlapped image is selected, and
    the input unit accepts an input to change the virtual distance of the selected plane image in a depth direction, among the plane images displayed in the overlapped image.
  • (16)
    The image processing apparatus according to any one of (12) to (15),
    wherein the image control unit displays a screen where only an image of the plane image in the screen is highlighted on a front surface, when the thumbnail image corresponding to the plane image is selected.
  • (17)
    The image processing apparatus according to any one of (12) to (16),
    wherein the image control unit displays a screen where images of the non-selected plane images in the screen are grayed out on a front surface, when the thumbnail image corresponding to the plane image is selected.
  • (18)
    The image processing apparatus according to any one of (12) to (17),
    wherein the image control unit displays distance scales showing the virtual distances of the plane images in the depth direction and displays a depth change operation input unit to operate the virtual distances of the plane images displayed in an editable state in the depth direction by the input unit, and
    the input unit accepts an operation to generate or edit the images in the plane images and accepts an operation with respect to the depth change operation input unit, when the plane images are displayed in the editable state.
  • (19)
    The image processing apparatus according to any one of (12) to (18),
    wherein the depth change operation input unit has a button to move the virtual distances of the plane images displayed in the editable state in the depth direction to the front side and a button to move the virtual distances to the inner side.
  • (20)
    The image processing apparatus according to any one of (12) to (19),
    wherein the depth change operation input unit is an object that corresponds to the plane image displayed on the distance scales in the editable state.
  • (21)
    The image processing apparatus according to any one of (12) to (20),
    wherein the image control unit displays a horizontal line change operation input unit to operate the horizontal line position set in the plane image displayed in an editable state by the input unit, and
    the input unit accepts an operation with respect to the horizontal line change operation input unit, when the plane image is displayed in an editable state.
  • (22)
    The image processing apparatus according to any one of (12) to (21),
    wherein the image control unit does not display an image of the lower side of the horizontal line position set in the plane image displayed in the editable state.
  • (23)
    The image processing apparatus according to any one of (12) to (22),
    wherein the three-dimensional image generating unit does not reflect an image of the lower side of the horizontal line position set in each plane image to a generated three-dimensional image.
  • (24)
    The image processing apparatus according to any one of (12) to (23),
    wherein the image control unit performs a control operation to reflect the edit result on the thumbnail image of the overlapped image, when the plane image corresponding to the selected thumbnail image is edited.
  • (25)
    The image processing apparatus according to any one of (12) to (24), further comprising:
    the display unit.
  • (26)
    An image processing method comprising:
    generating a plurality of plane images and setting virtual distances in a depth direction to the plurality of generated plane images, respectively;
    converting the plurality of plane images into a three-dimensional image where space positions of objects in each of the plurality of plane images are set, based on the virtual distances set to the plurality of generated plane images;
    outputting data of the converted three-dimensional image;
    displaying the plurality of generated plane images individually or to be overlapped and generating display data of an editing screen displayed by providing tabs to the plane images, respectively; and
    accepting an operation to generate or edit images in the generated editing screen.
  • (27)
    A program that causes a computer to execute an image process, the program causing the computer to execute:
    generating a plurality of plane images and setting the virtual distances of a depth direction to the plurality of generated plane images, respectively;
    converting the plurality of plane images into a three-dimensional image where the space positions of objects of the plurality of plane images are set, on the basis of the virtual distances set to the plurality of generated plane images;
    outputting data of the converted three-dimensional image;
    displaying the plurality of generated plane images individually or to be overlapped and generating display data of an editing screen displayed by providing tabs to the plane images, respectively; and
    accepting an operation to generate or edit images in the generated editing screen.
  • a to f display object
    11, 12, 13 generated image
    11a, 12a, 13a number of layers display
    100 image processing apparatus
    110 image generating/processing unit
    112 image generating unit
    114 three-dimensional image converting unit
    116 three-dimensional image generating unit
    118 editing screen generating unit
    120 image storage unit
    130 input unit
    140 image display unit
    201 tab
    202 3D state thumbnail tab
    203 image display in edition processing
    203a first layer edge portion
    203b second layer edge portion
    203c third layer edge portion
    211 to 213 layer thumbnail tab
    221 file importing button
    222 camera capturing button
    223 stamp button
    224 character input button
    225 depth operation button
    231 to 237 object display unit
    241 generation start button
    242 save button
    243 three-dimensional display button
    250L image for left eye
    250R image for right eye
    251, 251' first layer image
    252, 252' second layer image
    253, 253' third layer image
    259, 259' display surface
    261 horizontal line adjustment bar
    262 ground surface setting position display
    271 ground surface position display in layer
    272 non-display portion
    281, 282 position movement button
    283 returning button
    284 erasure button
    285 ground surface adjustment button
    290 pen tool
    291 first pen
    292 second pen
    293 third pen
    294 eraser
    301 depth axis
    301a to 301d depth position
    302 front edge portion
    303 ground surface setting position
    304 virtual display surface
    311 first layer image
    312 second layer image
    313 third layer image
    321 horizontal line position
    401 depth bar display
    401a, 401b, 401c layer position
    411, 412, 421, 422, 431, 432 depth adjustment button
    501 depth bar display
    502 virtual display surface position
    503 layer position
    511, 512 depth adjustment button
    601 depth bar display
    603 layer position
    611, 612 depth adjustment button
    701 depth bar display
    702 virtual display surface position
    703 layer position
    704 depth frame
    705 depth value display
    711, 712, 721, 722, 731, 732 depth adjustment button
    810 camera capturing operation screen
    811 camera capturing image
    812 extraction image
    820 image file importing operation screen
    821 imported image
    822 extraction image

Claims (27)

  1. An image processing apparatus comprising:
    an image generating unit that generates a plurality of plane images and sets virtual distances in a depth direction to the plurality of generated plane images, respectively;
    a three-dimensional image converting unit that converts the plurality of plane images into a three-dimensional image where objects' positions in space in each of the plurality of plane images are set, based on the virtual distances set to the plurality of plane images generated by the image generating unit;
    a three-dimensional image generating unit that outputs data of the three-dimensional image converted by the three-dimensional image converting unit;
    an editing screen generating unit that displays the plurality of plane images generated by the image generating unit individually or in a overlapped manner and generates display data of an editing screen displayed by providing tabs to the plane images, respectively; and
    an input unit that receives an operation to generate or edit images in the editing screen generated by the editing screen generating unit.
  2. The image processing apparatus according to claim 1,
    wherein the editing screen that is generated by the editing screen generating unit includes the tabs for the individual plane images that display thumbnail images where the plane images corresponding to the individual tabs are reduced, and
    the plane image corresponding to any tab is displayed on the editing screen by a selection operation of any tab in the input unit.
  3. The image processing apparatus according to claim 2,
    wherein the editing screen that is generated by the editing screen generating unit includes tabs that display thumbnail images where images showing the virtual distances of the plurality of plane images are reduced.
  4. The image processing apparatus according to claim 1,
    wherein distance scales showing setting of the virtual distances of the plane images and operation positions to operate the virtual distances of the plane images by the input unit are displayed on the editing screen that is generated by the editing screen generating unit.
  5. The image processing apparatus according to claim 4,
    wherein, in the distance scales, settings of the distances of the plurality of plane images are distinguished and displayed, and the operation positions are prepared for every plane image.
  6. The image processing apparatus according to claim 4,
    wherein displaying to indicate the position of a virtual display surface is performed in the distance scales.
  7. The image processing apparatus according to claim 1,
    wherein the three-dimensional image converting unit converts the plurality of plane images into a three-dimensional image where an image portion of a lower side from the horizontal line position set to one specific plane image of the plural plane images becomes an inclined surface gradually changing from the virtual distance.
  8. The image processing apparatus according to claim 7,
    wherein horizontal line position scales showing setting of the horizontal line position are displayed on the editing screen generated by the editing screen generating unit and the horizontal line position shown by the horizontal line position scales is changed by receiving an operation in the input unit.
  9. The image processing apparatus according to claim 8,
    wherein the plane images other than the specific plane image display the position crossing the inclined surface in the three-dimensional image, on the editing screen generated by the editing screen generating unit.
  10. The image processing apparatus according to claim 8,
    wherein the three-dimensional image converting unit erases an object that becomes the lower side of the position crossing the inclined surface in the three-dimensional image, with respect to the plane images other than the specific plane image.
  11. The image processing apparatus according to claim 8,
    wherein the image generating unit sets a lower end of a designated object in the plane images other than the specific plane image to the position matched with the position crossing the inclined surface in the three-dimensional image.
  12. An image processing apparatus comprising:
    an image control unit that controls an image displayed on a display unit;
    a three-dimensional image generating unit that generates a three-dimensional image where the space positions of objects of a plurality of plane images are set, from the plurality of plane images having the virtual distances in a depth direction, respectively; and
    an input unit that receives an operation from a user,
    wherein the image control unit displays thumbnail images in which the plurality of plane images and overlapped images, in which the plane images are displayed in an overlapped manner at a predetermined angle in a depth direction, are reduced, and
    the image control unit displays the plane image or the overlapped image corresponding to the selected thumbnail image in an editable state along the thumbnail images of the plurality of plane images and the overlapped images, when the input unit receives a command selecting the thumbnail image.
  13. The image processing apparatus according to claim 12,
    wherein the image control unit displays the plane images and the overlapped images in an overlapped manner and displays the thumbnail images as tabs of the plurality of plane images and the overlapped images, respectively.
  14. The image processing apparatus according to claim 13,
    wherein the image control unit further displays a tab to receive an addition of the plane images.
  15. The image processing apparatus according to claim 12,
    wherein the image control unit displays the overlapped image on a front surface and displays only the tabs of the non-selected plane images, when the thumbnail image corresponding to the overlapped image is selected, and
    the input unit accepts an input to change the virtual distance of the selected plane image in a depth direction, among the plane images displayed in the overlapped image.
  16. The image processing apparatus according to claim 12,
    wherein the image control unit displays a screen where only an image of the plane image in the screen is highlighted on a front surface, when the thumbnail image corresponding to the plane image is selected.
  17. The image processing apparatus according to claim 12,
    wherein the image control unit displays a screen where images of the non-selected plane images in the screen are grayed out on a front surface, when the thumbnail image corresponding to the plane image is selected.
  18. The image processing apparatus according to claim 12,
    wherein the image control unit displays distance scales showing the virtual distances of the plane images in the depth direction and displays a depth change operation input unit to operate the virtual distances of the plane images displayed in an editable state in the depth direction by the input unit, and
    the input unit accepts an operation to generate or edit the images in the plane images and accepts an operation with respect to the depth change operation input unit, when the plane images are displayed in the editable state.
  19. The image processing apparatus according to claim 18,
    wherein the depth change operation input unit has a button to move the virtual distances of the plane images displayed in the editable state in the depth direction to the front side and a button to move the virtual distances to the inner side.
  20. The image processing apparatus according to claim 18,
    wherein the depth change operation input unit is an object that corresponds to the plane image displayed on the distance scales in the editable state.
  21. The image processing apparatus according to claim 12,
    wherein the image control unit displays a horizontal line change operation input unit to operate the horizontal line position set in the plane image displayed in an editable state by the input unit, and
    the input unit accepts an operation with respect to the horizontal line change operation input unit, when the plane image is displayed in an editable state.
  22. The image processing apparatus according to claim 21,
    wherein the image control unit does not display an image of the lower side of the horizontal line position set in the plane image displayed in the editable state.
  23. The image processing apparatus according to claim 21,
    wherein the three-dimensional image generating unit does not reflect an image of the lower side of the horizontal line position set in each plane image to a generated three-dimensional image.
  24. The image processing apparatus according to claim 13,
    wherein the image control unit performs a control operation to reflect the edit result on the thumbnail image of the overlapped image, when the plane image corresponding to the selected thumbnail image is edited.
  25. The image processing apparatus according to claim 12, further comprising:
    the display unit.
  26. An image processing method comprising:
    generating a plurality of plane images and setting virtual distances in a depth direction to the plurality of generated plane images, respectively;
    converting the plurality of plane images into a three-dimensional image where space positions of objects in each of the plurality of plane images are set, based on the virtual distances set to the plurality of generated plane images;
    outputting data of the converted three-dimensional image;
    displaying the plurality of generated plane images individually or to be overlapped and generating display data of an editing screen displayed by providing tabs to the plane images, respectively; and
    accepting an operation to generate or edit images in the generated editing screen.
  27. A program that causes a computer to execute an image process, the program causing the computer to execute:
    generating a plurality of plane images and setting the virtual distances of a depth direction to the plurality of generated plane images, respectively;
    converting the plurality of plane images into a three-dimensional image where the space positions of objects of the plurality of plane images are set, on the basis of the virtual distances set to the plurality of generated plane images;
    outputting data of the converted three-dimensional image;
    displaying the plurality of generated plane images individually or to be overlapped and generating display data of an editing screen displayed by providing tabs to the plane images, respectively; and
    accepting an operation to generate or edit images in the generated editing screen.
EP12796143.1A 2011-06-06 2012-02-16 Image processing apparatus, image processing method, and program Withdrawn EP2718905A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2011126792A JP2012094111A (en) 2010-09-29 2011-06-06 Image processing device, image processing method and program
PCT/JP2012/001012 WO2012169097A1 (en) 2011-06-06 2012-02-16 Image processing apparatus, image processing method, and program

Publications (1)

Publication Number Publication Date
EP2718905A1 true EP2718905A1 (en) 2014-04-16

Family

ID=47296843

Family Applications (1)

Application Number Title Priority Date Filing Date
EP12796143.1A Withdrawn EP2718905A1 (en) 2011-06-06 2012-02-16 Image processing apparatus, image processing method, and program

Country Status (3)

Country Link
EP (1) EP2718905A1 (en)
CN (1) CN103582903A (en)
WO (1) WO2012169097A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108885377B (en) * 2018-06-14 2021-12-24 京东方科技集团股份有限公司 Display device and driving method thereof
US20230127460A1 (en) * 2021-10-22 2023-04-27 Ebay Inc. Digital Content View Control System

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3182321B2 (en) * 1994-12-21 2001-07-03 三洋電機株式会社 Generation method of pseudo stereoscopic video
JP4449183B2 (en) * 2000-07-11 2010-04-14 ソニー株式会社 Image editing system, image editing method, and storage medium
US6734855B2 (en) * 2000-07-11 2004-05-11 Sony Corporation Image editing system and method, image processing system and method, and recording media therefor
JP2004145832A (en) * 2002-08-29 2004-05-20 Sharp Corp Devices of creating, editing and reproducing contents, methods for creating, editing and reproducing contents, programs for creating and editing content, and mobile communication terminal
JP2006293598A (en) * 2005-04-08 2006-10-26 Canon Inc Document processing system
JP2009054018A (en) * 2007-08-28 2009-03-12 Ricoh Co Ltd Image retrieving device, image retrieving method, and program
US8819585B2 (en) * 2008-10-27 2014-08-26 Microsoft Corporation Child window surfacing and management
JP2011097441A (en) * 2009-10-30 2011-05-12 Sony Corp Information processing apparatus, image display method and computer program

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO2012169097A1 *

Also Published As

Publication number Publication date
CN103582903A (en) 2014-02-12
WO2012169097A1 (en) 2012-12-13

Similar Documents

Publication Publication Date Title
US20140125661A1 (en) Image processing apparatus, image processing method, and program
US8686993B2 (en) Image processing for controlling disparity in 3D images
US10009603B2 (en) Method and system for adaptive viewport for a mobile device based on viewing angle
JP6214236B2 (en) Image processing apparatus, imaging apparatus, image processing method, and program
EP3672227A1 (en) Dynamic region of interest adaptation and image capture device providing same
KR20100028344A (en) Method and apparatus for editing image of portable terminal
JP2014197824A5 (en)
JP2009053539A (en) Information display device, information display method, and program
JP5360166B2 (en) Image display device
US20140132726A1 (en) Image display apparatus and method for operating the same
TW201301131A (en) Image processing apparatus and method, and program
TW201301130A (en) Image processing apparatus and method, and computer program product
US20120092457A1 (en) Stereoscopic image display apparatus
CN113225475A (en) Display control apparatus, display control method, and computer-readable medium
JP6314427B2 (en) Imaging apparatus, information processing apparatus, control method, and program
WO2012169097A1 (en) Image processing apparatus, image processing method, and program
JP2012015619A (en) Stereoscopic display device and stereoscopic photographing device
JP2023010980A (en) Imaging device, and control method and program thereof
US11165970B2 (en) Image processing apparatus, imaging apparatus, image processing method, and non-transitory computer readable medium
JP2019118029A (en) Electronic apparatus
KR102160038B1 (en) Mobile terminal and method for controlling the same
JP2022162409A (en) Electronic apparatus and control method thereof
JP6968690B2 (en) Electronics
JP7005340B2 (en) Electronics
JP6685853B2 (en) Display control device and control method thereof

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20131126

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN WITHDRAWN

18W Application withdrawn

Effective date: 20150106