CN103582903A - Image processing apparatus, image processing method, and program - Google Patents
Image processing apparatus, image processing method, and program Download PDFInfo
- Publication number
- CN103582903A CN103582903A CN201280026377.3A CN201280026377A CN103582903A CN 103582903 A CN103582903 A CN 103582903A CN 201280026377 A CN201280026377 A CN 201280026377A CN 103582903 A CN103582903 A CN 103582903A
- Authority
- CN
- China
- Prior art keywords
- image
- plane
- view
- image processing
- editing screen
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/10—Geometric effects
- G06T15/20—Perspective computation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/388—Volumetric displays, i.e. systems where the image is built up from picture elements distributed through a volume
- H04N13/395—Volumetric displays, i.e. systems where the image is built up from picture elements distributed through a volume with depth sampling, i.e. the volume being constructed from a stack or sequence of 2D image planes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/60—Editing figures and text; Combining figures or text
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- General Physics & Mathematics (AREA)
- Computing Systems (AREA)
- Geometry (AREA)
- Computer Graphics (AREA)
- Processing Or Creating Images (AREA)
Abstract
The present disclosure provides an image processing apparatus to be newly improved that can easily generate contents having three-dimensional images. The image processing apparatus generates a plurality of plane images and sets the virtual distances of a depth direction to the plurality of generated plane images, respectively. The image processing apparatus converts the plurality of plane images into a three-dimensional image where the space positions of objects of the plurality of plane images are set, on the basis of the virtual distances set to the plurality of generated plane images, and obtains data of the three-dimensional image. In addition, the image processing apparatus displays an editing screen that generates the plurality of plane images. The editing screen displays the plurality of plane images individually or to be overlapped and displays the plane images by providing tabs to the plane images, respectively.
Description
Technical field
The disclosure relates to image processing equipment, image processing method and program, and relates to particularly the technology for generating three-dimensional image.
Background technology
In recent years, show and for example can be used as 3-D view, by the image of user awareness (3-D view: image display so-called 3D rendering) is released and start diffusion (, patent documentation 1).The equipment that can show 3-D view is not limited to TV or other image displays.In personal computer, some type can show 3-D view.
In the application program of moving in personal computer, exist application program can generate the content with 3-D view.If described content is generated by application program, and user carrys out view content in a predetermined manner, and user can be 3-D view by the image perception comprising in content.
[reference listing]
[patent documentation]
PLT1:JP2010-210712A
Summary of the invention
[technical matters]
But, according to prior art, when generation has the content of 3-D view, need to carry out with special-purpose software setting position relation.Correspondingly, end subscriber is difficult to generate such content.
In view of this, need to provide a kind of novelty and improved image processing equipment, image processing method and computer program, it can easily generate the content with 3-D view.
Image processing equipment of the present disclosure comprises image generation unit, and it generates a plurality of plane pictures and is separately positioned on depth direction to the pseudo range of generated a plurality of plane pictures.
Image processing equipment of the present disclosure comprises 3-D view converting unit, its pseudo range to a plurality of plane pictures that generated by image generation unit based on set is converted to 3-D view by a plurality of plane pictures, is provided with the object space position in each in a plurality of plane pictures in 3-D view;
Image processing equipment of the present disclosure comprises 3-D view generation unit, the data of the 3-D view that its output is changed by 3-D view converting unit;
Image processing equipment of the present disclosure comprises editing screen generation unit, it is independent or in overlapping mode, show a plurality of plane pictures that generated by image generation unit, and by the demonstration data that provide label to generate respectively shown editing screen to each plane picture; And
Image processing equipment of the present disclosure comprises input block, and it receives operation and generates or edit the image in the editing screen being generated by editing screen generation unit.
Image processing method of the present disclosure comprises the pseudo range that generates a plurality of plane pictures and be separately positioned on depth direction to generated a plurality of plane pictures; The pseudo range to generated a plurality of plane pictures based on set is converted to 3-D view by a plurality of plane pictures, is provided with the locus in each in a plurality of plane pictures of object in 3-D view; The data of the 3-D view that output is changed; Separately or show a plurality of plane pictures that generated in overlapping mode, and by the demonstration data that provide label to generate respectively shown editing screen to each plane picture; And acceptance operation generates or edits the image in generated editing screen.
Program of the present disclosure makes computing machine carries out image processing, and this program is carried out computing machine: generate a plurality of plane pictures and be separately positioned on depth direction to the pseudo range of generated a plurality of plane pictures; The pseudo range to generated a plurality of plane pictures based on set is converted to 3-D view by a plurality of plane pictures, is provided with the locus of the object in a plurality of plane pictures in 3-D view; The data of the 3-D view of output conversion; Separately or show a plurality of plane pictures that generated in overlapping mode, and by the demonstration data that provide label to generate respectively shown editing screen to each plane picture; And acceptance operation generates or edits the image in generated editing screen.
[solution of problem]
According to the disclosure, a plurality of plane pictures are suitably shown, thereby user can be generated and be shown the 3-D view of wanting by shirtsleeve operation.
According to the disclosure, a kind of novelty and improved image processing equipment, image processing method and computer program are provided, it makes it possible to easily generate the content with 3-D view.
Accompanying drawing explanation
Fig. 1 is the block diagram illustrating according to the example of the configuration of the image processing equipment of disclosure embodiment.
Fig. 2 illustrates according to the example of the editing screen of the disclosure embodiment figure of (having shown all layers in this example).
Fig. 3 is the exemplary diagram illustrating according to the example of every tomographic image of disclosure embodiment.
Fig. 4 is the exemplary diagram illustrating according to the example of the Depth display screen of disclosure embodiment.
Fig. 5 illustrates the exemplary diagram that multi-layer image is converted to the design of 3-D view.
Fig. 6 illustrates the exemplary diagram that multi-layer image is converted to the state of 3-D view.
Fig. 7 illustrates the exemplary diagram that multi-layer image is converted into the example of three-dimensional left side and right side channel images.
Fig. 8 is the exemplary diagram that the example of the 3-D view generating according to multi-layer image is shown.
Fig. 9 be illustrate there is certain one deck, wherein the front side of virtual display list face is set to the exemplary diagram of the transition status under the situation of virtual location.
Figure 10 be illustrate there is certain one deck, wherein the front side of virtual display list face is set to the exemplary diagram of the example of the 3-D view under the situation of virtual location.
Figure 11 is the process flow diagram illustrating according to the example of the flow process of the editing screen Graphics Processing of disclosure embodiment.
Figure 12 illustrates according to the example of the editing screen of the disclosure embodiment exemplary diagram of (having shown the 3rd layer in this example).
Figure 13 illustrates according to the example of the editing screen of the disclosure embodiment exemplary diagram of (having shown the second layer in this example).
Figure 14 illustrates according to the example of the editing screen of the disclosure embodiment exemplary diagram of (having shown ground floor in this example).
Figure 15 is the process flow diagram that the example of the flow process of processing according to the degree of depth screen display of disclosure embodiment is shown.
Figure 16 is the exemplary diagram that the example for display depth screen (the first example) according to disclosure embodiment is shown.
Figure 17 is the exemplary diagram that the example for display depth screen (the second example) according to disclosure embodiment is shown.
Figure 18 is the exemplary diagram that the example for display depth screen (the 3rd example) according to disclosure embodiment is shown.
Figure 19 is the exemplary diagram that the example for display depth screen (the 4th example) according to disclosure embodiment is shown.
Figure 20 is the process flow diagram illustrating according to the example of the flow process of the horizontal line set handling of disclosure embodiment.
Figure 21 illustrates the exemplary diagram that the example of screen is set according to the horizontal line of disclosure embodiment.
Figure 22 is the exemplary diagram illustrating according to the example of the display screen of the certain layer when horizontal line is set of disclosure embodiment (the first example).
Figure 23 is the exemplary diagram illustrating according to the example of the display screen of the certain layer when horizontal line is set of disclosure embodiment (the second example).
Figure 24 is the exemplary diagram illustrating according to the example of the display screen of the certain layer when horizontal line is set of disclosure embodiment (the 3rd example).
Figure 25 is the exemplary diagram illustrating according to the example of the display screen of the certain layer when horizontal line is set of disclosure embodiment (the 4th example).
Figure 26 is the exemplary diagram illustrating according to the example of the display screen when catching camera image of disclosure embodiment.
Figure 27 is the exemplary diagram that the demonstration example of the 3-D view being synthesized according to the wherein camera image of disclosure embodiment is shown.
Figure 28 is the exemplary diagram illustrating according to the example of the display screen when importing image file of disclosure embodiment.
Figure 29 is the exemplary diagram that the example arranging according to the range to import when importing image file of disclosure embodiment is shown.
Figure 30 is the exemplary diagram illustrating according to the demonstration example of the 3-D view of disclosure embodiment, has synthesized the image importing from image file in this 3-D view.
Figure 31 is the exemplary diagram illustrating according to the example of the list display screen of the works of disclosure embodiment.
Figure 32 is the block diagram illustrating according to the example of the detailed hardware configuration of the image processing equipment of disclosure embodiment.
Embodiment
Below, with reference to accompanying drawing, describe preferred embodiment of the present disclosure in detail.Notice in this instructions and accompanying drawing, with identical reference number, represent to have the structural detail of basic identical function and structure, and omit the repeat specification to these structural details.
Below, will describe preferred embodiment of the present disclosure in detail with following sequence reference accompanying drawing:
1. embodiment of the present disclosure
The configuration of 1-1. image processing equipment (Fig. 1)
The general introduction (Fig. 3 to 10) of 1-2. 3-D view
The demonstration example (Fig. 2 and Figure 11 to 14) of 1-3 editing screen
The 1-4 degree of depth is adjusted the demonstration example (Figure 15 to 19) of image
1-5 ground surface arranges the demonstration example (Figure 20 to 25) of screen
1-6 catches the example (Figure 26 and 27) of camera image
1-7. imports the example (Figure 28 to 30) of image
The demonstration example (Figure 31) of the image list that 1-8. generates
The specific examples of 1-9 hardware configuration (Figure 32)
1. embodiment of the present disclosure
The functional configuration of 1-1 image processing equipment
First, will describe according to the configuration of the image processing equipment of embodiment of the present disclosure (hereinafter referred to as " this example ").Fig. 1 is the figure illustrating according to the configuration of the image processing equipment 100 of this example.
According to the image processing equipment 100 of this example shown in Fig. 1, be configured to operate synthetic image, also show and store the image of generation by user.As shown in Figure 1, image processing equipment 100 comprises image generation/processing unit 110, image storage unit 120, input block 130 and image-display units 140.
Image generation/processing unit 110 provides image to generate screen by image-display units 140 to user, or generates 3-D view according to the image being generated by user.As shown in Figure 1, according to the image generation/processing unit 110 comprising in the image processing equipment 100 of this example, comprising image generation unit 112,3-D view converting unit 114,3-D view generation unit 116 and editing screen generation unit 118.Editing screen generation unit 118 is also as image control unit, and its image of controlling in each unit generates.Or image generation/processing unit 110 can comprise the image control unit separating with editing screen generation unit 118, its image of controlling in each unit generates.
On image-display units 140, show under the state of the editing screen generating for image, image generation/processing unit 110 operates to generate a plurality of plane pictures (for example three plane pictures) based on user, and from generated a plurality of plane pictures (two dimensional image) generating three-dimensional image.The editing screen generating for image is the screen shown in Fig. 2, and in center, shows intermediate image generating in processing, and at image around display operation button or label.To describe illustrated editing screen in Fig. 2 below in detail.The information of the degree of depth of each for example, from reference position (virtual display curtain) to generated a plurality of plane pictures of setting, and generate 3-D view based on this degree of depth.
In addition, the view data of the 3-D view that image generation/processing unit 110 generates generated by image/processing unit 110 offers image-display units 140, and image-display units 140 shows this 3-D view.User watches 3-D view (for example, under the state of the time of wearing sharing the shutter glasses that drives type user) by predetermined method, and will show that the 3-D view on image-display units 140 is perceived as 3-D view.
3-D view converting unit 114 is carried out conversion process, for the image that comprises multilayer sending over from image generation unit 112 is presented to image-display units 140 as 3-D view.According to the image processing equipment 100 of this example, presuppose distance between eyes of user and the distance between user and display surface, and the pseudo range based between layer (depth information of every tomographic image) is carried out for be presented at the conversion process on image-display units 140 using image as 3-D view.Especially, in order to generate 3-D view by the image that comprises multilayer is carried out to coordinate transform, 3-D view converting unit 114 is carried out the coordinate transform processing with the image correlation that comprises multilayer.
In the conversion process of 3-D view converting unit 114, if user adjusts the degree of depth and the change degree of depth state of every tomographic image while showing 3-D view on image-display units 140,3-D view converting unit 114 is carried out conversion process in real time according to this change.Thus, user can adjust the degree of depth of every tomographic image, and confirms 3-D view after the demonstration by editing screen is adjusted in real time.The example of the processing of the degree of depth of adjusting every tomographic image will be described in detail below.
When the plane picture that comprises multilayer generating according to user at image processing equipment 100 generates 3-D view, the preview that image processing equipment 100 is carried out 3-D view shows.Preview by 3-D view shows, what kind of user can will be the image of generation being understood in advance before 3-D view is stored to the image of seeing with three dimensional constitution.
The image generating three-dimensional image of the conversion process of 3-D view generation unit 116 based on being carried out by 3-D view converting unit 114 from comprising multilayer.According to the operation of the input block 130 from user, the 3-D view being generated by 3-D view generation unit 116 is displayed on image-display units 140 and is stored in image storage unit 120.
The accepting state of the input operation of editing screen generation unit 118 based on input block 130 generates the demonstration data of editing screen.The demonstration data that editing screen generation unit 118 generates editing screen generation unit 118 offer image-display units 140, and show this editing screen.
Image-display units 140 is the displays that show image.For example, image-display units 140 shows that the image that comprises multilayer that generated by image generations/processing unit 110 or demonstration contain by convert packets the 3-D view that the image of multilayer generates.Image-display units 140 shows that a screen is to allow user's synthetic image of image processing equipment 100.To the example of display screen be described below.Touch-screen can be placed in the image display surface of image-display units 140, and the button in can the direct control shown image of user.The touch-screen comprising in image-display units 140 is as a part for input block 130.Image-display units 140 can be the device from image processing equipment 100 separation.
Can carry out configuration image display unit 140 by the display device that can show 3-D view.The method that shows 3-D view is not limited to specific display packing.For example, as the method that shows 3-D view, a kind of known method is switch the image of right eye and the image of left eye and show this image at a high speed.As a kind of method that 3-D view is sent to image-display units 140, known have frame sequence method, paralleling method and a upper and lower method etc.
The image that image generation/processing unit 110 generates can be output to television receiver and be connected to image processing equipment 100 and can show other display device of 3-D view.
1-2. 3-D view generates general introduction
Next, with reference to Fig. 3 to 10, describe and by image processing equipment 100, generate the general introduction of 3-D view according to disclosure embodiment.
First, as shown in Figure 3, according to the image processing equipment 100 of this example have comprising of being generated by user long apart from view, in apart from the plane picture of three layers 251,252 and 253 of view and short distance view.Image processing equipment 100 by comprise long apart from view, middlely apart from view and this plane picture of three layers 251,252,253 of short distance view, synthesize, and by 3-D view converting unit 114, plane picture is converted to 3-D view.Like this, by the image that comprises three layers is converted to 3-D view, user can generating three-dimensional image and need not be carried out complicated imaging processing.
In the example of Fig. 3, long apart from drafting object a, b, c and the g corresponding with mountain range, the sun, road and horizontal line in the plane picture 252 of view, middle apart from drawing the object d of tree in the plane picture 252 of view, and in the plane picture 253 of short distance view drafting object e and the f corresponding with dog and insect.
The number of layer is 3 to be exemplary, and the number of layer can be for a plurality of.
Using from the distance of reference position comes the illustrated degree of depth to be set to every tomographic image.Fig. 4 is the figure that situation is set that the degree of depth is shown.The state that arranges shown in Fig. 4 is the 3D state thumbnail label of the editing screen that the following describes, and the state shown in figure is reduced and show.When 3D state thumbnail label 202 is selected, the state that arranges shown in Fig. 4 is exaggerated and is presented on editing screen.
To the state that arrange of the degree of depth shown in Fig. 4 be described.Degree of depth axle 301 is made as to the direction with display screen quadrature, and the depth location of tomographic image 311,312 and 313 is set on degree of depth axle 301.In the example of Fig. 4, degree of depth axle 301 is illustrated as the sloping shaft with predetermined angular.The frames images dotting in Fig. 4 illustrates the locational virtual display list face 304 of image display surface at display.Virtual display list face 304 appears at the depth location 301a on degree of depth axle 301.The position of the front side of degree of depth axle 301 is represented as leading edge position 302.
In the example of Fig. 4, the long ground floor image 311 apart from view appears at the inner side depth location 301b on degree of depth axle 301, and middle second layer image 312 apart from view appears at the depth location 301c of the most close centre of degree of depth axle 301.The 3rd tomographic image 313 of short distance view appears at the depth location 301d of the front side of the virtual display list face 304 on degree of depth axle 301.In this example, the tomographic image 313 of short distance view appears at the depth location of the front side of virtual display list face 304.But, the tomographic image 313 of short distance view can than virtual display list face 304 more inner side (from long apart from view more close to).
Like this, when three tomographic images are ready, the tomographic image under original state can be by Lookup protocol to being suitable for the predetermined depth position apart from view as the length of ground floor image 311.Similarly, tomographic image can be by Lookup protocol to the middle predetermined depth position apart from view being suitable for as second layer image 312, and tomographic image can be by Lookup protocol to the predetermined depth position being suitable for as the short distance view of the 3rd tomographic image 313.
As shown in Figure 4, generating three-dimensional image, thus the object a in tomographic image 311,312 and 313 is placed on the depth location of the tomographic image in virtual three-dimensional space to g.
But when the position of horizontal line c is set to long during apart from the relevant horizontal line position 321 of the tomographic image 311 of view, the image section of long horizontal line position 321 downsides in the tomographic image 311 of view is arranged in three dimensions medium dip and to front side projection.That is, the image section of horizontal line c downside becomes the surface 303 of inclination, and when this image section enters into downside, the surface 303 of this inclination is protruding gradually towards front side from the depth location 301b forward edge position 302 of tomographic image 311.In the example of Fig. 4, the object g on the road of drawing at the downside of the horizontal line position 321 of tomographic image 311 is placed on inclined surface 303.
In the example of Fig. 4, inclined surface 303 is towards lead edge portion 302 projections of the front side of the 3rd tomographic image 313.But inclined surface 303 can be towards virtual display list face 304 projections.In this case, the layer (the 3rd tomographic image 313) of front side is towards the front side projection of the terminating point of the front side of inclined surface 303.
Like this, for the long ground floor image 311 apart from view, carrying out the setting of executive level line and when carrying out for the object of other tomographic images 312 and 313 arranging of mating with horizontal line, each object is placed on inclined surface 303 automatically and suitably.Or, process to remove the improper part of object.To the specific examples of horizontal line setting and the processing relevant to horizontal line be described below.
Next, the example that in 3-D view converting unit 114, image is converted to the processing of 3-D view will be described in.
Fig. 5 illustrates the figure that the ordinary two-dimensional image that comprises multilayer is converted to the design of 3-D view.In aspect shown in Fig. 5, the two dimensional image that comprises multilayer (plane picture 251,252 and 253) is converted into the left-eye image 250L that eye image 250R that user watches with right eye and user watch with left eye.But, in Fig. 5, not shown above-mentioned horizontal processing.The drafting position that 3-D view converting unit 114 is calculated eye image and left-eye image, to generate eye image 250R and left-eye image 250L according to two dimensional image.
Next, will a kind of ad hoc approach that calculates the drafting position of eye image and left-eye image be described.
Fig. 6 to Fig. 8 illustrates the exemplary plot that the ordinary two-dimensional image that comprises multilayer is converted to the example of 3-D view.Fig. 6 show shown in Fig. 7 from the two dimensional image that comprises three layers, generate eye image and left-eye image time coordinate transform, and show when three layers of coordinate transform inside more time than display surface, as shown in Figure 8.Fig. 6 schematically shows the state while watching every one deck and display surface from top.
As shown in Figure 6, presuppose distance E and the virtual viewing distance L between left eye and right eye.Layer depth D1, D2 between 3-D view converting unit 114 use virtual display list faces 259 and layer and D3 carry out for the projection coordinate transform of eye image with for the projection coordinate transform of left-eye image.Use projection coordinate transform, calculate the location of pixels of the object of the every tomographic image in virtual display list face 259.
Like this, the projection coordinate transform that 3-D view converting unit 114 is carried out about virtual display list face 259, and image processing equipment 100 can be converted to 3-D view by the ordinary two-dimensional image that comprises multilayer.
The degree of depth that Fig. 6 is every one deck to the example of Fig. 8 is set to the example darker than the degree of depth of virtual display list face 259.But, can be set to the degree of depth in the front side of virtual display list face 259 by every one deck.
Fig. 9 and Figure 10 show the example that two dimensional image is converted to 3-D view when the image of the layer of short distance view is placed on the front side of virtual display list face 259 '.Fig. 9 schematically shows the coordinate transform when generating eye image and left-eye image from the two dimensional image that comprises three layers as shown in figure 10 under the state of upwards watching.
As shown in Figure 9, when the object of the image of the layer of the front side of virtual display list face 259 ' is projected onto virtual display list face 259 ', the location of pixels of eye image and left-eye image is reversed to the location of pixels in the layer darker than virtual display list face 259 ' of left side and right side.
Under the situation of this example, as shown in figure 10, obtain 3-D view, wherein, see the object e of placement on the layer of short distance view and the front side that f protrudes into display surface.
The demonstration example of 1-3 editing screen
Next, the image of describing the required use editing screen of generating three-dimensional image is generated and processed.
Editing screen is the screen as shown in Figure 2 that the processing based in editing screen generation unit 118 shows on image-display units 140, and the process flow diagram of Figure 11 shows the process of processing in editing screen generation unit 118.Referring now to Figure 11, be described.First, determine whether input block 130 receives user and operate to show editing screen (step S11).In this case, when not receiving that user operates, keep waiting status, and when the user who receives demonstration editing screen operates, the processing (step S12) of the demonstration 3D editing screen shown in execution graph 2.Under this state, in 3D editing screen, all layers is all superimposed and be shown in the shown intermediate image generating in processing.In addition, show the multilayer thumbnail corresponding with the number of plies.
In addition, determine whether to receive the operation (step S13) of selecting the multilayer thumbnail of any demonstration by input block 130.When not receiving this operation, keep waiting status, and the image of editing screen can not be modified.
In addition,, when receiving the operation of selecting arbitrary layer of thumbnail, the image of the layer corresponding with the layer thumbnail of choosing is displayed on (step S14) on editing screen.But, in this case, in the image of every layer shows, reduce other layers image display brightness and show the image of every layer simultaneously, and whole 3-D view is shown to identify.
In step S14, show under the state of image of certain layer, determine whether to exist by selecting a layer thumbnail this layer to be changed to the operation (step S15) of another layer.In this case, when determining that existence changes to the operation of another layer by this layer, carry out the more changed handling (step S16) of the layer showing on editing screen, process and get back to step S14, and show the image of the layer after changing.
When determining at step S15, do not exist when this layer changed to the operation of another layer, to determine whether to exist the operation (S17) that demonstration is changed to the overlapping demonstration of all layers.In this case, when determine existing when this demonstration is changed to the operation of overlapping demonstration of all layers, this demonstration is modified the overlapping demonstration into all layers, and processes the 3D editing screen of getting back in step S12 and show.
In step S17, when determining, do not exist when this demonstration is changed to the operation of overlapping demonstration of all layers, continue the screen display of choosing layer in execution step S14.
Next, with reference to Fig. 2 and Figure 12, the specific demonstration example of editing screen is described to Figure 14.Fig. 2 shows the example of the editing screen under the superimposed state with showing of intermediate image that generates all layers in processing.In editing screen, at core, for relatively large image, come the intermediate image of executive editor in processing to show 203.In the example of Fig. 2, as edited image, show 203, the image when showing that three tomographic images 251 to 253 shown in Fig. 3 are superimposed and showing.For the image that comprises three tomographic images 251,252 and 253 is described, for edited image, show that 203 upper edge is assigned to overlapping and illustrates that three layers of marginal portion are ground floor marginal portion 203a, second layer marginal portion 203b and the 3rd layer of marginal portion 203c.At the marginal portion of three images 203a, in 203c, the marginal portion of the image corresponding with choosing label (for example marginal portion 203a) is displayed on front side, and the marginal portion of other images is displayed on inner side.
In addition, select to show that a plurality of labels 201,202 of image and the intermediate image in 211 to 213 editing and processing that are placed in editing screen show 203 upside.When existing while selecting the user of corresponding label display position to operate by touch screen operation etc., label 201,202 and 211 to 213 demonstrations are assigned to the image of choosing label.
Description is assigned to the image of each label.Multilayer thumbnail label 211,212 and 213 is the labels that show separately each layer.Especially, ground floor thumbnail label 211 is the labels that show ground floor image, and second layer thumbnail label 212 is the labels that show second layer image, and the 3rd layer of thumbnail label 213 is the labels that show the 3rd tomographic image.In thumbnail label 211 to 213, the image of each layer is reduced and be shown as thumbnail image.Therefore,, when adjusting every tomographic image, also adjust in the same way thumbnail image.
Near layer thumbnail label 211,212 and 213 labels 201 that show, are the labels that add tomographic image.That is,, when label 201 is selected, newly adds tomographic image and show the newly editing screen of the tomographic image of interpolation.
In addition, being placed as the 3D state thumbnail label 202 that the intermediate image that more approaches in editing and processing shows the right side of 203 upside is the labels that show the degree of depth state of every tomographic image.When 3D state thumbnail label 202 is selected, the display screen of the degree of depth state shown in Fig. 4 is exaggerated and is displayed on edited image and shows 203 position.In 3D state thumbnail label 202, the display screen of the degree of depth state shown in Fig. 4 is reduced and show.For the demonstration of dwindling in 3D state thumbnail label 202, adjust degree of depth state, and displaying contents is adjusted to change according to this.
Now the intermediate image in the editing and processing of the editing screen shown in description Fig. 2 is shown to 203 periphery.In edited image, show 203 left end, place a plurality of buttons 221 to 225.In this example, prepared file and imported button 221, camera catching press-button 222, seal button 223, character load button 224 and deep operations button 225.
If there is the user's operation that imports button 221 for select File, start for importing the processing of the document image of preparing at image storage unit 120 or external memory storage.
If exist for selecting user's operation of camera catching press-button 222, start to catch from being connected to the camera apparatus of equipment the processing of view data.
In seal button 223 and character load button 224, by user, the operation of each button is started for inputting the process of ready figure or character.
If there is user's operation of selected depth action button 225, degree of depth adjustment screen is displayed in edited image demonstration 203.To the Graphics Processing of degree of depth adjustment screen be described below.
As shown in Figure 2, generate the right-hand member that start button 241, save button 242 and 3-D display button 243 are displayed on edited image demonstration 203.Generating start button 241 is buttons that indication generates new 3-D view.Save button 242 refers to the button that is shown in the intermediate image in storage generation processing in image storage unit 120.3-D display button 243 is to switch in edited image to show the 3-D display of image and the button of common display (2D demonstration) showing on 203.
The downside that shows 203 right-hand member in edited image, shows paintbrush tool 290.Paintbrush tool 290 in this example shows the first paintbrush 291, the second paintbrush 292, the 3rd paintbrush 293 and erasing rubber 294.If there is user's operation of the display position of selecting paintbrush 291,292 and 293, can draw lines by color and the line style of type of distributing to each paintbrush.If there is user's operation of the display position of selecting erasing rubber 294, the lines of drawing are wiped free of.Paintbrush tool 290 can have supports other draftings or the function of wiping.
As shown in Figure 2, in edited image, show that 203 lower end provides target display unit 231 to 237.In a plurality of object display units 231 to 237, for example, show seven up-to-date objects that user uses.In the example of Fig. 2, show each object that generates the intermediate image in processing.Fig. 2 shows overlapping in edited image shows 203 and shows the example of the image of each layer.Even while showing other images in edited image shows 203, also near showing 203, edited image carries out same demonstration.In the editing screen of Fig. 2, do not provide especially label to select the every tomographic image shown in Fig. 2 to carry out overlapping overlapping demonstration.But, can separate with the label for every tomographic image and prepare for selecting the label of overlapping demonstration.When providing label to select every tomographic image to carry out overlapping overlapping demonstration, on this label, show the thumbnail image that the superimposed images of every tomographic image are wherein reduced and show.Therefore,, when adjusting every tomographic image, according to this, change to adjust the wherein reduced thumbnail image with showing of superimposed images.
Figure 12 shows in edited image and shows the example that shows the 3rd tomographic image (image 253 of the short distance view in Fig. 3) on 203.By user, operated and selected the 3rd layer of thumbnail label 213, the image of carrying out the 3rd layer shows.
When the image 253 that shows the 3rd layer, object e and f in the image that shows the 3rd layer with the color arranging and brightness.After reducing display brightness, the object of the image of other layers is superimposed and show.That is, only have the image of the 3rd layer be highlighted and show, and the image of other layers is become ash and is shown.
In image in intermediate image demonstration 203 by executable operations in editing and processing, add object or by user, drawn under the state shown in Figure 12, can generate or edit the image of the 3rd layer.
Figure 13 shows the example that shows the image (the middle image 252 apart from view in Fig. 3) of the second layer in edited image shows 203.By user, operated and selected second layer thumbnail label 212, the image of carrying out the second layer shows.
When showing the image 252 of the second layer, the object d in the image that shows the second layer with the color arranging and brightness.After reducing display brightness, the object of the image of other layers is superimposed and show.
In image in intermediate image demonstration 203 by executable operations in editing and processing, add object or by user, drawn under the state shown in Figure 13, can generate or edit the image of the second layer.
Figure 14 shows the example that shows the image (length in Fig. 3 is apart from the image 251 of view) of ground floor in edited image shows 203.By user, operated and selected ground floor thumbnail label 212, the image of carrying out ground floor shows.
When showing the image 251 of ground floor, object a, c and g in the image that shows ground floor with the color arranging and brightness.When showing, after reducing display brightness, the object of the image of other layers is superimposed and show.
In image in intermediate image demonstration 203 by executable operations in editing and processing, add object or by user, drawn under the state shown in Figure 14, can generate or edit the image of ground floor.
In the demonstration example of Figure 12 to 14, the image of the layer of choosing is highlighted and shows, and other layer become ash and show.Meanwhile, as edited image, the object of the image in the layer of only choosing can be shown, and the object of the image of other layers can not be shown.
The 1-4 degree of depth is adjusted the demonstration example of screen
Next, the flow process of the degree of depth adjustment processing of every tomographic image is described with reference to the process flow diagram of Figure 15.
For example, by user, operating the deep operations button 225 of selecting in the editing screen shown in Fig. 2 starts degree of depth adjustment and processes.That is, as shown in figure 15, editing screen generation unit 118 is determined the operation (step S21) that whether has selected depth action button 225 under the state that shows editing screen.When not there is not the operation of selected depth action button 225, keep waiting status until operation exists.
In step S21, when determining the operation that has selected depth action button 225, as the intermediate image in editing and processing, show that 203 carry out the overlapping demonstration of the image of all layers, and the intermediate image in editing and processing shows 203 upside display depth bar (step S22).Degree of depth bar is the scale that every layer depth is shown.As shown in this example, when carrying out configurable deep bar with three tomographic images, by degree of depth bar, show the depth location of three layers.In edited image, show 203 downside, display depth is adjusted button.The specific example that shows will be described below.But, in this example, for every one deck, prepare and show picture depth is moved to the adjustment button of inner side and the adjustment button that picture depth is moved to front side.
Editing screen generation unit 118 determines whether to exist user's operation (step S23) of any adjustment button.In this case, when determining, there is not the operation of adjusting button, continue the demonstration of execution step S22.When determining, have the operation of adjusting button, the intermediate image that the image of the layer corresponding with the action button of operation is used as in editing and processing shows that 203 show (step S24).According to the operational scenario of adjusting button, change to the set depth location of image of equivalent layer, and change according to corresponding position the depth location (step S25) that degree of depth bar shows.Changed described setting or described demonstration in step S25 after, described demonstration turns back to the demonstration of step S22.But intermediate image in editing and processing shows that 203 can be only the demonstration of operated layer, until next operation exists.
Figure 16 is the figure that the demonstration example of degree of depth bar is shown.When there is the operation of the deep operations button 225 of editing screen, as shown in figure 16, as the intermediate image in editing and processing, show that 203 carry out the overlapping demonstration of the image of all layers, and degree of depth bar 401 is displayed on the upside of edited image demonstration 203.
In this example, in a degree of depth bar 401, show the depth location of the image of three layers.That is, in degree of depth bar 401, by changing that Show Color illustrates the depth location 401a of the image of ground floor, the depth location 401c of the image of the depth location 401b of the image of the second layer and the 3rd layer.
In giving the scale of degree of depth bar 401, " 0 " represents the depth location of virtual display list face, and indicate by negative value inner side, and front side is used on the occasion of indicating (not providing the demonstration of positive sign).
In edited image, show 203 downside, for every tomographic image, show the button of adjusting depth location.Especially, show the degree of depth adjustment button 412 that the degree of depth is moved to the degree of depth adjustment button 411 of front side and the degree of depth is moved to inner side, as the adjustment button of ground floor image.Demonstration moves to the degree of depth adjustment button 421 of front side by the degree of depth and the degree of depth is moved to the degree of depth adjustment button 422 of inner side, as the adjustment button of second layer image.Demonstration moves to the degree of depth adjustment button 431 of front side by the degree of depth and the degree of depth is moved to the degree of depth adjustment button 432 of inner side, as the adjustment button of the 3rd tomographic image.
If exist selected depth to adjust user's operation of the display position of button 411 to 432, change the setting of the degree of depth of every tomographic image.For example, by the degree of depth, adjust the operation of button 411 and 412 and change the depth location of ground floor image, and as shown in arrow La, carry out the demonstration of the depth location 401a of the ground floor image in mobile degree of depth bar 401.
In addition, by the degree of depth, adjust the operation of button 421 and 422 and change the depth location of second layer image, and as shown in arrow Lb, carry out the demonstration of the depth location 401b of the second layer image in mobile degree of depth bar 401.
In addition, by the degree of depth, adjust the operation of button 431 and 432 and change the depth location of the 3rd tomographic image, and as shown in arrow Lc, carry out the demonstration of the depth location 401c of the 3rd tomographic image in mobile degree of depth bar 401.
The degree of depth is adjusted button 411 to 432 and is limited in the position close with the depth location of the image of adjacent layer.For example, the close position of depth location of the image of the scope of the mobile La of ground floor image from the darkest position to the second layer with adjacent.
Figure 17 shows another demonstration example of degree of depth bar.
In this example, in edited image shows 203, the image (being the image of the 3rd layer in this example) of the layer of choosing is highlighted and shows.In degree of depth bar 501, only emphasize that the depth location 502 of the layer of demonstration is shown as degree of depth bar 501.Under the situation of the example of Figure 17, in degree of depth bar 501, show the entire depth scope being arranged by image processing equipment 100, and show the position display 503 of virtual display list face.
Under the situation of the example of Figure 17, for the degree of depth, adjust button, only show as choosing the degree of depth of the image of the 3rd layer of layer to adjust button 511 and 512.The operation adjustment of adjusting button 511 and 512 by the degree of depth is emphasized the depth location of the image of the layer that shows, and as shown in arrow Ld, changes the display position of the depth location 502 of degree of depth bar 501.
Figure 18 shows another demonstration example of degree of depth bar.
In the example of Figure 18, similar with the example of Figure 17, the image (being the image of the second layer in this example) of the layer of choosing is highlighted and is presented at edited image and shows in 203.In degree of depth bar 601, only with scale, show that the degree of depth of this tomographic image can controlled scope.Depth location 602 is displayed in degree of depth bar 601.That is, in the degree of depth of second layer image, inner side is limited in the depth location of ground floor image, and front side is limited in the depth location of the 3rd tomographic image.The degree of depth is limited and the scope adjusted becomes the scale indication range of degree of depth bar 601.
Therefore,, when depth location is adjusted in the operation of adjusting button 611 and 612 by the degree of depth, as shown in arrow Le, depth location 602 moves in the scope of the degree of depth bar 601 showing.
Figure 19 shows another demonstration example of degree of depth bar.
In the example of Figure 19, intermediate image in editing and processing shows the overlapping demonstration of carrying out the image of all layers in 203, and at the depth location 702 of the controlled tomographic image of the degree of depth shown in degree of depth bar 701.In addition the position 703 of indication virtual display list face.Similar with the example of Figure 16, provide the degree of depth to adjust button 711,712,721,722,731 and 732 for every layer, and as shown in arrow Lf, by push-botton operation, adjust the degree of depth and change depth location 702.
When the degree of depth of certain layer is adjusted in the operation of adjusting button 711,712,721,722,731 and 732 by the degree of depth, in showing 203, edited image indicates by the frames images 704 in four corners the position of adjusting the image of compatible layer with the degree of depth.
At the almost center of image, carry out the demonstration 705 of the numerical value (being " 25 ") of the setting position that is used to indicate the degree of depth in this example.
In this way, can carry out can be from the demonstration that shows that the image recognition degree of depth arranges.
1-5. ground surface arranges the demonstration example of screen
Next, with reference to the process flow diagram of Figure 20, carry out the flow process of the ground surface set handling of Description Image.
The button (not shown in FIG.) of ground surface set handling in the editing screen shown in user's application drawing 2 starts while ground surface being set with indication.That is, as shown in figure 20, showing under the state of editing screen, editing screen generation unit 118 determines whether to exist the operation (step S31) that ground surface is set in order to indication.When the operation of ground surface being set not existing, keep waiting status until corresponding operation exists.
In step S31, when determining that existence arranges the operation of ground surface, as the intermediate image in editing and processing, show that 203 carry out the overlapping demonstration of the image of all layers, and the intermediate image in editing and processing shows that one end of 203 vertically shows the slider bar (step S32) of adjusting for horizontal line.On the slider bar for horizontal line adjustment, show and be used to indicate the sliding handle of horizontal line position, and determine whether to exist the drag operation (step S33) of sliding handle.
When determining while there is the operation of sliding handle, according to this, operate the more changed handling (step S34) of the position of executive level line.According to the change of horizontal position, the horizontal downside of the image of la m (ground floor) is set on three-dimensional inclined surface.The setting of inclined surface is the processing of having described in Fig. 4, and inclined surface is corresponding to ground surface.
Next, determine whether to be provided with for removing the pattern (step S35) at the downside of the ground surface of the image of the layer except ground floor.In this case, described when removing the pattern of downside of ground surface when being provided with, remove become except ground floor layer the locational object (step S36) of downside of inclined surface (ground surface) of image.
When determining in step S35, do not arrange describedly in order to remove the pattern of the downside of ground surface, to determine in the object of the image of each layer whether have the object (step S37) that is set to be placed on ground surface.In this case, when determine existing while being set to be placed on the object on ground surface, the lower end position of corresponding object is adjusted to the position (step S38) intersecting with the inclined surface wherein existing in the image of layer of this object.
When determining and do not have drag operation in step S33, after the processing of execution step S36 and S38 and in step S37, determine while there is not the object that is set to be arranged on ground surface, process the horizontal line slider bar processing of getting back to step S32.
Next, the specific demonstration example of adjusting horizontal time will be described in.
Figure 21 shows the intermediate image of adjusting in the editing and processing that bar 261 is displayed on editing screen as the horizontal line of horizontal line slider bar and shows the example in 203.At horizontal line, adjust in bar 261, indication ground surface setting position shows 262.In this example, ground surface setting position shows that 262 mate with the ground wire c drawing in the image of ground floor.By user, operate to carry out this matching treatment.
Like this, by the ground surface of horizontal downside that horizontal line arranges ground floor image being set as the inclined surface shown in Fig. 4.The position of ground surface can be displayed in the image image of the 3rd layer (second layer and) of the layer except ground floor.
For example, as shown in figure 22, the ground surface position display 271 when the image 252 that represents the second layer with dotted line intersects with ground surface (inclined surface), shows 203 as the intermediate image in the editing and processing of editing screen.By this, show that the demonstration of base area surface location determines that whether the position of the object (being the object d of tree in this example) in the image 252 of this layer is suitable.That is, almost mate with the ground surface position display 271 shown in dotted line the lower end of the object d of tree, and obtain suitable 3-D view.Meanwhile, the lower end of the object d of tree is at the upside of the ground surface position display 271 shown in dotted line, and this tree is floated.Downside when the lower end of object d in ground surface position display 271, tree sinks in ground surface.As a result, under two kinds of situations, all obtain factitious 3-D view.As shown in figure 22, place of execution surface location shows that 271 avoid image to become factitious 3-D view effectively.
Figure 23 shows another demonstration example of the position of ground surface.In the example of Figure 23, only have the upside of the position intersecting with ground surface to be shown as the image of the second layer, and the downside of ground surface is shown as non-displaypart 272(black display part).In Figure 23, for example, after reducing brightness, show the image of the 3rd layer of the front side of the second layer.But object e and f in the image of the 3rd layer may not be shown.
Figure 24 shows following example, wherein at the object as the downside of the ground surface of inclined surface in as long layer apart from the ground floor of view, is removed and carries out demonstration as the intermediate image demonstration 203 in the editing and processing of editing screen.The processing of Figure 24 is corresponding to the processing in the step S36 of the process flow diagram of Figure 20.
In the example of Figure 24, as the bottom of the object e of the dog in the 3rd layer of short distance view, become the downside of ground surface.At this moment, under the state being removed in the bottom of the object e of the downside of ground surface, show this object e.
In this way, the object that becomes the downside of ground surface can not be shown, thereby can avoid object when dimensionally watching the image of generation to appear at naturally not the showing of downside of ground surface.The part of the object shown in Figure 24 removes processing and can in 3 d image display, carry out, and corresponding object can be shown completely when the image of every layer is shown separately.
Figure 25 shows the lower end of the object in every tomographic image and the ground surface of inclined surface mates and show that as intermediate image in the editing and processing of editing screen 203 carry out the example of the function screen under the situation of demonstration.The processing of Figure 25 is corresponding to the processing in the step S38 of the process flow diagram of Figure 20.
In the example of Figure 25, show the function screen relevant to the object e of dog as in the 3rd layer of short distance view.In this example, as shown in figure 25, near object e display position movable button 281 and 282, return push-button 283, remove button 284 and ground surface and adjust button 285.
User's executable operations is selected each button, and the position of object e is adjusted.In this case, when the operation of selecting ground surface to adjust button 285 exists, the processing that mate on 118 execution of editing screen generation unit automatically intersect the lower end position of object e surface with same ground surface.
Therefore, by the selectively surperficial button 285 of adjusting, corresponding object e is placed on ground surface automatically, and can generates suitable 3-D view.
1-6. catches the example of camera image
Figure 26 and Figure 27 show the example that adopts camera image in generating the intermediate image of processing.
For example, if existed in order to select the operation of the camera catching press-button 222 in the editing screen shown in Fig. 2, as shown in figure 26, camera is caught function screen 810 and is displayed on editing screen.By catch the operation of function screen 810 with camera, carry out the processing that the external camera equipment (or memory device of storage camera image) from being connected to image processing equipment 100 reads camera image.In addition, as shown in figure 26, by this, read to carry out the demonstration of camera captures images 811.By camera, catch operation in function screen 810 and obtain the extraction image 812 while removing background from camera captures images 811.
By being placed on the image of random layer extracting image 812, as shown in figure 27, extract an object in the intermediate image that image 812 can be placed as intermediate treatment.User catches with camera the depth location that function screen 810 selects wherein to place the layer that extracts image.Or camera captures images can be placed on the layer of front side (short distance view) automatically.
1-7. imports the example of image file
Figure 28 to 30 shows the example that imports document image in generating the intermediate image of processing.
For example, if existed for selecting the file of the editing screen shown in Fig. 2 to import the operation of button 221, as shown in figure 28, View Image File import operation screen 820 in editing screen.By carrying out with the operation of image file import operation screen 820 processing that the image file of storing reads the view data of choosing from assigned address.In addition, as shown in figure 28, by this, read to show the image 821 of importing.As shown in figure 29, by the operation in image file import operation screen 820, obtain the extraction image 822 of extracting section the image 821 from importing.
By by extracting image 822 and be placed on the image of arbitrary layer, as shown in figure 30, extract image 822 and can be placed as an object in the image generating.In this case, user selects to place the depth location of the layer that extracts image with image file import operation screen 820.
The demonstration example of the list of 1-8. synthetic image
The 3-D view that uses editing screen to generate in above-mentioned processing is stored in the image storage unit 120 of image processing equipment 100.Can on a screen, show the list of the data of stored 3-D view.
Figure 31 shows the example of the list of the image that demonstration generates.In this example, the image 11,12,13 of generation ... reduced and show.In this case, in the demonstration of each image, can show two dimensional image or 3-D view.
In the row that show at the image generating, the number of plies of carrying out the number of plies of indicating with figure shows 11a, 12a, 13a ...In this case, as showing the number of plies with figure, when the number of plies is three, show three figure that image is superimposed.
The list of the image generating by demonstration, the image of generation can easily be selected.The image of selecting can be displayed in the editing screen shown in Fig. 2, and can executive editor's work.
The specific examples of 1-9. hardware configuration
Next, with reference to Figure 32, describe according to the specific examples of the hardware configuration of the image processing equipment 100 of this example.Figure 32 shows the example that image processing equipment 100 is configured to signal conditioning package (for example computer installation).
CPU901 is as operational processes device and control device, and all or part of controlling in image processing equipment 100 according to each program of storage in ROM903, RAM905, memory storage 919 and removable recording medium 927 operates.Program or operating parameter that ROM903 storage can be used by CPU901.RAM905 is mainly stored in the program used in the execution of CPU901 or the parameter of appropriate change in commission.These devices are connected to each other by the host bus 907 of the internal bus configuration by as cpu bus.
With the voice output of for example display device, for example loudspeaker and the earphone of liquid crystal indicator, plasm display device, EL display device and lamp or such as device, the mobile phone of printing equipment and can visually or with the facsimile recorder of sound notification user institute obtaining information configure output unit 917.The result that each processing that output unit 917 outputs are carried out by image processing equipment 100 is obtained.Especially, display device shows by image processing equipment 100 and carries out the result that each processing is obtained with text or image.Meanwhile, the sound signal that voice output configures the voice data with reproducing or audible data is converted to simulating signal, and exports this simulating signal.
For example, in display device, provide image capture apparatus 818, and image processing equipment 100 can be caught with image capture apparatus 918 user's still image or moving image.Image capture apparatus 918 comprises charge-coupled device (CCD) imageing sensor or complementary metal oxide semiconductor (CMOS) (CMOS) imageing sensor, and the light of camera lens gathering is converted to electric signal, and can catch still image or moving image.
The example that can realize according to the hardware configuration of the function of the image processing equipment 100 of this example has been described.Each assembly can configure with general object parts, or can configure with the hardware that is exclusively used in independent assembly function.Therefore, can suitably change the hardware using in configuration according to the various skill level that realize this embodiment.
Can generate and carry out according to the program of each performed treatment step of the image processing equipment 100 of this example (software), this program can be deployed in general object computer installation, and can carry out identical processing.Program can be stored in different media, and can from server end, download to computer installation by internet.
It will be understood by those skilled in the art that and can depend on design requirement and other because of usually realizing various adjustment, combination, sub-portfolio or change, need only them in the scope of appended claims and equivalent thereof.
Notice that following configuration drops in the scope of the present disclosure.
(1), comprising:
Image generation unit, it generates a plurality of plane pictures and is separately positioned on depth direction to the pseudo range of generated a plurality of plane pictures;
3-D view converting unit, the pseudo range of a plurality of plane pictures that generate to image generation unit based on set is converted to 3-D view by a plurality of plane pictures, is provided with the locus of the object in each in a plurality of plane pictures in this 3-D view;
3-D view generation unit, its output is by the data of the 3-D view of described 3-D view converting unit conversion;
Editing screen generation unit, it is independent or in overlapping mode, show a plurality of plane pictures that generated by image generation unit, and by the demonstration data that provide label to generate respectively shown editing screen to plane picture; And
Input block, it receives operation and generates or edit the image in the editing screen being generated by editing screen generation unit.
(2) image processing equipment as described in claim (1),
Wherein, the editing screen being generated by editing screen generation unit comprises the label for the demonstration thumbnail image of each plane picture, plane picture corresponding with each label in thumbnail image is reduced, and operates in the demonstration plane picture corresponding with any label on editing screen by the selection of any label in input block.
(3) image processing equipment as described in claim (1) or (2),
Wherein, the editing screen being generated by editing screen generation unit comprises for showing the label of thumbnail image, reduced at the image of the pseudo range of a plurality of plane pictures shown in thumbnail image.
(4) as the image processing equipment as described in any in claim (1)-(3),
Wherein, the distance scale of pseudo range setting and the operating position of the pseudo range by input block operation planar image that plane picture are shown are displayed on the editing screen being generated by editing screen generation unit.
(5) if claim (1) is to the image processing equipment as described in any in (4),
Wherein, under distance scale, the distance of a plurality of plane pictures arranges to be distinguished and is shown, and prepares described operating position for each plane picture.
(6) if claim (1) is to the image processing equipment as described in any in (5),
Wherein, under distance scale, carry out in order to indicate the demonstration of the position of virtual display list face.
(7) if claim (1) is to the image processing equipment as described in any in (6),
Wherein, described 3-D view converting unit is converted to 3-D view by a plurality of plane pictures, and the image section of downside that is set to the horizontal line position of a specific plane image in a plurality of plane pictures in this 3-D view becomes the inclined surface of changing gradually according to pseudo range.
(8) if claim (1) is to the image processing equipment as described in any in (7),
Wherein, the horizontal line position scale that the setting of horizontal line position is shown is displayed on the editing screen being generated by editing screen generation unit, and changes by the horizontal line position shown in horizontal line position scale by receive operation in input block.
(9) if claim (1) is to the image processing equipment as described in any in (8),
Wherein, except the plane picture described specific plane image on the editing screen being generated by editing screen generation unit, show with 3-D view in the position that intersects of inclined surface.
(10) if claim (1) is to the image processing equipment as described in any in (9),
Wherein, for the plane picture except described specific plane image, described 3-D view converting unit remove become with 3-D view in the object of downside of the position that intersects of inclined surface.
(11) if claim (1) is to the image processing equipment as described in any in (10),
Wherein, described image generation unit by the lower end of the appointed object in the plane picture except specific plane image be set to same 3-D view in the position of the inclined surface location matches of intersecting.
(12), comprising:
Image control unit, it is controlled at the image showing on display unit;
3-D view generation unit, its generating three-dimensional image, in this 3-D view, arranges respectively the locus of the object of a plurality of plane pictures according to a plurality of plane pictures on depth direction with pseudo range; And
Input block, it receives operation from user;
Wherein, described image control unit shows a plurality of plane pictures and the reduced thumbnail image of superimposed images, in these superimposed images, on depth direction, in mode overlapping on predetermined angular, carry out display plane image, and, when input block receive to be selected the order of thumbnail image, described image control unit is can editing mode to show the plane picture corresponding with choosing thumbnail image or superimposed images, together with the thumbnail image of a plurality of plane pictures and superimposed images.
(13) image processing equipment as described in claim (12),
Wherein, described image control unit comes display plane image and superimposed images in overlapping mode, and shows respectively the thumbnail image as the label of a plurality of plane pictures and superimposed images.
(14) image processing equipment as described in claim (12) or (13),
Wherein, described image control unit also display label carry out the interpolation of receiving plane image.
(15) if claim (12) is to the image processing equipment as described in any in (14),
Wherein, when the thumbnail image corresponding with superimposed images is selected, described image control unit shows superimposed images and only shows the label do not choose plane picture on front surface, and
Input block receives input and change the plane picture chosen in the plane picture the showing pseudo range on depth direction in superimposed images.
(16) if claim (12) is to the image processing equipment as described in any in (15),
Wherein, when the thumbnail image corresponding to plane picture is selected, described image control unit shows a screen, wherein on front surface, only highlights the image of the plane picture in this screen.
(17) if claim (12) is to the image processing equipment as described in any in (16),
Wherein, when the thumbnail image corresponding to plane picture is selected, described image control unit shows a screen, wherein on front surface, makes the image of the unchecked plane picture in this screen become ash.
(18) if claim (12) is to the image processing equipment as described in any in (17),
Wherein, described image control unit shows for the distance scale of the pseudo range of plane picture on depth direction is shown, and display depth change operation input block, to operate in the plane picture that shows under can the editing mode pseudo range on depth direction by input block, and
When display plane image under can editing mode, described input block is accepted operation and is generated or edit the image in plane picture, and acceptance operates with the degree of depth change operation that input block is relevant.
(19) if claim (12) is to image processing equipment as described in any in (18),
Wherein, described degree of depth change operation input block has the pseudo range on depth direction by the plane picture showing under can editing mode and moves to the button of front side and the button that this pseudo range is moved to inner side.
(20) if claim (12) is to the image processing equipment as described in any in (19),
Wherein, described degree of depth change operation input block be with can editing mode to be presented at the object that plane picture in distance scale is corresponding.
(21) if claim (12) is to the image processing equipment as described in any in (20),
Wherein, described image control unit display horizontal line change operation input block, to operate in the horizontal line position arranging in the plane picture can editing mode to show by input block, and
When display plane image under can editing mode, described input block is accepted to operate with horizontal line change the operation that input block is relevant.
(22) if claim (12) is to the image processing equipment as described in any in (21),
Wherein, described image control unit is not presented at the image of the downside of the horizontal line position arranging in the plane picture showing under can editing mode.
(23) if claim (12) is to the image processing equipment as described in any in (22),
Wherein, described 3-D view generation unit is not reflected in the image of the downside of the horizontal line position arranging in each plane picture to generated 3-D view.
(24) if claim (12) is to the image processing equipment as described in any in (23),
Wherein, when plane picture corresponding to the thumbnail image with choosing edited, described image control unit executivecontrol function is to be reflected in edited result on the thumbnail image of superimposed images.
(25) if claim (12) is to the image processing equipment as described in any in (24), also comprise:
Display unit.
(26), comprising:
Generate a plurality of plane pictures, and be separately positioned on depth direction to the pseudo range of generated a plurality of plane pictures;
The pseudo range to generated a plurality of plane pictures based on set is converted to 3-D view by a plurality of plane pictures, is provided with the locus of the object in each in a plurality of plane pictures in this 3-D view;
The data of the 3-D view of output conversion;
Show separately or overlappingly generated a plurality of plane pictures, and by the demonstration data that provide label to generate respectively shown editing screen to plane picture; And
Accept operation to generate or the image of editor in the editing screen being generated.
(27) make a program for computing machine carries out image processing, this program is carried out computing machine:
Generate a plurality of plane pictures, and be separately positioned on depth direction to the pseudo range of generated a plurality of plane pictures;
The pseudo range to generated a plurality of plane pictures based on set is converted to 3-D view by a plurality of plane pictures, is provided with the locus of the object in a plurality of plane pictures in this 3-D view;
The data of the 3-D view of output conversion;
Show separately or overlappingly generated a plurality of plane pictures, and by the demonstration data that provide label to generate respectively shown editing screen to plane picture; And
Accept operation to generate or the image of editor in the editing screen being generated.
List of numerals
A shows object to f
11,12,13 images that generate
11a, 12a, the 13a number of plies show
100 image processing equipments
110 image generation/processing units
112 image generation units
114 3-D view converting units
116 3-D view generation units
118 editing screen generation units
120 image storage unit
130 input blocks
140 image-display units
201 labels
202 3D state thumbnail labels
Image in 203 editing and processing shows
203a ground floor marginal position
203b second layer marginal position
The 3rd layer of marginal position of 203c
211 to 213 layers of thumbnail label
221 files import button
222 camera catching press-buttons
223 seal buttons
224 character load buttons
225 deep operations buttons
231 to 237 object display units
241 generate start button
242 save buttons
243 3-D display buttons
250L left-eye image
250R eye image
251,251 ' ground floor image
252,252 ' second layer image
253, the 253 ' the 3rd tomographic image
259,259 ' display surface
261 horizontal lines are adjusted bar
262 ground surface setting positions show
Ground surface position display in 271 layers
272 non-displayparts
281,282 position movable buttons
283 return push-buttons
284 remove button
285 ground surfaces are adjusted button
290 paintbrush tools
291 first paintbrushes
292 second paintbrushes
293 the 3rd paintbrushes
294 erasing rubbers
301 degree of depth axles
301a is to 301d depth location
302 leading edge positions
303 ground surface setting positions
304 virtual display list faces
311 ground floor images
312 second layer images
313 the 3rd tomographic images
321 horizontal line positions
401 degree of depth bars show
401a, 401b, 401c layer position
411,412,421,422,431,432 degree of depth are adjusted button
501 degree of depth bars show
502 virtual display list face positions
503 layers of position
511,512 degree of depth are adjusted button
601 degree of depth bars show
603 layers of position
611,612 degree of depth are adjusted button
701 degree of depth bars show
702 virtual display list face positions
703 layers of position
704 degree of depth frames
705 depth values show
711,712,721,722,731,732 degree of depth are adjusted button
810 cameras are caught function screen
811 camera captures images
812 extract image
820 image file import operation screens
821 images that import
822 extract image
Claims (27)
1. an image processing equipment, comprising:
Image generation unit, it generates a plurality of plane pictures and is separately positioned on depth direction to the pseudo range of generated a plurality of plane pictures;
3-D view converting unit, the pseudo range of a plurality of plane pictures that generate to image generation unit based on set is converted to 3-D view by described a plurality of plane pictures, is provided with the locus of the object in each in a plurality of plane pictures in this 3-D view;
3-D view generation unit, its output is by the data of the 3-D view of described 3-D view converting unit conversion;
Editing screen generation unit, it is independent or in overlapping mode, show a plurality of plane pictures that generated by image generation unit, and by the demonstration data that provide label to generate respectively shown editing screen to plane picture; And
Input block, it receives operation to generate or the image of editor in the editing screen being generated by editing screen generation unit.
2. image processing equipment as claimed in claim 1,
Wherein, the editing screen being generated by editing screen generation unit comprises the label for the demonstration thumbnail image of each plane picture, and plane picture corresponding with each label in thumbnail image is reduced, and
Selection by any label in input block operates in the demonstration plane picture corresponding with any label on editing screen.
3. image processing equipment as claimed in claim 2,
Wherein, the editing screen being generated by editing screen generation unit comprises for showing the label of thumbnail image, reduced at the image of the pseudo range of a plurality of plane pictures shown in thumbnail image.
4. image processing equipment as claimed in claim 1,
Wherein, the distance scale of pseudo range setting and the operating position of the pseudo range by input block operation planar image that plane picture are shown are displayed on the editing screen being generated by editing screen generation unit.
5. image processing equipment as claimed in claim 4,
Wherein, under distance scale, the distance of a plurality of plane pictures arranges to be distinguished and is shown, and prepares described operating position for each plane picture.
6. image processing equipment as claimed in claim 4,
Wherein, under distance scale, carry out in order to indicate the demonstration of the position of virtual display list face.
7. image processing equipment as claimed in claim 1,
Wherein, described 3-D view converting unit is converted to 3-D view by a plurality of plane pictures, and the image section of downside that is set to the horizontal line position of a specific plane image in a plurality of plane pictures in this 3-D view becomes the inclined surface of changing gradually according to pseudo range.
8. image processing equipment as claimed in claim 7,
Wherein, the horizontal line position scale that the setting of horizontal line position is shown is displayed on the editing screen being generated by editing screen generation unit, and changes by the horizontal line position shown in horizontal line position scale by receive operation in input block.
9. image processing equipment as claimed in claim 8,
Wherein, except the plane picture described specific plane image on the editing screen being generated by editing screen generation unit, show with 3-D view in the position that intersects of inclined surface.
10. image processing equipment as claimed in claim 8,
Wherein, for the plane picture except described specific plane image, described 3-D view converting unit remove become with 3-D view in the object of downside of the position that intersects of inclined surface.
11. image processing equipments as claimed in claim 8,
Wherein, described image generation unit by the lower end of the appointed object in the plane picture except specific plane image be set to same 3-D view in the position of the inclined surface location matches of intersecting.
12. 1 kinds of image processing equipments, comprising:
Image control unit, it is controlled at the image showing on display unit;
3-D view generation unit, its generating three-dimensional image arranges respectively the locus of the object of a plurality of plane pictures in this 3-D view according to a plurality of plane pictures on depth direction with pseudo range; And
Input block, it receives operation from user;
Wherein, described image control unit shows wherein a plurality of plane pictures and the reduced thumbnail image of superimposed images, in these superimposed images, on depth direction, in mode overlapping on predetermined angular, carrys out display plane image, and,
When input block receive to be selected the order of a thumbnail image, described image control unit is can editing mode to show the plane picture corresponding with choosing thumbnail image or superimposed images, together with the thumbnail image of a plurality of plane pictures and superimposed images.
13. image processing equipments as claimed in claim 12,
Wherein, described image control unit comes display plane image and superimposed images in overlapping mode, and shows respectively the thumbnail image as the label of a plurality of plane pictures and superimposed images.
14. image processing equipments as claimed in claim 13,
Wherein, described image control unit also display label carry out the interpolation of receiving plane image.
15. image processing equipments as claimed in claim 12,
Wherein, when the thumbnail image corresponding with superimposed images is selected, described image control unit shows superimposed images and only shows the label do not choose plane picture on front surface, and
Input block receives input and change the plane picture chosen among the plane picture the showing pseudo range on depth direction in superimposed images.
16. image processing equipments as claimed in claim 12,
Wherein, when the thumbnail image corresponding to plane picture is selected, described image control unit shows a screen, wherein on front surface, only highlights the image of the plane picture in this screen.
17. image processing equipments as claimed in claim 12,
Wherein, when the thumbnail image corresponding to plane picture is selected, described image control unit shows a screen, wherein on front surface, makes the image of the unchecked plane picture in this screen become ash.
18. image processing equipments as claimed in claim 12,
Wherein, described image control unit shows for the distance scale of the pseudo range of plane picture on depth direction is shown, and display depth change operation input block, to operate in the plane picture that shows under can the editing mode pseudo range on depth direction by input block, and
When display plane image under can editing mode, described input block is accepted operation and is generated or edit the image in plane picture, and acceptance operates with the degree of depth change operation that input block is relevant.
19. image processing equipments as claimed in claim 18,
Wherein, described degree of depth change operation input block has the pseudo range on depth direction by the plane picture showing under can editing mode and moves to the button of front side and the button that this pseudo range is moved to inner side.
20. image processing equipments as claimed in claim 18,
Wherein, described degree of depth change operation input block be with can editing mode to be presented at the object that plane picture in distance scale is corresponding.
21. image processing equipments as claimed in claim 12,
Wherein, described image control unit display horizontal line change operation input block, to operate in the horizontal line position arranging in the plane picture can editing mode to show by input block, and
When display plane image under can editing mode, described input block is accepted to operate with horizontal line change the operation that input block is relevant.
22. image processing equipments as claimed in claim 21,
Wherein, described image control unit is not presented at the image of the downside of the horizontal line position arranging in the plane picture showing under can editing mode.
23. image processing equipments as claimed in claim 21,
Wherein, described 3-D view generation unit is not reflected in the image of the downside of the horizontal line position arranging in each plane picture to generated 3-D view.
24. image processing equipments as claimed in claim 13,
Wherein, when plane picture corresponding to the thumbnail image with choosing edited, described image control unit executivecontrol function is to be reflected in edited result on the thumbnail image of superimposed images.
25. image processing equipments as claimed in claim 12, also comprise:
Display unit.
26. 1 kinds of image processing methods, comprising:
Generate a plurality of plane pictures, and be separately positioned on depth direction to the pseudo range of generated a plurality of plane pictures;
The pseudo range to generated a plurality of plane pictures based on set is converted to 3-D view by a plurality of plane pictures, is provided with the locus of the object in each in a plurality of plane pictures in this 3-D view;
The data of the 3-D view of output conversion;
Show separately or overlappingly generated a plurality of plane pictures, and by the demonstration data that provide label to generate respectively shown editing screen to plane picture; And
Accept operation to generate or the image of editor in the editing screen being generated.
27. 1 kinds of programs that make computing machine carries out image processing, this program is carried out computing machine:
Generate a plurality of plane pictures, and be separately positioned on depth direction to the pseudo range of generated a plurality of plane pictures;
The pseudo range to generated a plurality of plane pictures based on set is converted to 3-D view by a plurality of plane pictures, is provided with the locus of the object in a plurality of plane pictures in this 3-D view;
The data of the 3-D view of output conversion;
Show separately or overlappingly generated a plurality of plane pictures, and by the demonstration data that provide label to generate respectively shown editing screen to plane picture; And
Accept operation to generate or the image of editor in the editing screen being generated.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2011-126792 | 2011-06-06 | ||
JP2011126792A JP2012094111A (en) | 2010-09-29 | 2011-06-06 | Image processing device, image processing method and program |
PCT/JP2012/001012 WO2012169097A1 (en) | 2011-06-06 | 2012-02-16 | Image processing apparatus, image processing method, and program |
Publications (1)
Publication Number | Publication Date |
---|---|
CN103582903A true CN103582903A (en) | 2014-02-12 |
Family
ID=47296843
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201280026377.3A Pending CN103582903A (en) | 2011-06-06 | 2012-02-16 | Image processing apparatus, image processing method, and program |
Country Status (3)
Country | Link |
---|---|
EP (1) | EP2718905A1 (en) |
CN (1) | CN103582903A (en) |
WO (1) | WO2012169097A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108885377A (en) * | 2018-06-14 | 2018-11-23 | 京东方科技集团股份有限公司 | Show equipment and its driving method |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11989398B2 (en) * | 2021-10-22 | 2024-05-21 | Ebay Inc. | Digital content view control system |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08227464A (en) * | 1994-12-21 | 1996-09-03 | Sanyo Electric Co Ltd | Method for generating dummy three-dimensional dynamic image |
US20020018065A1 (en) * | 2000-07-11 | 2002-02-14 | Hiroaki Tobita | Image editing system and method, image processing system and method, and recording media therefor |
CN1679346A (en) * | 2002-08-29 | 2005-10-05 | 夏普株式会社 | Device capable of easily creating and editing a content which can be viewed in three-dimensional way |
WO2010062623A2 (en) * | 2008-10-27 | 2010-06-03 | Microsoft Corporation | Child window surfacing and management |
CN102053786A (en) * | 2009-10-30 | 2011-05-11 | 索尼公司 | Information processing device, image display method, and computer program |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4449183B2 (en) * | 2000-07-11 | 2010-04-14 | ソニー株式会社 | Image editing system, image editing method, and storage medium |
JP2006293598A (en) * | 2005-04-08 | 2006-10-26 | Canon Inc | Document processing system |
JP2009054018A (en) * | 2007-08-28 | 2009-03-12 | Ricoh Co Ltd | Image retrieving device, image retrieving method, and program |
-
2012
- 2012-02-16 CN CN201280026377.3A patent/CN103582903A/en active Pending
- 2012-02-16 WO PCT/JP2012/001012 patent/WO2012169097A1/en active Application Filing
- 2012-02-16 EP EP12796143.1A patent/EP2718905A1/en not_active Withdrawn
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08227464A (en) * | 1994-12-21 | 1996-09-03 | Sanyo Electric Co Ltd | Method for generating dummy three-dimensional dynamic image |
US20020018065A1 (en) * | 2000-07-11 | 2002-02-14 | Hiroaki Tobita | Image editing system and method, image processing system and method, and recording media therefor |
CN1679346A (en) * | 2002-08-29 | 2005-10-05 | 夏普株式会社 | Device capable of easily creating and editing a content which can be viewed in three-dimensional way |
WO2010062623A2 (en) * | 2008-10-27 | 2010-06-03 | Microsoft Corporation | Child window surfacing and management |
CN102053786A (en) * | 2009-10-30 | 2011-05-11 | 索尼公司 | Information processing device, image display method, and computer program |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108885377A (en) * | 2018-06-14 | 2018-11-23 | 京东方科技集团股份有限公司 | Show equipment and its driving method |
US11126027B2 (en) | 2018-06-14 | 2021-09-21 | Boe Technology Group Co., Ltd. | Display apparatus and driving method thereof |
CN108885377B (en) * | 2018-06-14 | 2021-12-24 | 京东方科技集团股份有限公司 | Display device and driving method thereof |
Also Published As
Publication number | Publication date |
---|---|
EP2718905A1 (en) | 2014-04-16 |
WO2012169097A1 (en) | 2012-12-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
TWI477141B (en) | Image processing apparatus, image processing method, and computer program | |
JP6482580B2 (en) | Information processing apparatus, information processing method, and program | |
US10271038B2 (en) | Camera with plenoptic lens | |
CN101296386B (en) | Picture processing apparatus and method | |
JP5851625B2 (en) | Stereoscopic video processing apparatus, stereoscopic video processing method, and stereoscopic video processing program | |
EP2662801B1 (en) | Method and system for augmented reality | |
JPWO2013054462A1 (en) | User interface control device, user interface control method, computer program, and integrated circuit | |
CN109120869A (en) | Double light image integration methods, integration equipment and unmanned plane | |
WO2013108285A1 (en) | Image recording device, three-dimensional image reproduction device, image recording method, and three-dimensional image reproduction method | |
EP3196838A1 (en) | An apparatus and associated methods | |
US10860166B2 (en) | Electronic apparatus and image processing method for generating a depth adjusted image file | |
KR20190133867A (en) | System for providing ar service and method for generating 360 angle rotatable image file thereof | |
CN103582903A (en) | Image processing apparatus, image processing method, and program | |
US20130167086A1 (en) | Digital image processing apparatus and method of controlling the same | |
JP5812127B2 (en) | Image playback device | |
CN111078085B (en) | Information processing apparatus, control method of information processing apparatus, and storage medium | |
JP4184061B2 (en) | Stereoscopic image editing apparatus and stereoscopic image editing program | |
US20170026638A1 (en) | Autostereoscopic display system | |
KR20200091196A (en) | Method for augmented reality contents data optimization using user feedback | |
JP6680273B2 (en) | Image playback device | |
KR102239877B1 (en) | System for producing 3 dimension virtual reality content | |
JP5904221B2 (en) | Image playback device | |
JP5560964B2 (en) | Image processing apparatus, image processing method, projection apparatus, and program | |
JP4995966B2 (en) | Image processing apparatus, method, and program | |
EP2541945A2 (en) | Method and system for utilizing an image sensor pipeline (ISP) for 3D imaging processing utilizing Z-depth information |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C02 | Deemed withdrawal of patent application after publication (patent law 2001) | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20140212 |