JP2003250039A - Image processing apparatus, image processing method, and recording medium - Google Patents

Image processing apparatus, image processing method, and recording medium

Info

Publication number
JP2003250039A
JP2003250039A JP2002046972A JP2002046972A JP2003250039A JP 2003250039 A JP2003250039 A JP 2003250039A JP 2002046972 A JP2002046972 A JP 2002046972A JP 2002046972 A JP2002046972 A JP 2002046972A JP 2003250039 A JP2003250039 A JP 2003250039A
Authority
JP
Japan
Prior art keywords
image
area
region
scale
outer edge
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2002046972A
Other languages
Japanese (ja)
Inventor
Akira Mimori
Tsuneyo Sano
Hikari Takigasaki
明 三森
常世 佐野
光 瀧ヶ崎
Original Assignee
Tokyo Electric Power Co Inc:The
東京電力株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tokyo Electric Power Co Inc:The, 東京電力株式会社 filed Critical Tokyo Electric Power Co Inc:The
Priority to JP2002046972A priority Critical patent/JP2003250039A/en
Publication of JP2003250039A publication Critical patent/JP2003250039A/en
Application status is Pending legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/40Scaling the whole image or part thereof
    • G06T3/403Edge-driven scaling

Abstract

(57) [Summary] [Image] An image processing in which display information of an image displayed in an original image is not erased from an image after the enlargement processing even when a part of the original image is enlarged. An apparatus, an image processing method, and a recording medium are provided. An image processing unit (13) enlarges an image of a region (R1) specified by an operation unit (12) in correspondence with a region (R2) including the region (R1), and has an outer edge contour including the region (R2) and an inner edge contour. Enlarging and deforming means for transforming the image of the region R3, which is the outline of the region R1, in correspondence with the region R4 whose outer edge is the outer edge of the region R3 and whose inner edge is the outline of the region R2; The image G2 subjected to the enlargement process corresponding to the region G, the image G4 subjected to the deformation process corresponding to the region R4, and the region R of the original image G5
2 and an image synthesizing means for synthesizing an image of an area other than the area R4.

Description

Description: BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to an image processing apparatus, in which even if a part of an original image is enlarged, display information of an image displayed in the original image is enlarged. The present invention relates to an image processing device, an image processing method, and a recording medium that are not erased from a subsequent image. In recent years, with the spread of computers, images are frequently displayed on a display and used. In an application having an image display function or an image creation / processing application, a user may reduce an image and display the entire image, or conversely, enlarge an image and display a part of the image in detail. For example, when displaying a map, reduced information is displayed when information of a wide area is to be obtained, and enlarged display is required when information of a narrow area is to be obtained. Normally, the magnification is switched so that the display is performed at the desired magnification. [0004] For example, when a map is displayed on a screen, when switching from a reduced screen to an enlarged screen is performed, information of a wide area displayed on the reduced screen is displayed. Many are hidden. Therefore, to see the hidden image information again,
It is necessary to return the screen display from the enlarged screen display to the reduced screen display, or to scroll the enlarged screen to display a desired portion. In the case where various system diagrams, various flow charts, various design drawings, etc. are referred to by a computer or the like in addition to a normal map, enlarged display and reduced display are performed on the display in the same manner as described above, and the scale of the display is reduced. Is performed. On the other hand, images of different scales can be displayed simultaneously in different areas on the screen. In this case,
Since two display areas must be secured on one screen, the display area of each display area must be reduced. There is also known a technique in which a part of an area where an image is displayed is specified, and the specified area is enlarged and pasted on an original image for display. In this display technique, as shown in FIG. 1A, it is assumed that a region R1 (indicated by a circle in FIG. 1A) is designated among the regions where the original image G5 is displayed. In this case, the designated region R1
The image G1 is clipped, and as shown in FIG. 1 (b), this image G1 is enlarged to produce an enlarged image G2.
This enlarged image G2 is pasted on the original image G5. However, when the enlarged image G2 is directly pasted on the original image G5 as shown in FIG.
As shown in (d), of the information of the original image G5, the region R3 sandwiched between the outer edge outline of the region R1 and the outer edge outline of the region R2.
(Areas indicated by hatching) are erased. SUMMARY OF THE INVENTION The present invention has been made in view of the above problems, and has as its object to display an enlarged image of an area in an original image without damaging information of the original image. An object of the present invention is to provide a processing device, an image processing method, and a recording medium. [0010] In order to achieve the above object, an image processing apparatus according to the present invention comprises an image display unit for displaying an original image, and an image display unit for displaying an image on the image display unit. An image processing apparatus comprising: an operation unit having an interface (point means such as a mouse and a raster pen) for designating a partial region; and an image processing unit for enlarging the region designated by the operation and displaying the enlarged image on the image display unit. is there. The image processing unit enlarges an image of the first area specified by the operation unit in correspondence with a second area including the first area, and an outer edge outline includes the second area. An image of a third area whose inner edge contour is the contour of the first area is modified to correspond to a fourth area whose outer edge contour is the outer edge contour of the third area and whose inner edge contour is the contour of the second area. Enlarging / deforming means for processing; an image subjected to enlarging processing corresponding to the second area; an image subjected to deforming processing corresponding to the fourth area;
Image synthesizing means for synthesizing an area and an image of an area excluding the fourth area. For example, in a guide map or the like, by enlarging and displaying only the vicinity of the destination without losing the peripheral information, it is easy to check from the overall rough process to the detailed process near the destination. Also, in a normal image, partial emphasis is facilitated by enlarging only a part. Even with text data, it is possible to make small characters easier to read by partially enlarging the text data. In the image processing apparatus according to the present invention, the enlarging / deforming means may be configured such that a scale and / or a scale change rate of the image deformed corresponding to the fourth area is reduced and / or changed at the outer edge contour portion. Alternatively, the deformation processing is performed on the image of the third area so as to be continuous with the scale change rate and the scale and / or the scale change rate of the image subjected to the enlargement processing corresponding to the second area at the inner edge contour portion. be able to. As a result, the image at the boundary between the second region and the third region and the boundary between the second region and the region not subjected to the enlargement / deformation process change smoothly, so that the image of the present invention becomes difficult to see. Such inconvenience does not occur. According to the image processing method of the present invention, an image of a first area designated by an operator is converted into a second image including the first area.
Enlarging the image in accordance with the area; and an image of a third area in which an outer edge outline includes the second area and an inner edge outline is the outline of the first area, and an outer edge outline is an outer edge outline of the third area. A step of deforming the inner edge contour corresponding to a fourth area that is the contour of the second area; an image subjected to enlargement processing corresponding to the second area; and a deforming processing corresponding to the fourth area A step of combining the processed image and an image of a fifth region excluding the second region and the fourth region in the original image. In the image processing method according to the present invention, in the deformation processing step, the scale and / or the scale change rate of the image deformed corresponding to the fourth area may be set to the scale and the scale of the original image at the outer edge contour portion. And / or transforming the image of the third area so as to be continuous with the scale change rate and / or the scale and / or the scale change rate of the image subjected to the enlargement processing corresponding to the second area at the inner edge contour portion. Can be applied. Further, the recording medium of the present invention stores a program for executing each of the above processes. The above program can be applied to a computer application such as a database, or can be incorporated as a plug-in such as a WWW browser. Further, in each of the above-mentioned inventions,
“Scale and / or scale change rate” means “at least one of scale and scale change rate”. And
"In principle, the scale and / or scale change rate deformed corresponding to the fourth area is continuous with the scale and / or scale change rate of the original image in the outer edge contour portion."
"The reduced scale deformed corresponding to the fourth area and the scale of the original image in the outer edge contour part are continuous" and "The scale change rate deformed corresponding to the fourth area is the outer edge contour part. Continuing with the scale change rate of the original image "
Thus, the scales correspond to each other and the scale change rates correspond to each other. This is the same for the inner edge contour. However, between the outer edge contour and the inner edge contour,
It is not always necessary to make "scale" correspond to "scale change rate". That is, for example, the scale may be continuous at the outer edge contour portion, and the scale change rate may be continuous at the inner edge contour portion (or vice versa). Of course, it is preferable to unify them in performing arithmetic processing. FIG. 2 shows a map database system using the image processing apparatus of the present invention. 2, the image processing apparatus 1 includes a display unit 11, an operation unit 12,
An image processing unit 13 and a memory 14 (ROM 14a, RAM
14b), and a mass storage device 15. The display unit 11 displays an original image (map). The operation unit 12 is used when specifying a predetermined area of the map in the present embodiment. The image processing unit 13 can clip a predetermined area of the original image, and perform enlargement processing and deformation processing of the clipped image. The ROM 14a stores image processing programs such as a map creation processing program and a scale conversion processing program. In this embodiment, the RAM 14b can be used as a temporary storage area and a work area for various calculation results. Mass storage device 15
Stores a large number of maps in the form of a database (DB). The basic operation of the image processing apparatus 1 shown in FIG. 2 will be described with reference to the flowchart of FIG. 3 and the image processing explanatory diagrams of FIGS. 4 and 5. First, the image processing unit 13
The data of the original image G5 to be processed is stored in the large-capacity storage device 1.
5 is called into the RAM 14b, and the original image G5 is displayed on the display unit 11. The image processing unit 13
Information (image G) of a region R1 (see FIG. 4A-1) to be enlarged in the original image G5 specified by the user from the operation unit 12
1) (ST101). The user can use an input device such as a mouse or a pen for this input. The image processing unit 13 acquires the enlargement ratio information input by the user from the operation unit 12, and
It is stored in M14b (ST102). Further, the image processing unit 13 obtains an area R2 when the specified area R1 is enlarged from the information on the specified area R1 and the enlargement ratio information specified by the user, and stores this area information in a predetermined area of the RAM 14b. . Next, the image processing unit 13 determines that the outer edge contour is the contour of the area R2, the inner edge contour is the contour of the area R1 (see FIG. 1D), and the outer edge contour is the area R2.
3 is an outline including the outer edge outline, and the inner edge outline is a region R
Region R4, which is the outer contour of region 3 (ie, the outline of region R2), is specified (ST103: see FIG. 4 (b-1)).
Here, the size of the outer edge contour of the region R4 may be fixedly set (for example, the size is a predetermined multiple of the size of the region R2), or a setting value is designated to the user from the operation unit 12. You may make it do. Thereafter, the image processing section 13 acquires data of all figures, characters, symbols, etc. on the original image G5 (here, on the mesh including the region R2 and the region R4) (ST104). The data acquired here is, for example in the case of a power transmission and distribution system diagram, the management number of a diagram such as electric wires and cables, the configuration coordinates, the type, the thickness, etc. Coordinates, constituent coordinates, types, etc., management numbers of character string data, base coordinates, font types, sizes, etc. The image processing unit 13 defines a new mesh other than the currently displayed mesh on the memory, and separately copies the acquired data on the mesh into the RAM 14b (S14).
T105) The subsequent processing (coordinate conversion calculation) can be performed on the copied mesh. If the result of the coordinate conversion is not stored, the subsequent processing may be performed on the currently displayed mesh in the following steps.
There is no need to separately copy to 4b. In this embodiment, the currently displayed mesh is also stored in a predetermined area of the RAM 14b. Hereinafter, the image of the mesh to be processed stored in the RAM 14b is referred to as a work mesh. Next, regarding the work mesh, the region R
Then, the image data included in No. 1 and the data included in the region R3 are obtained (ST106). That is, the image processing unit 1
3 is an image G1 of the region R1 as shown in FIG.
4 (a-2), (a-3)
(Enlargement process) to create an enlarged image G2. Also, as shown in FIG. 4 (b-1), the image G3 of the region R3 is clipped, and is mapped (deformed) to the region R4 as shown in FIGS. 4 (b-2) and (b-3). (ST107), the transformed image G
Create 4. For objects that do not need to worry about shape deformation due to coordinate transformation, such as polygons and line data such as buildings and electric wires and cables, the image of the region R1 corresponds to the region R2 based on a calculation formula described later. Then, the image of the region R3 is deformed so as to correspond to the region R4. The image processing unit 13 calculates a scale value for each point of the region R2 and the region R4 for the polygon and the line (ST108). Here, the scale of the region R2 is a single value over the region. Further, the scale of the region R4 is a value that is continuous with the scale of the original image at the outer edge contour, and is a value that is continuous with the scale of the image subjected to the enlargement processing corresponding to the region R4 at the inner edge contour. The scale change rate can also be continuous with the scale change rate of the original image at the outer edge contour portion, and can be made continuous with the scale change rate of the image subjected to the enlargement processing corresponding to the region R2 at the inner edge contour portion. For the polygons and lines on the image mapped to the regions R2 and R4, the thicknesses of the polygons and lines are set according to the calculated scale value (ST1).
09). For example, if the line width of a polygon or a line is
2 and without depending on the scale value in the region R4,
The line width can be set so that the appearance has a constant thickness. On the other hand, if the object is a character or a symbol (for example, a symbol), steps 107 to 107
Step 110 to Step 1 instead of Step 109
12 is performed. That is, for an object, such as a power distribution equipment symbol or character string data, whose shape is distorted due to the coordinate conversion and has a visual effect, the base coordinates (for example, , The destination of the center coordinates or the coordinates of the upper left corner of the character or symbol is calculated (ST11).
0). With respect to the characters and symbols after the coordinate conversion, the scale values when they are arranged in the regions R2 and R4 are calculated (ST111). Then, for characters / symbols newly formed in the regions R2 and R4, the line width of the characters / symbols is set according to the calculated scale value (ST11).
2). In addition, the shape of the character or the symbol can be set so that the appearance of the character or the symbol becomes a normal shape without depending on the scale value in the region R2 and the region R4. The images G2 and G4 that have been subjected to the enlargement processing and the deformation processing in steps 107 to 109 and 110 to 112 are mapped to the corresponding areas of the work mesh (ST113), and are replaced with the currently displayed mesh. Display the work mesh (ST11
4). When the mesh is copied in step 105, the data on the work mesh is registered in the DB so that the data can be read as needed. FIG. 5A shows an image obtained by mapping the enlarged image G2 and the deformed image G4 on the original image G5. When performing the deformation process on the region R3, FIG.
As shown in the scale of the image and the rate of change of the scale are continuous with the scale of the original image and the rate of change of the scale of the outer edge,
The scale and the rate of change of the scale of the image G2 that has been subjected to the enlargement processing in correspondence with the region R2 in the inner edge contour portion can be made continuous. Further, when performing the deformation process on the image of the region R3, as shown in FIG. 5C, the scale change rate of the image becomes discontinuous with the scale change rate of the original image at the outer edge contour portion, and the image at the inner edge contour portion. It is also possible to set the scale so that the scale change rate of the image G2 subjected to the enlargement processing corresponding to the region R2 is discontinuous. An embodiment in which the regions R1 to R4 are similar ellipses will be described below with reference to FIG. In addition,
For reference, FIGS. 6 (b-1), (b-2), and (b-
The regions R1, R2, R3, and R4 are extracted and shown in 3) and (b-4). Ellipse A: outline of region R1, inner edge outline of region R3 Ellipse B: outline of region R2, inner edge outline of region R4 C: outer edge outline of region R3, outer edge outline of region R4 O (Xo, Yo): origin of ellipse p: major axis length of the ellipse A q: minor axis length of the ellipse A s × p, s × q: major axis length of the ellipse B, minor axis length t × p, t × q: major axis length of the ellipse C, short Axial length M (Xm, Ym): Arbitrary point I (Xi, Yi) in region R1: Coordinate N (Xn, Yn) when pixel of coordinate M (Xm, Ym) of region R1 is re-mapped to region R2 ): Any point J (Xj, Yj) in the region R3: Coordinates P (Xp, Yp) when the pixel at the coordinates N (Xn, Yn) of the region R3 is re-mapped to the region R4: Long axis of the ellipse A End point Q (Xq, Yq): short axis end point of ellipse A S (Xs, Ys): long axis end point of ellipse B T (Xt, Yt): long axis end of ellipse C Straight line K: Straight line passing through origin O and point M Straight line L: Straight line A passing through origin O and point N (Xa, Ya): Contact point B between ellipse A and straight line L (Xb, Yb): Contact point between ellipse B and straight line L C (Xc, Yc): Contact point between the ellipse C and the straight line L θ: Angles p and q generated between the major axis of the ellipse and a straight line parallel to Y = 0 are (1-1), (1-2) ) Formula is calculated. [Mathematical formula-see original document] Θ is the point O (Xo, Yo) and the point P (X
(2-1), (2-p)
2), (2-3) and (2-4). [Mathematical formula-see original document] An algorithm for expanding the region R1 to the region R2 will be described below. The ellipse A that is the outline of the region R1 is expressed by Expression (3-1), and the ellipse R2 that is the outline of the region R2 (an enlarged region of the region R1) is expressed by Expression (3-2). (Equation 3) Here, any coordinates M included in the region R1
(Xm, Ym) is included in the ellipse A, and the expression (4) is satisfied. [Equation 4] When the pixels on the region R1 are re-mapped to the region R2, the pixel on the coordinates M (Xm, Ym) is
And on a straight line K passing through the coordinates M, in a direction away from the origin O according to the distance from the origin O. FIG.
In (a), the pixel on the coordinate M (Xm, Ym) is the coordinate I (Xm
(i, Yi) is shown (mapped). The pixels in the region R1 on the origin O do not move even if the mapping is performed again. Further, a region R on the ellipse A
When the pixel 1 is re-mapped, it moves to the intersection of the straight line K and the ellipse B. The coordinates I (Xi, Yi) are expressed by equation (5). ## EQU5 ## As a result, a pixel at an arbitrary coordinate M (Xm, Ym) included in the region R1 is moved to the coordinate I in the region R2.
(Xi, Yi) (FIG. 6 (b-
1), (b-2)). Next, a case where pixels of the image in the region R3 are mapped to the region R4 (here, deformed) will be described below. An ellipse C slightly larger than the ellipse B
Is given by equation (6). [Mathematical formula-see original document] An arbitrary pixel (coordinate N) included in region R3
(Xn, Yn)). Since this pixel is outside the ellipse A and inside the ellipse C, the equation (7-1) and (7-
Equation 2) holds. [Mathematical formula-see original document] Mapping the pixels included in the region R3 to the region R4 means that the pixel at the coordinates N (Xn, Yn) is far from the origin O on a straight line L passing through the origin O and the coordinates N (Xn, Yn). Moving in a certain direction. If the pixel is on the ellipse A (the coordinates N (Xn, Yn) are on the ellipse A), the pixel moves on the ellipse B, and if the pixel is on the ellipse C, the pixel does not move. Also, as described with reference to FIG. 5, the scale and the rate of change of the image of the region R4 are continuous with the scale and the rate of change of the original image at the outer edge contour portion, and are enlarged corresponding to the region R2 at the inner edge contour portion. The scale and the rate of change of the scale of the processed image can be made continuous. When the deformation process is performed on the region R3, the scale change rate of the image becomes discontinuous at the outer edge contour and the scale change rate of the original image, and the enlargement process is performed corresponding to the area R2 at the inner edge contour. The scale can also be set so as to be discontinuous with the scale change rate of the made image. Assuming that the coordinates of the point at which the pixel at the coordinates N (Xn, Yn) on the area 3 has moved are J (Xj, Yj), Xj, Yj are
It is expressed by the equations (8-1) and (8-2). [Equation 8] Next, FIGS. 7A to 7F show an example of a process when the user specifies the region R1 and an example of an operation when the user specifies the region R2. First, the user designates the end point of the area to be enlarged by clicking the mouse (FIG. 7A). Next, the mouse is clicked and dragged to the other end point of the area to be enlarged, and when the end point is reached, the drag is released (FIG. 7B). At this time, the ellipse is displayed by default, and the two end points of the long axis and the one end point (middle point) of the ellipse are specified (FIG. 7C). Then, one end point (middle point) of the short axis is dragged up and down with the mouse clicked to adjust the size of the ellipse A. At this time, the ellipse A is determined at the point where the mouse click is released. Thus, the region R1 is determined (FIG. 7D). Thereafter, a rectangle surrounding the ellipse A is displayed, and 4
The end points of the corners are specified (FIG. 7E). Then, while dragging any one of the four corners in the enlargement direction with the mouse clicked, the size of the ellipse to be enlarged is adjusted. At this time,
The ellipse B is determined at the point where the mouse click is released. Thus, the region R2 is determined (FIG. 7F). The ellipse C is determined by a predetermined enlargement ratio based on the ratio between the ellipse A and the ellipse B. Thus, the regions R1, R2, R3, and R4 are determined. The positions of characters and various symbols (symbols) are specified by pixels serving as base points. When the coordinates of the constituent pixels of the characters and symbols included in the region R1 and the region R3 are converted by the above-described algorithm, the original shape is damaged and distorted. Therefore, in consideration of legibility, characters and symbols are converted at the base point of the above-described algorithm, and are converted to a size corresponding to the scale value at the coordinates of the conversion destination without breaking the shape. For example, assume that there is a square symbol as shown in FIG. The shape is determined by base point coordinates (for example, coordinates a at the upper left corner) and coordinates forming other symbols (for example, coordinates b, c, and d of the remaining three corners). If these coordinates are changed by the above-described algorithm (the four corners after the change are a ', b', c ', d'
As shown in FIG. 8B, the scale is changed depending on the coordinates, so that the characters and symbols are distorted. Then, as shown in FIG. 8 (c), only the base point coordinate a is moved to the point a 'by the above-described algorithm.
Other components move while maintaining the positional relationship according to the scale value of the base point coordinates. As a result, it is possible to move characters and symbols while maintaining the smoothness of appearance. The scale value Z 'at the destination of the base point coordinates is
It is given by equation (9). In equation (9), z is the scale before processing, and s is the scale after processing at the base point. [Mathematical formula-see original document] According to the scale at the destination,
The symbols may be configured. FIG. 9 shows an example of actual image conversion using the apparatus of the above-described embodiment. That is, as shown in FIG. 9A, a predetermined area of the map (the area surrounded by a circle in the figure) displayed on the same scale as the whole.
Is designated, and the designated area is enlarged, and FIG.
As shown in (b), the designated area is enlarged and the surrounding area is compressed and deformed and displayed. As is clear from comparison between FIG. 9A of the original image and FIG. 9B after processing, all the information displayed in the original image is displayed without being hidden. In this example, the area around the enlargement target is changed in a stepwise manner and is not continuous. Of course, they may be continuous. As another example, for example, FIG.
As shown in (a) and (b), a one-dimensional original image may of course be processed. In FIG. 10A, when the character string is enlarged, a part of the original character string is deleted by the enlarged image. However, when the apparatus of the above-described embodiment is used, As shown, the information displayed in the original image is all displayed without being hidden. Since the present invention is configured as described above, even if an enlarged image of an area in the original image is pasted and displayed, the information of the original image is not lost. Therefore, in a guide map or the like, by enlarging and displaying only the vicinity of the destination without losing the surrounding information, it is easy to check from the overall rough route to the detailed route near the destination.
Further, in a normal image, partial emphasis is facilitated by enlarging only a part, and detailed characters of text data are easily read by enlarging the part.

BRIEF DESCRIPTION OF THE DRAWINGS FIG. 1 is a diagram showing a conventional inconvenience associated with image enlargement.
(A) is a diagram illustrating an area to be enlarged in which the original image is displayed, (b) is a diagram illustrating an image obtained by clipping the area to be enlarged, (c) is a diagram in which the enlarged image is pasted on the original image, ( d) is a diagram showing an area of the original image where information is erased due to enlargement. FIG. 2 is a diagram showing a map database system using the image processing device of the present invention. FIG. 3 is a flowchart illustrating a flow of processing of the image processing apparatus of FIG. 2; FIG. 4 is an explanatory diagram of an enlarging process and a deforming process in the image processing apparatus of FIG. 2, and (a-1), (a-2), and (a).
FIG. 3B is a diagram illustrating an enlargement process for the enlargement area, and FIG.
1), (b-2) and (b-3) are diagrams showing a deformation process for a deformation area. 5A and 5B are diagrams illustrating a case where an enlarged image and a deformed image are mapped on an original image, FIG. 5B is a diagram illustrating characteristics of a scale and a scale change rate of the image at the time of deformation processing, and FIG. FIG. 4 is a diagram showing characteristics of the present invention. 6A is a detailed explanatory diagram of an enlargement process and a deformation process in the image processing apparatus of FIG. 2, and FIGS.
2), (b-3), and (b-4) are diagrams showing each area formed by the ellipse in (a). FIGS. 7A to 7F are diagrams illustrating an operation procedure when a user specifies a region to be enlarged formed by an ellipse and a size to be enlarged; FIGS. 8A to 8C are diagrams showing processing for preventing characters and symbols from being distorted by deformation. FIG. 9 is a diagram showing an example of a specific processing result;
(A) is a map before processing, and (b) is a map after processing. FIGS. 10A and 10B are diagrams illustrating another example of a specific processing result, in which FIG. 10A illustrates a character string before processing and FIG. 10B illustrates a character string after processing. [Description of Signs] 1 Image processing device 11 Display unit 12 Operation unit 13 Image processing unit 14 Memory 14a ROM 14b RAM 15 Mass storage device

   ────────────────────────────────────────────────── ─── Continuation of front page    (72) Inventor Hikaru Takigasaki             4-1 Egasakicho, Tsurumi-ku, Yokohama, Kanagawa Prefecture             Tokyo Electric Power Company System Research Laboratory F-term (reference) 5B057 BA24 CA12 CB12 CD05 CE08                 5C076 AA13 AA21 AA23 CA02 CA10                       CB05                 5C082 AA01 BA12 BB13 BB42 CA54                       CA56 CB06 DA22 DA42 DA86                       DA89 MM10

Claims (1)

  1. Claims: 1. An image display unit for displaying an original image, an operation unit having an interface for designating a partial area of the image displayed on the image display unit, and an operation unit designated by the operation. An image processing unit that enlarges an area and displays the image on the image display unit, wherein the image processing unit includes an image of a first area specified by the operation unit, the first area including the image of the first area. The image of the third area whose outer edge outline includes the second area and whose inner edge outline is the outline of the first area is displayed, and the outer edge outline is the outer edge of the third area. Enlargement / deformation processing in which the contour processing is performed in correspondence with the fourth area in which the inner edge outline is the outline of the second area.
    Transforming means, an image subjected to enlargement processing corresponding to the second area,
    An image subjected to a deformation process corresponding to the fourth area;
    An image processing apparatus, comprising: an image synthesizing unit that synthesizes an image of a fifth area excluding the second area and the fourth area in the original image. 2. The enlargement / deformation means, wherein a scale and / or a scale change rate of an image deformed corresponding to the fourth area is continuous with a scale and / or a scale change rate of the original image at an outer edge contour portion. And at the inner edge contour, the second
    2. The image processing apparatus according to claim 1, wherein an image of the third area is subjected to a deformation process so as to be continuous with a scale and / or a scale change rate of the image on which the enlargement processing has been performed corresponding to the area. 3. Enlarging an image of a first area specified by an operator in correspondence with a second area including the first area, wherein an outer edge includes the second area and an inner edge includes The image of the third area, which is the outline of the first area, is obtained by setting the outer edge outline to the third area.
    Deforming the outer edge of the region and the inner edge corresponding to a fourth region that is the outline of the second region; an image subjected to an enlarging process corresponding to the second region;
    An image subjected to a deformation process corresponding to the fourth area;
    An image processing method comprising combining an image of a fifth area excluding the second area and the fourth area in the original image. 4. In the deformation processing step, a scale and / or a scale change rate of an image deformed corresponding to the fourth area is continuous with a scale and / or a scale change rate of the original image at an outer edge contour portion. The image of the third area is subjected to a deformation process so as to be continuous with the scale and / or the scale change rate of the image subjected to the enlargement processing corresponding to the second area in the inner edge contour portion. 3. The image processing method according to 3. 5. A computer-readable recording medium in which a program for executing the processing according to claim 3 or 4 is stored.
JP2002046972A 2002-02-22 2002-02-22 Image processing apparatus, image processing method, and recording medium Pending JP2003250039A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2002046972A JP2003250039A (en) 2002-02-22 2002-02-22 Image processing apparatus, image processing method, and recording medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2002046972A JP2003250039A (en) 2002-02-22 2002-02-22 Image processing apparatus, image processing method, and recording medium

Publications (1)

Publication Number Publication Date
JP2003250039A true JP2003250039A (en) 2003-09-05

Family

ID=28660191

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2002046972A Pending JP2003250039A (en) 2002-02-22 2002-02-22 Image processing apparatus, image processing method, and recording medium

Country Status (1)

Country Link
JP (1) JP2003250039A (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006313511A (en) * 2005-04-07 2006-11-16 Sony Corp Image processor, image processing method and computer program
JP2007048196A (en) * 2005-08-12 2007-02-22 Seiko Epson Corp Image processor, image processing program and image processing method
JP2009043252A (en) * 2007-07-17 2009-02-26 Nagoya Institute Of Technology Map display device and system
US7715656B2 (en) 2004-09-28 2010-05-11 Qualcomm Incorporated Magnification and pinching of two-dimensional images
JP2010243605A (en) * 2009-04-01 2010-10-28 Denso Corp Map display apparatus
JP2011054008A (en) * 2009-09-03 2011-03-17 Suzuki Motor Corp Distorted image correction device and method
JP2012065202A (en) * 2010-09-16 2012-03-29 Toshiba Corp Apparatus, method, and program for processing image
JP2013196009A (en) * 2012-03-15 2013-09-30 Toshiba Corp Image processing apparatus, image forming process, and program
JP2016097925A (en) * 2014-11-26 2016-05-30 国立研究開発法人 海上・港湾・航空技術研究所 Maneuvering support system and craft equipped with the same

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7715656B2 (en) 2004-09-28 2010-05-11 Qualcomm Incorporated Magnification and pinching of two-dimensional images
JP2006313511A (en) * 2005-04-07 2006-11-16 Sony Corp Image processor, image processing method and computer program
JP2007048196A (en) * 2005-08-12 2007-02-22 Seiko Epson Corp Image processor, image processing program and image processing method
JP2009043252A (en) * 2007-07-17 2009-02-26 Nagoya Institute Of Technology Map display device and system
JP2010243605A (en) * 2009-04-01 2010-10-28 Denso Corp Map display apparatus
JP2011054008A (en) * 2009-09-03 2011-03-17 Suzuki Motor Corp Distorted image correction device and method
JP2012065202A (en) * 2010-09-16 2012-03-29 Toshiba Corp Apparatus, method, and program for processing image
JP2013196009A (en) * 2012-03-15 2013-09-30 Toshiba Corp Image processing apparatus, image forming process, and program
JP2016097925A (en) * 2014-11-26 2016-05-30 国立研究開発法人 海上・港湾・航空技術研究所 Maneuvering support system and craft equipped with the same

Similar Documents

Publication Publication Date Title
US5187776A (en) Image editor zoom function
US8830272B2 (en) User interface for a digital production system including multiple window viewing of flowgraph nodes
US7256801B2 (en) Elastic presentation space
KR100209841B1 (en) Visual enhancement method for display
US5619632A (en) Displaying node-link structure with region of greater spacings and peripheral branches
JP2865751B2 (en) Display screen scroll method
JP4777788B2 (en) System and method for dynamically zooming and rearranging display items
US7714859B2 (en) Occlusion reduction and magnification for multidimensional data presentations
DE69732663T2 (en) Method for generating and changing 3d models and correlation of such models with 2d pictures
DE60109434T2 (en) Systems and method for generating visual illustrations of graphical data
CN1322473C (en) Method and system for displaying visual content in virtual three-dimensienal space
Sarkar et al. Graphical fisheye views
US5883635A (en) Producing a single-image view of a multi-image table using graphical representations of the table data
JP4341408B2 (en) Image display method and apparatus
Carpendale et al. A framework for unifying presentation space
EP0694878B1 (en) A method and apparatus for increasing the displayed detail of a tree structure
US7728848B2 (en) Tools for 3D mesh and texture manipulation
US9153062B2 (en) Systems and methods for sketching and imaging
US6870545B1 (en) Mixed but indistinguishable raster and vector image data types
US7472354B2 (en) Graphical user interface having an attached toolbar for drag and drop editing in detail-in-context lens presentations
US6437799B1 (en) Method and apparatus for logical zooming of a directed graph
US5473740A (en) Method and apparatus for interactively indicating image boundaries in digital image cropping
US6577330B1 (en) Window display device with a three-dimensional orientation of windows
RU2377663C2 (en) Dynamic window architecture
EP0887771B1 (en) Method and apparatus for composing layered synthetic graphics filters

Legal Events

Date Code Title Description
A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20050518

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20050524

A02 Decision of refusal

Free format text: JAPANESE INTERMEDIATE CODE: A02

Effective date: 20051004