WO2011083676A1 - オブジェクト処理装置およびオブジェクト選択方法 - Google Patents
オブジェクト処理装置およびオブジェクト選択方法 Download PDFInfo
- Publication number
- WO2011083676A1 WO2011083676A1 PCT/JP2010/072890 JP2010072890W WO2011083676A1 WO 2011083676 A1 WO2011083676 A1 WO 2011083676A1 JP 2010072890 W JP2010072890 W JP 2010072890W WO 2011083676 A1 WO2011083676 A1 WO 2011083676A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- identification information
- identification
- objects
- information
- image
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04802—3D-info-object: information is displayed on the internal or external surface of a three dimensional manipulable object, e.g. on the faces of a cube that can be rotated by the user
Definitions
- the present invention relates to an object processing apparatus and an object selection method, and more particularly, to an object processing apparatus and an object selection method for supporting selection of a desired object from a plurality of objects displayed on a screen by a pointing device. Is.
- a technique for enlarging or reducing an image including such an object is provided.
- the image 104 including a plurality of objects 101 to 103 is reduced, the individual objects 101 to 103 included in the image 101 are reduced, and at the same time, the interval between the objects is reduced. Is done.
- the distance between the objects is smaller than the size of the objects 101 to 103, if the reduction ratio increases, the plurality of reduced objects 101 to 103 may overlap each other.
- the vertical relationship of the overlap occurs according to a predetermined rule. That is, a part of the lower object is hidden under a part of the upper object. In this case, when the mouse cursor is moved to the portion where the two objects overlap, the upper object is selected. Therefore, in order to select the lower object, it is necessary to place the cursor on the lower object and a portion that does not overlap the upper object.
- Patent Document 1 a technique has been proposed in which a desired object can be selected even when an object is completely hidden below another object.
- the mouse cursor is placed on the part where two objects overlap, and when the left button of the mouse is pressed, the upper object is selected, and the right button of the mouse is pressed. It is assumed that the lower object is selected.
- JP-A-9-223241 Japanese Patent Laid-Open No. 11-299106 JP 2000-308370 A
- the present invention has been made to solve such a problem. Even when an object is completely hidden under another object, a desired object can be selected by a very simple operation. The purpose is to be able to.
- object identification information is assigned to each dot associated with the plurality of objects on the same two-dimensional layout as the image.
- the identification layout information is generated.
- identification information of the object is assigned to each dot corresponding to the object.
- identification information of the plurality of objects is assigned to each dot corresponding to the non-overlapping part, and the overlapping part is divided into a plurality of small areas.
- Identification information of a plurality of objects is assigned to each dot corresponding to a plurality of small areas. Then, identification information corresponding to the dot designated on the image displayed on the display device is acquired from the identification layout information, and an object corresponding to the acquired identification information is selected.
- the identification layout information generated separately from the image displayed on the display device by using the identification layout information generated separately from the image displayed on the display device, the object at the position where the cursor is positioned on the image is identified. Is possible.
- the identification layout information is divided into a plurality of small areas, and the identification information of each object is assigned to each small area. It is generated by giving each. Therefore, even if a plurality of objects overlap on the layout of the image displayed on the display device, the identification information corresponding to each object is generated on the layout of the identification layout information generated to identify the plurality of objects.
- Each object can be identified by the identification information acquired from the position of the cursor. Therefore, even if an object is completely hidden on the display below another object, a desired object can be selected by an extremely simple operation.
- FIG.3 It is a figure which shows the example at the time of reducing the image containing a some object. It is a block diagram which shows the function structural example of the image processing apparatus provided with the object processing apparatus by this embodiment. It is a figure which shows the example of the original image produced
- FIG. 2 is a block diagram illustrating a functional configuration example of the image processing apparatus 100 including the object processing apparatus according to the present embodiment.
- the image processing apparatus 100 according to the present embodiment is built in, for example, a personal computer.
- the image processing apparatus 100 according to the present embodiment includes an image generation unit 11, an identification information generation unit 12, an identification information storage unit 13, a display control unit 14, and an identification information acquisition unit as functional configurations. 15, an object selection unit 16 and an operation control unit 17 are provided.
- These functional blocks 11 to 17 can be realized by any of a hardware configuration, a DSP (Digital Signal Processor), and software.
- the image processing apparatus 100 when realized by software, the image processing apparatus 100 according to the present embodiment is actually configured by including a CPU or MPU of a computer, RAM, ROM, and the like, and is realized by operating a program stored in the RAM or ROM. it can.
- the image generation unit 11 uses the original data stored in the original data storage unit 200 to generate an image including a plurality of objects.
- the original data is data including a plurality of objects and layout information indicating their arrangement
- the image generation unit 11 uses the original data to generate an image of a predetermined size (hereinafter referred to as an original image) including the plurality of objects. Is generated.
- the image generation unit 11 reduces the original image to generate a reduced image, or enlarges the original image to generate an enlarged image.
- FIG. 3 is a diagram illustrating an example of the original image and the reduced image generated by the image generation unit 11.
- FIG. 3A shows an original image
- FIG. 3B shows a reduced image.
- an image in which three objects 21, 22, and 23 exist on the background 24 is illustrated.
- the original image including the plurality of objects 21 to 23 is reduced, the individual objects 21 to 23 included in the original image are reduced together with the background 24, and at the same time, the interval between the objects 21 to 23 is also reduced.
- the reduced objects 21 to 23 are overlapped.
- a predetermined process related to the selected object is performed by a computer.
- a personal computer equipped with the image processing apparatus 100 can be executed.
- the identification information generation unit 12 Based on the image (original image, reduced image, enlarged image) generated by the image generation unit 11, the identification information generation unit 12 generates identification information that can identify each of the plurality of objects on the image, Identification layout information is generated by adding identification information to each dot associated with a plurality of objects on the same two-dimensional layout (bitmap layout). In the present embodiment, color information is used as the identification information of each dot generated by the identification information generation unit 12.
- the identification information generation unit 12 stores the generated identification layout information in the identification information storage unit 13.
- the identification information generation unit 12 determines whether or not there is an overlap between objects when generating identification layout information. For an object that does not overlap, identification information of the object is assigned to each dot of the identification layout information corresponding to the object.
- identification information of the plurality of objects is assigned to each dot of the layout information for identification corresponding to the portion without overlap. Further, the overlapping portion is divided into a plurality of small areas, and identification information of a plurality of objects is assigned to each dot of the identification layout information corresponding to the plurality of small areas.
- the identification information of the objects 21 to 23 is associated with each dot of the identification layout information corresponding to the objects 21 to 23. Is assigned to each. That is, the first identification information is given to each dot of the identification layout information corresponding to the first object 21, and the second is assigned to each dot of the identification layout information corresponding to the second object 22. Identification information is assigned, and third identification information is assigned to each dot of the identification layout information corresponding to the third object 23.
- FIG. 4 is a diagram showing identification layout information generated from the original image shown in FIG. As shown in FIG. 4, when there is no overlap between the plurality of objects 21 to 23, each dot in the same area as the plurality of objects 21 to 23 on the same two-dimensional layout as the original image of FIG. On the other hand, identification layout information obtained by assigning the first to third identification information 31 to 33 is generated. It should be noted that identification information is not given to the portion corresponding to the background 24 in the layout information for identification.
- the three objects 21 to 23 correspond to the dots of the identification layout information corresponding to the non-overlapping portions.
- the first to third identification information 31 to 33 are given respectively.
- the area on the identification layout information is divided into a plurality of small areas, and a plurality of objects that are overlapped with each dot of the identification layout information corresponding to the plurality of small areas Each identification information is assigned.
- FIG. 5 is a diagram illustrating an example in which identification information of each object is given by dividing a portion where the objects overlap each other into small areas.
- the identification information generation unit 12 divides the overlapping area on the identification layout information into two small areas 31 a and 32 a, and applies to one small area 31 a closer to the first object 21.
- the first identification information 31 is given, and the second identification information 32 is given to the other small area 32 a on the side close to the second object 22.
- the two small regions 31a and 32a are divided so that their areas are equal to each other. Since both the first object 21 and the second object 22 are rectangular, the two small areas 31a and 32a are also rectangular.
- the identification information generation unit 12 divides the overlapping area on the identification layout information into two small areas 31 b and 33 b, and applies to one small area 31 b closer to the first object 21.
- the first identification information 31 is given, and the third identification information 33 is given to the other small area 33 b on the side close to the third object 23.
- the two small regions 31b and 33b are also divided so that their areas are equal to each other.
- the first object 21 is rectangular, while the third object 23 is elliptical. Therefore, the shapes of the two small regions 31b and 33b are not both rectangular. In the example of FIG. 5, the other small area 33b on the side close to the third object 23 is rectangular and the remaining overlapping area is one small area 31b. However, the shapes of the small areas 31b and 33b are not limited to this. .
- FIG. 6 is a diagram showing identification layout information generated from the reduced image shown in FIG. Note that the identification layout information generated from the reduced image is generated in the same size on the same two-dimensional layout as the reduced image.
- FIG. 6 is an enlarged view of the identification layout information for convenience of explanation. Show.
- the areas of the first to third identification information 31 to 33 in the identification layout information are the first to third objects 21 to 33. It becomes a shape different from 23. That is, the area of the first identification information 31 has a shape in which the small areas 32 a and 33 b are missing from the area of the first object 21. The area of the second identification information 32 has a shape in which the small area 31 a is missing from the area of the second object 22. Further, the area of the third identification information 33 has a shape in which the small area 31 b is missing from the area of the third object 23.
- the display control unit 14 controls the display device 300 to display the image generated by the image generation unit 11.
- the identification information acquisition unit 15 stores the identification information (color information) corresponding to the dot designated by the cursor of the mouse 400 on the image displayed on the display device 300 by the display control unit 14 in the identification information storage unit 13. From the identification layout information. Since the identification layout information has the same form as the image data (bitmap data) in which color information is assigned to each dot of the two-dimensional layout, the color information is acquired using the getPixcel function of the BitmapData class Can be done.
- the object selection unit 16 selects an object corresponding to the identification information (color information) acquired by the identification information acquisition unit 15. That is, the object selection unit 16 selects a dot row indicated by the identification information acquired from the identification layout information by the identification information acquisition unit 15 as an object. For example, when the dot in the area
- the operation control unit 17 performs control so as to perform a predetermined operation related to the selected object when the object selection unit 16 selects the object. For example, the motion control unit 17 controls the image generation unit 11 and the display control unit 14 so that the object selected by the object selection unit 16 is highlighted. Specifically, under the control of the operation control unit 17, the image generation unit 11 redraws the image so that the object selected by the object selection unit 16 appears in the foreground in a specific highlight color. Then, the display control unit 14 causes the display device 300 to display the image redrawn by the image generation unit 11.
- the operation control unit 17 performs control so that a predetermined operation related to the object is performed.
- a predetermined operation related to the object For example, the application associated with the object is activated, or the display control unit 14 is controlled to display information about the object on the display device 300.
- information related to an object and an application and information related to the object are stored in the original data stored in the original data storage unit 200. Then, the operation control unit 17 controls to execute a predetermined operation by referring to these pieces of information.
- FIG. 7 is a flowchart illustrating an operation example of the image processing apparatus 100 according to the present embodiment.
- the flowchart shown in FIG. 7 starts when the image processing apparatus 100 is activated and an image display is instructed.
- the image generation unit 11 generates an image including a plurality of objects using the original data stored in the original data storage unit 200 (step S1).
- an original image is generated.
- the identification information generation unit 12 generates identification layout information based on the image generated by the image generation unit 11, and stores it in the identification information storage unit 13 (step S2).
- the display control unit 14 causes the display device 300 to display the image generated by the image generation unit 11 (step S3).
- the identification information acquisition unit 15 determines whether or not the position of the mouse cursor has been designated on the image (step S4).
- the image generation unit 11 determines whether an image generation instruction such as image reduction or image enlargement has been performed by the user. Is determined (step S5).
- an instruction for image reduction or image enlargement can be performed by, for example, operating the mouse 400 and dragging a boundary portion of the image display area to arbitrarily reduce or enlarge the image display area.
- an instruction to reduce or enlarge an image may be given by operating the mouse 400 and selecting a desired reduction rate or enlargement rate from a menu.
- step S11 if it is determined that an image generation instruction has not been issued, the process jumps to step S11. On the other hand, if it is determined that an image generation instruction has been issued, the process returns to step S ⁇ b> 1, and the image generation unit 11 generates a reduced image or an enlarged image having a size designated by the operation of the mouse 400. Then, the identification information generation unit 12 regenerates the identification layout information based on the image generated by the image generation unit 11 and stores it in the identification information storage unit 13 (step S2). Subsequently, the display control unit 14 causes the display device 300 to display the image regenerated by the image generation unit 11 (step S3).
- the identification information acquisition unit 15 acquires identification information corresponding to the dot designated by the mouse cursor from the identification layout information in the identification information storage unit 13 ( Step S6).
- the object selection unit 16 selects an object corresponding to the identification information acquired by the identification information acquisition unit 15 (step S7). Then, the motion control unit 17 controls the image generation unit 11 and the display control unit 14 to highlight and display the object selected by the object selection unit 16 (step S8). Thereafter, the motion control unit 17 determines whether or not the selected object has been clicked with the mouse 400 (step S9).
- step S10 the motion control unit 17 performs control so as to perform a predetermined motion related to the clicked object (application activation, display of related information, etc.) (step S10).
- step S11 the image processing apparatus 100 determines whether an instruction for ending the operation has been given by the user. If an instruction for ending the operation of the image processing apparatus 100 has not been issued, the process returns to step S4. On the other hand, when an instruction to end the operation of the image processing apparatus 100 is given, the processing of the flowchart shown in FIG. 7 ends.
- identification layout information for identifying a plurality of objects included in an image is generated separately from the image for mouse determination.
- This identification layout information is obtained by adding identification information of each object to each dot associated with a plurality of objects on the same two-dimensional layout as the image.
- the overlapping area is divided into a plurality of small areas, and identification information of each object is assigned to each dot corresponding to the plurality of small areas.
- identification information corresponding to the dot designated by the mouse cursor on the image displayed on the display device 300 is acquired from the identification layout information, and an object corresponding to the acquired identification information is selected.
- the identification layout information generated separately from the image displayed on the display device 300 it is possible to identify the object at the position where the cursor is positioned on the image. Since the image information at the position where the cursor is placed is not acquired, even if the entire image including the object is generated by bitmap data, the object included in the image can be identified.
- the layout of the identification layout information generated to identify the plurality of objects does not overlap (see FIG. 6). Therefore, it is possible to identify even overlapping objects by identification information acquired from the position of the cursor. Therefore, even if an object is completely hidden underneath other objects, it is a very simple operation that only moves the cursor, including the hidden object. Can be selected.
- the coordinate information is not acquired from the mouse 400, but the identification information is acquired from the identification layout information.
- the method of acquiring the coordinate information from the mouse 400 it is necessary to acquire the coordinate information from the mouse 400 of the external device each time the cursor is moved a little, and it takes a lot of time.
- the identification information can be acquired instantaneously following the movement of the mouse cursor. For this reason, the determination speed of a mouse position can be improved. As a result, a series of performances from the display of an image through the determination of the mouse position to the highlight display of the selected object can be made extremely high.
- FIG. 8 is a block diagram illustrating a functional configuration example of an image processing apparatus 100 ′ to which the object processing apparatus according to the present embodiment is applied.
- the same reference numerals as those shown in FIG. 2 have the same functions, and therefore redundant description is omitted here.
- the image processing apparatus 100 ′ illustrated in FIG. 8 includes, as its functional configuration, an image generation unit 11, an identification information generation unit 12 ′, an identification information storage unit 13, a display control unit 14, an identification information acquisition unit 15 ′, and an object selection unit 16. 'And an operation control unit 17'.
- Each of these functional blocks can be realized by any of a hardware configuration, a DSP, and software.
- the identification information generation unit 12 'generates two types of identification layout information.
- the first identification layout information is information for identifying a group including a plurality of objects.
- the second identification layout information is information for identifying individual objects included in the group.
- the identification information generation unit 12 ′ classifies a plurality of objects included in the image generated by the image generation unit 11 into a plurality of groups. Then, the identification information generation unit 12 ′ assigns group identification information to each dot corresponding to all the objects included in the group on the same two-dimensional layout as the image for each of the plurality of groups. The layout information for identification is generated. In addition, the identification information generation unit 12 ′ assigns object identification information to each dot corresponding to each object on the same two-dimensional layout as the image for each of the objects included in the group. Identification layout information is generated.
- FIG. 9 is a diagram illustrating an example of an image generated by the image generation unit 11 and a plurality of objects and groups included in the image.
- 81A, 82A, 83A are mark objects representing icons
- 81B, 82B, 83B are text objects representing titles
- 84 is a chart image such as a graph (in this example, an object that can be selected by a mouse cursor) It shall not be).
- three graphs are displayed as the chart image 84.
- a plurality of objects in the image are classified into three groups. That is, the two objects 81A and 81B are classified into the first group 81, the two objects 82A and 82B are classified into the second group 82, and the two objects 83A and 83B are classified into the third group 83.
- Which object belongs to which group is indicated by, for example, original data.
- FIG. 10 is a diagram illustrating an example of first identification layout information and second identification layout information generated by the identification information generation unit 12 ′ based on the image shown in FIG. 9.
- FIG. 10A shows the first identification layout information
- FIG. 10B shows the second identification layout information.
- the identification information generation unit 12 ′ when generating the first identification layout information, includes a rectangular area within a predetermined range surrounding all the objects 81A and 81B in the first group 81. First identification information 91 is assigned to each dot. In addition, the identification information generation unit 12 ′ provides the second identification information 92 to each dot in a rectangular area within a predetermined range surrounding all the objects 82 ⁇ / b> A and 82 ⁇ / b> B in the second group 82. Further, the identification information generation unit 12 ′ provides the third identification information 93 to each dot in a rectangular area within a predetermined range surrounding all the objects 83 ⁇ / b> A and 83 ⁇ / b> B in the third group 83.
- the identification information generation unit 12 ′ generates each of the identification objects corresponding to the individual objects 81A and 81B included in the first group 81 when generating the second identification layout information. Fourth to fifth identification information 94A and 94B are assigned to the dots, respectively. In addition, the identification information generation unit 12 ′ assigns sixth to seventh identification information 95 A and 95 B to the dots corresponding to the individual objects 82 A and 82 B included in the second group 82. Further, the identification information generation unit 12 ′ assigns the eighth to ninth identification information 96 A and 96 B to the dots corresponding to the individual objects 83 A and 83 B included in the third group 83.
- the identification information generation unit 12 ′ stores the first identification layout information and the second identification layout information generated as described above in the identification information storage unit 13. Also in the application example shown in FIG. 8, color information is used as the identification information of each dot generated by the identification information generation unit 12 ′. Specifically, the identification information of each dot in the identification layout information generated by the identification information generation unit 12 ′ is color information obtained by grouping similar colors within a predetermined range into the same color group. To give similar colors of the same color group. The similar color mentioned here refers to a color that can hardly be identified by human eyes but can be identified by a computer.
- the identification information generation unit 12 ′ gives color information of similar colors belonging to the same color group to each dot corresponding to each object included in one group.
- the identification information generation unit 12 ′ gives the second identification layout information by giving color information of different similar colors to each dot corresponding to each of the objects included in one group. Generate.
- the first identification layout is provided by assigning a similar color different from the similar color used in the second identification layout information to each dot in a rectangular area of a predetermined range surrounding all objects in the group. Generate information.
- the identification information generation unit 12 ′ belongs to the same color group for each dot on the second identification layout information corresponding to each of the two objects 81A and 81B included in the first group 81, but is different. Color information of similar colors is assigned as fourth to fifth identification information 94A and 94B, respectively. For each dot on the first identification layout information corresponding to the rectangular area surrounding the two objects 81A and 81B, color information of similar colors different from the fourth to fifth identification information 94A and 94B is provided. This is given as first identification information 91.
- the identification information generation unit 12 ′ uses the same color group (for the first group 81) for each dot on the identification layout information corresponding to each of the two objects 82A and 82B included in the second group 82. Color information of similar colors that belong to different groups) but are assigned as sixth to seventh identification information 95A and 95B. For each dot on the first identification layout information corresponding to the rectangular area surrounding the two objects 82A and 82B, color information of similar colors different from the sixth to seventh identification information 95A and 95B is provided. It is given as the second identification information 92.
- the identification information generation unit 12 ′ applies the same color group (the first group 81 and the first group) to each dot on the identification layout information corresponding to each of the two objects 83A and 83B included in the third group 83.
- Color information of different similar colors belonging to a different group from that set for the two groups 82 is provided as eighth to ninth identification information 96A and 96B, respectively.
- color information of similar colors different from the eighth to ninth identification information 96A and 96B is provided. It is given as third identification information 93.
- the identification information acquisition unit 15 ′ stores identification information (color information) corresponding to the dot designated by the cursor of the mouse 400 on the image displayed on the display device 300 by the display control unit 14 in the identification information storage unit 13. Obtained from the identified layout information. For example, the identification information acquisition unit 15 ′ acquires color information from the first identification layout information when the mouse cursor is simply positioned at a desired position, and a click is performed at the position where the mouse cursor is positioned. When the color information is received, the color information is acquired from the second identification layout information.
- the object selection unit 16 ′ selects a group of objects corresponding to the color information. Further, when the color information is acquired from the second identification layout information by the identification information acquisition unit 15 ′, the object selection unit 16 ′ selects an object corresponding to the color information. That is, the object selection unit 16 ′ selects a dot row indicated by the color information acquired from the first identification layout information by the identification information acquisition unit 15 ′ as a group of objects, and the identification information acquisition unit 15 ′ A dot row indicated by the color information acquired from the second identification layout information is selected as an object.
- the operation control unit 17 ′ performs control so as to perform a predetermined operation related to the selected object or group when the object selection unit 16 ′ selects the object or its group. For example, when the desired group of objects is selected by the object selection unit 16 ′, the motion control unit 17 ′ controls the image generation unit 11 and the display control unit 14 so that the selected group is highlighted. To do.
- the motion control unit 17 ′ activates an application associated with the selected object or displays information about the object.
- the display control unit 14 is controlled to be displayed on the device 300.
- each object and each group can be identified by identification information acquired from the position of the cursor.
- the color information is acquired from either the first identification layout information or the second identification layout information by simply distinguishing whether the mouse cursor is positioned at a desired position or clicked.
- the present invention is not limited to this.
- identification information for identifying a group no color information is given to a region where each object is located in a rectangular region representing the position of the group in the first identification layout information, and the rectangular region Color information is assigned only to the remaining area.
- the color information is acquired from the second identification layout information, and the remaining areas in the rectangular area are obtained.
- the color information may be acquired from the first identification layout information.
- the application related to the object is activated, while the remaining areas in the rectangular area are designated by the mouse cursor.
- the group may be highlighted. In this way, it is possible to change the content of the operation controlled by the operation control unit 17 'only by changing the position of the mouse cursor.
- the chart image 84 such as a graph has been described as not being an object that can be selected by the mouse cursor.
- one graph represented by the chart image 84 may be a group, and a plurality of break points on the graph may be each object in the group.
- the first identification layout information is generated by assigning different similar colors of the same color group to each dot on the graph, and the first identification layout information is assigned to the break point portion.
- the second identification layout information is generated by giving the same color information as the color information to the dot corresponding to the break point.
- the graph is highlighted, and when the breakpoint portion is designated with the mouse cursor, the information of the breakpoint portion is displayed as a label. It is possible to Information necessary for label display is indicated by the original data. If the information necessary for label display is stored in association with the second identification layout information, the corresponding label is obtained when obtaining color information corresponding to the position of the mouse cursor from the second identification layout information. Information can also be acquired.
- color information is used as identification information constituting identification layout information
- the present invention is not limited to this. That is, information other than color information may be used as long as the information can identify each of a plurality of objects (objects and groups in the application example) included in the image.
- the display device 300 may be provided with a touch panel, and dots on the image may be designated by touching the touch panel with a touch pen or a finger.
- identification layout information is generated based on an image including a plurality of objects.
- the present invention is not limited to this.
- identification layout information may be generated based on the original data.
- the object processing apparatus and the object selection method of the present invention can be used for a computer system having a function of selecting a desired object from a plurality of objects displayed on a screen by using a pointing device.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Processing Or Creating Images (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
Description
Claims (5)
- 複数のオブジェクトを含む画像に関して、上記複数のオブジェクトをそれぞれ識別可能な識別情報を生成し、上記画像と同じ2次元レイアウト上で上記複数のオブジェクトに関連付けた各ドットに対して上記識別情報を各々付与して成る識別用レイアウト情報を生成する識別情報生成部と、
表示装置に表示された画像上で指定されたドットに対応する識別情報を上記識別用レイアウト情報から取得する識別情報取得部と、
上記識別情報取得部により取得された識別情報に対応するオブジェクトを選択するオブジェクト選択部とを備え、
上記識別情報生成部は、上記オブジェクトに重なりがあるか否かを判定し、重なりがないオブジェクトについては、当該オブジェクトに対応する各ドットに対して当該オブジェクトの上記識別情報を付与する一方、重なりがある複数のオブジェクトについては、重なりのない部分に対応する各ドットに対して上記複数のオブジェクトの上記識別情報を各々付与するとともに、重なりのある部分は複数の小領域に分割して上記複数の小領域に対応する各ドットに対して上記複数のオブジェクトの上記識別情報を各々付与することによって、上記識別用レイアウト情報を生成することを特徴とするオブジェクト処理装置。 - 上記識別情報生成部が生成する各ドットの上記識別情報は色情報であることを特徴とする請求項1に記載のオブジェクト処理装置。
- 上記識別情報生成部は、上記画像に含まれる複数のオブジェクトを複数のグループに分類し、上記複数のグループの各々について、上記画像と同じ2次元レイアウト上で上記グループに含まれる全オブジェクトを囲む所定領域に対応する各ドットに対して上記複数のグループをそれぞれ識別可能な識別情報を各々付与して成る第1の識別用レイアウト情報と、上記グループに含まれるオブジェクトの各々について、上記画像と同じ2次元レイアウト上で上記グループに含まれる個々のオブジェクトに対応する各ドットに対して上記個々のオブジェクトを識別可能な識別情報を各々付与して成る第2の識別用レイアウト情報とを生成し、
上記オブジェクト選択部は、上記識別情報取得部により上記第1の識別用レイアウト情報から取得された識別情報に対応するオブジェクトのグループを選択するとともに、上記識別情報取得部により上記第2の識別用レイアウト情報から取得された識別情報に対応するオブジェクトを選択することを特徴とする請求項1に記載のオブジェクト処理装置。 - 上記識別情報生成部が生成する各ドットの上記識別情報は、所定範囲内にある類似色を同一色グループにグルーピングした色情報であり、上記識別情報生成部は、1つのグループおよびそれに含まれる各オブジェクトに対応する各ドットに対して上記同一色グループに属する類似色の色情報を各々付与し、その際に上記1つのグループ内に含まれるオブジェクトの各々に対応する各ドットに対しては上記第1の識別用レイアウト情報とは異なる類似色の色情報を各々付与することによって上記第2の識別用レイアウト情報を生成することを特徴とする請求項3に記載のオブジェクト処理装置。
- 複数のオブジェクトを含む画像に関して、上記複数のオブジェクトをそれぞれ識別可能な識別情報を生成し、上記画像と同じ2次元レイアウト上で上記複数のオブジェクトに関連付けた各ドットに対して上記識別情報を各々付与して成る識別用レイアウト情報を生成する識別情報生成ステップと、
表示装置に表示された画像上で指定されたドットに対応する識別情報を上記識別用レイアウト情報から取得する識別情報取得ステップと、
上記識別情報取得ステップで取得された識別情報に対応するオブジェクトを選択するオブジェクト選択ステップとを有し、
上記識別情報生成ステップでは、上記オブジェクトに重なりがあるか否かを判定し、重なりがないオブジェクトについては、当該オブジェクトに対応する各ドットに対して当該オブジェクトの上記識別情報を付与する一方、重なりがある複数のオブジェクトについては、重なりのない部分に対応する各ドットに対して上記複数のオブジェクトの上記識別情報を各々付与するとともに、重なりのある部分は複数の小領域に分割して上記複数の小領域に対応する各ドットに対して上記複数のオブジェクトの上記識別情報を各々付与することによって、上記識別用レイアウト情報を生成することを特徴とするオブジェクト選択方法。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP10842189.2A EP2523082A4 (en) | 2010-01-07 | 2010-12-20 | OBJECT PROCESSING DEVICE AND OBJECT SELECTION METHOD |
US13/505,549 US8787661B2 (en) | 2010-01-07 | 2010-12-20 | Object processing device and object selection method |
CN201080051185.9A CN102640099B (zh) | 2010-01-07 | 2010-12-20 | 对象处理装置和对象选择方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010-002088 | 2010-01-07 | ||
JP2010002088A JP5417185B2 (ja) | 2010-01-07 | 2010-01-07 | オブジェクト処理装置およびオブジェクト選択方法 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2011083676A1 true WO2011083676A1 (ja) | 2011-07-14 |
Family
ID=44305413
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2010/072890 WO2011083676A1 (ja) | 2010-01-07 | 2010-12-20 | オブジェクト処理装置およびオブジェクト選択方法 |
Country Status (5)
Country | Link |
---|---|
US (1) | US8787661B2 (ja) |
EP (1) | EP2523082A4 (ja) |
JP (1) | JP5417185B2 (ja) |
CN (1) | CN102640099B (ja) |
WO (1) | WO2011083676A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104820558A (zh) * | 2015-05-11 | 2015-08-05 | 北京白鹭时代信息技术有限公司 | 一种拾取被遮挡图像的方法及装置 |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6188370B2 (ja) | 2013-03-25 | 2017-08-30 | インターナショナル・ビジネス・マシーンズ・コーポレーションInternational Business Machines Corporation | オブジェクト分類方法、装置及びプログラム。 |
WO2018072459A1 (zh) * | 2016-10-18 | 2018-04-26 | 华为技术有限公司 | 一种屏幕截图和读取的方法及终端 |
CN114332311B (zh) * | 2021-12-05 | 2023-08-04 | 北京字跳网络技术有限公司 | 一种图像生成方法、装置、计算机设备及存储介质 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS62140172A (ja) * | 1985-12-13 | 1987-06-23 | Canon Inc | 画像合成方法 |
JPH11265246A (ja) * | 1998-03-18 | 1999-09-28 | Omron Corp | マルチウィンドウ表示装置、マルチウィンドウ表示方法およびマルチウィンドウ表示プログラムを記憶した媒体 |
WO2007126096A1 (ja) * | 2006-04-24 | 2007-11-08 | Sony Corporation | 画像処理装置及び画像処理方法 |
Family Cites Families (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS57209563A (en) * | 1981-06-19 | 1982-12-22 | Hitachi Ltd | Diagram editing system |
JPH02287614A (ja) * | 1989-04-27 | 1990-11-27 | Sanyo Electric Co Ltd | ウィンドウ管理方式 |
JP2966531B2 (ja) * | 1990-12-14 | 1999-10-25 | 富士通株式会社 | ウインドウ移動方式 |
JPH09223241A (ja) | 1996-02-19 | 1997-08-26 | Oki Electric Ind Co Ltd | オブジェクトの選択方法 |
US5892511A (en) * | 1996-09-30 | 1999-04-06 | Intel Corporation | Method for assisting window selection in a graphical user interface |
US6002397A (en) * | 1997-09-30 | 1999-12-14 | International Business Machines Corporation | Window hatches in graphical user interface |
US6691126B1 (en) * | 2000-06-14 | 2004-02-10 | International Business Machines Corporation | Method and apparatus for locating multi-region objects in an image or video database |
JP3939550B2 (ja) * | 2001-12-28 | 2007-07-04 | 株式会社リコー | オブジェクト整合管理方法及びシステム |
US7215828B2 (en) * | 2002-02-13 | 2007-05-08 | Eastman Kodak Company | Method and system for determining image orientation |
US7263220B2 (en) * | 2003-02-28 | 2007-08-28 | Eastman Kodak Company | Method for detecting color objects in digital images |
JP4241647B2 (ja) * | 2005-03-04 | 2009-03-18 | キヤノン株式会社 | レイアウト制御装置、レイアウト制御方法及びレイアウト制御プログラム |
WO2006129554A1 (en) * | 2005-05-30 | 2006-12-07 | Fujifilm Corporation | Album creating apparatus, album creating method and program |
JP2007065356A (ja) * | 2005-08-31 | 2007-03-15 | Toshiba Corp | 合成オブジェクト表示装置、合成オブジェクト表示方法およびプログラム |
-
2010
- 2010-01-07 JP JP2010002088A patent/JP5417185B2/ja active Active
- 2010-12-20 CN CN201080051185.9A patent/CN102640099B/zh active Active
- 2010-12-20 EP EP10842189.2A patent/EP2523082A4/en not_active Withdrawn
- 2010-12-20 US US13/505,549 patent/US8787661B2/en active Active
- 2010-12-20 WO PCT/JP2010/072890 patent/WO2011083676A1/ja active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS62140172A (ja) * | 1985-12-13 | 1987-06-23 | Canon Inc | 画像合成方法 |
JPH11265246A (ja) * | 1998-03-18 | 1999-09-28 | Omron Corp | マルチウィンドウ表示装置、マルチウィンドウ表示方法およびマルチウィンドウ表示プログラムを記憶した媒体 |
WO2007126096A1 (ja) * | 2006-04-24 | 2007-11-08 | Sony Corporation | 画像処理装置及び画像処理方法 |
Non-Patent Citations (1)
Title |
---|
See also references of EP2523082A4 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104820558A (zh) * | 2015-05-11 | 2015-08-05 | 北京白鹭时代信息技术有限公司 | 一种拾取被遮挡图像的方法及装置 |
CN104820558B (zh) * | 2015-05-11 | 2018-03-30 | 北京白鹭时代信息技术有限公司 | 一种拾取被遮挡图像的方法及装置 |
Also Published As
Publication number | Publication date |
---|---|
US20120213433A1 (en) | 2012-08-23 |
EP2523082A4 (en) | 2015-12-09 |
CN102640099A (zh) | 2012-08-15 |
US8787661B2 (en) | 2014-07-22 |
JP5417185B2 (ja) | 2014-02-12 |
JP2011141748A (ja) | 2011-07-21 |
EP2523082A1 (en) | 2012-11-14 |
CN102640099B (zh) | 2015-05-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10528236B2 (en) | Creating a display pattern for multiple data-bound graphic objects | |
US8566741B2 (en) | Internal scroll activation and cursor adornment | |
US8823744B2 (en) | Method for indicating annotations associated with a particular display view of a three-dimensional model independent of any display view | |
US20130055126A1 (en) | Multi-function affine tool for computer-aided design | |
JP4759081B2 (ja) | チャート描画装置およびチャート描画方法 | |
US20130019200A1 (en) | Methods for combination tools that zoom, pan, rotate, draw, or manipulate during a drag | |
CN102169408A (zh) | 链接手势 | |
CN102169407A (zh) | 上下文复用手势 | |
US10275910B2 (en) | Ink space coordinate system for a digital ink stroke | |
CN102141888A (zh) | 盖印手势 | |
CN102141887A (zh) | 画笔、复写和填充手势 | |
CN102725711A (zh) | 边缘手势 | |
CN102169365A (zh) | 裁剪、打孔和撕裂手势 | |
US20120317509A1 (en) | Interactive wysiwyg control of mathematical and statistical plots and representational graphics for analysis and data visualization | |
US10613725B2 (en) | Fixing spaced relationships between graphic objects | |
CN110473273B (zh) | 矢量图形的绘制方法、装置、存储介质及终端 | |
US10855481B2 (en) | Live ink presence for real-time collaboration | |
JP5981175B2 (ja) | 図面表示装置、及び図面表示プログラム | |
JP5417185B2 (ja) | オブジェクト処理装置およびオブジェクト選択方法 | |
US10475223B2 (en) | Generating multiple data-bound graphic objects | |
JP2013012063A (ja) | 表示制御装置 | |
JP6373710B2 (ja) | 図形処理装置および図形処理プログラム | |
JP6526851B2 (ja) | 図形処理装置および図形処理プログラム | |
JP4907156B2 (ja) | 3次元ポインティング方法および3次元ポインティング装置ならびに3次元ポインティングプログラム | |
US11023110B2 (en) | Creating an axis for data-bound objects |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201080051185.9 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10842189 Country of ref document: EP Kind code of ref document: A1 |
|
REEP | Request for entry into the european phase |
Ref document number: 2010842189 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2010842189 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13505549 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |