WO2014027408A1 - Image display device, image display method, and program - Google Patents

Image display device, image display method, and program Download PDF

Info

Publication number
WO2014027408A1
WO2014027408A1 PCT/JP2012/070765 JP2012070765W WO2014027408A1 WO 2014027408 A1 WO2014027408 A1 WO 2014027408A1 JP 2012070765 W JP2012070765 W JP 2012070765W WO 2014027408 A1 WO2014027408 A1 WO 2014027408A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
display
unit
image display
display device
Prior art date
Application number
PCT/JP2012/070765
Other languages
French (fr)
Japanese (ja)
Inventor
竜也 飯塚
Original Assignee
Necカシオモバイルコミュニケーションズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Necカシオモバイルコミュニケーションズ株式会社 filed Critical Necカシオモバイルコミュニケーションズ株式会社
Priority to PCT/JP2012/070765 priority Critical patent/WO2014027408A1/en
Publication of WO2014027408A1 publication Critical patent/WO2014027408A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen

Definitions

  • the present invention relates to an image display device that includes an input unit such as a touch panel and performs processing in accordance with an operation instruction received via the input unit.
  • Portable electronic devices represented by tablet terminals are known (see Patent Documents 1 to 4).
  • the portable electronic device is configured to receive an operation instruction via a display unit such as a liquid crystal display, a touch panel input unit provided on a screen, and the touch panel input unit, and execute display processing according to the operation instruction.
  • a display unit such as a liquid crystal display, a touch panel input unit provided on a screen, and the touch panel input unit, and execute display processing according to the operation instruction.
  • the operator can perform a work using a desired region of the image displayed on the screen through an input operation using the touch panel input unit.
  • JP 2008-139711 A Japanese Patent Laid-Open No. 2005-100084 Japanese Patent Laid-Open No. 06-259191 Japanese Patent Laid-Open No. 05-250090
  • An object of the present invention is to provide an image display device, an image display method, and a program capable of solving the above-described problems and improving work efficiency.
  • a display unit that displays an image
  • an input unit that receives an operation on the image and outputs an operation instruction according to the received operation
  • the input unit An image display device is provided that includes a control unit that receives a plurality of operation instructions from and controls a plurality of processes based on these operation instructions in parallel.
  • an image display device includes: a display unit that displays an image; and an input unit that receives an operation on the image and outputs an operation instruction according to the received operation.
  • an image display method for receiving a plurality of operation instructions from the input unit and executing a plurality of processes based on these operation instructions in parallel.
  • a computer of an image display device comprising: a display unit that displays an image; and an input unit that receives an operation on the image and outputs an operation instruction according to the received operation.
  • a program for receiving a plurality of operation instructions from the input unit and executing a process for executing a plurality of processes based on the operation instructions in parallel is provided.
  • FIG. 1 is a block diagram illustrating a configuration of an image display device according to a first embodiment of the present invention.
  • 3 is a flowchart illustrating a procedure of image display processing performed by the image display device illustrated in FIG. 1. It is a block diagram which shows the structure of the image display apparatus which is the 2nd Embodiment of this invention.
  • FIG. 4 is a schematic diagram illustrating an arrangement example of first to fourth direction instruction buttons of the image display device illustrated in FIG. 3. It is a flowchart which shows one procedure of the drawing direction change process performed with the image display apparatus shown in FIG. It is a schematic diagram which shows the display form of the drawing thing before execution of the drawing direction change process shown in FIG. It is a schematic diagram which shows the display form of the drawing after execution of the drawing direction change process shown in FIG. It is a figure which shows an example of drawing top and bottom information.
  • Control unit 2 Display unit 3 Input unit
  • FIG. 1 is a block diagram showing a configuration of an image display apparatus according to the first embodiment of the present invention.
  • the image display apparatus shown in FIG. 1 includes a display unit 2 that displays an image, an input unit 3 that receives an operation on the image and outputs an operation instruction according to the received operation, and an operation that is received via the input unit 3 And a control unit 1 that executes processing on the image based on the instruction.
  • the display unit 2 is a display device represented by a liquid crystal display or the like.
  • a projector or the like may be used as the display unit 2. In this case, an image from the projector is projected on the table.
  • the input unit 3 is, for example, a multi-touch type input unit, and can receive input instructions from a plurality of operators.
  • the multi-touch type input unit can simultaneously detect contact with a plurality of locations on the touch panel, and can instruct the position and movement of each contact location.
  • the contact is contact of an indicator such as a finger or a pen.
  • Detecting means for detecting the movement of an indicator such as a finger or a pen may be used as the input unit 3.
  • This detection means is an imaging means represented by an infrared sensor or a CCD camera, for example.
  • the detection means outputs an operation instruction for an image on the screen or the table based on the movement of the indicator.
  • the control unit 1 receives a plurality of operation instructions with different operation targets via the input unit 3, and executes processes corresponding to the operation instructions in parallel.
  • the control unit 1 may simultaneously execute some of the plurality of operation instructions received via the input unit 3. For example, the operation instruction of each operator does not always end instantaneously, and the length of each operation instruction does not match. An operation instruction of one operator may be long, and an operation instruction of another operator may be short. When a part of each of these operation instructions overlaps, the control unit 1 may execute the part of the overlapped operation instructions at the same time.
  • control unit 1 holds information that associates the specific movement of the indicator with the content of the operation instruction, and refers to this information from the detection unit.
  • the operation instruction may be determined.
  • FIG. 2 is a flowchart showing one procedure of image display processing of the control unit 1.
  • step S10 when an operation instruction is input through the input unit 3 (step S10), the control unit 1 determines whether or not a plurality of operation instructions are received simultaneously (step S11).
  • simultaneous reception of a plurality of operation instructions includes reception of at least a part of each operation instruction by the input unit 3 simultaneously.
  • control unit 1 When a plurality of operation instructions are received at the same time, the control unit 1 simultaneously executes processing based on each operation instruction (step S12).
  • control unit 1 executes processing based on the received operation instructions (step S13).
  • the input unit 3 outputs a plurality of operation instructions according to the input operation for each object.
  • the control unit 1 receives an operation instruction (specifically, contact with a finger, a pen, or the like) for each object from the input unit 3.
  • an operation instruction specifically, contact with a finger, a pen, or the like
  • each part may be treated as an object.
  • the control unit 1 can determine an operation instruction for the object for each object. Specifically, the control unit 1 corresponds to which object each operation instruction corresponds to based on the display position of each object on the screen (or table) and the operation instruction (contact position) for each object. Can be determined.
  • control unit 1 executes processing for each object in parallel based on each operation instruction.
  • Processing for the object includes processing such as rotation, enlargement, reduction, movement, and color change.
  • Rotation processing is processing that rotates an object in a direction instructed through contact with a finger or a pen.
  • the enlargement / reduction process is a process of enlarging or reducing the object in accordance with an operation instruction such as increasing or decreasing the interval between the contact points of two fingers.
  • the moving process is a process of moving the object in the direction instructed through contact with a finger or a pen.
  • the color changing process is a process of changing the color of an object to a desired color through contact with a finger or a pen.
  • an object is designated and a color selection screen is displayed, and the color of the object is changed to a color selected from the color selection screen.
  • Object designation, display of a color selection screen, color selection, etc. are all executed according to instructions via contact with a finger or pen.
  • the screen is divided into a plurality of areas, and the input unit 3 outputs an operation instruction (specifically, a contact with a finger or a pen) for each area.
  • the control unit 1 receives an operation instruction for each area from the input unit 3.
  • the control unit 1 can determine an operation instruction for the area for each area. Specifically, the control unit 1 determines which region each operation instruction corresponds to based on position information on the screen of each region and an operation instruction (contact position) for each region. be able to.
  • control unit 1 When operation instructions for a plurality of areas are input at the same time, the control unit 1 performs processing (rotation, enlargement, reduction, movement, color change, etc.) for the corresponding areas in parallel based on each operation instruction. And execute.
  • the input unit 3 outputs an operation instruction (specifically, contact with a finger or a pen) for each layer. To do.
  • the control unit 1 receives an operation instruction for each layer from the input unit 3.
  • the control unit 1 can determine an operation instruction for the layer for each layer. Specifically, the control unit 1 determines which layer each operation instruction corresponds to based on position information on the screen of each layer and an operation instruction (contact position) for each layer. be able to. When each layer partially overlaps, position information of the displayed area of the layer is used.
  • control unit 1 When operation instructions for a plurality of layers are input at the same time, the control unit 1 performs processing (rotation, enlargement, reduction, movement, color change, etc.) for the corresponding layer in parallel based on each operation instruction. And execute.
  • the input unit 3 outputs a plurality of operation instructions according to an input operation for each of a plurality of arbitrarily set areas on the image.
  • the arbitrary area is designated by using a closed figure (circle, ellipse, quadrangle, etc.) having an arbitrary shape surrounded by a straight line or a curve.
  • a closed figure is designated through contact with a finger or the like.
  • the control unit 1 receives an operation instruction for each arbitrary region on the image from the input unit 3.
  • the control part 1 can discriminate
  • control unit 1 When operation instructions for a plurality of arbitrary areas are input simultaneously, the control unit 1 performs processing (rotation, enlargement, reduction, movement, color change, etc.) for a corresponding arbitrary area based on each operation instruction. ) In parallel.
  • the detection means detects the movement of an indicator (such as a finger or a pen) for issuing an operation instruction, and outputs an operation instruction corresponding to the detected movement.
  • the control unit 1 accepts a plurality of operation instructions according to the movement of the indicator with respect to each operation target output from the detection means.
  • the image display device can be applied to a terminal device such as a tablet terminal or a mobile phone terminal, and can also be applied to an electronic device (for example, a game machine) that operates independently.
  • a terminal device such as a tablet terminal or a mobile phone terminal
  • an electronic device for example, a game machine
  • the image display apparatus of the present embodiment can be applied to a conference system, an expert system (decision support system), an image creation system, a CAD (Computer Aided Design) system, and the like.
  • Each operator puts the created card (image) in the summary area. For example, the operator touches (selects) a card (image) and drags it in the direction in which the operator wants to move, thereby placing the card (image) in the summary area.
  • the image display apparatus of the present embodiment may be configured using a computer (CPU: Central Processing unit) that operates according to a program.
  • the program can cause a computer to execute at least the image display processing of the control unit 1.
  • the program may be provided using a computer-readable recording medium, for example, an optical disc such as a CD (Compact Disc) or a DVD (Digital Video Disc), a USB (Universal Serial Bus) memory, a memory card, or the like. It may be provided via a network (for example, the Internet).
  • FIG. 3 is a block diagram showing a configuration of an image display apparatus according to the second embodiment of the present invention.
  • the image display device includes a control unit 1, a display unit 2, an input unit 3, and a storage unit 4.
  • the storage unit 4 is a storage device represented by a semiconductor memory or the like.
  • the display unit 2 and the input unit 3 are basically the same as those described in the first embodiment.
  • the input unit 3 includes first to fourth direction instruction buttons provided on the top, bottom, left, and right (four sides) of the end region of the screen (or table).
  • FIG. 4 schematically shows an arrangement example of the first to fourth direction instruction buttons.
  • first to fourth direction instruction buttons 31 to 34 are provided around the display surface 21.
  • the first direction instruction button 31 is disposed adjacent to the upper side of the display surface 21.
  • the input unit 3 sends a first direction instruction signal indicating the drawing direction in which the upper side of the display surface 21 is the lower side of the drawing to the control unit 1. Supply.
  • the second direction instruction button 32 is disposed adjacent to the lower side portion of the display surface 21.
  • the input unit 3 sends a second direction instruction signal indicating the drawing direction in which the lower side of the display surface 21 is the lower side of the drawn object to the control unit 1. To supply.
  • the third direction instruction button 33 is disposed adjacent to the left side portion of the display surface 21.
  • the input unit 3 sends to the control unit 1 a third direction instruction signal indicating the drawing direction in which the left side of the display surface 21 is the lower side of the drawing. Supply.
  • the fourth direction instruction button 34 is disposed adjacent to the right side portion of the display surface 21.
  • the input unit 3 sends to the control unit 1 a fourth direction instruction signal indicating the drawing direction in which the right side of the display surface 21 is the lower side of the drawing. Supply.
  • the control unit 1 includes a drawing direction determination unit 11, a drawing direction change unit 12, and a top and bottom information creation unit 13 in addition to the function of executing each process based on a plurality of operation instructions described in the first embodiment in parallel. .
  • the top / bottom information creation unit 13 identifies the top / bottom direction (display direction) of the written object with respect to the reference direction. For example, when receiving an operation instruction for an image via the input unit 3, the top / bottom information creation unit 13 specifies the top / bottom (display orientation) of the drawing based on the position of the drawing that is the target of the operation instruction.
  • the top and bottom information creation unit 13 determines which side of the top, bottom, left, and right sides of the display surface 21 shown in FIG. 4 is close, and places the nearest side below the drawing. As the side, the top and bottom direction (display direction) with respect to the reference direction is specified.
  • the top-and-bottom information creation unit 13 creates, for each drawing object, drawing object top-and-bottom information 41 including identification information and information indicating the top and bottom direction (display direction), and the created drawing and top-and-bottom information 41 is stored in the storage unit 4. Store.
  • the drawing direction determination unit 11 When the drawing direction determination unit 11 receives one of the first to fourth direction instruction signals from the input unit 3, the drawing direction determination unit 11 determines the drawing direction based on the received direction instruction signal.
  • the drawing direction changing unit 12 compares the drawing direction determined by the drawing direction determining unit 11 and the top and bottom information of each drawing in the drawing top and bottom information 41 stored in the storage unit 4, and the top and bottom direction is determined as the drawing direction. Rotate the mismatched drawings to align in the drawing direction.
  • FIG. 5 is a flowchart showing a procedure of the drawing direction changing process.
  • the drawing direction determination unit 11 determines whether one of the first to fourth direction instruction signals is received from the input unit 1 as a display shaping request (step S20).
  • the drawing direction determination unit 11 determines the drawing direction (display direction) according to the received direction instruction signal (step S21).
  • the drawing direction changing unit 12 acquires the top and bottom information of one drawing from the drawing top and bottom information 41 stored in the storage unit 4 (step S22), and the acquired top and bottom information is determined in step S21. It is determined whether or not the direction (display direction) matches (step S23).
  • the drawing direction changing unit 12 rotates the drawn object in accordance with the drawing direction (display direction) (step S24). ).
  • step S23 If it is determined in step S23 that the top / bottom information matches the drawing direction (display direction) or if the process of step S24 is executed, then the drawing direction changing unit 12 is included in the drawing top / bottom information 41. It is determined whether or not the top and bottom information of all the drawn objects is confirmed (step S25).
  • step S25 If it is determined in step S25 that there is top / bottom information of a drawing that has not been confirmed yet, the processing of steps S22 to S24 is performed on the top / bottom information of the unconfirmed drawing.
  • step S25 If it is determined in step S25 that the top and bottom information of all the drawn objects has been confirmed, the drawing direction changing process is terminated.
  • FIG. 6A shows a display form of a drawn object before execution of the drawing direction change process
  • FIG. 6B shows a display form of the drawn object after execution of the drawing direction change process
  • FIG. 7 shows the drawing top-and-bottom information 41 for the display form shown in FIG. 6A.
  • characters “A”, “B”, and “C” are displayed on the display surface 21 as drawn objects.
  • the orientation of the top and bottom is set so that the lower side portion of the display surface 21 faces down.
  • the direction of the top and bottom of the character “B” is set so that the right side portion of the display surface 21 faces down.
  • the letter “C” has its top and bottom orientation set so that the left side of the display surface 21 is on the bottom.
  • the top and bottom information of the characters “A”, “B”, and “C” is stored in the storage unit 4 as the drawing top and bottom information 41 as shown in FIG.
  • the identification information of the drawn object indicated by “A”, “B”, and “C” in the figure
  • the top and bottom information arrow
  • a second direction instruction signal is supplied from the input unit 3 to the control unit 1.
  • the display direction is specified based on the second direction instruction signal, and the top and bottom directions of each drawing are aligned according to the specified display direction.
  • the characters “A”, “B”, and “C” are displayed in a top-and-bottom orientation with the lower side of the display surface 21 facing down.
  • a plurality of drawn objects having different top and bottom directions can be displayed in an arbitrary direction, so that the contents of the drawn object can be easily grasped.
  • the top and bottom information creation unit 13 may be deleted.
  • each operator may register the top-and-bottom information about the drawing operated by the operator in the storage unit 4 through an input operation using the input unit 1.
  • the image display apparatus of the present embodiment may be configured using a computer (CPU: Central Processing unit) that operates according to a program.
  • the program can cause a computer to execute at least image display processing and drawing direction change processing of the control unit 1.
  • the program may be provided using a computer-readable recording medium, for example, an optical disc such as a CD (Compact Disc) or a DVD (Digital Video Disc), a USB (Universal Serial Bus) memory, a memory card, or the like. It may be provided via a network (for example, the Internet).
  • a part or all of the image display device of each of the above-described embodiments may take forms such as the following supplementary notes 1 to 11, but is not limited to these forms.
  • Appendix 1 A display for displaying an image; An input unit that receives an operation on the image and outputs an operation instruction according to the received operation; An image display device comprising: a control unit that receives a plurality of operation instructions from the input unit and executes a plurality of processes based on the operation instructions in parallel.
  • Appendix 2 The image display device according to appendix 1, wherein the control unit simultaneously executes at least a part of each of the plurality of operation instructions.
  • the image includes a plurality of operation objects, The image display apparatus according to appendix 1 or 2, wherein the input unit outputs the plurality of operation instructions in response to an input operation on the plurality of operation targets.
  • the image display device according to attachment 3 wherein the plurality of operation objects are images of a plurality of parts, a plurality of regions, images of a plurality of layers, or a combination of two or more thereof.
  • the image display device according to appendix 4 wherein at least one of the plurality of regions is an arbitrarily set region.
  • the input unit includes a touch panel that simultaneously detects contact with a plurality of indicators for performing the input operation, and outputs the plurality of operation instructions based on the contact with the plurality of indicators.
  • the image display device according to any one of 1 to 5.
  • the input unit includes detection means for detecting movements of a plurality of indicators for performing the input operation and outputting the plurality of operation instructions based on the detected movements of the indicators.
  • the image display device according to any one of the above.
  • Appendix 8 A storage unit that stores top and bottom information indicating the display direction with respect to the reference direction of each of the plurality of drawn objects that are operation targets on the image;
  • the control unit refers to the top and bottom information of the plurality of drawn objects stored in the storage unit, and
  • the image display device according to any one of appendices 1 to 7, wherein the display direction is aligned in the arbitrary direction.
  • the controller is A drawing direction determination unit that determines a drawing direction based on the pressed direction instruction button when any of the first to fourth direction instruction buttons is pressed;
  • the drawing direction determined by the drawing direction determination unit is compared with the top and bottom information of the plurality of drawing objects stored in the storage unit, and the drawing object whose display direction is different from the drawing direction among the plurality of drawing objects
  • the image display device according to appendix 8, further comprising: a drawing direction changing unit that performs rotation control for adjusting a display direction of the drawing object to the drawing direction.
  • An image display method performed in an image display device comprising: a display unit that displays an image; and an input unit that receives an operation on the image and outputs an operation instruction according to the received operation, An image display method for receiving a plurality of operation instructions from the input unit and executing a plurality of processes based on these operation instructions in parallel.
  • a computer of an image display device comprising: a display unit that displays an image; and an input unit that receives an operation on the image and outputs an operation instruction according to the received operation.
  • a program that receives a plurality of operation instructions from the input unit and causes a process to execute a plurality of processes based on these operation instructions in parallel.

Abstract

An image display device is provided with: a display unit (2) for displaying an image; an input unit (3) for receiving an operation with respect to an image, and outputting an operation instruction corresponding to the received operation; and a control unit (1) for receiving a plurality of operation instructions from the input unit (3) and executing in parallel a plurality of processes on the basis of the operation instructions.

Description

画像表示装置、画像表示方法およびプログラムImage display device, image display method, and program
 本発明は、タッチパネルなどの入力部を備え、入力部を介して受け付けた操作指示に応じて処理が行われる画像表示装置に関する。 The present invention relates to an image display device that includes an input unit such as a touch panel and performs processing in accordance with an operation instruction received via the input unit.
 タブレット端末に代表される携帯式電子機器が知られている(特許文献1~4参照)。 Portable electronic devices represented by tablet terminals are known (see Patent Documents 1 to 4).
 この携帯式電子機器は、液晶ディスプレイなどの表示部と、画面上に設けられたタッチパネル入力部と、タッチパネル入力部を介して操作指示を受け付けて、その操作指示に応じて表示処理を実行する制御部と、を有する。 The portable electronic device is configured to receive an operation instruction via a display unit such as a liquid crystal display, a touch panel input unit provided on a screen, and the touch panel input unit, and execute display processing according to the operation instruction. Part.
 上記の携帯式電子機器によれば、例えば、操作者は、タッチパネル入力部を用いた入力操作を通じて、画面上に表示された画像の所望の領域を利用した作業を行うことができる。 According to the above portable electronic device, for example, the operator can perform a work using a desired region of the image displayed on the screen through an input operation using the touch panel input unit.
特開2008-139711号公報JP 2008-139711 A 特開2005-100084号公報Japanese Patent Laid-Open No. 2005-100084 特開平06-259191号公報Japanese Patent Laid-Open No. 06-259191 特開平05-250090号公報Japanese Patent Laid-Open No. 05-250090
 複数の操作者が1台の機器を同時に使用して作業を行う場合がある。特許文献1~4に記載された機器においては、ある操作者が作業を行いたい場合に、他の操作者が作業を行っている場合は、他の操作者の作業が終了するのを待つ必要がある。このように、複数の操作者が同時に作業を行うことができないため、作業効率が悪いという問題がある。 ∙ Multiple operators may work using one device at the same time. In the devices described in Patent Documents 1 to 4, when an operator wants to perform work, if another operator is working, it is necessary to wait for the other operator's work to be completed. There is. As described above, since a plurality of operators cannot work at the same time, there is a problem that work efficiency is poor.
 本発明の目的は、上記問題を解決し、作業効率の向上を図ることができる、画像表示装置、画像表示方法およびプログラムを提供することにある。 An object of the present invention is to provide an image display device, an image display method, and a program capable of solving the above-described problems and improving work efficiency.
 上記目的を達成するため、本発明の一態様によれば、画像を表示する表示部と、前記画像に対する操作を受け取り、該受け取った操作に応じた操作指示を出力する入力部と、前記入力部から複数の操作指示を受け付け、これら操作指示に基づく複数の処理を並行して実行する制御部と、を有する画像表示装置が提供される。 In order to achieve the above object, according to one aspect of the present invention, a display unit that displays an image, an input unit that receives an operation on the image and outputs an operation instruction according to the received operation, and the input unit An image display device is provided that includes a control unit that receives a plurality of operation instructions from and controls a plurality of processes based on these operation instructions in parallel.
 本発明の別の態様によれば、画像を表示する表示部と、前記画像に対する操作を受け取り、該受け取った操作に応じた操作指示を出力する入力部と、を備えた画像表示装置にて行われる画像表示方法であって、前記入力部から複数の操作指示を受け付け、これら操作指示に基づく複数の処理を並行して実行する、画像表示方法が提供される。 According to another aspect of the present invention, an image display device includes: a display unit that displays an image; and an input unit that receives an operation on the image and outputs an operation instruction according to the received operation. There is provided an image display method for receiving a plurality of operation instructions from the input unit and executing a plurality of processes based on these operation instructions in parallel.
 本発明の他の態様によれば、画像を表示する表示部と、前記画像に対する操作を受け取り、該受け取った操作に応じた操作指示を出力する入力部と、を備えた画像表示装置のコンピュータに、前記入力部から複数の操作指示を受け付け、これら操作指示に基づく複数の処理を並行して実行する処理を実行させる、プログラムが提供される。 According to another aspect of the present invention, a computer of an image display device comprising: a display unit that displays an image; and an input unit that receives an operation on the image and outputs an operation instruction according to the received operation. A program for receiving a plurality of operation instructions from the input unit and executing a process for executing a plurality of processes based on the operation instructions in parallel is provided.
本発明の第1の実施形態である画像表示装置の構成を示すブロック図である。1 is a block diagram illustrating a configuration of an image display device according to a first embodiment of the present invention. 図1に示す画像表示装置にて行われる画像表示処理の一手順を示すフローチャートである。3 is a flowchart illustrating a procedure of image display processing performed by the image display device illustrated in FIG. 1. 本発明の第2の実施形態である画像表示装置の構成を示すブロック図である。It is a block diagram which shows the structure of the image display apparatus which is the 2nd Embodiment of this invention. 図3に示す画像表示装置の第1乃至第4の方向指示ボタンの配置例を示す模式図である。FIG. 4 is a schematic diagram illustrating an arrangement example of first to fourth direction instruction buttons of the image display device illustrated in FIG. 3. 図3に示す画像表示装置にて行われる描画方向変更処理の一手順を示すフローチャートである。It is a flowchart which shows one procedure of the drawing direction change process performed with the image display apparatus shown in FIG. 図5に示す描画方向変更処理の実行前の描画物の表示形態を示す模式図である。It is a schematic diagram which shows the display form of the drawing thing before execution of the drawing direction change process shown in FIG. 図5に示す描画方向変更処理の実行後の描画物の表示形態を示す模式図である。It is a schematic diagram which shows the display form of the drawing after execution of the drawing direction change process shown in FIG. 描画物天地情報の一例を示す図である。It is a figure which shows an example of drawing top and bottom information.
 1 制御部
 2 表示部
 3 入力部
1 Control unit 2 Display unit 3 Input unit
 以下、本発明における一実施形態を、図面を参照して説明する。 Hereinafter, an embodiment of the present invention will be described with reference to the drawings.
 (第1の実施形態)
 図1は、本発明の第1の実施形態である画像表示装置の構成を示すブロック図である。
(First embodiment)
FIG. 1 is a block diagram showing a configuration of an image display apparatus according to the first embodiment of the present invention.
 図1に示す画像表示装置は、画像を表示する表示部2と、画像に対する操作を受け取り、該受け取った操作に応じた操作指示を出力する入力部3と、入力部3を介して受け付けた操作指示に基づいて画像に対する処理を実行する制御部1と、を有する。 The image display apparatus shown in FIG. 1 includes a display unit 2 that displays an image, an input unit 3 that receives an operation on the image and outputs an operation instruction according to the received operation, and an operation that is received via the input unit 3 And a control unit 1 that executes processing on the image based on the instruction.
 表示部2は、液晶ディスプレイなどに代表されるディスプレイ装置である。この他、表示部2として、プロジェクタなどを用いてもよい。この場合は、プロジェクタからの画像がテーブル上に投射される。 The display unit 2 is a display device represented by a liquid crystal display or the like. In addition, a projector or the like may be used as the display unit 2. In this case, an image from the projector is projected on the table.
 入力部3は、例えば、マルチタッチ方式の入力手段であって、複数の操作者からの入力指示を受け付けることができる。このマルチタッチ方式の入力手段は、例えば、タッチパネル上の複数の箇所への接触を同時に検出することができ、それぞれの接触箇所に対して位置や動きを指示することができる。ここで、接触は、指やペンなどの指示体の接触である。 The input unit 3 is, for example, a multi-touch type input unit, and can receive input instructions from a plurality of operators. For example, the multi-touch type input unit can simultaneously detect contact with a plurality of locations on the touch panel, and can instruct the position and movement of each contact location. Here, the contact is contact of an indicator such as a finger or a pen.
 入力部3として、指やペンなどの指示体の動きを検出する検出手段を用いてもよい。この検出手段は、例えば、赤外線センサやCCDカメラに代表される撮像手段である。検出手段は、指示体の動きに基づいて、画面またはテーブル上の画像に対する操作指示を出力する。 Detecting means for detecting the movement of an indicator such as a finger or a pen may be used as the input unit 3. This detection means is an imaging means represented by an infrared sensor or a CCD camera, for example. The detection means outputs an operation instruction for an image on the screen or the table based on the movement of the indicator.
 制御部1は、入力部3を介して操作の対象が異なる複数の操作指示を受け付け、各操作指示に応じた処理を並行して実行する。制御部1は、入力部3を介して受け付けた複数の操作指示それぞれの一部を同時に実行してもよい。例えば、各操作者の操作指示は、瞬間に終了するとは限らず、各操作指示の長さも一致するものではない。ある操作者の操作指示は長く、別の操作者の操作指示は短い場合がある。このような各操作指示の一部が重なった場合には、制御部1は、その重なった一部の操作指示を同時に実行してもよい。 The control unit 1 receives a plurality of operation instructions with different operation targets via the input unit 3, and executes processes corresponding to the operation instructions in parallel. The control unit 1 may simultaneously execute some of the plurality of operation instructions received via the input unit 3. For example, the operation instruction of each operator does not always end instantaneously, and the length of each operation instruction does not match. An operation instruction of one operator may be long, and an operation instruction of another operator may be short. When a part of each of these operation instructions overlaps, the control unit 1 may execute the part of the overlapped operation instructions at the same time.
 なお、入力部3が上記検出手段である場合、制御部1は、指示体の特定の動きと操作指示内容とを関連付けた情報を保持しており、この情報を参照して、上記検出手段からの操作指示を判断してもよい。 When the input unit 3 is the detection unit, the control unit 1 holds information that associates the specific movement of the indicator with the content of the operation instruction, and refers to this information from the detection unit. The operation instruction may be determined.
 図2は、制御部1の画像表示処理の一手順を示すフローチャートである。 FIG. 2 is a flowchart showing one procedure of image display processing of the control unit 1.
 図2を参照すると、入力部3にて操作指示が入力されると(ステップS10)、制御部1が、複数の操作指示を同時に受け付けたか否かを判定する(ステップS11)。ここで、複数の操作指示の同時受付とは、それぞれの操作指示の少なくとも一部が入力部3で同時に受け付けられることを含む。 Referring to FIG. 2, when an operation instruction is input through the input unit 3 (step S10), the control unit 1 determines whether or not a plurality of operation instructions are received simultaneously (step S11). Here, simultaneous reception of a plurality of operation instructions includes reception of at least a part of each operation instruction by the input unit 3 simultaneously.
 複数の操作指示を同時に受け付けた場合は、制御部1は、各操作指示に基づく処理を同時に実行する(ステップS12)。 When a plurality of operation instructions are received at the same time, the control unit 1 simultaneously executes processing based on each operation instruction (step S12).
 複数の操作指示の同時受付で無い場合は、制御部1は、受け付けた操作指示に基づく処理を実行する(ステップS13)。 If it is not simultaneous reception of a plurality of operation instructions, the control unit 1 executes processing based on the received operation instructions (step S13).
 上記の画像表示処理によれば、複数の操作者が、画面(またはテーブル)上に表示された画像に対する操作指示を入力部3にて同時に入力した場合、それら操作指示に従った処理が同時に実行される。よって、ある操作者が作業を行いたい場合に、他の操作者が作業を行っている場合でも、他の操作者の作業が終了するのを待つことなく作業を実施することができ、その結果、作業効率が向上する。 According to the above image display processing, when a plurality of operators simultaneously input operation instructions for images displayed on the screen (or table) using the input unit 3, processing according to the operation instructions is simultaneously executed. Is done. Therefore, when a certain operator wants to work, even if another operator is working, the work can be performed without waiting for the other operator's work to be completed. , Work efficiency is improved.
 次に、複数の操作指示の同時受付および同時処理が可能な手法を具体的に説明する。ここでは、入力部3がマルチタッチ方式の入力手段であるものと仮定して動作を説明する。 Next, a method capable of simultaneously accepting and simultaneously processing a plurality of operation instructions will be specifically described. Here, the operation will be described on the assumption that the input unit 3 is a multi-touch type input means.
 例えば、表示部2が複数のオブジェクトを表示している場合、入力部3は、各オブジェクトに対する入力操作に応じて複数の操作指示を出力する。制御部1は、入力部3から各オブジェクトに対する操作指示(具体的には、指やペンなどの接触)をそれぞれ受け付ける。ここで、1つの画像が複数のパーツ(部品)の画像を含む場合、各パーツそれぞれがオブジェクトとして扱われてもよい。 For example, when the display unit 2 displays a plurality of objects, the input unit 3 outputs a plurality of operation instructions according to the input operation for each object. The control unit 1 receives an operation instruction (specifically, contact with a finger, a pen, or the like) for each object from the input unit 3. Here, when one image includes images of a plurality of parts (components), each part may be treated as an object.
 制御部1は、オブジェクト毎に、当該オブジェクトに対する操作指示を判別可能である。具体的には、制御部1は、各オブジェクトの画面(またはテーブル)上の表示位置と各オブジェクトに対する操作指示(接触位置)とに基づいて、それぞれの操作指示がどのオブジェクトに対応するものであるかを判別することができる。 The control unit 1 can determine an operation instruction for the object for each object. Specifically, the control unit 1 corresponds to which object each operation instruction corresponds to based on the display position of each object on the screen (or table) and the operation instruction (contact position) for each object. Can be determined.
 各オブジェクトに対する操作指示が同時に入力された場合、制御部1は、各操作指示に基づいて各オブジェクトに対する処理を並行して実行する。 When an operation instruction for each object is input at the same time, the control unit 1 executes processing for each object in parallel based on each operation instruction.
 オブジェクトに対する処理は、回転、拡大、縮小、移動、色の変更などの処理を含む。 Processing for the object includes processing such as rotation, enlargement, reduction, movement, and color change.
 回転処理は、指やペンなどの接触を介して指示された方向にオブジェクトを回転させる処理である。 Rotation processing is processing that rotates an object in a direction instructed through contact with a finger or a pen.
 拡大または縮小の処理は、例えば2本の指の接触箇所の間隔を広げる、または狭める、といった操作指示に応じて、オブジェクトを拡大または縮小する処理である。 The enlargement / reduction process is a process of enlarging or reducing the object in accordance with an operation instruction such as increasing or decreasing the interval between the contact points of two fingers.
 移動処理は、指やペンなどの接触を介して指示された方向にオブジェクトを移動させる処理である。 The moving process is a process of moving the object in the direction instructed through contact with a finger or a pen.
 色の変更処理は、指やペンなどの接触を介してオブジェクトの色を所望の色に変更する処理である。この処理では、オブジェクトが指定されるとともに、色選択画面が表示され、その色選択画面から選択された色にオブジェクトの色を変更する。オブジェクト指定、色選択画面の表示、色選択など、いずれも、指やペンなどの接触を介した指示に従って実行される。 The color changing process is a process of changing the color of an object to a desired color through contact with a finger or a pen. In this process, an object is designated and a color selection screen is displayed, and the color of the object is changed to a color selected from the color selection screen. Object designation, display of a color selection screen, color selection, etc. are all executed according to instructions via contact with a finger or pen.
 また、複数の操作指示の同時受付および同時処理が可能な手法として、以下のような別の手法もある。 Also, there are other methods as follows that can simultaneously accept and simultaneously process a plurality of operation instructions.
 画面が複数の領域に区画されており、入力部3は、各領域に対する操作指示(具体的には、指やペンなどの接触)を出力する。制御部1は、入力部3から各領域に対する操作指示を受け付ける。 The screen is divided into a plurality of areas, and the input unit 3 outputs an operation instruction (specifically, a contact with a finger or a pen) for each area. The control unit 1 receives an operation instruction for each area from the input unit 3.
 制御部1は、領域毎に、当該領域に対する操作指示を判別可能である。具体的には、制御部1は、各領域の画面上の位置情報と各領域に対する操作指示(接触位置)とに基づいて、それぞれの操作指示がどの領域に対応するものであるかを判別することができる。 The control unit 1 can determine an operation instruction for the area for each area. Specifically, the control unit 1 determines which region each operation instruction corresponds to based on position information on the screen of each region and an operation instruction (contact position) for each region. be able to.
 複数の領域に対する操作指示が同時に入力された場合、制御部1は、それぞれの操作指示に基づいて、対応する領域に対する処理(回転、拡大、縮小、移動、色の変更などの処理)を並行して実行する。 When operation instructions for a plurality of areas are input at the same time, the control unit 1 performs processing (rotation, enlargement, reduction, movement, color change, etc.) for the corresponding areas in parallel based on each operation instruction. And execute.
 さらに、複数の操作指示の同時受付および同時処理が可能な手法として、以下のようなさらなる別の手法もある。 Furthermore, there are other methods as follows that can simultaneously accept and simultaneously process a plurality of operation instructions.
 例えば、表示部2が複数の画像をレイヤ別(またはウィンドウ別)に表示している場合、入力部3は、各レイヤに対する操作指示(具体的には、指やペンなどの接触)をそれぞれ出力する。制御部1は、入力部3から各レイヤに対する操作指示を受け付ける。 For example, when the display unit 2 displays a plurality of images by layer (or by window), the input unit 3 outputs an operation instruction (specifically, contact with a finger or a pen) for each layer. To do. The control unit 1 receives an operation instruction for each layer from the input unit 3.
 制御部1は、レイヤ毎に、当該レイヤに対する操作指示を判別可能である。具体的には、制御部1は、各レイヤの画面上の位置情報と各レイヤに対する操作指示(接触位置)とに基づいて、それぞれの操作指示がどのレイヤに対応するものであるかを判別することができる。各レイヤが部分的に重なっている場合は、レイヤの表示された領域の位置情報が用いられる。 The control unit 1 can determine an operation instruction for the layer for each layer. Specifically, the control unit 1 determines which layer each operation instruction corresponds to based on position information on the screen of each layer and an operation instruction (contact position) for each layer. be able to. When each layer partially overlaps, position information of the displayed area of the layer is used.
 複数のレイヤに対する操作指示が同時に入力された場合、制御部1は、それぞれの操作指示に基づいて、対応するレイヤに対する処理(回転、拡大、縮小、移動、色の変更などの処理)を並行して実行する。 When operation instructions for a plurality of layers are input at the same time, the control unit 1 performs processing (rotation, enlargement, reduction, movement, color change, etc.) for the corresponding layer in parallel based on each operation instruction. And execute.
 さらに、複数の操作指示の同時受付および同時処理が可能な手法として、以下のようなさらなる別の手法もある。 Furthermore, there are other methods as follows that can simultaneously accept and simultaneously process a plurality of operation instructions.
 入力部3は、画像上の任意に設定された複数の領域それぞれに対する入力操作に応じて複数の操作指示を出力する。ここで、任意の領域は、直線または曲線で囲まれた任意の形状の閉図形(円、楕円、四角形など)を用いて指定される。閉図形は、指などの接触を介して指定される。制御部1は、入力部3から画像上の各任意の領域に対する操作指示を受け付ける。 The input unit 3 outputs a plurality of operation instructions according to an input operation for each of a plurality of arbitrarily set areas on the image. Here, the arbitrary area is designated by using a closed figure (circle, ellipse, quadrangle, etc.) having an arbitrary shape surrounded by a straight line or a curve. A closed figure is designated through contact with a finger or the like. The control unit 1 receives an operation instruction for each arbitrary region on the image from the input unit 3.
 制御部1は、任意の領域毎に、当該任意の領域に対する操作指示を判別可能である。具体的には、制御部1は、各任意の領域の画面上の位置情報と各任意の領域に対する操作指示(接触位置)とに基づいて、それぞれの操作指示がどの任意の領域に対応するものであるかを判別することができる。 The control part 1 can discriminate | determine the operation instruction with respect to the said arbitrary area | region for every arbitrary area | region. Specifically, the control unit 1 corresponds to which arbitrary region each operation instruction corresponds to based on position information on the screen of each arbitrary region and an operation instruction (contact position) for each arbitrary region. Can be determined.
 複数の任意の領域に対する操作指示が同時に入力された場合、制御部1は、それぞれの操作指示に基づいて、対応する任意の領域に対する処理(回転、拡大、縮小、移動、色の変更などの処理)を並行して実行する。 When operation instructions for a plurality of arbitrary areas are input simultaneously, the control unit 1 performs processing (rotation, enlargement, reduction, movement, color change, etc.) for a corresponding arbitrary area based on each operation instruction. ) In parallel.
 上述した各手法は、入力部3として上述の検出手段を用いても実現することができる。この場合は、検出手段は、操作指示を行うための指示体(指やペンなど)の動きを検出し、その検出した動きに応じた操作指示を出力する。制御部1は、検出手段から出力された各操作対象に対する指示体の動きに応じた複数の操作指示を受け付ける。 Each method described above can also be realized by using the detection means described above as the input unit 3. In this case, the detection means detects the movement of an indicator (such as a finger or a pen) for issuing an operation instruction, and outputs an operation instruction corresponding to the detected movement. The control unit 1 accepts a plurality of operation instructions according to the movement of the indicator with respect to each operation target output from the detection means.
 本実施形態の画像表示装置は、タブレット端末や携帯電話端末のような端末装置に適用することができる他、単独で動作する電子機器(例えば、ゲーム機)にも適用することができる。 The image display device according to the present embodiment can be applied to a terminal device such as a tablet terminal or a mobile phone terminal, and can also be applied to an electronic device (for example, a game machine) that operates independently.
 また、本実施形態の画像表示装置は、会議システム、エキスパートシステム(意思決定支援システム)、画像作成システムやCAD(Computer Aided Design)システムなどにも適用することができる。 Also, the image display apparatus of the present embodiment can be applied to a conference system, an expert system (decision support system), an image creation system, a CAD (Computer Aided Design) system, and the like.
 さらに、本実施形態の画像表示装置を適用したシステムとして、KJ法を電子化した複数同時操作システムが考えられる。 Furthermore, as a system to which the image display device of the present embodiment is applied, a multiple simultaneous operation system in which the KJ method is digitized can be considered.
 以下、複数同時操作システムについて簡単に説明する。 The following is a brief description of the multiple simultaneous operation system.
 KJ法では、カードを用いて膨大な情報をまとめる作業が行われる。これに対して、複数同時操作システムでは、1つのテーブル上に表示した画像(カード)が用いられる。テーブルには、カードをまとめる領域と、各操作者がカードを作る領域とが表示される。そして、以下の(1)~(5)の動作が行われる。 In the KJ method, a large amount of information is collected using cards. On the other hand, in the multiple simultaneous operation system, an image (card) displayed on one table is used. The table displays an area for collecting cards and an area for each operator to make cards. Then, the following operations (1) to (5) are performed.
 (1)各操作者は、自身の作業領域でカード(画像)を作成する。 (1) Each operator creates a card (image) in his work area.
 (2)各操作者は、作成したカード(画像)をまとめ領域に抛る。例えば、操作者は、カード(画像)に触れ(選択し)、移動したい方向にドラッグすることで、カード(画像)をまとめ領域に抛る。 (2) Each operator puts the created card (image) in the summary area. For example, the operator touches (selects) a card (image) and drags it in the direction in which the operator wants to move, thereby placing the card (image) in the summary area.
 (3)まとめ担当者は、カード(画像)をまとめ領域の適当な位置に配置する。 (3) The person in charge of summarizing arranges the card (image) at an appropriate position in the summarizing area.
 (4)まとめ領域内のカード(画像)は、他のカード(画像)の追加や検討結果に基づき、随時その位置が変更される。 (4) The position of the card (image) in the summary area is changed at any time based on the addition of other cards (images) and the examination results.
 (5)まとめ領域内のカード(画像)の中から似通ったものをいくつかのグループに分ける。このカード(画像)のグループ化においては、グループの階層化を行っても良い。 (5) Divide similar cards (images) in the summary area into several groups. In grouping the cards (images), the groups may be hierarchized.
 上記の動作(1)~(5)が同時に行われても良い。 The above operations (1) to (5) may be performed simultaneously.
 また、本実施形態の画像表示装置は、プログラムに従って動作するコンピュータ(CPU:Central Processing unit)を用いて構成されてもよい。プログラムは、少なくとも、制御部1の画像表示処理を、コンピュータに実行させることが可能なものである。プログラムは、コンピュータ読み出し可能な記録媒体、例えば、CD(Compact Disc)やDVD(Digital Video Disc)などの光ディスクや、USB(Universal Serial Bus)メモリやメモリカードなどを用いて提供されてもよく、通信網(例えばインターネット)を介して提供されてもよい。 Further, the image display apparatus of the present embodiment may be configured using a computer (CPU: Central Processing unit) that operates according to a program. The program can cause a computer to execute at least the image display processing of the control unit 1. The program may be provided using a computer-readable recording medium, for example, an optical disc such as a CD (Compact Disc) or a DVD (Digital Video Disc), a USB (Universal Serial Bus) memory, a memory card, or the like. It may be provided via a network (for example, the Internet).
 (第2の実施形態)
 複数の操作者が画面(または画像が表示されたテーブル)を囲んで四方から入力操作を行って描画物(文字や図形などを含む画像)を書き込む場合、それぞれの操作者が書き込んだ描画物の天地(所定の方向から画面またはテーブルを見た場合の描画物の上下左右の向き)が異なる。このように各描画物の天地が統一されていないため、所定の方向から画面またはテーブルを見た場合に、それぞれの描画物の内容が分かりづらい。
(Second Embodiment)
When multiple operators perform input operations from four sides around a screen (or a table on which images are displayed) and write a drawing (an image containing characters, figures, etc.), the drawing of the drawing written by each operator The top and bottom (the vertical and horizontal directions of the drawing when viewing the screen or table from a predetermined direction) are different. As described above, since the top and bottom of each drawing object is not unified, it is difficult to understand the contents of each drawing object when the screen or the table is viewed from a predetermined direction.
 図3は、本発明の第2の実施形態である画像表示装置の構成を示すブロック図である。 FIG. 3 is a block diagram showing a configuration of an image display apparatus according to the second embodiment of the present invention.
 図3を参照すると、画像表示装置は、制御部1、表示部2、入力部3および記憶部4を有する。記憶部4は、半導体メモリなどに代表される記憶装置である。 Referring to FIG. 3, the image display device includes a control unit 1, a display unit 2, an input unit 3, and a storage unit 4. The storage unit 4 is a storage device represented by a semiconductor memory or the like.
 表示部2および入力部3は、第1の実施形態で説明したものと基本的に同じである。ただし、入力部3は、画面(またはテーブル)の端部領域の上下左右(4辺)に設けられた第1乃至第4の方向指示ボタンを有する。 The display unit 2 and the input unit 3 are basically the same as those described in the first embodiment. However, the input unit 3 includes first to fourth direction instruction buttons provided on the top, bottom, left, and right (four sides) of the end region of the screen (or table).
 図4に、第1乃至第4の方向指示ボタンの配置例を模式的に示す。 FIG. 4 schematically shows an arrangement example of the first to fourth direction instruction buttons.
 図4を参照すると、第1乃至第4の方向指示ボタン31~34が表示面21の周辺に設けられている。 Referring to FIG. 4, first to fourth direction instruction buttons 31 to 34 are provided around the display surface 21.
 第1の方向指示ボタン31は、表示面21の上側辺部に隣接して配置されている。第1の方向指示ボタン31が押下されると、入力部3は、表示面21の上側辺部の側が描画物の下側とされた描画方向を示す第1の方向指示信号を制御部1に供給する。 The first direction instruction button 31 is disposed adjacent to the upper side of the display surface 21. When the first direction instruction button 31 is pressed, the input unit 3 sends a first direction instruction signal indicating the drawing direction in which the upper side of the display surface 21 is the lower side of the drawing to the control unit 1. Supply.
 第2の方向指示ボタン32は、表示面21の下側辺部に隣接して配置されている。第2の方向指示ボタン32が押下されると、入力部3は、表示面21の下側辺部の側が描画物の下側とされた描画方向を示す第2の方向指示信号を制御部1に供給する。 The second direction instruction button 32 is disposed adjacent to the lower side portion of the display surface 21. When the second direction instruction button 32 is pressed, the input unit 3 sends a second direction instruction signal indicating the drawing direction in which the lower side of the display surface 21 is the lower side of the drawn object to the control unit 1. To supply.
 第3の方向指示ボタン33は、表示面21の左側辺部に隣接して配置されている。第3の方向指示ボタン33が押下されると、入力部3は、表示面21の左側辺部の側が描画物の下側とされた描画方向を示す第3の方向指示信号を制御部1に供給する。 The third direction instruction button 33 is disposed adjacent to the left side portion of the display surface 21. When the third direction instruction button 33 is pressed, the input unit 3 sends to the control unit 1 a third direction instruction signal indicating the drawing direction in which the left side of the display surface 21 is the lower side of the drawing. Supply.
 第4の方向指示ボタン34は、表示面21の右側辺部に隣接して配置されている。第4の方向指示ボタン34が押下されると、入力部3は、表示面21の右側辺部の側が描画物の下側とされた描画方向を示す第4の方向指示信号を制御部1に供給する。 The fourth direction instruction button 34 is disposed adjacent to the right side portion of the display surface 21. When the fourth direction instruction button 34 is pressed, the input unit 3 sends to the control unit 1 a fourth direction instruction signal indicating the drawing direction in which the right side of the display surface 21 is the lower side of the drawing. Supply.
 制御部1は、第1の実施形態で説明した複数の操作指示に基づく各処理を並行に実行する機能に加えて、描画方向決定部11、描画方向変更部12および天地情報作成部13を有する。 The control unit 1 includes a drawing direction determination unit 11, a drawing direction change unit 12, and a top and bottom information creation unit 13 in addition to the function of executing each process based on a plurality of operation instructions described in the first embodiment in parallel. .
 天地情報作成部13は、基準方向に対する書き込まれた描画物の天地の向き(表示向き)を特定する。例えば、天地情報作成部13は、入力部3を介して画像に対する操作指示を受け付けると、その操作指示の対象である描画物の位置に基づいてその描画物の天地(表示向き)を特定する。 The top / bottom information creation unit 13 identifies the top / bottom direction (display direction) of the written object with respect to the reference direction. For example, when receiving an operation instruction for an image via the input unit 3, the top / bottom information creation unit 13 specifies the top / bottom (display orientation) of the drawing based on the position of the drawing that is the target of the operation instruction.
 具体的には、天地情報作成部13は、描画物が図4に示した表示面21の上下左右の辺部のどの辺部に近いかを判定し、最も近い辺部側を描画物の下側として、基準方向に対する天地の向き(表示向き)を特定する。天地情報作成部13は、描画物毎に、その識別情報と天地の向き(表示向き)を示す情報を含む描画物天地情報41を作成し、その作成した描画物天地情報41を記憶部4に格納する。 Specifically, the top and bottom information creation unit 13 determines which side of the top, bottom, left, and right sides of the display surface 21 shown in FIG. 4 is close, and places the nearest side below the drawing. As the side, the top and bottom direction (display direction) with respect to the reference direction is specified. The top-and-bottom information creation unit 13 creates, for each drawing object, drawing object top-and-bottom information 41 including identification information and information indicating the top and bottom direction (display direction), and the created drawing and top-and-bottom information 41 is stored in the storage unit 4. Store.
 描画方向決定部11は、入力部3から第1乃至第4の方向指示信号のいずれかの信号を受信すると、その受信した方向指示信号に基づいて描画方向を決定する。 When the drawing direction determination unit 11 receives one of the first to fourth direction instruction signals from the input unit 3, the drawing direction determination unit 11 determines the drawing direction based on the received direction instruction signal.
 描画方向変更部12は、描画方向決定部11で決定した描画方向と記憶部4に格納された描画物天地情報41の各描画物の天地情報とを比較して、天地の向きが描画方向と不一致となった描画物を描画方向に揃うように回転する。 The drawing direction changing unit 12 compares the drawing direction determined by the drawing direction determining unit 11 and the top and bottom information of each drawing in the drawing top and bottom information 41 stored in the storage unit 4, and the top and bottom direction is determined as the drawing direction. Rotate the mismatched drawings to align in the drawing direction.
 次に、描画物の天地の向きを指定された描画方向に揃えるための描画方向変更処理について説明する。 Next, a description will be given of a drawing direction change process for aligning the top / bottom direction of the drawn object with the designated drawing direction.
 図5は、描画方向変更処理の一手順を示すフローチャートである。 FIG. 5 is a flowchart showing a procedure of the drawing direction changing process.
 まず、描画方向決定部11が、表示整形要求として、入力部1から第1乃至第4の方向指示信号のいずれかを受信した否かを判定する(ステップS20)。 First, the drawing direction determination unit 11 determines whether one of the first to fourth direction instruction signals is received from the input unit 1 as a display shaping request (step S20).
 表示整形要求として第1乃至第4の方向指示信号のいずれかを受信した場合は、描画方向決定部11が、その受信した方向指示信号に従って描画方向(表示向き)を決定する(ステップS21)。 When one of the first to fourth direction instruction signals is received as the display shaping request, the drawing direction determination unit 11 determines the drawing direction (display direction) according to the received direction instruction signal (step S21).
 次に、描画方向変更部12が、記憶部4に格納された描画物天地情報41から1つの描画物の天地情報を取得し(ステップS22)、その取得した天地情報がステップS21で決定した描画方向(表示向き)と一致するか否かを判定する(ステップS23)。 Next, the drawing direction changing unit 12 acquires the top and bottom information of one drawing from the drawing top and bottom information 41 stored in the storage unit 4 (step S22), and the acquired top and bottom information is determined in step S21. It is determined whether or not the direction (display direction) matches (step S23).
 ステップS23の判定において、天地情報が描画方向(表示向き)と一致しなかった場合は、続いて、描画方向変更部12が、描画方向(表示向き)に合わせて描画物を回転する(ステップS24)。 If the top-and-bottom information does not match the drawing direction (display direction) in the determination in step S23, then the drawing direction changing unit 12 rotates the drawn object in accordance with the drawing direction (display direction) (step S24). ).
 ステップS23の判定において、天地情報が描画方向(表示向き)と一致した場合またはステップS24の処理が実行された場合は、続いて、描画方向変更部12が、描画物天地情報41に含まれている全ての描画物の天地情報を確認した否かを判定する(ステップS25)。 If it is determined in step S23 that the top / bottom information matches the drawing direction (display direction) or if the process of step S24 is executed, then the drawing direction changing unit 12 is included in the drawing top / bottom information 41. It is determined whether or not the top and bottom information of all the drawn objects is confirmed (step S25).
 ステップS25の判定において、まだ確認されていない描画物の天地情報がある場合は、未確認の描画物の天地情報についてステップS22~S24の処理が実行される。 If it is determined in step S25 that there is top / bottom information of a drawing that has not been confirmed yet, the processing of steps S22 to S24 is performed on the top / bottom information of the unconfirmed drawing.
 ステップS25の判定において、全ての描画物の天地情報が確認された場合は、描画方向変更処理を終了する。 If it is determined in step S25 that the top and bottom information of all the drawn objects has been confirmed, the drawing direction changing process is terminated.
 以下、具体例を挙げて、描画方向変更処理を説明する。 Hereinafter, the drawing direction changing process will be described with a specific example.
 図6Aに、描画方向変更処理の実行前の描画物の表示形態を示し、図6Bに、描画方向変更処理の実行後の描画物の表示形態を示す。図7に、図6Aに示した表示形態についての描画物天地情報41を示す。 FIG. 6A shows a display form of a drawn object before execution of the drawing direction change process, and FIG. 6B shows a display form of the drawn object after execution of the drawing direction change process. FIG. 7 shows the drawing top-and-bottom information 41 for the display form shown in FIG. 6A.
 図6Aに示す表示形態では、表示面21上に、描画物として「A」、「B」、「C」の文字が表示されている。文字「A」は、表示面21の下側辺部が下となるように天地の向きが設定されている。文字「B」は、表示面21の右側辺部が下となるように天地の向きが設定されている。文字「C」は、表示面21の左側辺部が下となるように天地の向きが設定されている。 In the display form shown in FIG. 6A, characters “A”, “B”, and “C” are displayed on the display surface 21 as drawn objects. In the character “A”, the orientation of the top and bottom is set so that the lower side portion of the display surface 21 faces down. The direction of the top and bottom of the character “B” is set so that the right side portion of the display surface 21 faces down. The letter “C” has its top and bottom orientation set so that the left side of the display surface 21 is on the bottom.
 上記の「A」、「B」、「C」の文字の天地情報が、図7に示すような描画物天地情報41として記憶部4に格納される。図7に示す例では、描画物の識別情報(図中では、「A」、「B」、「C」で示されている)と天地の向きを示す天地情報(矢印)とが関連付けて格納されている。 The top and bottom information of the characters “A”, “B”, and “C” is stored in the storage unit 4 as the drawing top and bottom information 41 as shown in FIG. In the example shown in FIG. 7, the identification information of the drawn object (indicated by “A”, “B”, and “C” in the figure) and the top and bottom information (arrow) indicating the direction of the top and bottom are stored in association with each other. Has been.
 操作者が第2の方向指示ボタン32を押下すると、第2の方向指示信号が入力部3から制御部1に供給される。制御部1では、第2の方向指示信号に基づいて表示向きが特定され、その特定された表示向き合わせて各描画物の天地の方向が揃えられる。この結果、図6Bに示すように、「A」、「B」、「C」の各文字が、表示面21の下側辺部が下となるような天地の向きで表示される。 When the operator depresses the second direction instruction button 32, a second direction instruction signal is supplied from the input unit 3 to the control unit 1. In the control unit 1, the display direction is specified based on the second direction instruction signal, and the top and bottom directions of each drawing are aligned according to the specified display direction. As a result, as shown in FIG. 6B, the characters “A”, “B”, and “C” are displayed in a top-and-bottom orientation with the lower side of the display surface 21 facing down.
 上述した描画方向変更処理によれば、天地の方向が異なる複数の描画物を任意の方向に揃えて表示することができるので、描画物の内容を容易に把握することができる。 According to the drawing direction changing process described above, a plurality of drawn objects having different top and bottom directions can be displayed in an arbitrary direction, so that the contents of the drawn object can be easily grasped.
 なお、本実施形態において、天地情報作成部13を削除してもよい。この場合は、各操作者が、自身が操作した描画物についての天地情報を、入力部1を用いた入力操作を通じて、記憶部4に登録してもよい。 In the present embodiment, the top and bottom information creation unit 13 may be deleted. In this case, each operator may register the top-and-bottom information about the drawing operated by the operator in the storage unit 4 through an input operation using the input unit 1.
 また、本実施形態の画像表示装置は、プログラムに従って動作するコンピュータ(CPU:Central Processing unit)を用いて構成されてもよい。プログラムは、少なくとも、制御部1の画像表示処理や描画方向変更処理を、コンピュータに実行させることが可能なものである。プログラムは、コンピュータ読み出し可能な記録媒体、例えば、CD(Compact Disc)やDVD(Digital Video Disc)などの光ディスクや、USB(Universal Serial Bus)メモリやメモリカードなどを用いて提供されてもよく、通信網(例えばインターネット)を介して提供されてもよい。 Further, the image display apparatus of the present embodiment may be configured using a computer (CPU: Central Processing unit) that operates according to a program. The program can cause a computer to execute at least image display processing and drawing direction change processing of the control unit 1. The program may be provided using a computer-readable recording medium, for example, an optical disc such as a CD (Compact Disc) or a DVD (Digital Video Disc), a USB (Universal Serial Bus) memory, a memory card, or the like. It may be provided via a network (for example, the Internet).
 上述した各実施形態の画像表示装置の一部又は全部は、以下の付記1~11のような形態をとり得るが、これら形態に限定されない。
(付記1)
 画像を表示する表示部と、
 前記画像に対する操作を受け取り、該受け取った操作に応じた操作指示を出力する入力部と、
 前記入力部から複数の操作指示を受け付け、これら操作指示に基づく複数の処理を並行して実行する制御部と、を有する画像表示装置。
(付記2)
 前記制御部は、前記複数の操作指示それぞれの少なくとも一部を同時に実行する、付記1に記載の画像表示装置。
(付記3)
 前記画像は複数の操作対象を含み、
 前記入力部は、前記複数の操作対象に対する入力操作に応じて前記複数の操作指示を出力する、付記1または2に記載の画像表示装置。
(付記4)
 前記複数の操作対象は、複数のパーツの画像、複数の領域、又は、複数のレイヤの画像である、若しくは、それらの2つ以上の組み合わせである、付記3に記載の画像表示装置。
(付記5)
 前記複数の領域の少なくとも1つは、任意に設定された領域である、付記4に記載の画像表示装置。
(付記6)
 前記入力部は、前記入力操作を行うための複数の指示体との接触を同時に検出するタッチパネルを有し、前記複数の指示体との接触に基づいて前記前記複数の操作指示を出力する、付記1から5のいずれかに記載の画像表示装置。
(付記7)
 前記入力部は、前記入力操作を行うための複数の指示体の動きを検出し、該検出した指示体の動きに基づいて前記複数の操作指示を出力する検出手段を有する、付記1から5のいずれかに記載の画像表示装置。
(付記8)
 前記画像上の操作対象である複数の描画物それぞれの基準方向に対する表示の向きを示す天地情報が描画物別に格納された記憶部を、さらに有し、
 前記制御部は、任意の表示の向きへの表示整形を要求する表示整形要求を受け付けると、前記記憶部に格納された前記複数の描画物の天地情報を参照して、前記複数の描画物の表示の向きを前記任意の方向に揃える、付記1から7のいずれかに記載の画像表示装置。
(付記9)
 第1の表示向きを指定するための第1の方向指示ボタンと、
 前記第1の表示向きとは反対の第2の表示向きを指定するための第2の方向指示ボタンと、
 前記第1の表示向きと直交する第3の表示向きを指定するための第3の方向指示ボタンと、
 前記第3の表示向きとは反対の第4の表示向きを指定するための第4の方向指示ボタンと、をさらに有し、
 前記制御部は、
  前記第1乃至第4の方向指示ボタンのいずれかが押下されると、該押下された方向指示ボタンに基づいて描画方向を決定する描画方向決定部と、
  前記描画方向決定部で決定した描画方向と前記記憶部に格納された前記複数の描画物の天地情報とを比較し、前記複数の描画物のうち、表示の向きが該描画方向と異なる描画物に対して、該描画物の表示向きを該描画方向に合わせるための回転制御を行う描画方向変更部と、を有する、付記8に記載の画像表示装置。
(付記10)
 画像を表示する表示部と、前記画像に対する操作を受け取り、該受け取った操作に応じた操作指示を出力する入力部と、を備えた画像表示装置にて行われる画像表示方法であって、
 前記入力部から複数の操作指示を受け付け、これら操作指示に基づく複数の処理を並行して実行する、画像表示方法。
(付記11)
 画像を表示する表示部と、前記画像に対する操作を受け取り、該受け取った操作に応じた操作指示を出力する入力部と、を備えた画像表示装置のコンピュータに、
 前記入力部から複数の操作指示を受け付け、これら操作指示に基づく複数の処理を並行して実行する処理を実行させる、プログラム。
A part or all of the image display device of each of the above-described embodiments may take forms such as the following supplementary notes 1 to 11, but is not limited to these forms.
(Appendix 1)
A display for displaying an image;
An input unit that receives an operation on the image and outputs an operation instruction according to the received operation;
An image display device comprising: a control unit that receives a plurality of operation instructions from the input unit and executes a plurality of processes based on the operation instructions in parallel.
(Appendix 2)
The image display device according to appendix 1, wherein the control unit simultaneously executes at least a part of each of the plurality of operation instructions.
(Appendix 3)
The image includes a plurality of operation objects,
The image display apparatus according to appendix 1 or 2, wherein the input unit outputs the plurality of operation instructions in response to an input operation on the plurality of operation targets.
(Appendix 4)
The image display device according to attachment 3, wherein the plurality of operation objects are images of a plurality of parts, a plurality of regions, images of a plurality of layers, or a combination of two or more thereof.
(Appendix 5)
The image display device according to appendix 4, wherein at least one of the plurality of regions is an arbitrarily set region.
(Appendix 6)
The input unit includes a touch panel that simultaneously detects contact with a plurality of indicators for performing the input operation, and outputs the plurality of operation instructions based on the contact with the plurality of indicators. The image display device according to any one of 1 to 5.
(Appendix 7)
The input unit includes detection means for detecting movements of a plurality of indicators for performing the input operation and outputting the plurality of operation instructions based on the detected movements of the indicators. The image display device according to any one of the above.
(Appendix 8)
A storage unit that stores top and bottom information indicating the display direction with respect to the reference direction of each of the plurality of drawn objects that are operation targets on the image;
When receiving a display shaping request for requesting display shaping in an arbitrary display direction, the control unit refers to the top and bottom information of the plurality of drawn objects stored in the storage unit, and The image display device according to any one of appendices 1 to 7, wherein the display direction is aligned in the arbitrary direction.
(Appendix 9)
A first direction instruction button for designating a first display direction;
A second direction instruction button for designating a second display direction opposite to the first display direction;
A third direction instruction button for designating a third display direction orthogonal to the first display direction;
A fourth direction instruction button for designating a fourth display direction opposite to the third display direction;
The controller is
A drawing direction determination unit that determines a drawing direction based on the pressed direction instruction button when any of the first to fourth direction instruction buttons is pressed;
The drawing direction determined by the drawing direction determination unit is compared with the top and bottom information of the plurality of drawing objects stored in the storage unit, and the drawing object whose display direction is different from the drawing direction among the plurality of drawing objects The image display device according to appendix 8, further comprising: a drawing direction changing unit that performs rotation control for adjusting a display direction of the drawing object to the drawing direction.
(Appendix 10)
An image display method performed in an image display device comprising: a display unit that displays an image; and an input unit that receives an operation on the image and outputs an operation instruction according to the received operation,
An image display method for receiving a plurality of operation instructions from the input unit and executing a plurality of processes based on these operation instructions in parallel.
(Appendix 11)
A computer of an image display device comprising: a display unit that displays an image; and an input unit that receives an operation on the image and outputs an operation instruction according to the received operation.
A program that receives a plurality of operation instructions from the input unit and causes a process to execute a plurality of processes based on these operation instructions in parallel.
 以上、実施形態を参照して本発明を説明したが、本発明は上述した実施形態に限定されるものではない。本発明の構成および動作については、本発明の趣旨を逸脱しない範囲において、当業者が理解し得る様々な変更を行うことができる。 The present invention has been described above with reference to the embodiments, but the present invention is not limited to the above-described embodiments. Various modifications that can be understood by those skilled in the art can be made to the configuration and operation of the present invention without departing from the spirit of the present invention.
 2011年2月23日に出願された日本出願特願2011-037147の開示の全てをここに取り込む。 The entire disclosure of Japanese Patent Application No. 2011-037147 filed on February 23, 2011 is incorporated herein.

Claims (10)

  1.  画像を表示する表示部と、
     前記画像に対する操作を受け取り、該受け取った操作に応じた操作指示を出力する入力部と、
     前記入力部から複数の操作指示を受け付け、これら操作指示に基づく複数の処理を並行して実行する制御部と、を有する画像表示装置。
    A display for displaying an image;
    An input unit that receives an operation on the image and outputs an operation instruction according to the received operation;
    An image display device comprising: a control unit that receives a plurality of operation instructions from the input unit and executes a plurality of processes based on the operation instructions in parallel.
  2.  前記制御部は、前記複数の操作指示それぞれの少なくとも一部を同時に実行する、請求項1に記載の画像表示装置。 The image display device according to claim 1, wherein the control unit simultaneously executes at least a part of each of the plurality of operation instructions.
  3.  前記画像は複数の操作対象を含み、
     前記入力部は、前記複数の操作対象に対する入力操作に応じて前記複数の操作指示を出力する、請求項1または2に記載の画像表示装置。
    The image includes a plurality of operation objects,
    The image display device according to claim 1, wherein the input unit outputs the plurality of operation instructions according to an input operation with respect to the plurality of operation objects.
  4.  前記複数の操作対象は、複数のパーツの画像、複数の領域、又は、複数のレイヤの画像である、若しくは、それらの2つ以上の組み合わせである、請求項3に記載の画像表示装置。 The image display device according to claim 3, wherein the plurality of operation objects are images of a plurality of parts, a plurality of regions, images of a plurality of layers, or a combination of two or more thereof.
  5.  前記入力部は、前記入力操作を行うための複数の指示体との接触を同時に検出するタッチパネルを有し、前記複数の指示体との接触に基づいて前記前記複数の操作指示を出力する、請求項1から4のいずれか1項に記載の画像表示装置。 The input unit includes a touch panel that simultaneously detects contact with a plurality of indicators for performing the input operation, and outputs the plurality of operation instructions based on contact with the plurality of indicators. Item 5. The image display device according to any one of Items 1 to 4.
  6.  前記入力部は、前記入力操作を行うための複数の指示体の動きを検出し、該検出した指示体の動きに基づいて前記複数の操作指示を出力する検出手段を有する、請求項1から4のいずれか1項に記載の画像表示装置。 The said input part has a detection means which detects the motion of the several indicator for performing the said input operation, and outputs the said several operation instruction based on this detected motion of the indicator. The image display device according to any one of the above.
  7.  前記画像上の操作対象である複数の描画物それぞれの基準方向に対する表示の向きを示す天地情報が描画物別に格納された記憶部を、さらに有し、
     前記制御部は、任意の表示の向きへの表示整形を要求する表示整形要求を受け付けると、前記記憶部に格納された前記複数の描画物の天地情報を参照して、前記複数の描画物の表示の向きを前記任意の方向に揃える、請求項1から6のいずれか1項に記載の画像表示装置。
    A storage unit that stores top and bottom information indicating the display direction with respect to the reference direction of each of the plurality of drawn objects that are operation targets on the image;
    When receiving a display shaping request for requesting display shaping in an arbitrary display direction, the control unit refers to the top and bottom information of the plurality of drawn objects stored in the storage unit, and The image display device according to claim 1, wherein a display direction is aligned in the arbitrary direction.
  8.  第1の表示向きを指定するための第1の方向指示ボタンと、
     前記第1の表示向きとは反対の第2の表示向きを指定するための第2の方向指示ボタンと、
     前記第1の表示向きと直交する第3の表示向きを指定するための第3の方向指示ボタンと、
     前記第3の表示向きとは反対の第4の表示向きを指定するための第4の方向指示ボタンと、をさらに有し、
     前記制御部は、
      前記第1乃至第4の方向指示ボタンのいずれかが押下されると、該押下された方向指示ボタンに基づいて描画方向を決定する描画方向決定部と、
      前記描画方向決定部で決定した描画方向と前記記憶部に格納された前記複数の描画物の天地情報とを比較し、前記複数の描画物のうち、表示の向きが該描画方向と異なる描画物に対して、該描画物の表示向きを該描画方向に合わせるための回転制御を行う描画方向変更部と、を有する、請求項7に記載の画像表示装置。
    A first direction instruction button for designating a first display direction;
    A second direction instruction button for designating a second display direction opposite to the first display direction;
    A third direction instruction button for designating a third display direction orthogonal to the first display direction;
    A fourth direction instruction button for designating a fourth display direction opposite to the third display direction;
    The controller is
    A drawing direction determination unit that determines a drawing direction based on the pressed direction instruction button when any of the first to fourth direction instruction buttons is pressed;
    The drawing direction determined by the drawing direction determination unit is compared with the top and bottom information of the plurality of drawing objects stored in the storage unit, and the drawing object whose display direction is different from the drawing direction among the plurality of drawing objects The image display device according to claim 7, further comprising: a drawing direction changing unit that performs rotation control for adjusting a display direction of the drawing object to the drawing direction.
  9.  画像を表示する表示部と、前記画像に対する操作を受け取り、該受け取った操作に応じた操作指示を出力する入力部と、を備えた画像表示装置にて行われる画像表示方法であって、
     前記入力部から複数の操作指示を受け付け、これら操作指示に基づく複数の処理を並行して実行する、画像表示方法。
    An image display method performed in an image display device comprising: a display unit that displays an image; and an input unit that receives an operation on the image and outputs an operation instruction according to the received operation,
    An image display method for receiving a plurality of operation instructions from the input unit and executing a plurality of processes based on these operation instructions in parallel.
  10.  画像を表示する表示部と、前記画像に対する操作を受け取り、該受け取った操作に応じた操作指示を出力する入力部と、を備えた画像表示装置のコンピュータに、
     前記入力部から複数の操作指示を受け付け、これら操作指示に基づく複数の処理を並行して実行する処理を実行させる、プログラム。
    A computer of an image display device comprising: a display unit that displays an image; and an input unit that receives an operation on the image and outputs an operation instruction according to the received operation.
    A program that receives a plurality of operation instructions from the input unit and causes a process to execute a plurality of processes based on these operation instructions in parallel.
PCT/JP2012/070765 2012-08-15 2012-08-15 Image display device, image display method, and program WO2014027408A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2012/070765 WO2014027408A1 (en) 2012-08-15 2012-08-15 Image display device, image display method, and program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2012/070765 WO2014027408A1 (en) 2012-08-15 2012-08-15 Image display device, image display method, and program

Publications (1)

Publication Number Publication Date
WO2014027408A1 true WO2014027408A1 (en) 2014-02-20

Family

ID=50685473

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2012/070765 WO2014027408A1 (en) 2012-08-15 2012-08-15 Image display device, image display method, and program

Country Status (1)

Country Link
WO (1) WO2014027408A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0659813A (en) * 1992-08-07 1994-03-04 Fuji Xerox Co Ltd Electronic information picture drawing device
WO1997035248A1 (en) * 1996-03-15 1997-09-25 Hitachi, Ltd. Display and its operating method
JP2004171504A (en) * 2002-11-17 2004-06-17 Sega Corp Information terminal with touch panel

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0659813A (en) * 1992-08-07 1994-03-04 Fuji Xerox Co Ltd Electronic information picture drawing device
WO1997035248A1 (en) * 1996-03-15 1997-09-25 Hitachi, Ltd. Display and its operating method
JP2004171504A (en) * 2002-11-17 2004-06-17 Sega Corp Information terminal with touch panel

Similar Documents

Publication Publication Date Title
JP2012174112A (en) Image display device, image display method, and program
JP6132644B2 (en) Information processing apparatus, display control method, computer program, and storage medium
US8749497B2 (en) Multi-touch shape drawing
JP5584372B2 (en) Display device, user interface method and program
US20160283054A1 (en) Map information display device, map information display method, and map information display program
JP5333397B2 (en) Information processing terminal and control method thereof
JP2014215737A5 (en) Information processing apparatus, display control method, computer program, and storage medium
US9623329B2 (en) Operations for selecting and changing a number of selected objects
RU2613739C2 (en) Method, device and terminal device for apis movement control
KR102205283B1 (en) Electro device executing at least one application and method for controlling thereof
JP2014235698A (en) Information processing apparatus and information processing apparatus control method
US8631317B2 (en) Manipulating display of document pages on a touchscreen computing device
WO2014034031A1 (en) Information input device and information display method
CN106527915A (en) Information processing method and electronic equipment
CN104423836A (en) Information processing apparatus
JP2018136650A (en) Object moving program
JP2015035092A (en) Display controller and method of controlling the same
CN104679389B (en) Interface display method and device
JP5620895B2 (en) Display control apparatus, method and program
TWI405104B (en) Method of turning over three-dimensional graphic object by use of touch sensitive input device
JP6028375B2 (en) Touch panel device and program.
WO2014027408A1 (en) Image display device, image display method, and program
JPWO2018179552A1 (en) ANALYZING DEVICE HAVING TOUCH PANEL DEVICE, ITS DISPLAY CONTROL METHOD, AND PROGRAM
JP6327834B2 (en) Operation display device, operation display method and program
JP2015076068A (en) Display device, display control method therefor, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12891356

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12891356

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP