JP5127547B2 - Display object control device, display object control program, and display device - Google Patents

Display object control device, display object control program, and display device Download PDF

Info

Publication number
JP5127547B2
JP5127547B2 JP2008109415A JP2008109415A JP5127547B2 JP 5127547 B2 JP5127547 B2 JP 5127547B2 JP 2008109415 A JP2008109415 A JP 2008109415A JP 2008109415 A JP2008109415 A JP 2008109415A JP 5127547 B2 JP5127547 B2 JP 5127547B2
Authority
JP
Japan
Prior art keywords
object
display
display screen
position data
touch member
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2008109415A
Other languages
Japanese (ja)
Other versions
JP2009259110A (en
Inventor
圭二 石森
Original Assignee
株式会社東芝
東芝ソリューション株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社東芝, 東芝ソリューション株式会社 filed Critical 株式会社東芝
Priority to JP2008109415A priority Critical patent/JP5127547B2/en
Publication of JP2009259110A publication Critical patent/JP2009259110A/en
Application granted granted Critical
Publication of JP5127547B2 publication Critical patent/JP5127547B2/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/041012.5D-digitiser, i.e. digitiser detecting the X/Y position of the input means, finger or stylus, also when it does not touch, but is proximate to the digitiser's interaction surface and also measures the distance of the input means within a short range in the Z direction, possibly with a separate measurement setup

Description

  The present invention relates to a display object control device, a display object control program, and a display device that control an object displayed by an application program on a touch panel display device that detects a touched position.

  With the development of information terminals in recent years, touch panel type display devices have become widespread. This touch panel display device can operate an information terminal connected to the display device by pressing the screen with a touch member such as a finger or a stylus. By using a touch panel display device, the same operation can be performed without operating a special device such as a mouse. Accordingly, since the user of the touch panel display device can intuitively operate the information terminal, it is easy for anyone to handle and has the advantage of being familiar with people who are unfamiliar with the operation of the information terminal. This touch panel type display device is used in ATMs of banks, car navigation systems, vending machines such as tickets, copy machines and the like.

  Since these devices are widely used by the general public, it is required that anyone can easily handle them. For example, there is a display device that detects that the fingertip of the operator has approached, identifies a region that the operator's fingertip aims at, and changes the display on the screen (see, for example, Patent Document 1). In the display device described in Patent Document 1, as shown in FIG. 3 and FIG. 4 of Patent Document 1, an enlarged portion of an approached portion is displayed, and more detailed information is displayed, so that an operator's selection area is displayed. Can be narrowed down.

Also, there is an information display device that, when detecting that a hand has entered, determines a zone where a finger is present and displays an enlarged operation switch display unit located in the zone where the finger is present (see, for example, Patent Document 2). ). In the information display device described in Patent Document 2, a zone where a finger is present is detected by signals from a plurality of infrared sensors, and an operation switch display unit located in the zone is enlarged.
JP 2003-44223 A JP 2006-103363 A

  However, in the known technology as described above, it is considered that the object pressed by the user cannot be appropriately selected.

  In a general touch panel display device, pressing with a tool having a sharp tip such as a stylus can surely press the object even if the object is small, but it may be troublesome to take out a tool such as a stylus. In particular, when pressing with a finger or the like, there is a problem that it is difficult to select a target object appropriately because it is not a tip.

  Further, in the display device described in Patent Document 1, although the operator's selection area can be narrowed down, there is a problem that not all information can be displayed on the screen displayed first. Specifically, when all the predetermined amount of information is to be displayed on the display screen, there is a problem that the characters become small. On the other hand, if the size of the characters is easy for the operator to see, there is a problem that the entire information amount cannot be displayed and the amount of information displayed on the display screen must be limited. Furthermore, in the method described in Patent Document 2, when the distance between the finger and the display screen is equal to or less than a predetermined value, the switch is expanded. Therefore, the switch may be the target switch, but the switch is expanded. There is a problem that the user's processing becomes complicated when the other switch is pressed later. Specifically, when a predetermined switch is enlarged, another switch cannot be selected. Therefore, when the user presses another switch, it is necessary to repeatedly move the finger closer to or away from the display screen until the target switch is enlarged.

  As described above, in the conventional method, it is sometimes difficult to press the button with only a finger without using a special device. In addition, as described in Patent Document 1 and Patent Document 2, even if the button can be pressed with a finger, the amount of information displayed on the screen may be limited, or the pressing operation may be complicated. It was. Therefore, there is a demand for a technique that supports a user to easily operate an object on the screen without restricting information displayed on the screen.

  Accordingly, an object of the present invention is to provide a display object control device, a display object control program, and a display device that allow a user to appropriately select an object displayed on a touch panel display device.

In order to solve the above-described problem, a first feature of the present invention relates to a display object control device that controls an object displayed by an application program on a touch panel display device. That is, the display object control device according to the first feature of the present invention includes an object position data storage means for associating an object identifier and object position data and storing them in the object position data storage device, and the touch member as a display device. When approaching the display screen, the approach operation detecting means for sequentially acquiring the position coordinates of the touch member on the display screen and the distance between the touch member and the display screen, the object position data storage device is read, and the approach operation detecting means Calculating the position data corresponding to the distance sequentially acquired by the approaching operation detecting means for the object acquired by the object acquiring means, the object acquiring means for acquiring the identifier of the object corresponding to the position coordinates acquired by The new position data of the object is calculated Comprises a property changing means for sequentially changing the location data, every time a new location data is sequentially changed by the property changing means, and a screen display means for displaying the object based on the new position data.

Here, the approaching operation detection means, through the proximity sensing device for sensing the approach of the touch member to the display screen, the position coordinates of the touch member on the display screen, the distance between the touch member and the display screen, You may get

The property changing unit may further change the property so that the object acquired by the object acquiring unit is displayed in front of other objects.
Further, in the object position data, an increase / decrease value and a unit for determining the size of the object according to the distance between the touch member and the display screen may be associated with the identifier of the object. In this case, when the distance between the touch member and the display screen is decreased by the numerical value of the unit, the property changing unit increases the size of the object by the numerical value of the increase / decrease value, and when the distance of the unit is increased by the numerical value of the unit. Decrease the size by the increment / decrement value.
Further, the property changing means changes the size of the object when the distance between the touch member and the display screen is equal to or less than a predetermined threshold value, and even if the distance between the touch member and the display screen increases, The position data of the object may be calculated so as not to be smaller than the size of the object displayed on the screen.

The second feature of the present invention relates to a display object control program for controlling an object displayed by an application program on a touch panel display device. Display object control program according to the second aspect of the present invention, the computer, the identifier of the object, in association with the position data of the object, and the object position data storage means for storing the object position data storage device, a touch member When approaching the display screen of the display device, the approach operation detecting means for sequentially acquiring the position coordinates of the touch member on the display screen and the distance between the touch member and the display screen, and the object position data storage device are read out to perform the approach operation. Object acquisition means for acquiring an identifier of an object corresponding to the position coordinate acquired by the detection means, and position data corresponding to the distance sequentially acquired by the approach operation detection means for the object acquired by the object acquisition means. The position data of the object And property changing means for sequentially changing the calculated new position data, every time a new location data is sequentially changed by the property changing means, to function as a screen display means for displaying the object based on the new position data.

  According to the present invention, it is possible to provide a display object control device, a display object control program, and a display device that allow a user to appropriately select an object displayed on a touch panel display device.

  Next, embodiments of the present invention will be described with reference to the drawings. In the following description of the drawings, the same or similar parts are denoted by the same or similar reference numerals.

(Best Embodiment)
As shown in FIG. 1, the display object control device 1 according to the best embodiment of the present invention includes a touch panel display device 105 that detects a touched position, and is displayed on the display device 105 by an application program. It is a device that controls the object to be processed. In the example illustrated in FIG. 1, a case where the display object control device 1 includes the display device 105 will be described. Here, the display object control device 1 may be connected to a plurality of display devices 105 and control objects displayed on the plurality of display devices 105. In the best mode of the present invention, the “object” is an object used for screen display such as a button, list box, and text box displayed on the display screen 105a, and is an object to be selected by the user. It is.

  Here, the operation of the display object control apparatus 1 according to the preferred embodiment of the present invention will be described with reference to FIG. FIGS. 2A, 2 </ b> C, and 2 </ b> E are diagrams showing the display screen 105 a and the touch member 130 of the display device 105. FIGS. 2B, 2D, and 2F are views of the display screen 105a viewed from the horizontal direction, and are diagrams illustrating the distance between the touch member 130 and the display screen 105a of the display device 105. FIG. In FIG. 2, a case where the user tries to press the “sa” button with a finger will be described.

  When the distance between the finger and the display screen 105a is larger than a predetermined value as shown in FIG. 2B, the “sa” button has a normal size as shown in FIG. When the distance between the finger and the display screen 105a becomes smaller than a predetermined value as shown in FIG. 2D, the “sa” button becomes slightly larger as shown in FIG. When the distance between the finger and the display screen 105a is further reduced as shown in FIG. 2F, the “sa” button is changed according to the distance between the finger and the display screen 105a as shown in FIG. It gets bigger. Here, the predetermined value is a distance between the touch member 130 and the display screen 105a by the display object control apparatus 1 according to the best embodiment of the present application, and is a threshold value for increasing the object.

  Thus, in the preferred embodiment of the present invention, as the distance between the finger and the display screen 105a decreases, the target button increases so that the user can easily press the target button. In this way, by improving the visibility of only the objects that are to be pressed by the user, erroneous operations by the user can be prevented.

  As shown in FIG. 3, the display object control device 1 according to the preferred embodiment of the present invention includes a central processing control device 101, a ROM (Read Only Memory) 102, a RAM (Random Access Memory) 103, and an input / output interface 109. Are connected via the bus 110. An input device 104, a display device 105, a communication control device 106, a storage device 107, and a removable disk 108 are connected to the input / output interface 109.

  The central processing control device 101 reads out and executes a boot program for starting the display object control device 1 from the ROM 102 based on an input signal from the input device 104, and further reads an operating system stored in the storage device 107. Further, the central processing control device 101 controls various devices based on input signals from the input device 104, the communication control device 106, etc., and reads programs and data stored in the RAM 103, the storage device 107, etc. into the RAM 103. A processing device that loads and implements a series of processes to be described later, such as data calculation or processing, based on a program command read from the RAM 103.

  The input device 104 includes input devices such as a keyboard and a mouse through which an operator inputs various operations. The input device 104 generates an input signal based on the operation of the operator, and inputs via the input / output interface 109 and the bus 110. It is transmitted to the central processing control apparatus 101. The display device 105 is a display device such as a liquid crystal display. This is a device that receives an output signal to be displayed on the display device 105 from the central processing control device 101 via the bus 110 and the input / output interface 109 and displays the processing result of the central processing control device 101, for example.

  Here, the display device 105 according to the preferred embodiment of the present invention is a touch panel type display device. When the touch device 130 such as a finger or a stylus senses contact on the display screen 105a, the display device 105 acquires information on the touched position. Further, the display device 105 transmits the acquired position to the central processing control device 101 via the input / output interface 109 and the bus 110. Accordingly, the display device 105 can acquire the same information as the pointer without using another pointer such as a mouse. As a method for detecting contact with the screen, there are a pressure-sensitive type for detecting a change in pressure and an electrostatic type for detecting an electric signal due to static electricity. In the best mode of the present invention, either method is used. Can also be applied.

  The communication control device 106 is a device such as a LAN card or a modem, and is a device that connects the display object control device 1 to a communication network such as the Internet or a LAN. Data transmitted / received to / from the communication network via the communication control device 106 is transmitted / received to / from the central processing control device 101 via the input / output interface 109 and the bus 110 as an input signal or an output signal.

  The storage device 107 is a semiconductor storage device or a magnetic disk device, and stores programs and data executed by the central processing control device 101. The removable disk 108 is an optical disk or a flexible disk, and signals read / written by the disk drive are transmitted / received to / from the central processing control apparatus 101 via the input / output interface 109 and the bus 110.

  The storage device 107 of the display object control device 1 according to the best mode of the present invention stores an application program and a display object control program, and also stores object position data 21. In addition, the application execution means 11, the screen display means 12, the touch operation control means 13, and the like are mounted on the display object control apparatus 1 by the application program being read and executed by the central processing control apparatus 101 of the display object control apparatus 1. The When the display object control program is read and executed by the central processing control device 101 of the display object control device 1, the approach operation detection means 14, the object acquisition means 15, and the property change means 16 are mounted on the display object control device 1. .

  The object position data 21 is data relating to the object property of the screen display data displayed by the application execution means 11. For example, the object position data 21 stores an object identifier and object position data in association with each other. The object position data 21 is stored in the storage device 107 by the object position data storage means. Each time the application execution unit 11 displays the screen, the object position data storage unit stores the position data of each object included in the screen data in the object position data 21 based on the screen data displayed on the screen. good.

  The object position data 21 according to the preferred embodiment of the present invention has an example of the data structure and data shown in FIG. Specifically, the object position data 21 is associated with a display position Y, a display position X, a height, a width, an increase / decrease value, and a unit using the object ID as a key. As shown in FIG. 5, the “display position Y” is a coordinate position in the vertical direction (Y direction) of the center of the object displayed on the display screen 105a. “Display position X” is a coordinate position in the horizontal direction (X direction) of the center of the object displayed on the display screen 105a. “Height” is the length in the vertical direction (Y direction) of the object, and “width” is the length in the horizontal direction (X direction) of the object.

  The “increase / decrease value” and “unit” are parameters that determine the size of the object according to the distance between the touch member 130 and the display screen 105a. In the preferred embodiment of the present invention, when the distance between the touch member 130 and the display screen 105a changes by a numerical value of “unit”, the size of the object changes by a numerical value of “increase / decrease value”. Specifically, when the distance between the touch member 130 and the display screen 105a is reduced by 3 mm, the height and width of the object are increased by 2 pixels. Conversely, when the distance between the touch member 130 and the display screen 105a increases by 3 mm, the height and width of the object are reduced by 2 pixels. However, it is preferable that the application execution unit 11 controls the size so as not to be smaller than the size of the object registered first.

  The object position data 21 may further include threshold value data of the distance between the touch member 130 and the display screen 105a for changing the size of the object. It is preferable that the display object control device 1 changes the size of the object only when the distance between the touch member 130 and the display screen 105a is equal to or smaller than the threshold data.

  In the best embodiment of the present application, the case where the size of an object is determined according to the amount of change in the distance between the touch member 130 and the display screen 105a by setting “increase / decrease value” and “unit” will be described. However, it is not limited to this method. For example, the size of the object may be calculated from the distance between the touch member 130 and the display screen 105a. In FIG. 4, the case where “increase / decrease value” and “unit” are set for each object has been described, but “increase / decrease value” and “unit” may be calculated in common for all objects. . As described above, various methods can be considered as a method for determining the size of the object in accordance with the distance between the touch member 130 and the display screen 105a.

  The application execution means 11 is a means for executing a general application. This application receives an operation instruction from the user when the user touches the display screen 105a of the display device 105 via the touch panel display device 105, and executes the application based on the operation instruction. For example, when the display device 105 is a bank ATM display device, the application execution means 11 executes applications such as “withdrawal” and “balance inquiry” according to the menu specified by the user. Further, when the display device 105 is a display device of a ticket vending machine for transportation facilities, the application execution means 11 executes applications such as “buy ticket” and “buy commuter pass” according to the menu specified by the user.

  The screen display unit 12 displays a result or the like executed by the application execution unit 11 on the display screen 105 a of the display device 105 based on an instruction from the application execution unit 11. It is preferable that the screen display means 12 further displays the screen every time the property of the object is changed by the property change means 15. When the display device 105 is a bank ATM display device, for example, the screen display means 12 displays a menu of “withdrawal”, “balance inquiry”, etc. on the display device 105. When the display device 105 is a display device of a transport ticket vending machine, for example, the screen display means 12 displays a menu such as “buy ticket” and “buy commuter pass” on the display device 105. At this time, the screen display unit 12 transmits the screen data configured according to the property of the object changed by the property changing unit 15 described later to the display device 105 and displays it on the display device 105.

  When the touch operation detection unit 13 detects the touch of the touch member 130 on the display screen 105 a of the display device 105, the touch operation detection unit 13 transmits the detection of the contact and the detected position information to the application execution unit 11. The application execution unit 11 executes the application according to the position information where the touch of the touch member is detected.

  When the touch member 130 approaches the display screen 105a of the display device 105, the approaching operation detection unit 14 acquires the position coordinates of the touch member 130 on the display screen 105a and the distance between the touch member 130 and the display screen. Here, the touch member 130 is a member that touches the display screen 105 by a user operation with a user's finger, stylus, or the like. In the best embodiment, the size of the touch member 130 is not particularly limited. The touch member 130 larger than the object displayed on the display screen may be used.

  It is preferable that the approach operation detection unit 14 intermittently detects the approach of the touch member 130 to the display screen 105a and sequentially acquires the distance between the touch member 130 and the display screen 105a. Further, the approach operation detection unit 14 may be any device as long as it can acquire the position coordinates of the touch member 130 on the display screen 105a and the distance between the touch member 130 and the display screen 105a.

  The object acquisition unit 15 reads the object position data 21 from the storage device 107 and acquires the identifier of the object corresponding to the position coordinate acquired by the approaching operation detection unit 14. The object acquisition means 15 acquires the identifier of the object that the user is trying to contact. Specifically, the object acquisition unit 15 acquires the identifier of the object including the position coordinates acquired by the approach operation detection unit 14. When there is no object including the position coordinates, the object acquisition unit 15 may acquire the identifier of the object closest to the acquired position coordinates.

  The property changing unit 16 calculates new position data of the object acquired by the object acquiring unit 15 according to the distance acquired by the approaching operation detecting unit 14, and changes the position data of the object to new position data. To do. Here, the property changing unit 16 may further change the property so that the object acquired by the object acquiring unit 15 is displayed in front of other objects.

  Based on the property changed by the property changing unit 16, the screen display unit 12 redisplays the screen on the display device 105. Specifically, the screen display unit 12 displays the object that the user is trying to press on the display device 105 in an enlarged manner or on the foreground so that the user can easily operate the object.

  Here, the process by the approach operation means 14 is explained in full detail. Various devices can be considered as devices for obtaining the position coordinates of the touch member 130 on the display screen 105a and the distance between the touch member 130 and the display screen 105a in the approaching operation detecting means 14. For example, a camera is provided in the vicinity of the display screen 105a, a method of detecting the touch member 130 from image data captured by the camera, a method of detecting using the proximity sensing device, and a CCD element arranged in parallel with the display screen 105a Possible ways to do this.

  In the method of providing a camera in the vicinity of the display screen 105a, for example, as shown in FIG. 6, the camera is installed at a position where the touch member 130 can approach the display screen 105a. Further, the approach operation means 14 determines the position coordinates of the touch member 130 on the display screen 105a, the touch member 130 and the display screen from the size and position of the touch member 130 in the image data based on the image data acquired by the camera. And the distance to 105a. In this method, reference image data of the touch member 130 at a position away from a certain distance is acquired in advance. The reference image data may include a plurality of data depending on the distance from the touch member 130 and the position on the display screen 105a that is to be pressed by the touch member 130. If the touch member 130 enters the range that can be photographed by the camera 120 to press the display screen 105a with the touch member 130, the camera 120 acquires the image data. The approaching operation detection unit 14 calculates the distance between the camera 120 and the touch member 130 by comparing the size of the touch member 130 in the reference image data with the size of the touch member 130 in the acquired image data. Further, as shown in FIG. 7, the approaching operation detecting unit 14 uses a trigonometric function to calculate the approaching operation detection unit 14 from the distance between the camera 120 and the touch member 130 and the position of the touch member 130 in the acquired image data. The position coordinate of the touch member 130 is acquired, and the distance between the display screen 105a and the touch member 130 is acquired.

  Further, in the method using the proximity sensing device, for example, the proximity sensing device is installed so that the state where the touch member 130 approaches the display screen 105a can be detected. The proximity sensing device is, for example, a sensor. The approach operation means 14 acquires the position coordinates of the touch member 130 on the display screen 105a and the distance between the touch member 130 and the display screen 150a via this approach sensing device. In this method, in order to acquire the position coordinates of the touch member 130 on the display screen 105a and the distance between the touch member 130 and the display screen 105a, a plurality of proximity sensing devices may be provided.

  In the method of arranging the CCD elements in parallel with the display screen 105a, the position on the display screen 105a to be pressed by the touch member 130 is acquired from the image data generated by the CCD elements. Further, the distance between the display screen 105a and the touch member 130 is calculated from the size of the touch member 130 in the image data. Also in this method, as in the case of installing a camera, reference image data serving as a reference may be prepared in advance in order to acquire the distance between the touch member 130 and the display screen 105a. This reference image data is preferably prepared for each distance between the touch member 130 and the display screen 105a. In each reference image data, the size of the touch member 130 changes according to the distance. Instead of preparing the reference image data, a table in which the size of the touch member 130 calculated from each reference image data and the distance between the touch member 130 and the display screen 105a are associated may be provided.

  With reference to FIG. 6, the operation of the display device 105 of the display object control device 1 according to the preferred embodiment of the present invention will be described in detail. In FIG. 6, a case will be described in which a camera is provided in the vicinity of the display screen 105a and the position coordinates of the touch member 130 on the display screen 105a and the distance between the touch member 130 and the display screen 105a are acquired.

  The central processing control device 101 is connected to the display device 105 via the input / output interface 109 and the bus 110. The display device 105 includes display device control means 105b. The display device control unit 105b displays a screen on the display screen 105b in accordance with a display instruction from the screen display unit 12 or the like of the central processing control device 101. Further, when the display device control unit 105b detects contact by the touch member 130 on the display screen 105a, the display device control unit 105b outputs the information to the touch operation detection unit 13.

  In FIG. 6, a camera 120 is provided in the vicinity of the display screen 105a. The camera 120 captures the touch member 130 that is about to press the object on the display screen 105 a and transmits the captured image data to the approaching operation detection unit 14. The approach operation detection unit 14 calculates the position coordinates of the touch member 130 on the display screen 105a and the distance between the touch member 130 and the display screen 105a from the captured image data. The position coordinates calculated by the approaching operation detection means 14 include a vertical coordinate position X and a horizontal coordinate position Y on the display screen 105a. Specifically, as shown in FIG. 8, the approaching operation detection unit 14 acquires the position coordinates (X, Y) of the touch member 130 on the display screen 105a and the distance (H) between the touch member 130 and the display screen 105a. To do. The approach operation detection unit 14 inputs the position coordinates (X and Y) of the touch member 130 on the display screen 105a to the object acquisition unit 15, and sets the distance (H) between the touch member 130 and the display screen 105a as property change unit. 16

  With reference to FIG. 9, processing for controlling a display object according to the preferred embodiment of the present invention will be described.

  First, in step S101, it waits for the application execution means 11 to execute an application. When the execution of the application is started, the application execution unit 11 registers the position data of the object used for screen display in the object position data 21 in step S102. In step S <b> 103, the screen display unit 12 displays the screen on the display device 105 based on the screen display instruction from the application execution unit 11. It is preferable that the registered object position data 21 is deleted each time the application displays the screen, and the registration process of the object position data 21 is executed based on the newly displayed object. In step S102, it is preferable to register the object position data in the object position data 21 only for the object to be operated by the user.

  In step S104, when the approaching operation detecting unit 14 acquires the approach of the touch member 130 to the display screen 105a, the process proceeds to step S105. In step S105, the approach operation detection unit 14 acquires the distance between the touch member 130 and the display screen 105a, and calculates the position coordinates of the touch member 130 on the display screen 105a in step S106. The processes in steps S105 and S106 may be calculated by any method as described above. Of course, depending on the calculation method, the processing of step S105 and step S106 may be reversed.

  Next, in step S107, the object acquisition unit 15 acquires the identifier of the object that the user is trying to touch based on the position coordinates of the touch member acquired in step S106. At this time, it is preferable to search for an object included in the position coordinates of the touch member 130 acquired in step S106, and to search for an object closest to the position coordinates of the touch member if no object is found.

  In step S107, when the identifier of the object that the user is trying to touch is specified, the property changing unit 16 changes the property of the object in step S108. Specifically, in order to change the size of the object based on the distance between the touch member 130 calculated in step S105 and the display screen 105a or to display the object that the user is trying to touch on the foreground, Change object properties. At this time, the property changing unit 16 may calculate a new size of the object with reference to the parameters stored in the object position data 21. Also, a new size of the object may be calculated by a predetermined calculation method stored in the program.

  In step S109, the screen display unit 12 displays a screen on the display device 105 according to the property changed in step S108, returns to step S104, and waits for the touch member 130 to approach. The processing from step S104 to step S109 is executed every time the touch member 130 approaches the display screen 105a.

  In step S104, when the touch operation detection unit 13 acquires that the touch member 130 has touched the display screen 105a of the display device 105, the process proceeds to step S110. In step S110, when an instruction to end the processing of the application is received by the user's touch operation, the execution of the application is ended. If the process is not terminated, the process proceeds to step S111, and the touch operation detection unit 13 executes the application based on the information on the touch operation by the user, and returns to step S102. In step S102, information about each object displayed on the screen is registered in the object position data 21 when a new screen is displayed based on the execution of the application in step S110.

  Thus, in the preferred embodiment of the present invention, as the distance between the touch member 130 and the display screen 105a decreases, the target button increases so that the user can easily press the target button. Thus, the malfunction by the user can be prevented by improving the visibility of the object.

  Further, the object becomes larger as the distance from the display screen becomes shorter. Thereby, each object can be selected in a large state by moving the touch member in parallel while maintaining the distance from the display screen. The size of the object is determined according to the distance from the display screen. Accordingly, the user can move the touch member in parallel while maintaining the distance from the display screen while keeping the size of the object so as not to cover the adjacent object. You can select an object.

  In the preferred embodiment of the present invention, the touch member 130 may be a finger or a stylus, but the size is not particularly specified. Further, the position coordinate of the touch member 130 on the display screen 105a and the method for obtaining the distance between the touch member 130 and the display screen 105a may be different depending on the type of the touch member 130. In this case, it is preferable that the approach operation detection unit 14 specifies the type of the touch member 130 and acquires the position coordinates and the distance according to the type.

(Modification)
In the best embodiment of the present invention, the case where the display object control device 1 is provided with the display device 105 has been described, but other modified examples are also conceivable.

  Specifically, a touch panel display device that detects the touched position may be provided with a processing circuit equivalent to the display object control device 1 according to the best mode of the present invention. Specifically, the display device according to the modification includes means for realizing a normal touch panel type function, and associates an object identifier with the object position data and stores the object in the object position data storage device. A position data storage means; an approach operation detection means for acquiring a position coordinate of the touch member on the display screen and a distance between the touch member and the display screen when the touch member approaches the display screen; and an object position data storage device. An object acquisition unit that reads and acquires an identifier of an object corresponding to the position coordinate acquired by the approach operation detection unit, and a new object acquired by the object acquisition unit according to the distance acquired by the approach operation detection unit Position data of the object And property changing means for changing to a new position data, based on the changed property by property changing means may comprise a screen display means for displaying the object on the screen.

  Further, similarly to the best embodiment, a camera installed in the vicinity of the display screen of the display device may be further provided. In this case, the approaching operation detecting means, based on the image data acquired by the camera, from the size and position of the touch member in the image data, the position coordinates of the touch member on the display screen, the distance between the touch member and the display screen, , Get. Moreover, you may further provide the approach detection apparatus which detects the approach to the display screen of a touch member. In this case, the approaching operation detection unit acquires the position coordinates of the touch member on the display screen and the distance between the touch member and the display screen via the approach sensing device.

(Other embodiments)
As described above, the best mode for carrying out the invention and the modifications thereof have been described. However, it should not be understood that the description and drawings constituting a part of this disclosure limit the present invention. From this disclosure, various alternative embodiments, examples, and operational techniques will be apparent to those skilled in the art.

  For example, the display object control apparatus described in the best embodiment of the present invention may be configured on a single piece of hardware as shown in FIG. 1, or a plurality of pieces of hardware depending on the functions and the number of processes. It may be configured above. Moreover, you may implement | achieve on the existing information processing system.

  It goes without saying that the present invention includes various embodiments not described herein. Therefore, the technical scope of the present invention is defined only by the invention specifying matters according to the scope of claims reasonable from the above description.

It is a figure explaining the functional block of the display object control apparatus which concerns on the best embodiment of this invention. It is a figure explaining the outline | summary of a process of the display object control apparatus which concerns on the best embodiment of this invention. It is a figure explaining the hardware constitutions of the display object control apparatus which concerns on the best embodiment of this invention. It is a figure explaining an example of the data structure and data of the object position data of the display object control apparatus which concerns on the best embodiment of this invention. It is a figure explaining the item of the object position data of the display object control apparatus which concerns on the best embodiment of this invention. It is a figure explaining the process of a display apparatus and a central processing control apparatus in the display object control apparatus which concerns on the best embodiment of this invention. In the display object control apparatus which concerns on the best embodiment of this invention, it is a figure explaining the method of calculating the position coordinate of the touch member in a display screen, and the distance of a touch member and a display screen using a camera. In the display object control apparatus according to the preferred embodiment of the present invention, it is a diagram illustrating a method for calculating the position coordinates of the touch member on the display screen and the distance between the touch member and the display screen. It is a flowchart explaining the process in the display object control apparatus which concerns on the best embodiment of this invention.

Explanation of symbols

DESCRIPTION OF SYMBOLS 1 Display object control apparatus 11 Application execution means 12 Screen display means 13 Touch operation detection means 14 Approach operation detection means 15 Object acquisition means 16 Property change means 21 Object position data 101 Central processing control apparatus 102 ROM
103 RAM
104 Input Device 105 Display Device 106 Communication Control Device 107 Storage Device 108 Removable Disk 109 Input / Output Interface 110 Bus

Claims (6)

  1. A display object control device for controlling an object displayed by an application program on a touch panel display device,
    Object position data storage means for associating the identifier of the object with the position data of the object and storing it in an object position data storage device;
    When the touch member approaches the display screen of the display device, the approach operation detecting means for sequentially acquiring the position coordinates of the touch member on the display screen and the distance between the touch member and the display screen;
    An object acquisition unit that reads the object position data storage device and acquires an identifier of the object corresponding to the position coordinate acquired by the approaching operation detection unit;
    For the object acquired by the object acquisition means, position data corresponding to the distance sequentially acquired by the approaching operation detection means is calculated, and the position data of the object is sequentially changed to the calculated new position data. Property change means ,
    A display object control device, comprising: screen display means for displaying the object based on new position data each time new position data is sequentially changed by the property changing means .
  2. The approaching operation detecting means includes a position coordinate of the touch member on the display screen, the touch member and the display screen via an approach sensing device that senses the approach of the touch member to the display screen of the display device. The display object control apparatus according to claim 1, wherein the distance is acquired.
  3. The property change means further display object control device according to the object acquired by the object obtaining means, to claim 1 or 2, characterized in that to change the properties so as to appear in front of other objects .
  4.   The object position data is further associated with an identifier of the object, an increase / decrease value and a unit for determining the size of the object according to the distance between the touch member and the display screen,
      When the distance between the touch member and the display screen decreases by the numerical value of the unit, the property changing unit increases the size of the object by the numerical value of the increase / decrease value and increases by the numerical value of the unit. , Decrease the size of the object by the value of the increase / decrease value
      The display object control apparatus according to claim 1, wherein the display object control apparatus is a display object control apparatus.
  5. The property changing means changes the size of the object when the distance between the touch member and the display screen is a predetermined threshold value or less, and even if the distance between the touch member and the display screen increases, The position data of the object is calculated so as not to be smaller than the size of the object initially displayed on the display screen.
      The display object control apparatus according to claim 4, wherein the display object control apparatus is a display object control apparatus.
  6. A display object control program for controlling an object displayed by an application program on a touch panel display device,
    The computer,
    Object position data storage means for associating the identifier of the object with the position data of the object and storing it in an object position data storage device;
    When the touch member approaches the display screen of the display device, the approach operation detecting means for sequentially acquiring the position coordinates of the touch member on the display screen and the distance between the touch member and the display screen;
    An object acquisition unit that reads the object position data storage device and acquires an identifier of the object corresponding to the position coordinate acquired by the approaching operation detection unit;
    For the object acquired by the object acquisition means, position data corresponding to the distance sequentially acquired by the approaching operation detection means is calculated, and the position data of the object is sequentially changed to the calculated new position data. Property change means ,
    A display object control program for causing a function to function as screen display means for displaying the object based on new position data each time new position data is sequentially changed by the property changing means .
JP2008109415A 2008-04-18 2008-04-18 Display object control device, display object control program, and display device Active JP5127547B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2008109415A JP5127547B2 (en) 2008-04-18 2008-04-18 Display object control device, display object control program, and display device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2008109415A JP5127547B2 (en) 2008-04-18 2008-04-18 Display object control device, display object control program, and display device

Publications (2)

Publication Number Publication Date
JP2009259110A JP2009259110A (en) 2009-11-05
JP5127547B2 true JP5127547B2 (en) 2013-01-23

Family

ID=41386432

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2008109415A Active JP5127547B2 (en) 2008-04-18 2008-04-18 Display object control device, display object control program, and display device

Country Status (1)

Country Link
JP (1) JP5127547B2 (en)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4795462B2 (en) 2009-11-12 2011-10-19 ファナック株式会社 Roll hem processing equipment using robot manipulator with force sensor
US8438531B2 (en) * 2009-12-01 2013-05-07 Cadence Design Systems, Inc. Visualization and information display for shapes in displayed graphical images
US8645901B2 (en) 2009-12-01 2014-02-04 Cadence Design Systems, Inc. Visualization and information display for shapes in displayed graphical images based on a cursor
US8533626B2 (en) 2009-12-01 2013-09-10 Cadence Design Systems, Inc. Visualization and information display for shapes in displayed graphical images based on user zone of focus
US20120050007A1 (en) * 2010-08-24 2012-03-01 Babak Forutanpour Methods and apparatus for interacting with an electronic device application by moving an object in the air over an electronic device display
JP5556515B2 (en) * 2010-09-07 2014-07-23 ソニー株式会社 Information processing apparatus, information processing method, and program
JP2013016018A (en) * 2011-07-04 2013-01-24 Canon Inc Display control apparatus, control method, and program
CN104335156A (en) * 2012-06-05 2015-02-04 索尼公司 Information processing device, information processing method and recording medium upon which computer program has been recorded
JP5984718B2 (en) * 2013-03-04 2016-09-06 三菱電機株式会社 In-vehicle information display control device, in-vehicle information display device, and information display control method for in-vehicle display device
JP5950851B2 (en) * 2013-03-04 2016-07-13 三菱電機株式会社 Information display control device, information display device, and information display control method
JP5933468B2 (en) * 2013-03-04 2016-06-08 三菱電機株式会社 Information display control device, information display device, and information display control method

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10269022A (en) * 1997-03-25 1998-10-09 Hitachi Ltd Portable information processor with communication function
JP4649932B2 (en) * 2004-09-30 2011-03-16 マツダ株式会社 Vehicle information display device
JP4479962B2 (en) * 2005-02-25 2010-06-09 ソニー エリクソン モバイル コミュニケーションズ, エービー Input processing program, portable terminal device, and input processing method

Also Published As

Publication number Publication date
JP2009259110A (en) 2009-11-05

Similar Documents

Publication Publication Date Title
US8325134B2 (en) Gesture recognition method and touch system incorporating the same
US7573462B2 (en) Image display apparatus, multi display system, coordinate information output method, and program for implementing the method
US5396443A (en) Information processing apparatus including arrangements for activation to and deactivation from a power-saving state
EP2614422B1 (en) Motion control touch screen method and apparatus
US6326950B1 (en) Pointing device using two linear sensors and fingerprints to generate displacement signals
US8487881B2 (en) Interactive input system, controller therefor and method of controlling an appliance
JP4605170B2 (en) Operation input device
TWI428798B (en) Information processing devices, information processing methods and program products
CN103294300B (en) Sensor management apparatus, method, and computer program product
US7199787B2 (en) Apparatus with touch screen and method for displaying information through external display device connected thereto
EP2202626A2 (en) Inputting apparatus
US7002556B2 (en) Touch responsive display unit and method
TWI358028B (en) Electronic device capable of transferring object b
EP0513694A2 (en) Apparatus and method for inputting data
DE112010002760T5 (en) User interface
KR100260867B1 (en) Breakaway and Re-Grow Touchscreen Pointing Device
EP2325740A2 (en) User interface apparatus and method
JP2011053971A (en) Apparatus, method and program for processing information
US6762752B2 (en) Dual function input device and method
EP0690368A2 (en) A pen-based computer system
US20070018966A1 (en) Predicted object location
US8040320B2 (en) Input device and method of operation thereof
US9069386B2 (en) Gesture recognition device, method, program, and computer-readable medium upon which program is stored
EP1507192A2 (en) Detection of a dwell gesture by examining parameters associated with pen motion
DE202007019347U1 (en) Touch-panel display device

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20110224

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20111221

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20120221

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20120406

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20121002

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20121030

R150 Certificate of patent or registration of utility model

Free format text: JAPANESE INTERMEDIATE CODE: R150

Ref document number: 5127547

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20151109

Year of fee payment: 3

S531 Written request for registration of change of domicile

Free format text: JAPANESE INTERMEDIATE CODE: R313531

R350 Written notification of registration of transfer

Free format text: JAPANESE INTERMEDIATE CODE: R350