WO1998030952A1 - Processeur informatique presentant une caracteristique d'interface utilisateur graphique - Google Patents

Processeur informatique presentant une caracteristique d'interface utilisateur graphique Download PDF

Info

Publication number
WO1998030952A1
WO1998030952A1 PCT/JP1998/000111 JP9800111W WO9830952A1 WO 1998030952 A1 WO1998030952 A1 WO 1998030952A1 JP 9800111 W JP9800111 W JP 9800111W WO 9830952 A1 WO9830952 A1 WO 9830952A1
Authority
WO
WIPO (PCT)
Prior art keywords
screen
cursor
stimulus
information
pointing operation
Prior art date
Application number
PCT/JP1998/000111
Other languages
English (en)
Japanese (ja)
Inventor
Toyotaro Tokimoto
Masatoshi Oishi
Original Assignee
Avix Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Avix Inc. filed Critical Avix Inc.
Priority to AU54954/98A priority Critical patent/AU5495498A/en
Publication of WO1998030952A1 publication Critical patent/WO1998030952A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03543Mice or pucks

Definitions

  • the present invention relates to an information processing apparatus such as a personal computer or a workstation, and more specifically, has a feature in an information input / output process between a display output screen and a pointing device in a graphical user interface.
  • the present invention relates to an information processing device. Background art
  • GUI graphical user interface
  • the user gives an operation instruction to an image element such as an icon or a dialog box arranged on the screen.
  • image element such as an icon or a dialog box arranged on the screen.
  • Most operations can be performed simply by moving the force displayed on the screen, overlaying it on the target image element, and clicking and dragging.
  • a pointing operation device is indispensable for moving the cursor.
  • the mouse is the most popular.
  • the mouse is small enough to be held in the palm of your hand and is wrapped in a flat case.
  • a ball is built into the bottom.
  • the top of the case has one to three buttons.
  • a cable is pulled out from the front of the case, and the end is connected to a personal computer.
  • a movement detection unit such as a mouthpiece encoder.
  • An electronic circuit such as a microcomputer inside the mouse converts this movement state into a movement signal that can be processed as a movement direction and a movement distance on the PC side.
  • the microcomputer also generates an event signal according to the pressing state of the button and controls the output of its own peripheral device ID signal. These signals are output to the cable via the serial interface.
  • the PC outputs the mouse output signal to the appropriate interface via the serial interface. Receive every sampling time. When a movement signal is received, the coordinate data indicating the display position of the cursor is updated. The display position of the cursor on the screen is changed based on the new coordinates. Upon receiving the event signal, it performs information processing according to it and outputs the processing result to the display.
  • the target icon is checked visually, and various input operations are performed while visually following the mouse's force on the icon.
  • Information processing results for the input operation are output to the display.
  • the output information is visually confirmed again.
  • the user performs information input / output operations while visually confirming them on the display.
  • the display usually uses a high-definition display corresponding to GUI.
  • High-definition displays can display a large amount of information, but each image element on the screen is small. For this reason, the user must watch the screen in order to place the cursor on the target image element. This puts a heavy burden on the user's eyes. Further miniaturization of image elements requires delicate operation of the fingers when moving the mouse. The fatigue on the user's arms and shoulders due to this cannot be neglected. Also, it may interfere with the instruction of the correct force and may input unnecessary operation commands to unintended image elements.
  • An object of the present invention is to provide an information processing apparatus capable of reducing a burden on a body for a user operating in a GUI environment, and providing a comfortable operation environment and accurate cursor pointing.
  • the information processing apparatus of the present invention has the following functions and controls in addition to the functions and information input / output control in the conventional GUI environment.
  • the pointing operating device used incorporates stimulus generating means for stimulating the fingers of the person who operates it.
  • the image source information which is the origin of the display screen on the display, and the coordinate data of the cursor are sequentially compared, and the above-mentioned timing is associated with the relative relationship between the plurality of image elements in the screen and the display position of the cursor.
  • a drive signal for the stimulus generation means is generated and supplied to the pointing operation device. As a result, the screen state can be confirmed by stimulation.
  • a plurality of modes are set in the drive signal, and stimuli in the plurality of modes corresponding to the plurality of modes are given to one finger of the user.
  • the pointing operation device is a movable type in which a moving operation is performed by a mechanical movement input unit such as a mouse ball, and a braking force generating means for preventing the input unit from moving is incorporated. Then, a driving signal of the braking force generating means is supplied in association with the relativity between the plurality of image elements in the screen and the display position of the force sol. As a result, the screen state can be confirmed with a feeling of resistance to the moving operation.
  • the software has a function of managing position information of a plurality of image elements on the screen, and the above-mentioned operation can be achieved by processing the software.
  • the image state value at the pixel indicated by the cursor is extracted, and the difference between the image state values before and after the cursor is moved is calculated.
  • the information processing apparatus having the above-described GUI can also be implemented by incorporating a peripheral management program for transmitting a drive signal of the stimulus generation means when the value exceeds a predetermined threshold value into the operating system in the form of a device driver. Can be achieved.
  • FIG. 1 is a diagram showing a configuration of an information processing apparatus according to an embodiment of the present invention.
  • FIG. 2 shows an example of a pointing operation device in the information processing apparatus of the above embodiment, (A) is an external view thereof, and (B) is a configuration diagram thereof.
  • FIG. 3 is a schematic diagram of a graphical user interface in the information processing apparatus.
  • FIG. 4 is a diagram showing a screen displayed on a display when application software that is processed by the information processing apparatus is operated.
  • FIG. 5 is a diagram showing another example of the pointing operation device in the information processing device.
  • the basic configuration of an information processing apparatus 1 (hereinafter, information processing apparatus) 1 according to the present invention is the same as the information processing apparatus used in a conventional GUI environment as shown in FIG.
  • a display 50 as an output device and a mouse 100 as a pointing device are connected to a personal computer 10 as an information processing unit.
  • FIGS. 1 (A) and 1 (B) show an embodiment of a mouse (hereinafter, referred to as a new mouse) 100 used in the information processing apparatus 1.
  • FIG. (A) is an external view thereof.
  • the new mouse 100 has a button 101 a, 100 lb on top of a flat-bottomed case, and a connection cable 102 at the front end. A ball is built into the bottom of the case.
  • the new mouse 100 also has a diaphragm 103, which is a pulsed physical stimulator, and the back of the case where the palm of the hand holding the new mouse 100 is always in contact. It is installed in.
  • (B) is a configuration diagram of the new mouse 100.
  • the microcomputer 120 including the serial interface section (hereinafter referred to as the mouse side IF) 121, the ball 107, and the movement detection section 108 have the same configuration as the conventional mouse.
  • the new mouse 100 includes a stimulus generating means including a diaphragm 103 and a piezoelectric vibrator 104 coupled thereto, a brake pad 105 and a laminated piezoelectric actuator coupled thereto. Evening (hereinafter referred to as “Akuchie”) A braking force consisting of 106 Means are incorporated.
  • the stimulus generating means gives a vibration stimulus to the hand of the person holding the new mouse 100 by transmitting the vibration of the piezoelectric vibrator to the diaphragm.
  • the braking force generation means presses the brake pad against the ball by the downward displacement of the actuary. This gives a sense of resistance to the user's moving operation.
  • the microcomputer 120 receives a drive signal (hereinafter referred to as a “stimulus signal”) for the stimulus generating means or a driving signal for the braking force generating means from the personal computer, which is received via the mouse IF 121.
  • the control for driving the stimulus generation means or the braking force generation means is also performed according to (hereinafter, the braking signal).
  • the power source of the stimulus generation means and the braking force generation means is not limited to the piezoelectric element, and vibrations and displacements may be generated by an eccentric cam fixed to a motor / motor rotation shaft.
  • FIG. 3 shows an outline of a GU I (hereinafter, referred to as a new GU I) realized by the information processing device according to the present invention.
  • a plurality of icons are appropriately arranged on a screen 51 displayed on the display 50.
  • a cursor 53 is also displayed on the screen.
  • the force sol is moved along arrow 54.
  • icons 52a and 52b On the locus of the cursor 53, there are icons 52a and 52b.
  • the personal computer 10 supplies a stimulus signal to the mouse at the moment when the cursor overlaps the icons 52a and 52b. When you follow the trajectory of the force sol, you are facing the edge of the screen. When the cursor can no longer be moved, the computer supplies a braking signal to the mouse.
  • the stimulus signal mode is changed according to the image element, such as supplying different modes of stimulus signals to the new mouse. It can be changed as appropriate.
  • FIG. 4 shows an example of a display state when executed by the new GUI-compliant software information processing apparatus 1.
  • the new GUI compatible software being executed is spreadsheet software.
  • the cursor 53 and the execution window 55 of the new GUI compatible software are displayed on the screen 51 Is done.
  • the new GUI-compatible software manages image source information.
  • the image source information in this case is the attributes of all the image elements in the execution window 55. In other words, the type of image element (execution window 55, cell 56, button 57, etc.), coordinate data, color data, size, etc. Then, the attributes of the image elements and the position information of the cursor 53 are monitored sequentially.
  • the force 53 is moved along the arrow 58. Following the trajectory of force sol 5 3, it crosses the boundary of cell 5 6 several times. Since the position of the boundary is under the control of the new GUI-compatible software, the stimulus signal is generated at the moment when the cursor 53 crosses the cell boundary. Also, when approaching the toolbar area 59 where the button 57 is arranged, a stimulus signal in a mode different from that at the boundary of the cell 57 is transmitted. In addition, following the arrow 58, the tip is directed outward of the execution window area 55. A braking signal is generated for movement of the force sol 53 outside the execution window 55.
  • a braking signal is generated when the cursor crosses a window frame, such as when a cursor in one window is moved to another window. It is as follows. When the window frame is over, the transmission of the braking signal is stopped and the operation returns to the smooth cursor operation feeling again.
  • the new GUI environment can be changed according to the user's preference.
  • the mode of the stimulus to be generated can be changed according to the type and color of the image element, and the generation of the braking signal can be changed to the screen, window or screen.
  • the braking signal may be controlled so that the braking force is gradually increased from the periphery of the end, instead of suddenly applying the braking at the end of.
  • the level of the braking force and the position where the braking force starts to be generated can also be set as appropriate.
  • the above is the embodiment in the case where the application software corresponding to the new GUI is executed.
  • OS operating system
  • GUI GUI
  • Windows Microsoft Corporation
  • Win 95 (hereafter, Win 95) as an example of 0 S
  • This device driver controls the communication with the new mouse and sequentially monitors the attributes of the image elements managed by Win95 and the position information of the cursor. It also sets the new GUI environment and controls the timing of drive signal generation and transmission.
  • the device driver uses the plug-and-play capabilities of Win 95 as well as other Win 95-compatible peripherals.
  • Win95 requests the peripheral device to transmit a peripheral device identification ID.
  • the device driver for the new mouse is stored in an appropriate memory area and incorporated into Win95. This allows the control of the new GUI.
  • the new GUI has been initialized in advance, but fine-grained settings can be made by user input operations.
  • the stimulus mode can be selected according to the type and color of the image element, and can be set according to the user's preference.
  • MS-DOS such as 0 S
  • GU I which does not assume GU I
  • application software does not support the new GU I.
  • the technology that can achieve the new GUI even in this case will be described.
  • a new GUI is achieved by incorporating a device driver different from that of Win95 described above in the OS.
  • This device driver monitors bit map information and cursor position information in the computer's video memory sequentially.
  • the bitmap information in the video memory corresponds to the above-mentioned image source information.
  • the image state value at the pixel indicated by the force solver is calculated, and the image state value before and after the movement of the force solver is compared. When the difference between the compared image state values exceeds a predetermined threshold value, control for transmitting a stimulus signal is performed. Now.
  • the image state value is calculated by extracting the color data at the pixel (hereinafter referred to as a hot point) indicated by the force sol from the bitmap information in the video memory.
  • This color display shows the gradation level of the brightness of each of the three primary colors of red, green and blue (RGB) when displayed on the display.
  • Computers can output the three primary colors of red, green, and blue (RGB) to the display using 256 levels for each color.
  • the number of colors that can be displayed on the display depends on the number of display pixels on the display and the amount of video memory in the personal computer. Therefore, the number of gray levels is reduced by omitting half-tone levels for 256 gray levels. As a result, the number of colors that can be actually displayed on the display decreases.
  • the number of gradations for each primary color differs depending on the selected color palette. Since the technical contents in this area are not essentially related to the present invention, they will not be described in detail. An example for calculating the image state value will be described below.
  • the coordinates (X, y) in the video memory are the gradation levels of each RGB color at the coordinates (x, y) in the c- video memory, which correspond to the positions of the pixels on the display, as (Rxy, Gxy, Bxy). I do. If the image state value at this time is g (x, y) as a function of x and y,
  • a stimulus signal is generated when
  • D IRx 1 y 1 -Rx 2 y 2
  • the threshold value is not limited to one.
  • the mode of the stimulus signal may be changed according to each of the above cases. Further, the color data may be used as it is as the image state value. In this case, it is possible to change the mode of the stimulus signal depending on the color, or to generate the stimulus signal when the image element has a specific color. Further, even with a small number of display colors, multicolor display can be visually performed by appropriately dispersing the display colors in adjacent pixels. In such a case, the color data of a plurality of pixels adjacent to the hot point can be used as an image state value by taking an average value.
  • a new GU I environment can be provided even in 0 S where no GU I is assumed, and even when the application software does not support the new GU I.
  • a mode to send a braking signal This can be achieved by receiving the coordinate data of the pixel at the edge of the display screen from the on-screen information in the video memory, and determining whether the movement before and after the movement of the cursor is outside the edge. Also in this case, it is possible to change the braking force so that the braking force gradually increases as the force moves from near the edge to the edge of the screen.
  • Incorporation into the operating system is similar to embedding other device drivers.
  • the operating system is MS—DOS
  • you can use the appropriate text editor on the co nf i g. S Just write the file name of the device / driver in the ys file. Then, MS-D 0 S may be restarted.
  • MS-D 0 S may be restarted.
  • the control signal output from the personal computer is ignored, so it can be used as before.
  • the pointing operation device is in the form of a mouse 100.
  • other pointing devices such as a joystick, track pad, graphics tablet pack or stylus pen, track ball, and the like are used.
  • the form of the tinging controller may be adopted.
  • FIG. 5 (A) and 5 (B) show examples in which the form of a track ball 100b is adopted.
  • (A) is an external view, and (B) shows an outline of the configuration.
  • the user inputs movement information by rotating the ball 107 with a finger.
  • Reference numerals in the figure perform the same functions and operations as those of the mouse 100 described above.
  • the stimulating section 103 is provided on the button 101a, but is not limited to this, and may be any place where the user's finger is touching. Industrial availability
  • the position of a plurality of image elements arranged on the screen can be notified to the user operating the pointing operation device with a stimulus other than visual. Therefore, the burden on the user's eyes is reduced even in a GUI with a screen on which fine image elements are arranged. It also enables accurate pointing and reduces the burden on the user's body, especially the shoulder and arm. This provides the user with a comfortable and accurate GUI environment.
  • the second invention by setting a plurality of stimuli to be given to the user, different stimuli can be given according to the state of the screen and the image elements. For this reason, it is possible to change the stimulus for each screen state and each type of image element, and to grasp the screen state with a sense other than visual. In addition, it is possible to customize the settings to the user's preference, thereby achieving a much better GUI environment.
  • the screen state can be confirmed by applying a brake to the movement operation of the pointing operation device by the user. For this reason, when the force on the screen reaches the movement limit, such as at the edge of the screen, the user can be notified of the screen state with a feeling of resistance to the movement operation of the pointing device.
  • the sense of the screen and the user is more integrated, and the size of the display screen can be grasped sensuously. Furthermore, since braking is applied according to the type of image element, there is no need to slow down the cursor movement speed immediately before the target image element in the moving operation of the force. This allows the force sol to be quickly superimposed on the image element. Also, the accuracy increases when the image elements are displayed on the screen in a random manner.
  • the GUI environment described above can be provided only by connecting the pointing operation device of the present invention to a general-purpose personal computer.
  • the user confirms the screen state and the position of the image elements by means other than visual. be able to. Therefore, there is no need to prepare expensive dedicated software or a personal computer that runs a specific operating system. Therefore, the same GUI operation environment can be provided only by adding the pointing operation device according to the present invention to the existing system without imposing an excessive economic burden on the user.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

Processeur informatique permettant de limiter la difficulté physique à laquelle est soumis l'utilisateur mettant ce processeur en service dans un environnement GU1 en rendant cet environnement agréable et en augmentant la précision du pointage. Ce processeur est caractérisé par le fait que son interface utilisateur graphique possède un dispositif de pointage comportant des moyens intégrés servant à générer une stimulation physique afin de la transmettre au doigt de l'utilisateur; l'information de source d'image qui est la source de l'image affichée sur l'écran, est comparée aux données de coordonnées sur le curseur se déplaçant sur l'écran d'après l'information concernant le déplacement du dispositif de pointage; des signaux de commande servant à commander les moyens de génération de stimulation sont produits à un moment en rapport avec la relativité entre une pluralité d'éléments d'image sur l'écran et la position d'affichage du curseur et transmis au dispositif de pointage.
PCT/JP1998/000111 1997-01-14 1998-01-14 Processeur informatique presentant une caracteristique d'interface utilisateur graphique WO1998030952A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU54954/98A AU5495498A (en) 1997-01-14 1998-01-14 Information processor having characteristic in graphical user interface

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP9004776A JPH10198513A (ja) 1997-01-14 1997-01-14 グラフィカル・ユーザーインタフェースに特徴を有する情報処理装置
JP9/4776 1997-01-14

Publications (1)

Publication Number Publication Date
WO1998030952A1 true WO1998030952A1 (fr) 1998-07-16

Family

ID=11593244

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP1998/000111 WO1998030952A1 (fr) 1997-01-14 1998-01-14 Processeur informatique presentant une caracteristique d'interface utilisateur graphique

Country Status (4)

Country Link
JP (1) JPH10198513A (fr)
AU (1) AU5495498A (fr)
TW (1) TW394885B (fr)
WO (1) WO1998030952A1 (fr)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002211019A (ja) * 2001-01-15 2002-07-31 Fuji Photo Film Co Ltd 感熱プリンタ
JPWO2003081413A1 (ja) * 2002-03-26 2005-07-28 箕輪興亜株式会社 電子機器用情報入力装置
JP2008257294A (ja) * 2007-03-30 2008-10-23 Tokyo Institute Of Technology 触覚刺激生成方法
JP4715867B2 (ja) * 2008-06-04 2011-07-06 株式会社デンソー 操作表示制御システム。
JP2010157031A (ja) * 2008-12-26 2010-07-15 Nippon Telegr & Teleph Corp <Ntt> 情報処理装置とその操作支援方法

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH04286015A (ja) * 1991-03-15 1992-10-12 Fujitsu Ltd マウスを用いた位置指示方式
JPH06102997A (ja) * 1992-09-24 1994-04-15 Hitachi Ltd ポインティングデバイス
JPH06202801A (ja) * 1991-06-04 1994-07-22 Agency Of Ind Science & Technol 電子制御機器におけるマウス型入力装置
JPH0681035U (ja) * 1993-04-15 1994-11-15 ブラザー工業株式会社 マウス
JPH08305490A (ja) * 1995-05-11 1996-11-22 Sharp Corp 反力出力型2次元座標入力装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH04286015A (ja) * 1991-03-15 1992-10-12 Fujitsu Ltd マウスを用いた位置指示方式
JPH06202801A (ja) * 1991-06-04 1994-07-22 Agency Of Ind Science & Technol 電子制御機器におけるマウス型入力装置
JPH06102997A (ja) * 1992-09-24 1994-04-15 Hitachi Ltd ポインティングデバイス
JPH0681035U (ja) * 1993-04-15 1994-11-15 ブラザー工業株式会社 マウス
JPH08305490A (ja) * 1995-05-11 1996-11-22 Sharp Corp 反力出力型2次元座標入力装置

Also Published As

Publication number Publication date
AU5495498A (en) 1998-08-03
JPH10198513A (ja) 1998-07-31
TW394885B (en) 2000-06-21

Similar Documents

Publication Publication Date Title
JP3753744B2 (ja) アイソトニック及びアイソメトリックの機能を有する力フィードバックインタフェース
US7131073B2 (en) Force feedback applications based on cursor engagement with graphical targets
EP1066616B1 (fr) Molette de commande a retour d&#39;effort
US6061004A (en) Providing force feedback using an interface device including an indexing function
US6906697B2 (en) Haptic sensations for tactile feedback interface devices
US8638315B2 (en) Virtual touch screen system
EP1036390B1 (fr) Méthode de commande d&#39;un dispositif à retour de force dans un environnement d&#39;hôte graphique multitaches
US20200310561A1 (en) Input device for use in 2d and 3d environments
US7379048B2 (en) Human-computer interface including efficient three-dimensional controls
US10474238B2 (en) Systems and methods for virtual affective touch
EP3333674A1 (fr) Systèmes et procédés pour la simulation d&#39;élasticité avec rétroaction haptique
US8327294B2 (en) Method and system to reduce workload and skills required in usage of mouse or other pointing devices
EP3367216A1 (fr) Systèmes et procédés pour toucher affectif virtuel
WO1998030952A1 (fr) Processeur informatique presentant une caracteristique d&#39;interface utilisateur graphique
WO2002057885A2 (fr) Commande d&#39;effet de retour haptique pour l&#39;amelioration de la navigation dans un environnement graphique
JP3084433U (ja) 触感マウス装置
JP2001092579A (ja) 情報表示装置
van Dam Post-Wimp user interfaces: The human connection
JPH08305490A (ja) 反力出力型2次元座標入力装置
JP3465259B2 (ja) リモートコントロールシステム
JP2021093003A (ja) 力覚提示装置、力覚提示方法、及びプログラム
Connection et al. Post-Wimp User Interfaces: the

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AU CA CN KR SG US

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE CH DE DK ES FI FR GB GR IE IT LU MC NL PT SE

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application
122 Ep: pct application non-entry in european phase