CN101419528B - Data processing device - Google Patents

Data processing device Download PDF

Info

Publication number
CN101419528B
CN101419528B CN2008101729066A CN200810172906A CN101419528B CN 101419528 B CN101419528 B CN 101419528B CN 2008101729066 A CN2008101729066 A CN 2008101729066A CN 200810172906 A CN200810172906 A CN 200810172906A CN 101419528 B CN101419528 B CN 101419528B
Authority
CN
China
Prior art keywords
user
unit
visual impairment
video data
button
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN2008101729066A
Other languages
Chinese (zh)
Other versions
CN101419528A (en
Inventor
藤下真弘
前平洋利
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Brother Industries Ltd
Original Assignee
Brother Industries Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2007305558A external-priority patent/JP5092713B2/en
Application filed by Brother Industries Ltd filed Critical Brother Industries Ltd
Publication of CN101419528A publication Critical patent/CN101419528A/en
Application granted granted Critical
Publication of CN101419528B publication Critical patent/CN101419528B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P20/00Technologies relating to chemical industry
    • Y02P20/50Improvements relating to the production of bulk chemicals
    • Y02P20/52Improvements relating to the production of bulk chemicals using catalysts, e.g. selective catalysts

Landscapes

  • User Interface Of Digital Computer (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

A data processing device includes a displaying unit, a user type determining unit, and a display controlling unit. The display unit displays display data including first display data for visually impaired users and second display data for sighted users. The user type determining unit determines whether a user is visually impaired or sighted. The display controlling unit controls the displaying unit to display the first display data when the user type determining unit determines that the user is visually impaired, and to display the second display data when the user type determining unit determines that the user is sighted.

Description

Data processing equipment
The cross reference of related application
The application requires the right of priority of following application: the Japanese patent application of submitting in Japanese patent application 2007-276110 number that on October 24th, 2007 submitted to and on November 27th, 2007 2007-305558 number.The entirety of each of these priority applications is incorporated in this by reference.
Technical field
The present invention relates to a kind of data processing equipment and a kind of data handling system.
Background technology
The traditional data handling procedure that is called as screen reader is used for can listening convert at the last text data that shows of personal computer (being designated hereinafter simply as " PC ") to the ground sound played, so that help the people with visual impairment to use common PC.Being designed the patented claim that a kind of such data processor of supporting visual impairment person to use PC is disclosed in japanese unexamined discloses in 2002-288071 number.This program provides the viewing area obtaining and show in the function screen of browser of world wide web (www) data.Said viewing area is used to show menu or other data with link, makes the user of visual impairment to obtain the WWW data through shirtsleeve operation.When browser obtained the user's who is used for visual impairment except the WWW data menu data, said program can show in the function screen of browser made browser more user friendly for the user of visual impairment by this menu data.The visual impairment user's that browser shows in browser window with acoustic expression menu data, screen reader is with acoustic expression WWW data simultaneously.
But in japanese unexamined patent disclosed the technology of 2002-288071, the menu item that is used for the visual impairment user that in browser, shows also was included in the WWW data that on browser, show.Therefore, the user hears the same data that repeated by browser and screen reader, makes more to be difficult to understand on the function screen of browser, to show what information.Perhaps; If browser provide for the people of visual impairment easily, be used for visual impairment person's function screen and be used for observable people easily, be used for eyesight person of perfecting or visible person's function screen, then require the user to carry out to select or the blocked operation screen to the operation of the screen of expectation.This user for visual impairment is inconvenient especially, and for the user of visual impairment, each operation is consuming time.
Summary of the invention
Consider above-mentioned situation; The purpose of this invention is to provide a kind of data processing equipment; It can be used for the visual impairment user's screen and be used in a plurality of function screens of observable user's screen comprising, shows for the user-friendly function screen of visual impairment user.
In order to realize above-mentioned and other purposes, the invention provides a kind of data processing equipment, it comprises that display unit, user type confirm unit and indicative control unit.Said display unit shows video data said video data comprises first video data that is used for the visual impairment user and second video data that is used for observable user.Said user type confirms to confirm the user in the unit, and whether visual impairment is perhaps visible.Said indicative control unit is controlled said display unit and is come when said user type confirms that said user's visual impairment is confirmed in the unit, to show said first video data, and confirms the unit when said user type and confirm that said user shows said second video data when visible.
According to another aspect, the invention provides a kind of data processing method, comprising: whether visual impairment is perhaps visible to confirm the user; And, when confirming said user's visual impairment, show first video data that is used for the visual impairment user, and demonstration is used for observable user's second video data when confirming that said user is visible.
According to another aspect, the invention provides a kind of computer readable recording medium storing program for performing, it stores data processor, and said data processor comprises the instruction that is used for following behavior: whether visual impairment is perhaps visible to confirm the user; And, when confirming said user's visual impairment, show first video data that is used for the visual impairment user, and demonstration is used for observable user's second video data when confirming that said user is visible.
According to another aspect, the invention provides a kind of data handling system, it comprises first data processing equipment and second data processing equipment.Said first data processing equipment comprises display unit, and it shows video data said video data comprises first video data that is used for the visual impairment user and second video data that is used for observable user.Said second data processing equipment comprises that user type confirms unit and indicative control unit.Said user type confirms to confirm the user in the unit, and whether visual impairment is perhaps visible.Said indicative control unit is controlled said display unit and is come when said user type confirms that said user's visual impairment is confirmed in the unit, to show said first video data, and confirms the unit when said user type and confirm that said user shows said second video data when visible.
Description of drawings
Through the explanation below in conjunction with accompanying drawing, special characteristic of the present invention and advantage and other purposes will become obviously, wherein:
Fig. 1 shows the block scheme according to the electronic structure of the PC of the first embodiment of the present invention;
Fig. 2 A shows the screenshot capture of the example of the UI that is used for the visual impairment user;
Fig. 2 B shows the screenshot capture of the example of the UI that is used for observable user;
Fig. 3 A is illustrated in PC to go up the process flow diagram that interior step is handled in the initial setting up of carrying out;
Fig. 3 B shows the screenshot capture of the example of dialog box;
Fig. 4 is illustrated in the process flow diagram that PC goes up the step in main processing of carrying out;
Fig. 5 is illustrated in the process flow diagram that user type confirms to handle interior step;
Fig. 6 A is illustrated in the process flow diagram that User Status is obtained the step in handling;
Fig. 6 B is illustrated in the process flow diagram that the step in handling is obtained in operation;
Fig. 7 is the process flow diagram that is illustrated in the step in the screen hand-off process A;
Fig. 8 is the process flow diagram that is illustrated in the step in the screen hand-off process B;
Fig. 9 A shows the screenshot capture in the example of the dialog box that is used for the visual impairment user;
Fig. 9 B shows the screenshot capture of the example of the dialog box that is used for observable user;
Figure 10 is the step in handling is obtained in diagram according to the user mode of first version of first embodiment a process flow diagram;
Figure 11 A is the step in handling is obtained in diagram according to the user mode of second version of first embodiment a process flow diagram;
Figure 11 B is illustrated in the process flow diagram that obtains the step in handling according to the operation of the 3rd version of first embodiment;
Figure 11 C is the step in handling is obtained in diagram according to the operation of the 4th version of first embodiment a process flow diagram;
Figure 12 is the process flow diagram of the step in diagram is confirmed to handle according to the user type of the 5th version of first embodiment;
Figure 13 A shows the screenshot capture of the example of the UI that is used for the visual impairment user that on LCD, shows;
Figure 13 B shows the screenshot capture of example of the user's who is used for visible demonstration who on LCD, shows UI;
Figure 14 A is the process flow diagram of the step in diagram is handled according to the function executing by the PC execution of second embodiment;
Figure 14 B is illustrated in the screenshot capture that LCD goes up the example of the scanner function function screen that is used for the visual impairment user that shows;
Figure 15 A is the process flow diagram of diagram according to the step in the function executing treatments B of second embodiment;
Figure 15 B shows the screenshot capture of example of the user's who is used for visible demonstration who on LCD, shows scanner function function screen;
Figure 16 A shows the block scheme according to the electronic structure of the PC of the 3rd embodiment;
Figure 16 B is the process flow diagram of step in diagram is handled according to the user type monitoring by the PC execution of the 3rd embodiment; And
Figure 17 is diagram begins to handle interior step according to the application by the PC execution of the 3rd embodiment a process flow diagram.
Embodiment
To wherein, come the part and the parts of designate similar through identical drawing reference numeral, referring to description of drawings data processing equipment according to an embodiment of the invention to avoid repeat specification.
At first, with checking that Fig. 1-9B explains the data processing equipment according to the first embodiment of the present invention.
In first embodiment, set up applications X on PC1.Said application program X has the screen handoff functionality, is used for the visual impairment user and being used for automaticallying switch between these function screens of observable user.When operation, whether the user that application program X is used for confirming PC1 visual impairment or visible, and its function screen switched to is suitable for said user's screen.
Fig. 1 shows the block scheme according to the electronic structure of the PCI that first embodiment has been installed.
When aforesaid application program X moves on PC1 and shows for observable user easily during function screen, when application program X can work as the user and carries out input operation, function screen switched to was of value to the visual impairment user's screen.This configuration is easily for the visual impairment user, because for such user, each operation is consuming time.
Application program X can use the various functional programs that had by the multifunction peripheral (MFP) 100 that is connected to PC1 via LAN 200, and said function comprises facsimile function, printer function, scanner function and duplicating machine function.MFP100 also is connected to and is used for sending the needed telephone network 300 of data with facsimile function.
PC1 mainly is configured to by CPU11, ROM12, RAM13, hard disk 14, keyboard 15, mouse 16, LCD17, loudspeaker 18, timing circuit 19 and lan interfaces 20, and all these parts interconnect via bus 26.
CPU11 is used for according to fixed value and the program of storage on ROM12, RAM13 and hard disk 14 or according to controlling by PC1 via the signal of lan interfaces 20 exchanges and being connected to the various functions that each parts of bus 26 have.
ROM12 is non-recordable memory, is used to be stored in PC1 and goes up the control program of carrying out.RAM13 can rewrite volatile memory, is used for temporary transient store various kinds of data during executable operations on PC1 as CPU11.
The RAM13 condition of being provided with meets marker stores zone 13a, button operation marker stores zone 13b, mouse action marker stores zone 13c, condition coincidence counting storage area 13d, button input counting storage area 13e, mouse input counting storage area 13f, mouse and crosses button name storage area 13g and user type storage area 13h.
Condition meets marker stores zone 13a storage condition and meets mark, and it is used in reference to the user who is shown on the PC1 and is provided with whether indicate the setting that is used to help the visual impairment user.Being designed the PC that helps the visual impairment user is provided with and for example possibly comprises when the setting of carrying out when being designed to help the visual impairment user to use PC being installed and being used to reduce the speed of cursor of mouse or the setting in operating system (being designated hereinafter simply as " OS ") that improves the size of the text that is showing on the LCD17.Condition meets mark and when satisfying above-mentioned condition, is set to " effectively " (for example " 1 "), and when not satisfying above-mentioned condition or when the said mark of initialization, is set to engineering noise (for example " 0 ").
Button operation marker stores zone 13b storage button operation mark, it is used to indicate the user whether to carry out the button operation (for example, having supressed button) of appointment for keyboard 15.Said button operation mark is set to " effectively " (" on ") (for example " 1 ") when the user has carried out the button operation of appointment for keyboard 15, and when also not carrying out the button operation of appointment or when the said mark of initialization, is set to engineering noise (" off ") (for example " 0 ").
Mouse action marker stores zone 13c storage mouse action mark, it is used to indicate the user whether to carry out the mouse action (for example, click) of appointment for mouse 16.Said mouse action mark is set to " effectively " (for example " 1 ") when the user has carried out the mouse action of appointment for mouse 16, perhaps when also not carrying out the mouse action of appointment or when the said mark of initialization, be set to engineering noise (for example " 0 ").
Condition coincidence counting storage area 13d stored count, it is used to indicate the quantity of the user's setting on PC1 that is performed the user who helps visual impairment.For example; Help each user on PC of visual impairment user to be provided with for being designed to; Counting in condition coincidence counting storage area 13d stored is increased progressively " 1 ", and said user is provided with and comprises that the people who is designed to help visual impairment uses the installation of the software of PC, the OS that is used to reduce cursor of mouse speed to be provided with or the OS that is used to improve the size of the text that on LCD17, shows is provided with.
Button input counting storage area 13e stored count, it is used to indicate the operator to carry out the number of times of the button operation of appointment for keyboard 15.For example, when each user carried out the button operation of appointment for keyboard 15, the counting in button input counting storage area 13e was increased progressively " 1 ".
Mouse input counting storage area 13f stored count, it is used to indicate the user to carry out the number of times of the mouse action of appointment for mouse 16.For example, when each user carried out the mouse action of appointment, the counting of counting storage area 13f stored in the mouse input was increased progressively " 1 ".
Mouse is crossed button name storage area 13g and is stored in the title that LCD17 goes up the designated button that shows, cursor of mouse passes through from said button with the order that said button is passed through.For example, when cursor of mouse is the button of " scanning " through being assigned with button name, cross button name storage area 13g stored button name " scanning " at mouse.If when being assigned with button name for the button of " duplicate ", except " scanning " go back store button title " duplicate " subsequently by the order of then being passed through with said each button for cursor of mouse.
User type storage area 13h stores user type, and it is used to indicate the user, and whether visual impairment is perhaps visible.Confirm the user's of operation PC1 type according to confirming to handle referring to the user type of stating behind Fig. 5.If the user is confirmed as the visual impairment user in this is handled, be the user type (for example " 1 ") of visual impairment then user type storage area 13h stored indication user.If confirm that the operator is observable user, then indicate observable user's user type (for example " 2 ") in user type storage area 13h stored.User type storage area 13h is initialized to " 0 ".
Hard disk 14 is rewritable nonvolatile memories.Therefore, even behind the power supply that turn-offs for PC1, also can be retained in the data of storage on the hard disk 14.Hard disk 14 is provided with OS storage area 14a, screen reader storage area 14b and application program X storage area 14c.
OS storage area 14a storage OS and OS are provided with.Said OS is designed the base program of managing the parts that are connected to bus 26, and is needed when the operator uses PC1.For example, OS manages the whole viewing area of the LCD17 that is described below.OS is provided with a plurality of options that are provided with that are associated with input operation, is described below.Through selecting to offer one of a plurality of selections of each setting, the user can customize PC1 to obtain happier experience.The program of moving on the PC1 can be in the limited viewing area on the LCD17 that allows by OS videotex, image etc.For example, the speed of the cursor of mouse that the user can show on LCD17 is set to faster or slower than the default speed of appointment, and the size of the text data that can on LCD17, show is set to bigger or littler than the default text size of appointment.
Therefore, when in the program of moving on the PC1 on LCD17 when videotex or image, said program must at first be distributed the viewing area of expectation size to the OS request.Below, distribute the viewing area of the expectation size of (permission) will be called as window by OS, and each program will be considered to when OS distributes window, obtain said window from OS.When in the program of moving on the PC1 when OS obtains window, said program can be in the restriction of said window freely videotex and image.
Though whether have a plurality of programs of on PC1, moving and wherein each program obtained window or whether the single program of having obtained a plurality of windows is arranged, OS manages each window independently, so that on LCD17, can show each window.
When program stopped carrying out after obtaining window, perhaps when the window that obtains from OS became unnecessary, said program can be eliminated window from LCD17 through return the window that is obtained to OS.Below, return window to OS and will be called as with the behavior of eliminating said window from LCD17 and close window.
OS also is provided with a plurality of options that are provided with that are associated with input operation, is described below.Through selecting one of a plurality of selections that offer each setting, the user can customize PC1 to obtain happier experience.
Screen reader storage area 14b stores screen reader (acoustic expression software), and it is designed the text data that will on LCD17, show and converts sound into.Screen reader is used to help the user of visual impairment.When the user carried out button operation and is chosen in the text data that LCD17 go up to show, screen reader converted selected text data into voice signal, and comes output sound signal through loudspeaker 18.For example; If the user is chosen in LCD17 and goes up one of button BT1-BT11 (will describe in more detail with reference to figure 2A in the back) that shows; Then screen reader converts the button name of selected button into voice signal; And can listen the ground output sound signal, make the people of visual impairment to operate PC1 according to said sound.
Application program X storage area 14c application storing X, it is beneficial to the use of function that is connected to the MFP100 of PC1 via LAN 200.Because screen reader moves when the user of visual impairment uses application program X simultaneously, so screen reader can convert the text data that in the function screen of application program X, shows into through loudspeaker 18 outputs voice data.Therefore, the visual impairment user can come operating application program X according to said sound.
The program that initial setting up shown in application program X storage area 14c storage is used to carry out in the process flow diagram of Fig. 3 A is handled, main shown in the process flow diagram of Fig. 4 handle, the user type shown in the process flow diagram of Fig. 5 is confirmed to handle, the user mode shown in the process flow diagram of Fig. 6 A is obtained processing, processing is obtained in the operation shown in the process flow diagram of Fig. 6 B, the screen hand-off process A shown in the process flow diagram of Fig. 7 and in the process flow diagram of Fig. 8 shown in screen hand-off process B.
Application program X storage area 14c is provided with visual impairment person UI storage area 14c1, visible person UI storage area 14c2, switch flag storage area 14c3, confirms show tags storage area 14c4 and UI type stores zone 14c5.
Visual impairment person UI storage area 14c1 storage is used for visual impairment user's user interface (UI) (stating after in more detail with reference to figure 2A), and it is the function screen that is beneficial to the user's who is used for visual impairment input operation.Visible person UI storage area 14c2 storage is used for observable user's UI (stating after in more detail referring to Fig. 2 B), and it is the function screen that is beneficial to observable user's input operation.
Switch flag storage area 14c3 bank switching mark, it is used for characteristic according to user's input operation and indicates whether UI that operation displayed screen on LCD17 is switched to the user's who is used for visual impairment UI or is used for observable user.When switch flag is set to " effectively " (for example " 1 ") according to the characteristic of input operation and will be on LCD17 the operation displayed screen conversion be used for visual impairment the user UI or be used for observable user's UI.If switch flag is set to engineering noise (for example " 0 "), then do not change operation displayed screen on LCD17.
Confirm show tags storage area 14c4 memory verification show tags; It is used for indicating whether before the function screen on the conversion LCD17, on LCD17, to show referring to one of described dialog box of Fig. 9 according to the characteristic of input operation, is used for confirming whether the user hopes the conversion operations screen.
When according to the characteristic conversion of user's input operation on LCD17 during the operation displayed screen; If confirm that show tags is set to " effectively " (for example " 1 "); Then on LCD17, show dialog box B or C (referring to Fig. 9), it is used for confirming that the user hopes the conversion operations screen.If operator input is used for the instruction of conversion screen, conversion operations screen subsequently then.But, be set to engineering noise (for example " 0 ") if confirm show tags, then do not show to be used to confirm whether the user hopes the dialog box B or the C of conversion operations screen, and the operation displayed screen is not considered operator's input by conversion on LCD17.
UI type stores zone 14c5 storage UI type, it is used to indicate whether on LCD17, to show that the user's who is used for visual impairment UI perhaps is used for observable user's UI.When the user's who is used for visual impairment user type (for example " 1 ") is indicated in the 14c5 storage of UI type stores zone; The UI that on LCD17, shows the user who is used for visual impairment; And when the UI type (for example " 2 ") of the UI that is used for observable user is indicated in the 14c5 storage of UI type stores zone, on LCD17, show the UI that is used for visible person.The user type of 14c5 stored is initially set to " 0 " in UI type stores zone.In other words, when starting application program X, on LCD17, show function screen corresponding to the UI type of 14c5 stored in UI type stores zone.
Keyboard 15 is the input medias with a plurality of buttons.When the operator pressed these buttons, input was corresponding to the input signal of the button that is pressed in PC1.Mouse 16 is to point to device, is used to use the cursor of mouse that on LCD17, shows to come the indicated number position.When the user uses mouse 16 to come the position of rolling mouse cursor, be imported into PC1 corresponding to the input signal of amount of movement, and move the display position of said cursor of mouse according to said input signal.
Mouse 16 has user's manipulable mouse button in the position of rolling mouse cursor.When the user operates this button, the input signal of input appointment in PC1.Through operation keyboard 15 and mouse 16, the user can be modified in various in the OS and be provided with etc., and can use application program X etc.
LCD17 be used to be presented at operation the term of execution operating process and treatment state and with button of on keyboard 15, pressing and the corresponding data of operation of using mouse 16 to carry out.Loudspeaker 18 will be output as sound by the voice signal that screen reader provides.Timing circuit 19 is circuit known in the art, and it has the timing function that is used to keep current time and date.
Lan interfaces 20 is at circuit well known in the art; Be used for through an end of LAN cable being connected to the connectivity port of lan interfaces 20, and the other end be connected to LAN 200 so that with the data communication of the various external device (ED)s that are connected to LAN 200.
Then, referring to Fig. 2 A and 2B the user's who is used for visual impairment UI and the UI that is used for observable user are described.
The UI that is used for the user of visual impairment is the function screen of application program X with the UI that is used for observable user.In application program X operation, one of these function screens are displayed on the LCD17.UI through according to the user who is used for visual impairment carries out input operation with the UI that is used for observable user, and the user can use the various functions of MFP100.
In order on LCD17, to show one of said function screen, application program X obtains window from the OS corresponding to the shape of said function screen, and in the window that is obtained, shows said function screen.Through carrying out UI that meets the user who is used for visual impairment and the input operation that is used for observable user's UI in this window on LCD17, the user can use the function of MFP100.
Fig. 2 A shows the screenshot capture in the example of the user's who is used for visual impairment of visual impairment person UI storage area 14c1 stored UI, and Fig. 2 B is the screenshot capture in the example of the UI that is used for observable user of visible person UI storage area 14c2 stored.The UI that is used for the user of visual impairment has the image of limited quantity and a large amount of text datas, and makes the user of visual impairment can use screen reader to carry out all operations.
Shown in Fig. 2 A; Choice box SB1 is provided in the upper area of the user's who is used for visual impairment UI and button BT1 has been set; In said choice box SB1, show the text data be used to indicate the MFP100 that can use with application program X, the said button BT1 that is provided with is used to be arranged on the variety of option in the application program X.
On the left side, the UI that is used for the user of visual impairment has: scan button BT2, and it is used to use the scanner function on MFP100; Photo media catching press-button BT3, it is used to use the photo media capturing function on MFP100; Reproduction button BT4, it is used to use the copy function on MFP100; PC-FAX button BT5, it is used to use the PC-FAX function on MFP100; Device is provided with button BT6, and it is used to be arranged on the various operations in the MFP100; Screen switching push button BT7, it is used for operation displayed screen on LCD17 is switched to the UI that is used for observable user.
In middle section, the UI that is used for the user of visual impairment comprises: image button BT8, and it is used on LCD17, showing the image data file that uses scanner function scanning; OCR button BT9, it is used to use the OCR function to come to set up text data file according to the image data file that uses scanner function scanning; Email button BT10, it is used for send Email; And file button BT11, it is used on LCD17, being presented at the various files of hard disk 14 stored.
Device is provided with button BT6 and is provided for the various settings of execution in MFP100.Screen switching push button BT7 is provided for operation displayed screen on LCD17 is switched to the UI that is used for observable user.When part UI is used for the user of visual impairment, when after when carrying out the processing (referring to Fig. 7) of S80 in the screen hand-off process A that states, display screen switching push button BT7 on LCD17.
Through screen switching push button BT7 is provided in the user's who is used for visual impairment UI; The user's who is used for visual impairment that observable user can will show on LCD17 through operating this screen switching push button BT7 UI converts the UI that is used for observable user into, and needn't waiting for CPU 11 after confirm whether the operator of PC1 is the perhaps observable of visual impairment in main processing of Fig. 4 of stating.Therefore, the increase of this button is easily for observable user.
In said first embodiment, the button name that on screen switching push button BT7, shows is formed by image, so screen reader can not be with the said title of acoustic expression.Therefore, do not state the existence of screen switching push button BT7 to the user of visual impairment.Therefore, because the user of visual impairment can not recognition screen switching push button BT7,, convert the UI that is used for observable user into the user's that will be used for visual impairment UI so the user of visual impairment can not carry out input operation to screen switching push button BT7.Therefore, this configuration user error ground that the reduced visual impairment UI that will on LCD17, be used for visual impairment person converts the risk of the UI that is used for visible person into.
And screen switching push button BT7 is configured to only accept input operation via mouse 16.Because the user of visual impairment is difficult to use mouse 16 to carry out input operation; Therefore the user of visual impairment can not carry out input operation to screen switching push button BT7, converts the UI that is used for observable user into the user's who is used for visual impairment that will on LCD17, show UI.Therefore, this configuration user error ground that the reduced visual impairment UI that will be used for the user of visual impairment converts the chance of the UI that is used for observable user into.
In first embodiment,, button BT1 distributed button name " setting " to being set, distributed " scanning " to scan button BT2; BT3 has distributed " photo media is caught " to the photo media catching press-button, has distributed " duplicating " to reproduction button BT4, and BT5 has distributed " PC-FAX " to the PC-FAX button; To device button BT6 is set and has distributed " device is provided with "; Distributed " screen switching " to screen switching push button BT7, distributed " image " to image button BT8, BT9 has distributed " OCR " to the OCR button; Distributed " Email " to email button BT10, distributed " file " to file button BT11.
Screen reader can be a voice signal with all text-converted that show in button name that is used for all button BT1-BT11 and the choice box that on UI, the provides SB1, so that help the user of visual impairment to carry out input operation.
Use the aforesaid UI that is used for the user of visual impairment, the user can use keyboard 15 and mouse 16 to come action button BT1-BT11 and choice box SB1, so that be used in the function on the MFP100.At this, the user at first must carry out and be used for one of button BT1-BT11 or choice box SB1 are identified as the operation of Action Target, and must import the instruction that is used for executable operations subsequently.
Be identified as the button BT1-BT11 or the choice box SB1 of Action Target, the button or the frame that promptly have been placed in the state that can import execution command are called as " input focus ".Said input focus is called as when specifying different button BT1-BT11 or choice box SB1 and is shifted.When input focus has been set to one of button BT1-BT11 or choice box SB1; Show the rectangle frame that is called as cursor along the profile of button that is set to input focus or frame, make the user can distinguish that in said button or the frame which has been set to input focus.
For example, when using keyboard 15 to carry out input operation, the user carries out identifying operation through pressing the Tab button that on keyboard 15, provides.It is strong that each user presses Tab, input focus with designated order one next move through button BT1-BT11 and choice box SB1.It is strong that the user repeatedly presses Tab, reached the button or the frame of expectation up to input focus.When being provided with the input focus of expectation, the user presses and returns the strong operation that is selected as the button BT1-BT11 or the choice box SB1 of input focus with execution.For the purpose of simplifying the description; Below supposition representes that for the button inputted execution command that on LCD17, shows this button is pressed, and is used for which UI (even in situation of the dialog box A that after for example with reference to figure 3B, states) of the user of visual impairment with demonstration or whether uses keyboard 15 or mouse 16 execution input operations have nothing to do.
When each input focus is set to one of button BT1-BT11 or choice box SB1, screen reader will be used for the button name of button BT1-BT11 or the text data that in being set to the choice box SB1 of input focus, shows converts into from the sound of loudspeaker 18 outputs.
When using mouse 16 to carry out identifying operation, carry out identifying operation and execution command simultaneously.For example; On the display position that will move to one of button BT1-BT11 or choice box SB1 at the display position of the cursor of mouse that shows on the LCD17; And press said mouse button (being called " click "), the user can carry out the execution command that identifying operation and input are used for specific button BT1-BT11 or choice box SB1.
Then, will the UI that be used for observable user be described with reference to figure 2B.Fig. 2 B shows the screenshot capture of the example of the UI that is used for observable user, and this example is the function screen of application program X.Through carrying out input operation according to the UI that is used for observable user, the user can use the various functions of MFP100.The UI that is used for observable user also is configured to make the user on various images, to use mouse 16 to carry out input operation.Can use mouse 16 to carry out and accomplish input operation than using in keyboard 15 step still less.Therefore, observable user can use mouse 16 and more easily carry out input operation than using keyboard 15.
In the upper area of the UI that is used for observable user, provide: choice box SB21 with button BT21 is set; In said choice box SB21, can be shown as text data by the MFP100 that application program X uses, the said button BT21 that is provided with is used in application program X, variety of option being set.
In the left side, the UI that is used for observable user is provided with: scan button BT22, and it is used to use the scanner function in MFP100; Photo media catching press-button BT23, it is used to use the photo media capturing function in MFP100; Reproduction button BT24, it is used to use the copy function in MFP100; PC-FAX button BT25, it is used to use the PC-FAX function in MFP100; And device is provided with button BT26, and it is used to be arranged on the option in the MFP100.
On the right side, the UI that is used for observable user is provided with: image graphics button GBT21, and it is used on LCD17 showing the image data file etc. of the image that uses scanner function scanning; OCR graphic button GBT22, it is used to use the OCR function to come to set up text data file according to the image data file of the image that is used to use scanner function scanning; Email graphic button GBT23, it is used for send Email; And, graphic file button GBT24, it is used on LCD17, being presented at the various files of hard disk 14 stored.
In said first embodiment,, button BT21 distributes button name " setting " to being set, distribute " scanning " to scan button BT22; Distribute " photo media is caught " to photo media catching press-button BT23; Distribute " duplicating " to reproduction button BT24, distribute " PC-FAX ", to device button BT26 is set and distributes " device is provided with " to PC-FAX button BT25; Distribute " image " to image graphics button GBT21; Distribute " OCR " to OCR graphic button GBT22, " Email " is set, and distribute " file " to graphic file button GBT24 to Email graphic button GBT23.
Said screen reader is configured to convert the text data that shows in the button name of button BT21-BT26 and the choice box SB21 that in being used for observable user's UI, provides into voice signal.But because graphic button GBT21-GBT24 is configured to image, so screen reader can not be a voice signal with this data-switching.
Therefore, the user of visual impairment is difficult on graphic button GBT21-GBT24, carry out input operation.Therefore, if on one of graphic button GBT21-GBT24, carry out input operation, can suppose that then observable user has carried out input operation.
In other words, because do not state the existence of graphic button GBT21-GBT24 to the user of visual impairment, so the user of visual impairment is difficult to discern graphic button GBT21-GBT24, and therefore, the user can not carry out input operation on one of these graphic buttons.Therefore, if on one of graphic button GBT21-GBT24, carry out input operation, can suppose that then observable user has carried out input operation.
Carry out identifying operation with one of recognition button BT21-BT26 or choice box SB21 because use keyboard 15 or mouse 16; Perhaps the method for input execution command is identical with the described method of UI about the user that is used for visual impairment, so does not repeat the explanation of this method.
And, can not discern graphic button GBT21-GBT24 through the operation on keyboard 15.In other words; Have only through moving at the display position of the cursor of mouse on the LCD17 on the display position of the graphic button GBT21-GBT24 that expects and carrying out click, just can be for graphic button GBT21-GBT24 input identifying operation and execution command.
Because the user of visual impairment is difficult to use mouse 16 to carry out input operation, the therefore any input operation on graphic button GBT21-GBT24, carrying out can be thought the input operation by observable user's execution with bigger determinacy.
Then, will the initial setting up processing by the CPU11 execution of PC1 be described with reference to figure 3A.Fig. 3 A is illustrated in the process flow diagram that interior step is handled in initial setting up.
The initial setting up of S10 is handled and is performed according to user's input operation switch flag to be set, and show tags and UI type are confirmed in initialization.In the time of in application program X is stored in application program X storage area 14c, CPU11 carries out initial setting up and handles.
In the S1 that initial setting up is handled; CPU11 shows dialog box A on LCD17, be used to point out the user to confirm whether use the screen handoff functionality that is used to be transformed into the function screen that is suitable for the user most according to the various users' settings that are used for PC1 and the characteristic of input operation.With reference to figure 3B dialog box A is described at this.
Dialog box is a function screen, is used to show the message of the processed instruction that request user input is used for carrying out, and is used to show the message of request from user's affirmation, and is used in response to the input of these message sinks from the user.In this initial setting up was handled, CPU11 obtained window from OS, and in the window that is obtained, showed dialog box A.
Fig. 3 B shows the screenshot capture of the example of dialog box A.Dialog box A is a window, and it is used for display message, and said message notifying user confirms whether use the screen handoff functionality through input instruction.
Dialog box A is provided with: text box TB31, button BT31 and button BT32; The display reminding user confirms whether to use the message of screen handoff functionality in said text box TB31; The user uses said button BT31 to import the instruction of using the screen handoff functionality, and the user uses said button BT32 to import the instruction of not using the screen handoff functionality.
Text box TB31 comprises for example text data " through the input operation that monitoring is carried out by the user, screen type can be switched to the design of the method for operating that is suitable for the user automatically ".In addition, distribute button name " to be ", and " deny " to button BT32 distribution button name to button BT31.
Screen reader is configured to convert the text data that in text box TB31, shows and the button BT31 that in dialog box A, provides and the button name of BT32 into voice signal, is beneficial to be used for the user's of observable user and visual impairment input operation.
Switch flag in switch flag storage area 14c3 stored is set to " effectively " when the user uses input operation to come selector button BT31, and when user's selector button BT32, is set to engineering noise.Behind one of selector button BT31 and BT32, close the window that wherein shows dialog box A.
Turn back to the process flow diagram in Fig. 3 A, in S2, CPU11 determines whether that the user has carried out that to press in the dialog box A that on LCD17, shows be the operation of button BT31.If supressing is button BT31 (S2: be), then in S3, CPU11 is set to " effectively " in the switch flag of switch flag storage area 14c3 stored.But if supress not button BT32 (S2: deny), then in S4, the CPU11 switch flag is set to engineering noise.
If switch flag has been set to " effectively ", then CPU11 will the operation displayed screen conversion be the UI that is used for visual impairment user's UI or is used for observable user on LCD17 according to the characteristic of user's input operation.But if switch flag is set to engineering noise, then CPU11 does not change operation displayed screen on LCD17.
In S5; CPU11 will be initialized as " effectively " at the affirmation show tags of confirming show tags storage area 14c4 stored; And in S6, the UI type (" 2 " in first embodiment) that the 14c5 stored is used to indicate observable user in UI type stores zone.Subsequently, CPU11 finishes the initial setting up processing.
Therefore; When the indication of the UI of 14c5 stored type is used for visual impairment user's UI; CPU11 shows the UI that is used for the visual impairment user on LCD17, and when said UI type indication was used for observable user's UI, CPU11 showed the UI that is used for observable user on LCD17.
Initial setting up through shown in Fig. 3 A is handled, and CPU11 can be provided with switch flag according to the input operation of being carried out by the user, and can work as the value that application program X is installed in hard disk 14 last time initialization affirmation show tags and UI type.
Then, will the main processing by the CPU11 execution of PC1 be described with reference to figure 4.
Fig. 4 is the process flow diagram that is illustrated in the step in said main the processing.Said main the processing is used for being provided with and the characteristic of user's input operation switches to current function screen the function screen that is suitable for the user according to the various users for PC1.In case when executive utility X at first, CPU11 will carry out main the processing.When application program X was moving, CPU11 carried out main the processing with the interval (for example per 30 minutes) of appointment.
In the main S11 that handles, each storage area 13a-13h that the CPU11 initialization provides in RAM13.At S12, CPU11 determines whether to be set to " effectively " in the switch flag of switch flag storage area 14c3 stored.If switch flag is set to " effectively " (S12: be), then CPU11 carries out the definite processing of user type in S13.Whether the user type of S13 confirms to handle definite user is the user or the observable user of visual impairment.But, if switch flag is that (S12: not), then CPU11 finishes main the processing to engineering noise, and does not carry out S13-S20.To specify user type below confirms to handle.
In S14, the user type that CPU11 obtains and confirms in the user type of S13 is confirmed to handle, to confirm.If user type is confirmed as the user (S14: the user of visual impairment) of visual impairment; If i.e. user type storage area 13h storage is used to indicate the user's of visual impairment user type; Then at S15, CPU11 reads in the UI type of UI type stores zone 14c5 stored.
At S16, CPU11 determines whether that the UI type that in S15, reads indicates observable user.If user type is indicated observable user (S16: be), then in S17, CPU11 carries out screen hand-off process A, and finishes main the processing subsequently.Said screen hand-off process A is used for the UI that switches to the user who is used for visual impairment with the UI that is used for observable user that on LCD17, shows.But, if (S16: not), then CPU11 finishes main the processing to the user of user type indication visual impairment, and does not carry out the processing of S17.To specify screen hand-off process A below.
Perhaps; If CPU11 confirms that the user type of in S13, confirming is observable user (S14: observable user); It is the observable user's of user type storage area 13h storage indication user type; Then in S18, CPU11 reads in the UI type of UI type stores zone 14c5 stored.
In S19, the user of the UI type indication visual impairment that CPU11 determines whether in S18, to read.If the user (S19: be) of user type indication visual impairment, then in S20, CPU11 carries out screen hand-off process B, and finishes main the processing subsequently.Said screen hand-off process B is used for the user's who is used for visual impairment who on LCD17, shows UI is switched to the UI that is used for observable user.Specify screen hand-off process B below.But (S19: not), then CPU11 finishes main the processing, and does not carry out the processing in S20 if user type is indicated observable user.
Through described main processing the in Fig. 4, CPU11 can be provided with according to the various users for PC1 and the characteristic of user's input operation switches to function screen the function screen that is suitable for the user.Therefore; The UI that promptly is used in observable user is displayed on the LCD17, and when the user of visual impairment carried out input operation, function screen was switched to the user's who is used for visual impairment UI; Make said processing more user friendly for the user of visual impairment; Wherein, for the user of said visual impairment, each operation is consuming time.Said processing also is user-friendly for observable user, because when observable user carries out input operation, the user's who is used for visual impairment who on LCD17, shows UI is switched to the UI that is used for observable user.
The definite processing of user type of S13 then, will be described with reference to figure 5.
Fig. 5 is illustrated in the process flow diagram that user type confirms to handle interior step.This processing is performed to be used for being provided with the characteristic of user's input operation according to the various users for PC1 confirms that whether the user of PC1 is the perhaps observable user of user of visual impairment.
In the S31 that user type is confirmed to handle, CPU11 initialization button input counting storage area 13e and mouse input counting storage area 13f.In S32, CPU11 carries out user mode and obtains processing.At this, will explain by the user mode of CPU11 execution with reference to figure 6A and obtain processing.
Fig. 6 A is illustrated in the process flow diagram that user mode is obtained the step in handling.This processing is performed the user who is used for obtaining for PC1 and is provided with, and determines whether to have carried out being provided with the use that is beneficial to by the people of visual impairment.
In user mode was obtained the S51 of processing, CPU11 confirmed whether screen reader is mounted (storage) at screen reader storage area 14b.If screen reader is installed in screen reader storage area 14b (S51: be), then in S52, the condition that CPU11 meets marker stores zone 13a stored in the condition of RAM13 meets mark and is set to " effectively ".But (S51: not), then in S53, the CPU11 condition meets mark and is set to engineering noise if screen reader is not mounted.Subsequently, CPU11 end user mode is obtained processing.
Obtain processing through user mode, when screen reader is installed in 14 last times of hard disk, CPU11 can condition meets mark and is set to " effectively ", is used to indicate PC1 to be configured to be used for the user of visual impairment.Because screen reader is mainly used by the people of visual impairment, can think highly possibly that therefore the user by visual impairment when screen reader has been installed on the PC1 operates PC1.
Turn back to the process flow diagram in Fig. 5, in S33, CPU11 confirms whether meet mark in the condition that condition meets marker stores zone 13a stored is set to " effectively ".Be set to " effectively " (S33: be) if condition is met mark, then at S34, the CPU11 executable operations is obtained processing.Be set to engineering noise (S33: deny) if condition meets mark, then CPU11 skips the processing in S34-41, and proceeds to S42.
At this, will obtain processing by the operation of the S34 of CPU11 execution with reference to figure 6B explanation.
Fig. 6 B is that the process flow diagram of handling interior step is obtained in the operation that is illustrated in S34.This operation is obtained to handle to be performed and is used for obtaining the input operation of using keyboard 15 execution and the input operation of using mouse 16 to carry out.
Obtain the S61 that handles beginning in operation, the mouse action that CPU11 obtains the button operation that uses keyboard 15 execution or uses mouse 16 to carry out, and, determine whether in application program X, to have carried out the input operation of being obtained at S62.If in application program X, carried out the input operation of being obtained (S62: be), then the CPU11 end operation obtains processing.But (S62: not), then CPU11 returns S61, and repeats the processing in S61-S62 if in application program X, do not carry out the input operation of being obtained.
Processing is obtained in operation through shown in Fig. 6 B, and CPU11 can obtain the input operation of in application program X, carrying out from the various input operations of using keyboard 15 and mouse 16.
Turn back to Fig. 5, at S35, CPU11 determines whether that the input operation of in S34, obtaining is the input operation of pressing Tab button, control button and one of Tab button, space button, Alt button and arrow key.If the input operation of being obtained is one of above-mentioned button (S35: be), then at S36, CPU11 will increase progressively " 1 " in the value of button input counting storage area 13e stored.
In S37, CPU11 determines whether to have surpassed " 10 " at the counting of button input counting storage area 13e stored.If said counting has surpassed " 10 " (S37: be); Then at S38; CPU11 is used to indicate the user's of visual impairment user type (" 1 " in first embodiment) in user type storage area 13h stored, and the end user type is confirmed to handle subsequently.
But (S37: not), then CPU11 turns back to S34, and repeats the processing in S34-S37 as stated if counting is not more than " 10 ".
And, if CPU11 confirms that at S35 the input operation of being obtained is not that (S35: not), then at S39, CPU11 confirms whether the input operation of being obtained is click in one of above-mentioned operation of enumerating.If the input operation of being obtained is click (S39: be), then at S40, CPU11 will be in the count increments " 1 " in the mouse input counting storage area 13f.In S41, CPU11 confirms whether surpassed " 5 " at the counting of mouse input counting storage area 13f stored.If counting has surpassed " 5 " (S41: be), then in S42, the user type (" 2 " in first embodiment) that CPU11 is used to indicate observable user in user type storage area 13h stored, and the end user type is confirmed to handle subsequently.
But, be not that (S39: not), then CPU11 turns back to S34, and repeats the processing in aforesaid S34-S39 in click if CPU11 confirms the input operation of being obtained in S39.And (S41: not), then CPU11 turns back to S34, and repeats the processing in S34-S41 as stated if CPU11 confirms to be not more than " 5 " by said counting in S41.
Because the user of visual impairment is difficult to carry out input operation when watching function screen, therefore compare with observable user, can think that the user of unlikely visual impairment will use mouse 16 to carry out input operation.And; Because observable user can carry out input operation when watching function screen; Therefore observable user more possibly use mouse 16 to carry out input operation than the user of visual impairment because can on the mouse 16 to finish the work than still less operation on keyboard 15.
Therefore; When determined in the S37, using keyboard 15 to carry out more input operation; The user that can think visual impairment is operating PC1; And, be that observable user is operating PC1 when when confirming in the S41, using mouse 16 to carry out more input operation.
PC1 is configured to allow to use both input operations of keyboard 15 and mouse 16.But the characteristic in these input operations can be easily to be detected, because between the input operation of being carried out by the user of visual impairment and observable user, be significant in the difference on the characteristic.
Confirm to handle through the user type shown in Fig. 5, CPU11 can be provided with and use the quantity of the input operation that keyboard 15 carries out or use the quantity of the input operation that mouse 16 carries out to confirm whether the user is the perhaps observable of visual impairment according to the user who is used for PC1.And; CPU11 can confirm the user be visual impairment time be used to indicate the user's of visual impairment user type and the user type that when confirming that the user is visible, is used to indicate observable user in user type storage area 13h stored in user type storage area 13h stored.
The screen hand-off process A of the S17 that is carried out by the CPU11 of PC1 then, will be described with reference to figure 7.
Fig. 7 is the process flow diagram that is illustrated in the step in the screen hand-off process A.This processing is performed the UI that is used for the UI that is used for observable user that on LCD17, shows is switched to the user who is used for visual impairment.
Said processing is used for when function screen being transformed into the user of visual impairment, display screen switching push button BT7 in the viewing area of the user's who is used for visual impairment UI.
In the S71 of the beginning of screen hand-off process A, CPU11 determines whether to be set to " effectively " at the affirmation show tags of confirming show tags storage area 14c4 stored.Be set to " effectively " (S71: be) if confirm show tags, then at S72, CPU11 shows dialog box B on LCD17, is used to point out the user to confirm whether function screen to be converted into the user's who is used for visual impairment UI.But (S71: not), then CPU11 skips the processing in S72-S75, and proceeds to S76 if confirm to be set to engineering noise by show tags.
At this, will dialog box B be described with reference to figure 9A.
Fig. 9 A shows the screenshot capture of the example of dialog box B.Dialog box B is a window, and it is used for display message, and said message is used to point out the user to confirm whether function screen to be converted into the user's who is used for visual impairment UI through input instruction.
Dialog box B is provided with: text box TB41, check box CB41, be button BT41 and button BT42 not; Said text box TB41 is used for the message that the display reminding user confirms whether function screen to be converted into the user's who is used for visual impairment UI; Said check box CB41 is used to allow the user to import the following instruction that is not presented at the demonstration dialog box B on the LCD17; Said be button BT41 allow user's input function screen is transformed into the user who is used for visual impairment UI instruction, said not button BT42 allows user's input function screen not to be transformed into the instruction of the user's who is used for visual impairment UI.
Text box TB41 shows for example text data " you can be the window that is suitable for corresponding to the keyboard operation of acoustic expression software with screen conversion ".And, " this message will not be shown " in future in the right part display text data of check box CB41.In addition, respectively to being button BT41 and denying that button BT42 distributes button name " to be " and " denying ".
Thisly dispose the user who is beneficial to visual impairment and carry out input operation, because text data that screen reader can show with the text data that in text box TB41, shows, at the right part of check box CB41 and the button BT41 that in dialog box B, provides and the button name of BT42 convert voice signal into.
If it is button BT41 that the user carries out that input operation presses, then the function screen on LCD17 is switched to the user's who is used for visual impairment UI.If press not button BT42, then function screen keeps constant.When pressing the button one of BT41 and BT42, close the window that wherein shows dialog box B.
If the user has imported check mark in check box CB41 when pressing the button BT41 or BT42, then CPU11 is set to engineering noise at the affirmation show tags of confirming show tags storage area 14c4 stored.
Return Fig. 7, in S73, it is button BT41 that CPU11 determines whether in dialog box B, to supress.If supressed is button BT41 (S73: be), and then at S74, CPU11 determines whether in check box CB41, to have inserted check mark.
When in the check box CB41 of dialog box B, having inserted check mark (S74: be), at S75, CPU11 is set to engineering noise at the affirmation show tags of confirming show tags storage area 14c4 stored.
But (S74: not), then CPU11 skips S75, and proceeds to S76 if also in check box CB41, do not imported check mark.
At S76, the 14c5 stored is used to indicate the user's of visual impairment UI type (" 1 " in first embodiment) to CPU11 in UI type stores zone.At S77, CPU11 will be on LCD17 the operation displayed screen conversion for corresponding in the UI type of UI type stores zone 14c5 stored, promptly be used for the user's of visual impairment the function screen of UI.Particularly, CPU11 closes the window that has wherein shown the UI that is used for observable user, obtains new window from OS, and in the window that newly obtains, shows the user's who is used for visual impairment UI.At S80, CPU11 is used for display screen switching push button BT7 (referring to Fig. 2 A) in user's the viewing area of UI of visual impairment on LCD17, and finishes screen hand-off process A subsequently.
But if in dialog box B, supress not button BT42 (S73: deny), then in S78, CPU11 determines whether input checking mark in check box CB41.
If in check box CB41, inserted check mark (S78: be), then at S79, CPU11 is set to engineering noise at the affirmation show tags of confirming show tags storage area 14c4 stored.But (S78: not), then CPU11 finishes screen hand-off process A, and does not carry out the processing in S79 if in check box CB41, do not insert check mark.
If when the UI that is used for observable user that will on LCD17, show converts the user's who is used for visual impairment UI into; Confirm that show tags is set to " effectively "; Screen hand-off process A shown in then passing through in Fig. 7; CPU11 can show dialog box B on LCD17, it is used to point out the user to confirm whether function screen to be transformed into the user's who is used for visual impairment UI.And when showing dialog box B, under the condition of user input instruction with the UI that function screen converted into the user who is used for visual impairment, CPU11 can be the UI that is used for the user of visual impairment with operation displayed screen conversion on LCD17.Therefore, this configuration can prevent that when the user was observable user, the UI that is used for the user of visual impairment was displayed on LCD17.
In other words; Screen switching push button BT7 is provided in the viewing area of said UI when being displayed on the LCD17 through the UI as the user who is used for visual impairment; Observable user can convert the user's who is used for visual impairment UI into be used for observable user UI through function screen switching push button BT7, and type that needn't waiting for CPU 11 definite user in main processing of Fig. 4.
The screen hand-off process B of the S20 that is carried out by CPU11 then, will be described with reference to figure 8.
Fig. 8 is the process flow diagram that is illustrated in the step in the screen hand-off process B.This processing is performed the UI that is used for the user who is used for visual impairment who on LCD17, shows and switches to the UI that is used for observable user.In the S81 of the beginning of screen hand-off process B, CPU11 determines whether to be set to " effectively " at the affirmation show tags of confirming show tags storage area 14c4 stored.If said affirmation show tags is set to " effectively " (S81: be), then at S82, CPU11 shows dialog box C on LCD17, is used to point out the user to confirm whether convert function screen into be used for observable user UI.But (S81: not), then CPU11 skips the processing in S82-S85, and proceeds to S86 if confirm to be set to engineering noise by show tags.
At this, will dialog box C be described with reference to figure 9B.Fig. 9 B shows the screenshot capture of the example of dialog box C.Said dialog box C is a window, is used for display message, and said message notifying user confirms whether convert function screen into be used for observable user UI through input instruction.
Dialog box C provides: text box TB51, check box CCB51, be button BT51 and button BT52 not; Said text box TB51 is used for the message that the display reminding user confirms whether function screen to be converted into the UI that is used for observable user; Said check box CCB51 is used to allow the user to insert the following instruction that on LCD17, does not show dialog box C; Said is that button BT51 allows the user to import the instruction that function screen is converted into the UI that is used for observable user, and said not button BT52 is used to allow the user to import the instruction that function screen is not converted into the UI that is used for observable user.
Text box TB51 for example shows text data " you can with screen conversion to the visual windows that is suitable for mouse action (note: the information in screen can not be read by screen reader, and can not use keyboard to carry out input operation) ".And text data " will not illustrate this message " and is displayed on check box CB51 right part in future.In addition, to being button BT51 and denying that button BT52 distributes button name " to be " respectively and " denying ".
Because text data that in dialog box C, shows and other guide are not stated to the user of visual impairment, so the user of visual impairment will be difficult to be identified in the content in the dialog box C.Therefore, the user of visual impairment can not carry out input operation and convert the UI that is used for observable user into the user's that will be used for visual impairment UI on the button BT51 in the dialog box C.Therefore, this configuration UI that reduced the user who is used for visual impairment that the user error ground of the visual impairment of operation PC1 will show on LCD17 converts the chance of the UI that is used for observable user into.
Screen reader can not with in text box TB51 and the text data that shows on the right side of check box CB51 and the button BT51 that in dialog box C, provides and the button name of BT52 be converted into voice signal.
And dialog box C is configured to only accept to use the input operation of mouse 16.Therefore, only when using mouse 16 clicks to be button BT51, the function screen on LCD17 is switched to the UI that is used for observable user.In other words, because such input operation is difficult to carried out by the user of visual impairment, just can suppose that therefore the user who in dialog box C, carries out input operation is observable user.Because the user of visual impairment is difficult to use mouse 16 to carry out input operation, so the user of visual impairment can not click, and to be button BT51 convert the user's who is used for visual impairment UI into be used for observable user UI.Therefore, this configuration UI that reduced the user who is used for visual impairment that the user error ground of the visual impairment of operation PC1 will show on LCD17 converts the chance of the UI that is used for observable user into.When not pressing not button BT52, conversion operations screen not.And, when being button BT51 or when button BT52 is not pressed, close the window that wherein shows dialog box C.
And because for the user of visual impairment suppresses to be transformed into the message of the UI that is used for observable user, so the user of visual impairment can avoid the so unnecessary input operation in ground, makes that said system is more user friendly.
When the user carried out input operation and is button BT51 to press on mouse 16, the function screen on LCD17 was switched to the UI that is used for observable user.If press not button BT52, then said function screen is retained change.
And if be button BT51 or not during button BT52, the user inserts check mark in check box CB41 when pressing, then CPU11 is set to engineering noise at the affirmation show tags of confirming show tags storage area 14c4 stored.
Return Fig. 8, at S83, CPU11 determines whether to use click to operate and presses in the dialog box C that on LCD17, shows to be button BT51.If clicked is button BT51 (S83: be), and then at S84, CPU11 determines whether in check box CB51, to have inserted check mark.
If in check box CB51, inserted check mark (S84: be), then at S85, CPU11 is set to engineering noise at the affirmation show tags of confirming show tags storage area 14c4 stored.
But (S84: not), then CPU11 skips S85, and proceeds to S86 if also in check box CB51, do not insert check mark.
At S86, the CPU11 UI type (" 2 " in first embodiment) that the 14c5 stored is used to indicate observable user in UI type stores zone.In S87, CPU11 will be on LCD17 the operation displayed screen conversion for corresponding in the UI type of UI type stores zone 14c5 stored, promptly be used for the function screen of observable user's UI, and finish screen hand-off process B subsequently.In other words, CPU11 closes the window that wherein shows the user's who is used for visual impairment UI, obtains new window from OS, in the window that newly obtains, shows to be used for observable user's UI, and finishes said processing.
But if CPU11 confirms in dialog box C, to supress not button BT52 (S83: deny) in S83, then at S88, CPU11 determines whether in check box CB51, to have inserted check mark.
If in check box CB51, insert check mark (S88: be), then at S89, CPU11 is set to engineering noise at the affirmation show tags of confirming show tags storage area 14c4 stored.
But (S88: not), then CPU11 finishes screen hand-off process B, and does not carry out the processing in S89 if in check box CB51, do not insert check mark.
If when the user's who is used for visual impairment that will on LCD17, show UI converts the UI that is used for observable user into; Confirm that show tags is set to " effectively "; Through the screen hand-off process B shown in Fig. 8; CPU11 can show dialog box C on LCD17, it is used to point out the user to confirm whether convert function screen into be used for observable user UI.And when showing dialog box C, CPU11 can import under the condition of the instruction that is transformed into the UI that is used for observable user the user, will on LCD17, the operation displayed screen conversion be the UI that is used for observable user.Therefore, because the user must use mouse 16 to carry out input operation so that convert function screen into be used for observable user UI, so the user of visual impairment can not convert function screen into be used for observable user UI by error.
In first embodiment, CPU11 can be according to being provided with for the user of PC1 and the characteristic of user's input operation confirms that whether the user is visual impairment or observable, and function screen converted into be suitable for user's screen most.Therefore; When the user of visual impairment carries out input operation, even when the UI that is used for observable user was displayed on the LCD17, function screen was switched to the user's who is used for visual impairment UI; Make system more user friendly thus for the user of visual impairment; Wherein, for the user of visual impairment, each operation is consuming time.And when observable user carries out input operation, even when the user's who is used for visual impairment UI was displayed on the LCD17, function screen was switched to the UI that is used for observable user, makes said system more user friendly for observable user.
And; Through screen switching push button BT7 is provided in the user's who is used for visual impairment UI; CPU11 confirmed user's type in the user type of Fig. 5 is confirmed to handle before, the user's who is used for visual impairment that observable user can will show on LCD17 through function screen switching push button BT7 UI switched to the UI that is used for observable user.Therefore, the increase of this button is easily for observable user.
Then, will with reference to Figure 10 describe as user mode in S32 obtain processings (referring to Fig. 5) first version, obtain processing by the user mode of the CPU11 execution of PC1.Though CPU11 obtains the unique user setting of obtaining in the processing for PC1 in the user mode of first embodiment; But; Obtaining in the processing according to the user mode of first version; The a plurality of users that CPU11 obtains for PC1 are provided with, and have determined whether to carry out these each in being provided with so that by the user's of visual impairment use.
Figure 10 is the step in handling is obtained in diagram according to the user mode of first version of first embodiment a process flow diagram.
In user mode is obtained the S91 of processing, CPU11 initialization condition coincidence counting storer 13d.In S92, CPU11 determines whether that the default setting that velocity ratio that the cursor of mouse in OS is provided with moves sets up is slower when OS is installed.Particularly, CPU11 confirms that the speed of cursor of mouse is set to " slowly ".If cursor of mouse speed is set to than default setting slow (S92: be), then in S93, CPU11 will increase progressively " 1 " in the count value of condition coincidence counting storage area 13d stored.But if the cursor mouse is not set to be slower than default setting (S92: deny), then CPU11 skips S93, and proceeds to S94.
At S94, whether the size text that is provided with in the OS of the size text that CPU11 confirms on being used for screen, to show is provided with is greater than the default setting of when OS is installed, setting up.Particularly, CPU11 confirms whether size text is set to " greatly ".If size text is set to greater than default setting (S94: be), then in S95, CPU11 will increase progressively " 1 " in the count value in the condition coincidence counting storage area 13d.But (S94: not), then CPU11 skips S95, and proceeds to S96 greater than default setting if size text is not set to.
At S96, CPU11 confirms whether screen reader is mounted (storage) at screen reader storage area 14b.If screen reader is installed in screen reader storage area 14b and goes up (S96: be), then at S97, CPU11 will increase progressively " 1 " in the count value of condition coincidence counting storage area 13d stored.But (S96: not), then CPU11 skips S97, and proceeds to S98 if screen reader is not mounted.
At S98, CPU11 confirms that whether screen reader is in operation.If screen reader is in operation (S98: be), then at S99, CPU11 will increase progressively " 1 " in the count value of condition coincidence counting storage area 13d stored.But, if (S98: not), then CPU11 skips S99 to the screen reader off-duty, and proceeds to S100.
At S100, CPU11 determines whether to surpass " 2 " at the counting of condition coincidence counting storage area 13d stored.If said counting surpasses " 2 " (S100: be), then in S101, the condition that CPU11 meets marker stores zone 13a stored in condition meets mark and is set to " effectively ", and finishes user mode subsequently and obtain processing.But (S100: not), then at S102, the CPU11 condition meets mark and is set to engineering noise, and finishes user mode subsequently and obtain processing if counting is not greater than " 2 ".
User mode through shown in Figure 10 is obtained processing; A plurality of users that CPU11 obtains for PC1 are provided with; And when the quantity of the setting of using as the user who is used to be convenient to by visual impairment surpasses " 2 ", can be provided to be convenient to user by visual impairment through supposition PC1 and make the condition of being used for meet mark to be set to " effectively ".In other words, use the quantity of the setting of PC1 to surpass " 2 " if be performed the user who is beneficial to by visual impairment, then the user is likely visual impairment.
Then, will explain with reference to figure 11A-11C according to second, third of the version that obtains processing as the operation of first embodiment and obtain processing with the operation of the 4th version.
At first, processing is obtained in the operation of explanation in second version.Figure 11 A is the step in handling is obtained in diagram according to the operation of second version of first embodiment a process flow diagram.Be used to obtain the input operation in the program of itself though the operation of the S34 in first embodiment obtains to handle, this processing is performed obtains the input operation that the user carries out in being predetermined to be a plurality of application programs of operating the target of obtaining.
In the S111 of processing was obtained in operation, CPU11 obtained the mouse action in button operation of carrying out on the keyboard 15 or execution on mouse 16.At S112, CPU11 determines whether in one of destination application, to carry out the input operation of in S111, obtaining.If in one of destination application, carry out the input operation (S112: be) of being obtained, then the CPU11 end operation obtains processing.But (S112: not), then CPU11 returns S111, and repeats the processing in S111-112 if in one of destination application, do not carry out the input operation of being obtained.
Processing is obtained in operation through shown in Figure 11 A, and CPU11 can obtain the input operation of execution in one of a plurality of application programs that are predetermined to be the target that operation obtains in the input operation of using keyboard 15 and mouse 16 to carry out.
For example; If with operation be retrieved as target, can use keyboard 15 and mouse 16 to come the application program of input operation when the user is visible, mainly to use mouse 16 to import said operation; Then when using keyboard 15 to carry out input operation, the user is a visual impairment probably.In other words, because the difference between the characteristic of the input operation of being carried out by the user of visual impairment and observable user is quite big, therefore, might confirm that whether the user is visual impairment or observable with bigger correctness.The example of wherein mainly using indicator device to carry out the application program of input is image processing program and application program X.
Then, processing is obtained in the operation of explanation in the 3rd embodiment.Figure 11 B is the process flow diagram that obtains the step in handling according to the operation of the 3rd embodiment.This processing is used to obtain except such as the Word of MS Word (registered trademark) with the input operation of carrying out in such as the application the electrical form software of MS Excel (registered trademark).
In the S113 of processing was obtained in operation, CPU11 obtained the mouse action in button operation of carrying out on the keyboard 15 or execution on mouse 16.At S114, CPU11 confirms in word processor, whether to have carried out the input operation of in S113, obtaining.If (S114: not), then in S115, CPU11 confirms in spreadsheet program, whether to carry out the input operation of being obtained in the application program except word processor, to carry out the input operation of being obtained.
If (S115: not), then the CPU11 end operation obtains processing in the application except spreadsheet program, to carry out the input operation of being obtained.
But; If CPU11 confirms in word processor, to carry out the input operation (S114: be) of being obtained in S114; Perhaps in S115, confirm in spreadsheet program, to carry out the input operation (S115: be) of being obtained, then CPU11 returns S113, and repeats the processing in S113-S115.
Processing is obtained in operation through shown in Figure 11 B, and then CPU11 can obtain the input operation of in the application program except word processing or spreadsheet program, carrying out from the input operation of using keyboard 15 and mouse 16 to carry out.
At this, use keyboard 15 to carry out the input operation in word processing and spreadsheet program usually, and whether visual impairment is perhaps visible irrelevant with the user, has shown the very little difference in the characteristic of input operation.Therefore, through not obtaining the input operation of in word processing and spreadsheet program, carrying out, might confirm that whether the user is visual impairment or observable with bigger correctness.
Then, explanation is obtained processing according to the operation of the 4th version.Figure 11 C is that the process flow diagram of handling interior step is obtained in the operation that is illustrated in the 4th version.In this is handled,, do not obtain input operation when such as the Word of MS Word (registered trademark) or when the electrical form running software of MS Excel (registered trademark) is on PC1.
Obtain the S116 of processing in operation, CPU11 obtains the mouse action in button operation of carrying out on the keyboard 15 or execution on mouse 16.At S117, CPU11 confirms that whether word processor is in operation.(S117: not), then at S118, CPU11 confirms that whether spreadsheet program is in operation if word processor is not in operation.(S118: not), then the CPU11 end operation obtains processing if spreadsheet program is not in operation.
If CPU11 confirms word processor and confirms spreadsheet program in operation (S118: be) in operation (S117: be) or in S118 that then CPU11 turns back to S116, and repeats the processing in S116-S118 in S117.
Processing is obtained in operation through shown in Figure 11 C, and when word processor or spreadsheet program during in operation, CPU11 can obtain input operation in the input operation of using keyboard 15 and mouse 16 to carry out.
And, be used for confirming word processor or spreadsheet program whether operation and when such program during in operation definite said method of not obtaining input operation be than determining whether in word processor or spreadsheet program, to have carried out more shirtsleeve operation of each input operation.
Then, the user type of explanation in the 5th version of first embodiment confirmed to handle.This processing is that the user type in first embodiment confirms to handle the version of (Fig. 5).
Figure 12 is the process flow diagram of the step in diagram is confirmed to handle according to the user type of the 5th version of first embodiment.This is handled according to the characteristic that is performed the input operation of display menu bar on LCD17, and whether the operator who is performed to confirm PC1 is visual impairment or observable.
In the S121 that user type is confirmed to handle, CPU11 initialization button input counting storage area 13e.In S32, the user mode shown in CPU11 carries out in Fig. 6 A is obtained processing.In S122, CPU11 confirms whether meet mark in the condition that condition meets marker stores zone 13a stored is set to " effectively ".Be set to " effectively " (S122: be) if condition meets mark, then at S34, processing is obtained in the operation shown in CPU11 carries out in Fig. 6 B.But, if meeting mark, condition is set to engineering noise (S122: deny), then CPU11 skips the processing in S34-S127, and proceeds to S128,
In S123, CPU11 confirms whether the input operation of obtaining at S34 is the Alt button.If the input operation of being obtained is Alt button (S123: be), then in S124, CPU11 will increase progressively " 1 " in the count value of button input counting storage area 13e stored.
At S125, CPU11 confirms whether surpass " 10 " in the count value of button input counting storage area 13e stored.If said counting surpasses " 10 " (S125: be); Then at S126; CPU11 is used to indicate the user's of visual impairment user type (" 1 " in the 5th version) in the user type storage area 13h of RAM13 stored, and the end user type is confirmed to handle subsequently.
But (S125: not), then CPU11 turns back to S34, and repeats the processing in S34-S125 as stated if count value is not more than " 10 ".
And (S123: not), then at S127, CPU11 confirms whether the input operation of being obtained is the click that is used for the display menu bar if CPU11 confirms not obtain the Alt button operation in S123.If the input operation of being obtained is click (S127: be); Then at S128; The user type (" 2 " in the 5th version) that CPU11 is used to indicate observable user in user type storage area 13h stored, the end user type is confirmed to handle subsequently.
But, be not that (S127: not), then CPU11 turns back to S34, and repeats aforesaid processing in S34-S127 in the click that is used for the display menu bar if CPU11 confirms the input operation of being obtained in S127.
In most of application programs, the user is display menu bar on LCD17 usually, and the processing of in menu bar, selecting expectation is to carry out this processing.Therefore, in input operation, frequently carry out the operation that is used for display menu bar on LCD17.
During the display menu bar, might be that the user of visual impairment is using said application when using keyboard 15 on LCD17.If use mouse 16 display menu bar on LCD17, might the user be observable then.
Confirm to handle through described user type in Figure 12; CPU11 can be provided with to confirm that whether the operator of PC1 is visual impairment or observable according to the user for PC1, and can determine whether to use keyboard 15 or mouse 16 to import the operation that is performed display menu bar on LCD17.
Though specified the present invention with reference to first embodiment and version thereof, it is obvious that for those skilled in the art, under the situation that does not break away from spirit of the present invention, can carry out various changes and modification therein.
For example, in the S80 (referring to Fig. 7) according to the screen hand-off process A of first embodiment, PC1 is display screen switching push button BT7 in the viewing area of the user's who is used for visual impairment UI.But PC1 can be configured to outside the viewing area of the user's who is used for visual impairment UI, show this screen switching push button BT7.
Particularly, can be in the precalculated position in the viewing area of LCD17 (for example, in the lower right corner of said viewing area) display screen switching push button BT7, shown in Figure 13 A.In this case, observable user can easily find screen switching push button BT7 through checking said pre-position, even when the demonstration on LCD17 has mixed a plurality of function screen.Therefore, observable user can be easily converts the user's who is used for visual impairment UI into be used for observable user UI, though when the demonstration on LCD17 be mix the time, make that said configuration is user-friendly for observable user.When the UI that is used for observable user was displayed on LCD17 and goes up, display screen switching push button BT7 on LCD17 not was shown in Figure 13 B.
In aforesaid first embodiment, when pressing sub-screen switching push button BT7, the user's who is used for visual impairment who on LCD17, shows UI is switched to the UI that is used for observable user.But, can in the viewing area of the user's who is used for visual impairment UI, check box CB be provided, convert the UI that is used for observable user into to specify the UI that whether will be used for the user of visual impairment.Through when check box CB is examined, pressing button BT1 is set, PC1 can be configured to operation-screen changing to the UI that is used for observable user.
Then, will the PC101 according to second embodiment be described with reference to figure 14A-15B.Handle A and the B except PC101 also carries out function executing, PC101 has and the configuration identical according to the PC1 of first embodiment.
Therefore; Though when on LCD17, showing the user's who is used for visual impairment UI; PC1 display screen switching push button BT7 in the viewing area of said UI according to first embodiment; But, when pressing the button one of BT1-BT11 (get rid of screen switching push button BT7) in the user's who is being used for visual impairment the UI, according to display screen switching push button BT7 in the viewing area of the PC101 of the second embodiment also function operations of each on LCD7 screen.
As said in Fig. 2 A; The UI that is used for the user of visual impairment provides button BT1-BT11 (except screen switching push button BT7) and button BT21-BT26 and graphic button GBT21-GBT24 respectively with the UI that is used for observable user, is used to use the various functions of MFP100.When the user presses one of these buttons, on LCD17, show to be used for function operations screen corresponding to the button of being pressed.
The function of MFP100 and the instruction that input is used to be provided with option are perhaps used in the setting that the function operations screen makes the user to carry out and is used for application program X.
Then, will the function executing processing A by the CPU11 execution of PC10 be described with reference to figure 14A.Figure 14 A is illustrated in the process flow diagram that function executing is handled interior step.
Said function executing is handled and is performed when on LCD17, showing the user's who is used for visual impairment UI demonstration corresponding to the function operations screen of the button BT1-BT11 that is pressed by the user (except screen switching push button BT7), and in the viewing area of function operations screen display screen switching push button BT7.When application program X moved, this processing was carried out with being repeated.Said function executing is handled and for example is stored in the application program X storage area 14c.
In the S211 that function executing is handled, CPU11 confirms whether the user presses one of button BT1-BT11 in the user's who is used for visual impairment UI.CPU11 is (S211 :) do not wait for when not pressing the button one of BT1-BT11.When pressing the button one of BT1-BT11 (S211: be), at S212, CPU11 confirms whether the button that is pressed is screen switching push button BT7.
If the button that is pressed is screen switching push button BT7 (S212: be), then at S20, CPU11 carries out the screen hand-off process B in aforesaid Fig. 8, and turns back to S211.
But (S212: not), at S213, CPU11 does not show the function operations screen that is used for corresponding to the button BT1-BT11 that is pressed (except screen switching push button BT7) on LCD17 when the button of being pressed is not screen switching push button BT7.Particularly, CPU11 obtains window from OS, and demonstration is used for the function operations screen corresponding to the button of being pressed in the window that is obtained.
In S214, CPU11 is display screen switching push button BT7 in the viewing area of function operations screen, and turns back to S211.
For example; When on LCD17, showing the user's who is used for visual impairment UI; The user uses keyboard 15 or mouse 16 and when pressing scan button BT2, and CPU11 reading scan device feature operation screen on LCD17 is used to use the scanner function of MFP100; And display screen switching push button BT7 in the viewing area of this scanner function function screen subsequently.
The screenshot capture of the example of the scanner function function screen that Figure 14 B shows on LCD17 when showing the scan button BT2 in the user presses the user's who is being used for visual impairment UI.
Scanner function function screen shown in Figure 14 B comprises preserves form selector button SB61, resolution selector button SB62, scan type selector button SB63, screen switching push button BT7, OK button BT61 and cancel button BT62.
Preserving form selector button SB61 makes the user can select to be used to preserve one of a plurality of selections of the form of the image file that produces when when using the scanner function of MFP100, scanning source document.Resolution selector button SB62 makes one of a plurality of selections of resolution of the view data that the user can select when using scanner function to read source document, to produce.
Scan type selector button SB63 makes the user can be chosen in one of a plurality of selections of the quantity of the color in the view data that when using scanner function scanning source document, produces.Screen switching push button BT7 makes that the user can be the UI that is used for observable user with operation displayed screen conversion on LCD17.
OK button BT61 accepts the setting of the number of color of preservation form, resolution and view data, and the instruction of sending the beginning scanning document.When pressing OK button BT61,, produce view data according to using being provided with that selector button SB61-SB63 selects to scan the source document of on the scanning of a surface of MFP100, arranging.Cancel button BT62 is used to be closed in LCD17 and goes up the scanner function function screen that shows, and does not carry out scan operation.
As stated; If the user presses scan button BT2 when on LCD17, showing the user's who is used for visual impairment UI; CPU11 reading scan device feature operation screen on LCD17 then, and in the viewing area of scanner function function screen further display screen switching push button BT7.
Above-mentioned explanation only is an example that when pressing scan button BT2, is used for reading scan device feature operation screen on LCD17.On LCD17, show the similar operation screen that is used for other functions when pressing corresponding button BT1-BT11 when (getting rid of screen switching push button BT7), but the explanation of these function screens is not provided.
Function executing through shown in Figure 13 A is handled A; CPU11 can show the function operations screen that is used for corresponding to the button BT1-BT11 that is pressed by the user (get rid of screen switching push button BT7) in the user's who is used for visual impairment UI, and can be in the viewing area of function operations screen further display screen switching push button BT7.
Therefore; Which have nothing to do with showing function screen; Observable user can select screen switching push button BT7 to convert the UI that is used for observable user into the user's who is used for visual impairment that will on LCD17, show UI simply, makes that this configuration is easily for observable user.
And; If mixed the UI and a plurality of windows that are used for the function screen that is used for other programs other function operations screens that comprises except the user who is used for visual impairment in the demonstration on the LCD17, then observable user still can only easily find screen switching push button BT7 through the UI that finds the user who is used for visual impairment with being used for of said function operations screen.Therefore, observable user can be easily converts the user's who is used for visual impairment UI into be used for observable user UI, even be when mixing when on LCD17, showing, makes that thus this configuration is easily for observable user.
Then, with the function executing treatments B of explanation in second embodiment.Figure 15 A is the process flow diagram that is illustrated in the step in the function executing treatments B.
It is to be used for when in the UI that is used for observable user that the user shows on LCD17 when one of selector button BT21-BT26 or graphic button GBT21-GBT24 that said function executing is handled, and demonstration is corresponding to the processing of the function operations screen of this button on LCD17.When application program X is moving, repeat the function executing treatments B by PC101.This application program of carrying out the function executing treatments B for example is stored in the application program X storage area 14c.
In the S221 that said function executing is handled, CPU11 confirms whether the user presses the button one of BT21-BT26 or graphic button GBT21-GBT24 in being used for observable user's UI.(S221: not), CPU11 does not continue to wait for when one of these buttons all are pressed.When pressing one of said button (S221: be), then at S222, CPU11 shows the function operations screen corresponding to the button of being pressed on LCD17.
Particularly, CPU11 obtains window from OS, and demonstration is used for the function operations screen corresponding to the button of being pressed in the window that is obtained.Subsequently, CPU11 turns back to S381, and repeats the processing in S221-S222.
For example, if when showing that on LCD17 when being used for observable user's UI, the user uses keyboard 15 or mouse 16 to select scan button BT22, then CPU11 on LCD17 reading scan device feature operation screen to use the scanner function of MFP100.And, display screen switching push button BT7 not in this function screen.
Figure 15 B shows when in the user is being used for observable user's UI, selecting scan button BT22, the screenshot capture of the example of the scanner function function screen that on LCD17, shows.In the example of Figure 14 B, the scanner function function screen is provided with preserves form selector button SB61, resolution selector button SB62, scan type selector button SB63, OK button BT61 and cancel button BT62.Selector button SB61-SB63 and button BT61 and BT62 are identical with in Figure 13 B those, therefore, do not repeat the explanation of these buttons.
Shown in Figure 15 B, when in the user is being used for observable user's UI, selecting scan button BT22, reading scan device feature operation screen only on LCD17.Display screen switching push button BT7 not.
Above-mentioned explanation only is an example of the scanner function function screen that when pressing scan button BT22, on LCD17, shows.When pressing one of other buttons BT21-BT26 or graphic button GBT21-GBT24, on LCD17, show similarly to be used for other function operations screens, but the explanation of these function screens will be provided.
Through the function executing treatments B shown in Figure 15 A; When pressing the button one of BT21-BT26 or graphic button GBT21-GBT24 in the UI that is used for observable user that the user shows on LCD17, CPU11 can show the function operations screen that is used for corresponding to this button.
In aforesaid second embodiment; When pressing the button one of BT1-BT11 (getting rid of screen switching push button BT7) in the user's who is used for visual impairment that the user shows the UI on LCD17; CPU11 shows the function operations screen that is used for corresponding to the button of being pressed on LCD17, and in the viewing area of said function operations screen further display screen switching push button BT7.Therefore, the observable user UI that can easily will be used for the user of visual impairment through function screen switching push button BT7 in being used for any function operations screen converts the UI that is used for observable user into.Therefore, this configuration is user-friendly for observable user.
And; If mixed the UI and a plurality of windows that are used for the function screen that is used for other programs other function operations screens that comprises except the user who is used for visual impairment in the demonstration on the LCD17, then observable user still can only easily find screen switching push button BT7 through the UI that finds the user who is used for visual impairment with being used for one of said function operations screen.Therefore, observable user can be easily converts the user's who is used for visual impairment UI into be used for observable user UI, though when the demonstration on LCD17 be mix the time, make that thus this configuration is easily for observable user.
Then, with the PC1001 of explanation according to the 3rd embodiment.In said the 3rd embodiment, set up applications Y and start-up routine on PC1001.Application program Y has the user's who is used for visual impairment function screen and the function screen that is used for observable user.Start-up routine is used to start application program Y.When operation, start-up routine confirms whether the user of PC1001 is the user or the observable user of visual impairment.Then, when the user sent the instruction that starts application program Y, start-up routine command applications Y showed the function screen that is suitable for the user most.
For example; If the user of visual impairment sends the instruction that starts the application program Y with a plurality of function screens when aforesaid start-up routine just moves on PC1001; Start-up routine starts application program Y, so that application program Y shows the function screen for user's most convenient of visual impairment on LCD17.Therefore, this configuration is user-friendly for the user of visual impairment, and wherein, for the user of visual impairment, each operation is consuming time.
At first, with the configuration of explanation according to the PC1001 of the 3rd embodiment.
Figure 16 A shows the block scheme of the electronic structure of the hard disk 34 that in PC1001, provides.Because the electronic structure of PC1001 is identical with structure according to the PC1 (referring to Fig. 1) of first embodiment except hard disk 34, therefore only in the structure of this explanation hard disk 34.
Hard disk 34 is rewritable non-volatile storeies, even it can turn-off retention data behind the power supply of PC31.Hard disk 34 is provided with OS storage area 34a, screen reader storage area 34b, start-up routine storage area 34c and application program Y storage area 34d.
OS storage area 34a has and the identical configuration of described OS storage area 14a in first embodiment, and screen reader storage area 34b has and the identical structure of described screen reader storage area 14b in first embodiment.Therefore, do not repeat the explanation of these storage areas at this.
Start-up routine storage area 34c stores start-up routine, and it is used for when the user sends the instruction of start program, starting said program, and orders said program display to be suitable for user's function screen most.The program that the user type monitoring is handled and graphic application begins to handle in the process flow diagram of Figure 17 shown in said start-up routine storage area 34c storage is used to carry out in Figure 16 B.
Start-up routine storage area 34c also provides command execution marker stores zone 34c1, command table storage area 34c2 and independent variable form storage area (argument table memoryarea) 34c3.
Command execution marker stores zone 34c1 memory command is carried out mark, and said command execution mark is used to indicate and when the user sends the instruction of executive routine, whether starts said program and order said program display to be suitable for user's function screen most.Start-up routine starts the user and has sent the program that instruction begins, and (" 1 " in the 3rd embodiment) orders said program display to be suitable for user's function screen most when the command execution mark is set to " effectively ".
But when the command execution mark is set to engineering noise (" 0 " in the 3rd embodiment), start-up routine only starts the user and has sent the program that instruction begins.In this case, said program display is set to the assigned operation screen of default screen (UI that for example, is used for observable user).When on hard disk 34, watchdog routine being installed; Can point out user's command execution mark to be set to " effectively " or engineering noise, perhaps can allow the user to be set to " effectively " or engineering noise through the said mark of operation on keyboard 15, mouse 16 etc.
Command table storage area 34c2 storage is used to start the order of application program Y.Said independent variable form storage area 34c3 storage: when starting application program Y; Be used for the display command (command option is perhaps ordered independent variable) that command applications Y comes on LCD17, to show the user's who is used for visual impairment UI; And, be used for the display command that command applications Y comes on LCD17, to show the UI that is used for observable user.
When the OS content was imported display command, OS began application program Y after the order of specifying OS startup application program Y.When application program Y began to carry out, CPU11 showed the user's who is used for visual impairment UI or is used for observable user's UI on LCD17 according to the order of display command.
Application program Y storage area 34d application storing Y, it is the program that can on PC1001, move.Application program Y can for example be described application program X in first embodiment.Application program Y storage area 34d provides visual impairment person UI storage area 34d1 and visible person UI storage area 34d2.
Visual impairment person UI storage area 34d1 storage is used for the user's of visual impairment UI, and it is the function screen that the user that is used to be beneficial to visual impairment carries out input operation.For example, the user's who is used for visual impairment shown in visual impairment person UI storage area 34d1 is stored in Fig. 2 A UI.Visible person UI storage area 34d2 storage is used for observable user's UI, and it is the function screen that is used to be beneficial to observable user's input operation.For example, the UI that is used for observable user shown in visible person UI storage area 34d2 is stored in Fig. 2 B.
Then, will the user type monitoring processing by the CPU11 execution of PC1001 be described with reference to figure 16B.
Figure 16 B is the process flow diagram of the step in the monitoring of diagram user type is handled.This processing is performed the operator that monitors PC31 to be provided with according to the user for PC1001 and the characteristic of user's input operation confirms that whether the user is visual impairment or observable.
In the S311 that the user type monitoring is handled, the condition of CPU11 initialization RAM13 meets marker stores zone 13a (Fig. 1).Particularly, CPU11 is used to indicate observable user's user type (" 2 " in the 3rd embodiment) in user type storage area 13h (Fig. 1) stored.
In S312, CPU11 confirms whether the command execution mark of the 34c1 stored in command execution marker stores zone is set to " effectively ".If the command execution mark is set to " effectively " (S312: be), then at S13, the user type shown in CPU11 carries out in Fig. 5 is confirmed to handle.At S313, CPU11 waits for the fixed time (for example 10 minutes), and turns back to S312 subsequently, to repeat the processing in aforesaid S312-S313.If CPU11 confirms the command execution mark in S312 be that engineering noise (S312: not), then handle by the monitoring of CPU11 end user type.
And when the user sent the instruction that starts and order, the CPU11 of PC1001 can carry out and show the function screen that is suitable for the user most.For example, the application start shown in CPU11 can carry out in Figure 17 is handled.
Figure 17 is illustrated in the process flow diagram that application start is handled interior step.This processing is performed and when the user sends start-up routine and orders said program to show the instruction of the function screen that is suitable for the user most, starts said program.
In S321, CPU11 confirms which application of storage is started by instruction on hard disk 34.In this explanation, will suppose that application program Y is started by instruction.In S322, CPU11 reads in the user type of user type storage area 13h (Fig. 1) stored, and in S323, confirms the type by the user of user type appointment.
If the user of user type indication visual impairment; Then in S324; CPU11 is used on LCD17, showing the display command (command option) of the user's who is used for visual impairment UI to the fill order increase that is used for application program Y, and in OS, imports this order to start application program Y.Subsequently, CPU11 finishes the application start processing.
But; If user type is indicated observable user; Then in S325, CPU11 is used on LCD17, showing the display command (command option) of the UI that is used for observable user to the fill order increase that is used for application program Y, and in OS, imports this order to start application program Y.Subsequently, CPU11 finishes the application start processing.
Handle through described application start in Figure 17, CPU11 starts said program in the time of can working as the instruction that the user sends start-up routine, and can order the function screen of said program display for user's most convenient.
As stated; When the start-up routine according to the 3rd embodiment operates on the PC1001; CPU11 can be provided with and the characteristic of user's input operation confirms whether the user of PC1001 is the user or the observable user of visual impairment according to the user who is used for PC1001; And when the user sends the instruction that is used for start-up routine, can, the said program display of order be enabled in the said program on the PC1001 when being suitable for user's function screen most.Therefore, if the user of visual impairment sends the instruction that is used for start-up routine, then the user's who is used for visual impairment of said program UI is displayed on LCD17, makes that this configuration is user-friendly for the user of visual impairment.And if observable user sends the instruction that is used for start-up routine, then the UI that is used for observable user of said program is displayed on LCD17, makes that this configuration is user-friendly for observable user.
Though specified the present invention with reference to the above embodiments and variation pattern thereof,, be under the situation that does not break away from spirit of the present invention, to carry out various changes and modification therein for what those skilled in the art obviously were prone to see.
For example, every kind of version of aforesaid first embodiment (user mode is obtained processing, operation is obtained and handled and the definite processing of user type) also can be applied to the second and the 3rd embodiment.
And; Obtain according to the user mode of first version of first embodiment handle in (Figure 10), if at definite step (S92, S94; S96 and S98) interior satisfied two conditions that surpass, then CPU11 confirms to have set up the user that the setting that is used for PC1 is beneficial to visual impairment.But if satisfy said condition one of at least, then CPU11 can confirm to have set up the setting that is used for PC1 for the user of visual impairment.In addition; In definite step of S51 (Fig. 6) and S96 (Figure 10); Whether CPU11 can determine whether to be provided with screen reader and be used for converting the text data on the LCD17 into voice data, perhaps be provided with to be used for being provided with as OS of startup screen reader and to be provided with.In the case, screen reader storage area 14b can store and be used for confirming that whether screen reader is automatically converted to text data the setting of sound, and the OS storage area can be stored the setting that is used to determine whether the startup screen reader.
In aforesaid first embodiment, the function screen of application program X is switched to the user's who is used for visual impairment UI or is used for observable user's UI.But when the UI with the user who is used for visual impairment or a plurality of application programs of being used for observable user's UI are stored in hard disk 14 grades and operate in PC1 when going up, the function screen of each application program can switch.
And, use the example that starts application program Y according to the processing of the 3rd embodiment.But; If when sending when being used to start the instruction of each application program; UI with the user who is used for visual impairment is stored in hard disk 34 with a plurality of application programs that are used for observable user's UI, and then CPU11 can start corresponding application program when sending the order that is used to show the function screen that is suitable for the user most.In this case; Order in command table storage area 34c2 stored; Be used to instruct on hard disk 34 startup of each application program of storage; And at start-up routine storage area 34c stored display command (command option), on LCD17, show the user's who is used for visual impairment UI, and be used for each application program of order and on LCD17, show the UI that is used for observable user to order each application program.
And, though mouse is used as the example of the indicator device in embodiment and the version, can use any device of the display position that can specify cursor, such as touch pad, trace ball or track pad.
And; When in screen hand-off process B, on LCD17, showing dialog box (Fig. 8); If after showing dialog box, do not receive at the appointed time is being button BT51 or the not input operation on the button BT52, and then CPU11 can confirm to supress not button BT52.
Confirm to handle in (Fig. 5) in user type, the CPU11 counting is pressed the number of times of specified button, if but for any button that is generally used in application program X and the OS, also can count such button.

Claims (10)

1. data processing equipment comprises:
Display unit, it shows that to video data said video data comprises first video data that is used for the visual impairment user and second video data that is used for visible user;
User type is confirmed the unit, and it confirms that the user is visual impairment or observable; And
Indicative control unit, it is controlled said display unit and shows said first video data when confirming user's visual impairment to confirm the unit when said user type, and when said user type confirms that the unit confirms that the user is visible, shows said second video data,
Wherein, operating system and screen reader have been installed in said data processing equipment;
Wherein, said data processing equipment also comprises:
First memory, it can store the setting of said operating system; And
Second memory, it can store the setting of said screen reader,
Wherein, said user type is confirmed the characteristic of unit according to setting and user's input operation of said operating system, perhaps said screen reader be provided with confirm that the user is visual impairment or observable,
Wherein, Said data processing equipment also comprises the sensing unit; Said sensing unit has mobile member and button; The user can utilize said mobile member on said display unit, to move indicator and specify the display position of said indicator, and the user can utilize the user data of said button input corresponding to the display position of said indicator;
Wherein, said video data comprises literal;
Wherein, said operating system is controlled the translational speed of said indicator and the size of the literal that on said display unit, shows,
Wherein, During in meeting the following conditions at least two; Said user type confirms that the unit confirms that the user is a visual impairment: said screen reader (a) has been installed; (b) carrying out said screen reader, (c) translational speed of indicator is set to slowlyer than the acquiescence translational speed of said indicator preset in said operating system, and (d) size of literal is set to bigger than the default size of literal preset in said operating system.
2. data processing equipment according to claim 1, wherein, said screen reader can be installed in the said data processing equipment, and
Wherein, when said screen reader had been installed on the said data processing equipment, said user type confirmed that the unit confirms that the user is a visual impairment.
3. data processing equipment according to claim 1 also comprises:
Input block, the user can utilize said input block input user data; And
Input state is confirmed the unit, and it confirms input state, and
Wherein, said user type confirms that the unit confirms that according to said input state the user is visual impairment or observable.
4. data processing equipment according to claim 3, wherein, said input block comprises:
Point to the unit; Said sensing unit has mobile member and button; The user can utilize said mobile member on said display unit, to move indicator and specify the display position of said indicator, and the user can utilize the user data of said button input corresponding to the display position of said indicator; And
Key-press input unit with a plurality of buttons, the user can utilize said key-press input unit input user data, and
Wherein, confirm the unit when said input state and confirm that said a plurality of button is pressed when surpassing first predetermined number of times that said user type confirms that the unit confirms that the user is a visual impairment.
5. data processing equipment according to claim 3; Wherein, Said input block comprises the sensing unit; Said sensing unit has mobile member and button, and the user can utilize said mobile member on said display unit, to move indicator and specify the display position of said indicator, and the user can utilize the user data of said button input corresponding to the display position of said indicator; And
Key-press input unit with a plurality of buttons, the user can utilize said key-press input unit input user data, and
Wherein, confirm the unit when said input state and confirm that said button is pressed when surpassing second predetermined number of times that said user type confirms that the unit confirms that the user is observable.
6. data processing equipment comprises:
Display unit, it shows that to video data said video data comprises first video data that is used for the visual impairment user and second video data that is used for visible user;
User type is confirmed the unit, and it confirms that the user is visual impairment or observable; And
Indicative control unit, it is controlled said display unit and shows said first video data when confirming user's visual impairment to confirm the unit when said user type, and when said user type confirms that the unit confirms that the user is visible, shows said second video data,
Said data processing equipment also comprises:
Connector, it can be connected to the voice output unit that is used for output sound;
The sound control module, when said user type confirmed that user's visual impairment is confirmed in the unit, said sound control module was controlled said voice output unit output sound, was used to inquire whether the user shows said first video data,
Wherein, said user type is confirmed the characteristic of unit according to setting and user's input operation of said operating system, perhaps said screen reader be provided with confirm that the user is visual impairment or observable,
Wherein, said data processing equipment also comprises:
Key-press input unit with a plurality of buttons, the user can utilize said key-press input unit to import data; And
Point to the unit; Said sensing unit has mobile member and button; The user can utilize said mobile member on said display unit, to move indicator and specify the display position of said indicator; The user can utilize the user data of said button input corresponding to the display position of said indicator, and
Wherein, when after said voice output unit output sound, utilizing said key-press input unit to import to be used for the input data of said first video data of indicated number, said indicative control unit is controlled said display unit and is shown said first video data.
7. data processing equipment comprises:
Display unit, it shows that to video data said video data comprises first video data that is used for the visual impairment user and second video data that is used for visible user;
User type is confirmed the unit, and it confirms that the user is visual impairment or observable; And
Indicative control unit, it is controlled said display unit and shows said first video data when confirming user's visual impairment to confirm the unit when said user type, and when said user type confirms that the unit confirms that the user is visible, shows said second video data,
Said data processing equipment also comprises:
Connector, it can be connected to the voice output unit that is used for output sound;
The sound control module, when said user type confirmed that user's visual impairment is confirmed in the unit, said sound control module was controlled said voice output unit output sound, was used to inquire whether the user shows said first video data,
Wherein, said user type is confirmed the characteristic of unit according to setting and user's input operation of said operating system, perhaps said screen reader be provided with confirm that the user is visual impairment or observable,
Wherein, said data processing equipment also comprises:
Key-press input unit with a plurality of buttons, the user can utilize said key-press input unit to import data; And
Point to the unit; Said sensing unit has mobile member and button; The user can utilize said mobile member on said display unit, to move indicator and specify the display position of said indicator; The user can utilize the user data of said button input corresponding to the display position of said indicator
Wherein, when said user type confirmed that the unit confirms that the user is visible, said indicative control unit was controlled said display unit and is shown that inquiry shows, said inquiry shows to be used to inquire whether the user shows said second video data, and
Wherein, when the user shows according to said inquiry, utilize said sensing unit to import when representing to show the data of second video data, said indicative control unit control shows said second video data.
8. data processing equipment comprises:
Display unit, it shows that to video data said video data comprises first video data that is used for the visual impairment user and second video data that is used for visible user;
User type is confirmed the unit, and it confirms that the user is visual impairment or observable; And
Indicative control unit, it is controlled said display unit and shows said first video data when confirming user's visual impairment to confirm the unit when said user type, and when said user type confirms that the unit confirms that the user is visible, shows said second video data,
Wherein, When said display unit shows said first video data; Said indicative control unit is controlled said display unit and is also shown request msg; The described request data are used to inquire whether the user will be transformed into said second video data from said first video data at the video data on the said display unit, and
Wherein, when the user converted said first video data into said second video data according to the indication of said demonstration request msg, said indicative control unit was controlled said display unit and is shown said second video data,
Wherein, said user type is confirmed the characteristic of unit according to setting and user's input operation of said operating system, perhaps said screen reader be provided with confirm that the user is visual impairment or observable,
Wherein, said data processing equipment also comprises the sensing unit, and the user can utilize said sensing unit on said display unit, to move indicator with the appointment display position,
Wherein, when the user according to the described request data, when utilizing said sensing unit to specify to convert said first video data into said second video data, said indicative control unit is controlled said display unit and is shown said second video data.
9. data processing method comprises:
Confirm that the user is visual impairment or observable; And
When definite user's visual impairment, show first video data, and when definite user is visible, show second video data,
Wherein, saidly confirm to comprise characteristic according to setting and user's input operation of operating system, perhaps screen reader be provided with confirm that the user is visual impairment or observable,
Wherein, During in meeting the following conditions at least two; Confirm that the user is a visual impairment: said screen reader (a) has been installed; (b) carrying out said screen reader, (c) translational speed of indicator is set to slowlyer than the acquiescence translational speed of indicator preset in said operating system, and (d) size of literal is set to bigger than the default size of literal preset in said operating system.
10. data handling system comprises:
First data processing equipment, it comprises display unit, said display unit shows that to video data said video data comprises first video data that is used for the visual impairment user and second video data that is used for visible user; And
Second data processing equipment comprises:
User type is confirmed the unit, and it confirms that the user is visual impairment or observable; And
Indicative control unit, it is controlled said display unit and confirms the unit when said user type and show said first video data when confirming user's visual impairment, and when said user type confirms that the unit confirms that the user is visible, shows said second video data,
Wherein, operating system and screen reader have been installed in said data handling system;
Wherein, said data processing provides also and comprises:
First memory, it can store the setting of said operating system; And
Second memory, it can store the setting of said screen reader,
Wherein, said user type is confirmed the characteristic of unit according to setting and user's input operation of said operating system, perhaps said screen reader be provided with confirm that the user is visual impairment or observable,
Wherein, Said data handling system also comprises the sensing unit; Said sensing unit has mobile member and button; The user can utilize said mobile member on said display unit, to move indicator and specify the display position of said indicator, and the user can utilize the user data of said button input corresponding to the display position of said indicator;
Wherein, said video data comprises literal;
Wherein, said operating system is controlled the translational speed of said indicator and the size of the literal that on said display unit, shows,
Wherein, During in meeting the following conditions at least two; Said user type confirms that the unit confirms that the user is a visual impairment: said screen reader (a) has been installed; (b) carrying out said screen reader, (c) translational speed of indicator is set to slowlyer than the acquiescence translational speed of said indicator preset in said operating system, and (d) size of literal is set to bigger than the default size of literal preset in said operating system.
CN2008101729066A 2007-10-24 2008-10-24 Data processing device Active CN101419528B (en)

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
JP2007-276110 2007-10-24
JP2007276110 2007-10-24
JP2007276110A JP4935620B2 (en) 2007-10-24 2007-10-24 Information processing program, information processing apparatus, and information processing system
JP2007305558A JP5092713B2 (en) 2007-11-27 2007-11-27 Information processing program and information processing apparatus
JP2007305558 2007-11-27
JP2007-305558 2007-11-27

Publications (2)

Publication Number Publication Date
CN101419528A CN101419528A (en) 2009-04-29
CN101419528B true CN101419528B (en) 2012-08-29

Family

ID=40630328

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2008101729066A Active CN101419528B (en) 2007-10-24 2008-10-24 Data processing device

Country Status (2)

Country Link
JP (1) JP4935620B2 (en)
CN (1) CN101419528B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4935658B2 (en) 2007-12-11 2012-05-23 ブラザー工業株式会社 Browser program and information processing apparatus
JP2011238129A (en) * 2010-05-12 2011-11-24 Sony Corp Terminal device, electronic apparatus, method for allocating access key, and program
JP2013140516A (en) * 2012-01-05 2013-07-18 Sony Corp Information processing apparatus and display control method
JP2014137627A (en) * 2013-01-15 2014-07-28 Sony Corp Input apparatus, output apparatus, and storage medium
CN104020842B (en) * 2013-03-01 2018-03-27 联想(北京)有限公司 A kind of display methods and device, electronic equipment
JP2015064875A (en) * 2013-08-30 2015-04-09 キヤノンマーケティングジャパン株式会社 Information processing apparatus, information processing method, and program
CN105786430B (en) * 2016-02-25 2020-08-25 联想(北京)有限公司 Information processing method and electronic equipment
US20210084380A1 (en) * 2017-10-12 2021-03-18 Sony Corporation Information Processing Terminal, Information Processing Method, And Program

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1271148A (en) * 1999-04-19 2000-10-25 李斌 Electronic blind reader
CN1758671A (en) * 2004-10-09 2006-04-12 乐金电子(中国)研究开发中心有限公司 Mobile communication terminal with function for converting shooted letters into voice and method thereof
CN1849579A (en) * 2003-07-18 2006-10-18 苹果电脑公司 Voice menu system

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10171576A (en) * 1996-12-06 1998-06-26 Toshiba Corp Information input device and automatic transaction device
JP4755813B2 (en) * 2003-11-14 2011-08-24 日立公共システムエンジニアリング株式会社 Client terminal
JP2006268581A (en) * 2005-03-24 2006-10-05 Fuji Xerox Co Ltd Display controller, display control method, information processor, and information processing method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1271148A (en) * 1999-04-19 2000-10-25 李斌 Electronic blind reader
CN1849579A (en) * 2003-07-18 2006-10-18 苹果电脑公司 Voice menu system
CN1758671A (en) * 2004-10-09 2006-04-12 乐金电子(中国)研究开发中心有限公司 Mobile communication terminal with function for converting shooted letters into voice and method thereof

Also Published As

Publication number Publication date
JP4935620B2 (en) 2012-05-23
JP2009104436A (en) 2009-05-14
CN101419528A (en) 2009-04-29

Similar Documents

Publication Publication Date Title
CN101419528B (en) Data processing device
US10108584B2 (en) Host apparatus and screen capture control method thereof
US20060048069A1 (en) Display apparatus and method for displaying screen where dragging and dropping of object can be executed and program stored in computer-readable storage medium
JP5436187B2 (en) Image processing apparatus, control method therefor, and program
EP2053579A2 (en) Data processing device
CN101807141B (en) Controlling apparatus of image forming apparatus and image forming system
JP2001216251A (en) Computer switching device
JP2014215788A (en) Information processing system, information processing method, and program
US20240007571A1 (en) Image processing apparatus, method for controlling image processing apparatus, and recording medium
JP2016066258A (en) Image forming apparatus and button customization method
CN101650635A (en) Method for controlling remote display by terminal equipment and terminal equipment
EP2770420A1 (en) Data processing apparatus, content displaying method, and computer-readable recording medium encoded with content displaying program
JP2010087719A (en) Communication apparatus
JP6634732B2 (en) System, information processing method, information processing device, information terminal and program
JP6551105B2 (en) Image forming apparatus, screen display method, and computer program
JP2009151508A (en) Conference memo recording device and conference memo recording program
CN100588218C (en) Image forming apparatus and electronic mail delivery server,
US10785376B2 (en) Image processing apparatus for sending user interface data
JP5092713B2 (en) Information processing program and information processing apparatus
JP6522719B2 (en) Image display method
JP2018151820A (en) Display device and display control method
TWI419044B (en) Electronic reader and method for implementing page turning thereof
JP6950461B2 (en) Information processing equipment, information processing system and information processing method
JP2019117649A (en) Image display apparatus and image display method
JP2017041711A (en) System, information processing method, information terminal, and program

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant