WO2009107589A1 - ユーザインタフェース生成装置 - Google Patents
ユーザインタフェース生成装置 Download PDFInfo
- Publication number
- WO2009107589A1 WO2009107589A1 PCT/JP2009/053227 JP2009053227W WO2009107589A1 WO 2009107589 A1 WO2009107589 A1 WO 2009107589A1 JP 2009053227 W JP2009053227 W JP 2009053227W WO 2009107589 A1 WO2009107589 A1 WO 2009107589A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user interface
- information
- unit
- interface object
- generation
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F8/00—Arrangements for software engineering
- G06F8/30—Creation or generation of source code
- G06F8/38—Creation or generation of source code for implementing user interfaces
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72469—User interfaces specially adapted for cordless or mobile telephones for operating the device by selecting functions from two or more displayed items, e.g. menus or icons
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04817—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/247—Telephone sets including user guidance or feature selection means facilitating their use
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72448—User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/22—Details of telephonic subscriber devices including a touch pad, a touch sensor or a touch detector
Definitions
- the present invention relates to a user interface generation device, and more particularly to a user interface generation device that generates a user interface of a mobile terminal.
- the user interface (hereinafter abbreviated as “UI” as appropriate) of a mobile terminal typified by a mobile phone greatly affects the operability when the user operates the mobile terminal. For this reason, the UI of the mobile terminal is one of the important factors that determine the user's purchase of the mobile terminal.
- a UI based on XML Extensible Markup Language
- XML Extensible Markup Language
- Typical UIs based on XML such as this are UI Foundation developed by TAT (http://www.tat.se/) and Acrodea (http: //www.acrodea. VIVID UI developed by co.jp/) and UI One developed by Qualcomm (http://www.qualcomm.co.jp/).
- Japanese Patent Application Laid-Open No. 2001-36652 discloses a technique in which a plurality of external devices can be remotely operated (controlled) by infrared communication using a mobile phone terminal having an infrared communication unit.
- the mobile phone terminal described in Japanese Patent Laid-Open No. 2001-36652 includes an external control unit that performs communication with an external device.
- This mobile phone terminal stores external device control information for remotely controlling an external device by acquiring it via a telephone line or receiving it from an external device, and stores it. Based on this external device control information Remote control of external devices.
- a remote control that remotely operates a plurality of external devices on a terminal body that is normally used as a mobile phone.
- application a built-in application program
- remote control that remotely operates a plurality of external devices on a terminal body that is normally used as a mobile phone.
- a plurality of external devices can be remotely operated by one mobile phone terminal based on the external device control information corresponding to each external device. . Therefore, there is no need for a complicated operation to use a plurality of individual remote control terminals for each of a plurality of external devices, and the convenience for the user can be improved.
- each remote control application is temporarily terminated and the next remote control application is started. There is no need to let them. That is, a plurality of applications can be simultaneously activated on one terminal, and can be used by switching to an application of an arbitrary remote controller to be used from among them.
- the UI of the television remote control application is displayed on the portable terminal 100 including the touch panel 200. Can be reproduced while maintaining operability.
- the UI of the air conditioner remote control application is similarly displayed on the portable terminal 100 including the touch panel 200 as shown in FIG. Can be reproduced while maintaining its operability.
- the UI of each application is described based on XML, even if the application is ported to another manufacturer's terminal, the same UI can be easily reproduced simply by processing the XML file describing the UI. .
- both the TV and the air conditioner require frequent operation. Therefore, both applications are displayed on the display unit using the multitasking and multiwindow functions.
- the display area is divided and displayed simultaneously. In this case, since each application has only a UI that is assumed to be used alone, when the UIs are simultaneously displayed as they are, the states shown in FIGS. 15A to 15D are obtained.
- FIGS. 15A to 15D are views showing a state in which UIs of two applications are simultaneously displayed on the mobile terminal 100 in which the multitasking and multiwindow functions are implemented.
- each UI is reproduced as it is in each window obtained by dividing the display on the display unit of the touch panel 200 vertically, only a part of each UI is displayed.
- a scroll bar is provided at the right end of each window so that operations can be performed even on parts that cannot be displayed on the screen.
- the user wants to adjust the volume of the television in the state shown in FIG. 15A
- the user moves the scroll bar of the window of the television remote control UI
- the television remote control UI as shown in FIG. You must move the display range.
- the user moves the scroll bar of the window of the air conditioner remote control UI to move the air conditioner as shown in FIG.
- the display range of the conditioner remote control UI must be moved.
- the operation unit constituting the UI of an application such as a remote controller is generally designed with a size close to the minimum. Therefore, when displaying UIs of a plurality of applications at the same time, it is not realistic to reduce each UI as a whole and display it simultaneously.
- an object of the present invention made in view of such circumstances is to provide a user interface generation apparatus capable of simultaneously realizing user interfaces of a plurality of application programs and maintaining the operability of each user interface.
- An application program execution unit for realizing various functions based on the application program;
- a user interface generation unit that generates a user interface for instructing the application program execution unit to execute a predetermined function based on the application program;
- a display unit for displaying a user interface generated by the user interface generation unit;
- a storage unit that stores a user interface definition file including user interface object definition information that defines a user interface object that is a component of the user interface, and
- the user interface generation unit includes a user interface included in a user interface definition file stored in the storage unit corresponding to each of the instructed user interfaces
- the object definition information is selected and user interface composition processing for generating a composite user interface based on the selected user interface object definition information is performed.
- the invention according to a second aspect is the user interface generation device according to the first aspect,
- the user interface object definition information includes user interface object attribute information indicating a relationship between the user interface object and another user interface object included in the user interface including the user interface object,
- the user interface generation unit is configured to generate the predetermined user interface based on a predetermined display area for displaying a user interface on the display unit and the user interface object attribute information. It is characterized by selecting object definition information.
- the invention according to a third aspect is the user interface generation device according to the second aspect,
- the user interface object definition information further includes user interface object attribute information indicating whether each user interface object constituting the user interface is valid or invalid,
- the user interface generation unit is configured to generate the predetermined user interface based on a predetermined display area for displaying a user interface on the display unit and the user interface object attribute information. It is characterized by selecting object definition information.
- the invention according to a fourth aspect is the user interface generation device according to any one of the first to third aspects,
- the user interface definition file includes identification information for identifying a user interface generated based on the user interface definition file,
- the user interface generation unit processes the selected user interface object definition information based on the identification information.
- the invention according to a fifth aspect is the user interface generation device according to any one of the first to fourth aspects,
- the user interface object definition information includes action information indicating an operation content to be executed when an event occurs for the user interface object.
- the action information includes information related to the change of the user interface object attribute information included in the other user interface object definition information,
- the user interface generation unit changes the user interface object attribute information based on action information included in user interface object definition information of a user interface object that has received the occurrence of the event when the event occurs in the composite user interface. If so, the user interface composition processing is performed based on the changed user interface object attribute information to generate a new composite user interface.
- FIG. 4 is a flowchart for further explaining the user interface composition processing of FIG. 3.
- FIG. 4 is a flowchart for further explaining the user interface generation processing of FIG. 3.
- composition of a user interface using user interface identification information which is user interface object attribute information concerning a 2nd embodiment.
- UI generation apparatus of the present invention is not limited to a mobile phone, and can be applied to, for example, an arbitrary mobile terminal such as a notebook personal computer or a PDA. Further, the present invention is not limited to a portable terminal as long as it is an apparatus that requires implementation of the present invention. Note that the main object of the present invention is to synthesize a plurality of UIs so that they can be used at the same time.
- the application that receives an instruction from each UI is not limited to the remote control function. Can be applied.
- FIG. 1 is a block diagram showing a schematic configuration of a mobile phone which is a UI generation apparatus according to the first embodiment of the present invention.
- the mobile phone 1 includes a control unit 10 that controls the whole and a touch panel 20 that receives input from the user and displays an input result or the like according to each application.
- the touch panel 20 is configured by superimposing an input unit 22 configured with a matrix switch or the like that receives input from a user on a front surface of a display unit 24 configured with a liquid crystal display or the like.
- the display unit 24 draws and displays a UI for accepting a user operation input in the UI display area.
- the mobile phone 1 also includes a wireless communication unit 30 that transmits and receives various information such as voice calls and e-mail data with a base station (not shown), and an infrared communication unit 40 that communicates with various external devices (not shown) using infrared rays. Yes.
- the wireless communication unit 30 uses the wireless communication unit 30, the mobile phone 1 transmits and receives various data to and from the outside of the terminal via the Internet or wireless.
- the mobile phone 1 has a storage unit 50 that stores input information and various applications, and also functions as a work memory.
- the storage unit 50 includes an application storage area 52, a UI definition file storage area 54, an individual UI resource storage area 56, and a common UI resource storage area 58.
- the application storage area 52 stores various applications.
- the UI definition file storage area 54 stores a UI definition file that defines a series of generation rules for generating each UI as a whole.
- the individual UI resource storage area 56 stores individual UI resources such as image data and character string (text) data used for generating a UI unique to each application.
- the common UI resource storage area 58 stores common UI resources such as image data and font data shared and used by the UI used in the terminal other than the individual UI resources specific to each application.
- common UI resources such as image data and font data shared and used by the UI used in the terminal other than the individual UI resources specific to each application.
- the control unit 10 includes an application execution unit 12, a UI acquisition unit 14, and a UI generation unit 16.
- the application execution unit 12 executes various applications stored in the application storage area 52 of the storage unit 50 and controls the execution. Further, based on the input to the UI corresponding to the various applications stored in the application storage area 52, the application execution unit 12 executes the functions of the various applications corresponding to the input.
- the UI acquisition unit 14 acquires resources (image data and the like) and UI definition files (XML file and the like) outside the terminal via the wireless communication unit 30.
- the UI generation unit 16 performs parsing processing (parsing) and DOM (Document Object Model) processing on the UI definition file, and generates a UI to be actually used.
- the UI generation unit 16 interprets UI information described in XML format by the XML engine, and displays the UI generated based on the UI definition file on the display unit 24 of the touch panel 20.
- the UI generation unit 16 includes a UI composition processing unit 18.
- the UI synthesis processing unit 18 synthesizes the UIs according to the designated UI definition files.
- a UI definition file that defines each specification and operation corresponding to the application of the UI required when each application stored in the application storage area 52 is executed. Is memorized. Although the situation where the same UI is used in different applications is also conceivable, here, for convenience of explanation, it is assumed that different UIs are used for different applications, and each UI definition file corresponding to each application is saved. The case will be described.
- a TV remote control UI definition file is stored in the UI definition file storage area 54 in correspondence with a TV remote control application for remotely controlling a television receiver, which is an external device (not shown), using the mobile phone 1.
- an air conditioner remote control UI definition file is stored in the UI definition file storage area 54 in correspondence with an air conditioner remote control application that remotely controls an air conditioner of an external device (not shown) with the mobile phone 1.
- UIML User Interface Markup Language
- the UI generation unit 16 displays a UI on the display unit 24 of the touch panel 20 of the mobile phone 1, and the application execution unit 12 performs a process according to an input to the input unit 22 by the user. I do.
- UI object definition information for defining a UI object to be drawn on the display unit 24 is included.
- the UI object definition information includes information for defining an image and text for drawing a UI object such as a key or a button, which is an element constituting a UI displayed on the display unit 24, and when the UI object is input. This is information that defines an action (in actuality, when an input is made to the part of the input unit 22 corresponding to the UI object).
- each application execution unit is executed when an event occurs for each UI object in the UI object definition information as information defining an operation when an input is made to the input unit 22 corresponding to the position of the UI object.
- action information that indicates what action should be performed. For example, when an input event occurs in the input unit 22 corresponding to the position of the “power” UI object of the TV remote control UI, an infrared signal for turning on or off the TV of the external device to the TV remote control application Define the action information to issue an instruction to send
- the UI object definition information includes UI object attribute information that is information about each UI object used when a plurality of UIs are combined.
- UI object attribute information the priority with which each object is displayed when a plurality of UIs are combined is defined in advance based on the relationship between each UI object and another UI object.
- FIG. 2 is a diagram for explaining information indicating priority, which is UI object attribute information according to the present embodiment.
- FIG. 2A shows a state in which only the TV remote control application is activated on the mobile phone 1 and the TV remote control UI is displayed on the touch panel 20 based on the TV remote control UI definition file.
- a state in which the UI display area on the touch panel 20 is conceptually divided into small areas corresponding to the attribute information of each UI object is represented by an alternate long and short dash line, and each UI object attribute information corresponding to each small area is shown in the figure.
- the definition (UI identification information) of handling a plurality of UI objects as a single UI is shown with an interface tag, and the value “TV1_interface” identifies that this is the UI of the TV1 remote control. it can.
- the TV receiver identified as TV1 can be remotely operated (controlled) by the UI of the TV1 remote controller.
- the priority for the UI object in each small area described above is defined in advance as a priority value.
- a UI object with a priority value of zero is an indispensable object
- a UI object with a priority value of 1 is an important object
- the UI object has a lower priority as the priority value increases. It means that. That is, for example, since the “power” key shown in the small area (1) in FIG. 2A is an essential key, the priority value is set to zero.
- the channel selection keys of the small areas (2) and (3) and the volume up / down keys of the small areas (4) and (5) are important keys, so the priority value is 1.
- the small area (6) numeric keypad can be replaced with the channel selection keys of the small areas (2) and (3), the priority value is set to 3 and the priority is set low.
- FIG. 2B shows a state in which only the air conditioner remote control UI is displayed on the touch panel 20.
- the display of “set temperature” shown in the small area (1) is essential, and the “power” key shown in the small area (2) is an essential key, so the priority value is zero.
- the temperature up / down tuning keys in the small areas (3) and (4) are important keys, so the priority value is 1.
- the priority value is set to 5.
- the mode key for the small area (7) and the menu key for the small area (8) are not so frequently used, the priority value is set to 2.
- the value of the interface tag is “AIR1_interface” so that the plurality of objects can be identified as the UI of the remote controller of one air conditioner 1.
- the remote control (control) of the air conditioner identified as AIR1 can be performed by the UI of the remote controller.
- the priority information (priority value) defined based on the relationship between each UI object and another UI object is used as UI object attribute information, and the UI object of each UI definition file is used. It is included in the definition information in advance.
- FIG. 3 is a flowchart illustrating the entire processing of the UI generation unit 16 according to the present embodiment.
- a UI definition file (XML file) is designated from an application that provides the UI (step S11).
- the operation that triggers the UI generation process is a user operation when starting an application that uses the UI, or a user operation when further starting another application that uses the UI while the user is already executing the application. Etc. are assumed.
- the UI generation unit 16 determines whether the number of designated UI definition files is one or more (step S12). When it is determined that the number of designated files is one, the UI generation unit 16 determines that the application does not request composition of a plurality of UIs, and performs XML parsing processing (step S13) on one designated UI definition file. ) And DOM processing (step S14).
- the subsequent processing is the same as the processing for generating a conventional UI. That is, based on the UI definition file that has undergone the parsing process and the DOM process, the UI generation unit 16 performs a process of generating a UI (step S15). The UI generation process in step S15 will be described later.
- the application execution unit 12 performs a process of displaying the UI in the UI display area of the display unit 24 of the touch panel 20 (step S16).
- the mobile phone 1 can display a single UI on the display unit 24 as a single application is executed.
- the TV remote control UI definition file is designated by starting the TV remote control application
- the UI is displayed as shown in FIG.
- a UI definition file for an air conditioner remote controller is designated by starting an application of the air conditioner remote controller
- the UI is displayed as shown in FIG.
- step S12 the UI generation unit 16 determines that the application requests composition of a plurality of UIs, and performs an XML parsing process (steps) on each of the plurality of XML files. S17) and DOM processing (step S18) are performed. In step S19, the UI generation unit 16 determines whether or not the parsing process and the DOM conversion process have been completed for all UI definition files. If not yet completed, the process returns to step S17.
- the UI composition processing unit 18 When the parsing process and the DOM conversion process are completed for all the UI definition files in step S19, the UI composition processing unit 18 performs a process of synthesizing a plurality of UIs for which the parsing process and the DOM conversion process are completed (step S20). The UI composition process in step S20 will also be described later.
- the UI generation unit 16 performs a process of generating a combined UI (step S15).
- the application execution unit 12 performs a process of displaying the generated composite UI in the UI display area of the display unit 24 of the touch panel 20 (Step S16), and ends the entire process of UI generation.
- step S20 in FIG. 3 will be further described with reference to the flowchart in FIG.
- the UI composition processing unit 18 extracts all UI objects constituting each DOM node from the DOM nodes of the plurality of UIs for which the parsing processing and the DOM conversion processing have been completed, and each UI object Is analyzed (step S31).
- priority information (priority) of each UI object is analyzed as attribute information of each UI object.
- the UI objects that are indispensable to display are arranged as a temporary layout by the calculation process in the UI composition processing unit 18 without actually displaying the UI objects on the display unit 24.
- the UI composition processing unit 18 determines whether the display of the entire UI in which the UI objects arranged in the provisional layout are arranged within the UI display area that is an area for displaying the UI in the display unit 24. It is determined whether or not (step S34). As a result of this determination, if it is determined that the UI display does not fit in the UI display area, the plurality of UIs cannot be displayed as a combined UI, and the UI combining processing unit 18 has failed to combine the plurality of UIs. A message to that effect is displayed on the display unit 24 (step S35), and the UI composition processing is terminated. This is a measure when a large number of UIs are combined on the same screen so that even essential objects cannot be displayed.
- step S34 If it is determined in step S34 that the display of the entire UI in which the temporarily arranged essential UI objects are arranged fits in the UI display area in the display unit 24, the UI composition processing unit 18 has the highest priority. It is determined that a UI object (required for display) is adopted (step S36).
- step S39 if it is determined that the display of the temporarily laid UI objects is within the UI display area of the display unit 24, the UI composition processing unit 18 confirms the adoption of these UI objects (step S40). .
- the UI composition processing unit 18 determines whether or not the remaining UI objects that have not been analyzed for priority information have become zero (step S41). If the remaining UI objects whose priority information analysis has not been completed are not zero, that is, if there are still unanalyzed UI objects remaining, the process returns to step S37. In this manner, selection and determination of UI objects are repeated in descending order of priority as long as the UI display area permits and UI objects exist.
- step S41 when the remaining UI objects whose priority information is unanalyzed become zero, that is, when there are no more unanalyzed UI objects remaining, the UI composition processing unit 18 performs the composition UI object at that time. Is determined as an official layout (step S42). Thereafter, the UI objects of the determined layout are collected as one DOM node (step S43), and the UI composition process is terminated.
- step 39 if the display of the temporarily laid UI object does not fit in the UI display area of the display unit 24, the UI composition processing unit 18 selects a predetermined UI object from among the UI objects extracted at this time. Is selected based on the above condition (step S44), and then the process proceeds to step S42.
- This predetermined condition is related to the relationship between UI objects or the display size of UI objects, etc., in case there are multiple UI objects with the same priority and all of them do not fit in the UI display area. As appropriate, conditions for selection are defined in advance.
- step S15 in FIG. 3 will be further described with reference to the flowchart in FIG.
- either a UI based on one designated UI definition file or a composite UI based on a plurality of UI definition files becomes one DOM document (or DOM node).
- the UI generation unit 16 performs display-related analysis on the DOM document (step S51).
- the UI definition file described in the UIML format an attribute with a structure tag or a style tag is analyzed.
- the UI generation unit 16 analyzes the operation related to the DOM document (step S52).
- an attribute with a behavior tag is analyzed.
- the UI generation unit 16 performs processing for converting the expression included in the DOM document into an expression depending on each terminal (step S53). Furthermore, the UI generation unit 16 selects a UI object resource to be used and sets each attribute (property) based on the result of the conversion process in step S53 (step S54). The UI object resources required at this time are stored in the individual UI resource storage area 56 as individual UI resources or in the common UI resource storage area 58 as common UI resources. Thereafter, the UI generation unit 16 performs a plotting process of the UI including each UI object (step S55). As a result, the UI generation process ends, and thereafter, the process proceeds to step S16 in FIG. 3 to perform the UI display process.
- a combined UI as shown in FIG. 6 is obtained.
- the composite UI shown in FIG. 6 divides the touch panel 20 into two upper and lower areas, and arranges the UI object of the TV remote control UI in the upper half and the UI object of the air conditioner remote control UI in the lower half.
- UI objects with a priority of “0” to “1” are extracted, and for the air conditioner remote control UI, only UI objects with a priority of “0” are extracted to synthesize the UI. Yes.
- two XML files are analyzed, and only a UI object having a high priority is automatically arranged to synthesize a plurality of UIs.
- the user can use a plurality of UIs simultaneously without switching a plurality of applications or UIs based on the applications.
- the UI object attribute information which is information related to each UI object used when combining a plurality of UIs, included in the UI object definition information in the first embodiment described above is changed.
- the priority with which each object is displayed when a plurality of UIs are combined is defined in advance based on the relationship between each UI object and another UI object.
- UI identification information having a UI object as a constituent element is defined.
- FIG. 7 is a diagram for explaining UI composition using UI identification information according to the present embodiment.
- text information to be attached to each UI object is added as UI object attribute information.
- text information (particular_name) indicating the UI is added to the UI identification information (Interface id).
- each text information is information that describes a name or the like that can be given on or near the display of the UI object as necessary when the UI object is displayed on the display unit 24. is there.
- the UI composition processing in step S20 in FIG. 3 according to the present embodiment will be described with reference to the flowchart in FIG.
- the UI composition process according to the present embodiment shown in FIG. 8 is a UI processing process (step information) based on text information indicating a UI after the UI object layout is determined in step S42 of the first embodiment described with reference to FIG. S71) is added.
- the UI composition processing unit 18 attaches each text of the UI identification information on or around the UI object.
- a composite UI of the TV remote controller UI and the air conditioner remote controller UI is generated.
- the “power” is the same in both UIs. Text information exists. There is no other text information that exists in either UI. Therefore, when the UI object of “power” is displayed, text information (particular_name) added to the UI identification information of the UI to which the UI object belongs is also attached.
- various modes can be considered in addition to adding the text added to the UI identification information on or near the UI object of the same text.
- the UI display area is divided and displayed by drawing or color coding for each UI on the basis of the UI identification information of each UI.
- Each text added to the UI identification information may be displayed at a location that does not become necessary.
- an icon indicating each UI may be displayed in each divided UI display area instead of the text of the UI identification information.
- the UI identification information is attached to the description of the text in the document described in the XML format. Therefore, when the character string to be displayed is defined by the image resource, the text is attached to each UI object.
- an embodiment as shown in FIG. 9C is preferable.
- UI composition processing by the UI generation apparatus will be described.
- UI object attribute information which is information related to each UI object used when combining a plurality of UIs, included in the UI object definition information is changed.
- each UI object constituting the composite UI is enabled or disabled. Information to be included in UI object definition information.
- FIG. 10 is a diagram for explaining UI composition using information for validating or invalidating each UI object, which is UI object attribute information according to the present embodiment.
- UI object attribute information UI object attribute information according to the present embodiment.
- valid / invalid information (com_effect) of each UI object constituting the composite UI is used as UI object attribute information. Append.
- the valid / invalid information (com_effect) of each UI object is treated as “valid” when the value is “YES”, and when the value is “NO”, The object is treated as “invalid”.
- the UI object for which this information is valid is displayed in the UI display area of the display unit 24 at the time of UI composition, and the input to the input unit 22 is received at a part corresponding to the display.
- UI objects for which this information is invalid are not displayed in the UI display area of the display unit 24 at the time of UI composition, and input to the input unit 22 is not accepted.
- the UI object valid / invalid information (com_effect) is valid ("YES") among the UI objects (1) to (7) constituting the remote control UI of the television 1.
- the UI object valid / invalid information (com_effect) is valid ("YES"). Only UI objects. Therefore, in the UI display area, only the power ON key is displayed among the UI objects constituting the air conditioner remote control UI.
- the action information included in the UI object definition information described in the first embodiment includes the validity of each UI object as an operation to be executed by the application execution unit 12 when an event occurs for each UI object. / Include information to change (update) invalidity. Based on this action information, the application execution unit 12 changes the validity / invalidity information (com_effect) of the predetermined UI object from valid to invalid (from “YES” to “NO”) in response to the occurrence of an event for the predetermined UI object. ) Or from invalid to valid (“NO” to “YES”).
- step S20 in FIG. 3 the UI composition processing in step S20 in FIG. 3 according to the present embodiment will be described with reference to the flowchart in FIG.
- the UI composition processing according to the present embodiment shown in FIG. 11 is performed only for UI objects that have been analyzed as valid after extracting UI objects and analyzing attribute information in step S31 in the second embodiment described with reference to FIG. Is added (step S91).
- the UI composition processing unit 18 extracts all UI objects constituting each DOM node for the DOM nodes of a plurality of UIs for which the parsing processing and the DOM conversion processing have been completed.
- the attribute information of each UI object is analyzed (step S31).
- priority information (priority) of each UI object used in the first and second embodiments is analyzed as attribute information of each UI object.
- validity / invalidity information (com_effect) of each UI object is analyzed, and only the UI objects determined to be valid (value is “YES”) are all extracted as target UI objects (step S91).
- a composite UI can be generated by continuing the processing from step S32 described in the first and second embodiments on the valid UI object extracted as the target UI object.
- the composite UI is updated accordingly. That is, when an event occurs, the application execution unit 12 updates valid / invalid information of a predetermined UI object based on the action information included in the UI object definition information, and performs the UI composition processing shown in FIG. Again, update the composite UI.
- the application execution unit 12 when an operation input event occurs for a “power ON” UI object of the air conditioner remote control UI, the application execution unit 12 first turns on the power supply of the air conditioner of the external device. An instruction to send an infrared signal to turn ON is issued. Furthermore, the application execution unit 12 changes the value of com_effect, which is the attribute of the UI object “power ON” of the air conditioner remote controller UI of FIG. 10A, from “YES” to “NO” based on the action information. The value of com_effect of another predetermined UI object is also changed.
- FIG. 10B is a diagram illustrating a state after the value of com_effect, which is an attribute of the UI object, is changed based on the action information after the event occurs.
- the value surrounded by the alternate long and short dash line is changed from valid to invalid or invalid to valid. It has changed.
- the UI is updated to a composite UI including valid UI objects. .
- each UI can prevent unnecessary UI objects from being displayed before the power is turned on, so a limited UI display area can be displayed. It can be used effectively.
- a necessary UI object is expanded after the power is turned on as shown in FIG. 10B.
- the necessary UI objects can be displayed only when necessary.
- the effective UI object is also analyzed based on the priority attribute described in the first embodiment. Objects change dynamically in order according to their priorities.
- the application execution unit 12 updates the validity / invalidity information of a predetermined UI object constituting the TV remote control UI, so that a higher priority is displayed in the UI display area. Will come to be. However, in the state shown in FIG. 12A, there is no area for displaying any more UI objects in the UI display area. For this reason, when the composite UI is updated, the UI composition processing unit 18 generates a composite UI without adopting a UI object with a low priority based on the priority attribute information of each UI object.
- the TV remote controller UI is turned on to display the UI object of the important TV remote controller UI.
- the air conditioner remote control UI a UI object having a low importance level is not displayed at the time of updating, and a composite UI using the UI display area more effectively can be provided.
- some UI objects such as UI objects for manipulating the volume, are not easy to use with either “volume + (increase)” or “volume ⁇ (decrease)”, and both UI objects are aligned. There is something that makes sense for the first time. Therefore, in this embodiment, when the composition of the UI display area is sequentially changed by updating the composite UI as described above, two UI objects that should be paired are determined as a pair. .
- an attribute (relate_ID) indicating that a plurality of UI objects are paired is further added as UI object attribute information to the UI object definition information, and the attribute value is paired according to the value of the attribute.
- a UI object In the example shown in FIG. 13, in the TV remote control UI, it is indicated by an attribute (relate_ID) that the UI object of “channel selection +” in (2) and “channel selection ⁇ ” in (3) is a pair. Is defined as “select_1”.
- the attribute (relate_ID) indicates that the UI objects of (4) “volume +” and (5) “volume ⁇ ” are also a pair, and the value of the attribute is defined as “volume_1”.
- the valid / invalid attribute (com_effect) of each UI object is omitted.
- an attribute (relate_ID) that handles a UI object as a pair when performing analysis based on a priority (priority) or valid / invalid (com_effect) attribute also analyze.
- the UI object to which the attribute (relate_ID) to be treated as a pair is added is determined to be adopted or not adopted based on the value.
- the present invention is not limited to the above-described embodiment, and many changes or modifications can be made.
- the UI object attribute information (relate_ID) is used to determine whether or not two UI objects to be paired are to be adopted as a pair.
- the number of UI objects is not limited to two, and can be three or more.
- the mobile phone 1 of each embodiment mentioned above controlled the external apparatus remotely by performing infrared communication by the infrared communication part 40
- communication with an external apparatus is not restricted to infrared communication, For example, Bluetooth (trademark) ) Or wireless LAN or other short-range wireless communication.
- the UI object attribute information has been described as being included in the UI object definition information.
- the UI object attribute information is not necessarily included in the UI object definition information.
- the UI object attribute information may be stored in an area different from the area where the UI object definition information is stored in the storage unit 50 in association with the corresponding UI object definition information.
- the UI object attribute information does not have to be defined with its contents fixed. That is, for example, the control unit 10 may change the content of the UI object attribute information based on a UI usage history or the like.
- the UI object attribute information may not be defined in advance. That is, for example, the control unit 10 analyzes the UI definition file stored in the storage unit 50, the contents of the UI object definition information included in the UI definition file, the relevance between the UI object definition information, and the use of the UI UI object attribute information may be generated based on a history or the like.
- a trigger for starting the UI generation process a user operation for starting an application using a UI, or a user for starting another application using a UI when an application is already being executed.
- the operation has been described as an example, but the operation for starting the UI generation processing is not limited to these.
- the mobile phone 1 that is a UI generation device acquires information on external devices existing around the UI generation device by short-range wireless communication such as a wireless LAN or Bluetooth, or an RF tag reader.
- the UI may be generated based on the acquired information. That is, the mobile phone 1 detects the presence of an external device in the vicinity of the mobile phone 1 so that a UI related to the external device (such as a UI for a remote control application that operates the external device) depends on a user operation. Automatically generated.
- the mobile phone 1 that is a UI generation device may be configured to select a UI to be generated based on position information and time information acquired by GPS or the like. That is, the mobile phone 1 automatically generates a UI necessary at that location based on a change in the current position or a preset schedule.
- a UI definition file corresponding to each application is stored in the UI definition file storage area 54 in advance.
- a necessary UI definition file is appropriately acquired from the outside. You can also. That is, when the UI is generated, if the necessary UI definition file is not stored in the storage unit 50, the necessary UI definition file may be downloaded and obtained via the communication means of the mobile phone 1. good.
- the UI acquisition unit 14 of the control unit 10 receives the necessary UI definition file from an external device or an external server (not shown) via the wireless communication unit 30. To get.
- the touch panel is not an essential element.
- the present invention can be applied to a terminal having an arbitrary input unit that is supposed to be used by combining UIs, such as an input unit having a large number of mechanical keys or a terminal having an arbitrary pointing device. .
- UI definition file As an example of the UI definition file used in each embodiment described above, an example of a file in the UIML format based on XML is shown below. The portion to which each UI object attribute information according to the present invention is added is underlined.
- the UI object is defined by a ⁇ template> tag, and therefore the description between ⁇ template> to ⁇ / template> corresponds to the UI object definition information.
- TV1_interface.uiml file is shown as a UI definition file constituting the TV remote control UI.
- the present invention it is possible to synthesize a plurality of UIs, automatically select important UI objects when operating an application corresponding to each UI, and generate one synthesized UI that fits in a predetermined UI display area. Therefore, a plurality of UIs can be used simultaneously without switching. Furthermore, since there is no need to select a display area (window) or move the scroll bar within the display area as in the case of a multi-window, the user's operation load can be greatly reduced.
Abstract
Description
アプリケーションプログラムに基づいて各種の機能を実現するアプリケーションプログラム実行部と、
前記アプリケーションプログラムに基づく所定の機能の実行を前記アプリケーションプログラム実行部に指示するためのユーザインタフェースを生成するユーザインタフェース生成部と、
前記ユーザインタフェース生成部により生成されるユーザインタフェースを表示する表示部と、
前記ユーザインタフェースの構成要素であるユーザインタフェースオブジェクトを定義するユーザインタフェースオブジェクト定義情報を含むユーザインタフェース定義ファイルを格納する記憶部と、を備え、
前記ユーザインタフェース生成部は、複数の前記ユーザインタフェースの生成が指示された場合、当該指示された複数のユーザインタフェースの各々に対応して前記記憶部に格納されたユーザインタフェース定義ファイルに含まれるユーザインタフェースオブジェクト定義情報を取捨選択し、選択されたユーザインタフェースオブジェクト定義情報に基づいて合成ユーザインタフェースを生成するユーザインタフェース合成処理を行うことを特徴とするものである。
前記ユーザインタフェースオブジェクト定義情報は、該ユーザインタフェースオブジェクトと、該ユーザインタフェースオブジェクトを含むユーザインタフェースに含まれる他のユーザインタフェースオブジェクトとの関係を示すユーザインタフェースオブジェクト属性情報を含み、
前記ユーザインタフェース生成部は、複数の前記ユーザインタフェースの生成が指示された場合、前記表示部においてユーザインタフェースを表示する所定の表示領域と、前記ユーザインタフェースオブジェクト属性情報とに基づき、前記所定のユーザインタフェースオブジェクト定義情報を取捨選択することを特徴とするものである。
前記ユーザインタフェースオブジェクト定義情報は、当該ユーザインタフェースを構成する各ユーザインタフェースオブジェクトが有効か無効かを示すユーザインタフェースオブジェクト属性情報をさらに含み、
前記ユーザインタフェース生成部は、複数の前記ユーザインタフェースの生成が指示された場合、前記表示部においてユーザインタフェースを表示する所定の表示領域と、前記ユーザインタフェースオブジェクト属性情報とに基づき、前記所定のユーザインタフェースオブジェクト定義情報を取捨選択することを特徴とするものである。
前記ユーザインタフェース定義ファイルは、該ユーザインタフェース定義ファイルに基づいて生成されるユーザインタフェースを識別する識別情報を含み、
前記ユーザインタフェース生成部は、選択されたユーザインタフェースオブジェクト定義情報を前記識別情報に基づいて加工することを特徴とするものである。
前記ユーザインタフェースオブジェクト定義情報は、当該ユーザインタフェースオブジェクトに対するイベント発生時に実行すべき動作内容を示すアクション情報を含み、
該アクション情報は、他のユーザインタフェースオブジェクト定義情報に含まれる前記ユーザインタフェースオブジェクト属性情報の変更に関する情報を含み、
前記ユーザインタフェース生成部は、前記合成ユーザインタフェースにおいて前記イベントが発生し、当該イベントの発生を受け付けたユーザインタフェースオブジェクトのユーザインタフェースオブジェクト定義情報に含まれるアクション情報に基づいて前記ユーザインタフェースオブジェクト属性情報が変更された場合、当該変更されたユーザインタフェースオブジェクト属性情報に基づいてユーザインタフェース合成処理を行い新たな合成ユーザインタフェースを生成することを特徴とするものである。
10 制御部
12 アプリケーションプログラム実行部
14 ユーザインタフェース取得部
16 ユーザインタフェース生成部
18 ユーザインタフェース合成処理部
20 タッチパネル
22 入力部
24 表示部
30 無線通信部
40 赤外線通信部
50 記憶部
52 アプリケーションプログラム記憶領域
54 ユーザインタフェース定義ファイル記憶領域
56 個別ユーザインタフェースリソース記憶領域
58 共通ユーザインタフェースリソース記憶領域
図1は、本発明の第1実施の形態に係るUI生成装置である携帯電話の概略構成を示すブロック図である。
次に、本発明の第2実施の形態に係るUI生成装置によるUI合成処理について説明する。本実施の形態では、上述した第1実施の形態において、UIオブジェクト定義情報に含まれる、複数のUIを合成する際に用いる各UIオブジェクトに関する情報であるUIオブジェクト属性情報を変更する。
次に、本発明の第3実施の形態に係るUI生成装置によるUI合成処理について説明する。本実施の形態では、上述した第2実施の形態と同様に、UIオブジェクト定義情報に含まれる、複数のUIを合成する際に用いる各UIオブジェクトに関する情報であるUIオブジェクト属性情報を変更する。
Claims (5)
- アプリケーションプログラムに基づいて各種の機能を実現するアプリケーションプログラム実行部と、
前記アプリケーションプログラムに基づく所定の機能の実行を前記アプリケーションプログラム実行部に指示するためのユーザインタフェースを生成するユーザインタフェース生成部と、
前記ユーザインタフェース生成部により生成されるユーザインタフェースを表示する表示部と、
前記ユーザインタフェースの構成要素であるユーザインタフェースオブジェクトを定義するユーザインタフェースオブジェクト定義情報を含むユーザインタフェース定義ファイルを格納する記憶部と、を備え、
前記ユーザインタフェース生成部は、複数の前記ユーザインタフェースの生成が指示された場合、当該指示された複数のユーザインタフェースの各々に対応して前記記憶部に格納されたユーザインタフェース定義ファイルに含まれるユーザインタフェースオブジェクト定義情報を取捨選択し、選択されたユーザインタフェースオブジェクト定義情報に基づいて合成ユーザインタフェースを生成するユーザインタフェース合成処理を行うことを特徴とするユーザインタフェース生成装置。 - 前記ユーザインタフェースオブジェクト定義情報は、該ユーザインタフェースオブジェクトと、該ユーザインタフェースオブジェクトを含むユーザインタフェースに含まれる他のユーザインタフェースオブジェクトとの関係を示すユーザインタフェースオブジェクト属性情報を含み、
前記ユーザインタフェース生成部は、複数の前記ユーザインタフェースの生成が指示された場合、前記表示部においてユーザインタフェースを表示する所定の表示領域と、前記ユーザインタフェースオブジェクト属性情報とに基づき、前記所定のユーザインタフェースオブジェクト定義情報を取捨選択することを特徴とする、請求項1に記載のユーザインタフェース生成装置。 - 前記ユーザインタフェースオブジェクト定義情報は、当該ユーザインタフェースを構成する各ユーザインタフェースオブジェクトが有効か無効かを示すユーザインタフェースオブジェクト属性情報をさらに含み、
前記ユーザインタフェース生成部は、複数の前記ユーザインタフェースの生成が指示された場合、前記表示部においてユーザインタフェースを表示する所定の表示領域と、前記ユーザインタフェースオブジェクト属性情報とに基づき、前記所定のユーザインタフェースオブジェクト定義情報を取捨選択することを特徴とする、請求項2に記載のユーザインタフェース生成装置。 - 前記ユーザインタフェース定義ファイルは、該ユーザインタフェース定義ファイルに基づいて生成されるユーザインタフェースを識別する識別情報を含み、
前記ユーザインタフェース生成部は、選択されたユーザインタフェースオブジェクト定義情報を前記識別情報に基づいて加工することを特徴とする、請求項1~3のいずれか1項に記載のユーザインタフェース生成装置。 - 前記ユーザインタフェースオブジェクト定義情報は、当該ユーザインタフェースオブジェクトに対するイベント発生時に実行すべき動作内容を示すアクション情報を含み、
該アクション情報は、他のユーザインタフェースオブジェクト定義情報に含まれる前記ユーザインタフェースオブジェクト属性情報の変更に関する情報を含み、
前記ユーザインタフェース生成部は、前記合成ユーザインタフェースにおいて前記イベントが発生し、当該イベントの発生を受け付けたユーザインタフェースオブジェクトのユーザインタフェースオブジェクト定義情報に含まれるアクション情報に基づいて前記ユーザインタフェースオブジェクト属性情報が変更された場合、当該変更されたユーザインタフェースオブジェクト属性情報に基づいてユーザインタフェース合成処理を行い新たな合成ユーザインタフェースを生成することを特徴とする、請求項1~4のいずれか1項に記載のユーザインタフェース生成装置。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010500681A JP5200095B2 (ja) | 2008-02-27 | 2009-02-23 | ユーザインタフェース生成装置 |
KR1020107018957A KR101201856B1 (ko) | 2008-02-27 | 2009-02-23 | 유저인터페이스생성장치 |
US12/919,351 US8726175B2 (en) | 2008-02-27 | 2009-02-23 | User interface generation apparatus |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2008-045726 | 2008-02-27 | ||
JP2008045726 | 2008-02-27 | ||
JP2008-088248 | 2008-03-28 | ||
JP2008088248 | 2008-03-28 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2009107589A1 true WO2009107589A1 (ja) | 2009-09-03 |
Family
ID=41015983
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2009/053227 WO2009107589A1 (ja) | 2008-02-27 | 2009-02-23 | ユーザインタフェース生成装置 |
Country Status (4)
Country | Link |
---|---|
US (1) | US8726175B2 (ja) |
JP (1) | JP5200095B2 (ja) |
KR (1) | KR101201856B1 (ja) |
WO (1) | WO2009107589A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017527930A (ja) * | 2015-07-13 | 2017-09-21 | 小米科技有限責任公司Xiaomi Inc. | スマート機器をコントロールするための方法、装置、プログラム及び記録媒体 |
WO2018029738A1 (ja) * | 2016-08-08 | 2018-02-15 | 三菱電機株式会社 | ユーザーインターフェース制御装置及びユーザーインターフェース制御方法 |
JP2019133561A (ja) * | 2018-02-02 | 2019-08-08 | コニカミノルタ株式会社 | 画像形成装置及びプログラム |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
ES2734348T3 (es) * | 2012-11-07 | 2019-12-05 | Rain Bird Corp | Sistema de control de riego |
JP2015167102A (ja) * | 2014-03-04 | 2015-09-24 | 株式会社オートネットワーク技術研究所 | 蓄電モジュール |
EP3224705A4 (en) * | 2014-11-24 | 2018-07-11 | Hewlett-Packard Enterprise Development LP | Detection of user interface layout changes |
US10521502B2 (en) | 2016-08-10 | 2019-12-31 | International Business Machines Corporation | Generating a user interface template by combining relevant components of the different user interface templates based on the action request by the user and the user context |
JP6878934B2 (ja) * | 2017-02-10 | 2021-06-02 | オムロン株式会社 | 情報処理装置、情報処理システム、ユーザインターフェイスの作成方法、およびユーザインターフェイスの作成プログラム |
WO2018152111A1 (en) * | 2017-02-14 | 2018-08-23 | Sherman Brian Arthur | System for creating data-connected applications |
JP6915532B2 (ja) * | 2017-12-28 | 2021-08-04 | 富士通株式会社 | 情報処理装置、情報共有システムおよび同期制御方法 |
US11138288B2 (en) * | 2019-08-01 | 2021-10-05 | International Business Machines Corporation | Priority-based rendering |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005096596A (ja) * | 2003-09-25 | 2005-04-14 | Sony Corp | 車載装置及び車載装置の制御方法 |
JP2006350819A (ja) * | 2005-06-17 | 2006-12-28 | Toshiba Corp | 家電機器制御システム |
JP2007066099A (ja) * | 2005-08-31 | 2007-03-15 | Canon Inc | Gui構成システム、gui構成方法及びプログラム |
Family Cites Families (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1113302C (zh) * | 1993-07-30 | 2003-07-02 | 佳能株式会社 | 通过通信线路控制设备的控制器和方法 |
US5648813A (en) * | 1993-10-20 | 1997-07-15 | Matsushita Electric Industrial Co. Ltd. | Graphical-interactive-screen display apparatus and peripheral units |
US5657221A (en) * | 1994-09-16 | 1997-08-12 | Medialink Technologies Corporation | Method and apparatus for controlling non-computer system devices by manipulating a graphical representation |
JPH11122682A (ja) * | 1997-10-16 | 1999-04-30 | Nec Corp | リモートコントロール送信装置 |
US7831930B2 (en) * | 2001-11-20 | 2010-11-09 | Universal Electronics Inc. | System and method for displaying a user interface for a remote control application |
JP2001036652A (ja) | 1999-07-23 | 2001-02-09 | Nippon Conlux Co Ltd | 携帯電話機およびこれを利用した機器遠隔制御方法 |
JP2001069580A (ja) | 1999-08-31 | 2001-03-16 | Matsushita Electric Ind Co Ltd | Av機器コントロール装置 |
JP2002278666A (ja) | 2001-03-22 | 2002-09-27 | Toyoda Mach Works Ltd | 設備制御用操作盤 |
US20050134578A1 (en) * | 2001-07-13 | 2005-06-23 | Universal Electronics Inc. | System and methods for interacting with a control environment |
US20030103075A1 (en) * | 2001-12-03 | 2003-06-05 | Rosselot Robert Charles | System and method for control of conference facilities and equipment |
EP1483653B1 (en) * | 2002-03-08 | 2006-05-31 | Revelations in Design, LP | Electric device control apparatus |
US10721087B2 (en) * | 2005-03-16 | 2020-07-21 | Icontrol Networks, Inc. | Method for networked touchscreen with integrated interfaces |
JP4455170B2 (ja) * | 2004-05-31 | 2010-04-21 | 株式会社東芝 | ネットワーク家電制御システム |
JP3897774B2 (ja) * | 2004-06-09 | 2007-03-28 | 株式会社ソニー・コンピュータエンタテインメント | マルチメディア再生装置およびメニュー画面表示方法 |
JP4676303B2 (ja) * | 2005-10-18 | 2011-04-27 | 株式会社日立製作所 | 端末装置 |
US9607287B2 (en) * | 2008-01-19 | 2017-03-28 | International Business Machines Corporation | Integrated view of multi-sourced information objects |
-
2009
- 2009-02-23 US US12/919,351 patent/US8726175B2/en not_active Expired - Fee Related
- 2009-02-23 WO PCT/JP2009/053227 patent/WO2009107589A1/ja active Application Filing
- 2009-02-23 KR KR1020107018957A patent/KR101201856B1/ko active IP Right Grant
- 2009-02-23 JP JP2010500681A patent/JP5200095B2/ja not_active Expired - Fee Related
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005096596A (ja) * | 2003-09-25 | 2005-04-14 | Sony Corp | 車載装置及び車載装置の制御方法 |
JP2006350819A (ja) * | 2005-06-17 | 2006-12-28 | Toshiba Corp | 家電機器制御システム |
JP2007066099A (ja) * | 2005-08-31 | 2007-03-15 | Canon Inc | Gui構成システム、gui構成方法及びプログラム |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017527930A (ja) * | 2015-07-13 | 2017-09-21 | 小米科技有限責任公司Xiaomi Inc. | スマート機器をコントロールするための方法、装置、プログラム及び記録媒体 |
WO2018029738A1 (ja) * | 2016-08-08 | 2018-02-15 | 三菱電機株式会社 | ユーザーインターフェース制御装置及びユーザーインターフェース制御方法 |
JPWO2018029738A1 (ja) * | 2016-08-08 | 2018-08-16 | 三菱電機株式会社 | ユーザーインターフェース制御装置及びユーザーインターフェース制御方法 |
JP2019133561A (ja) * | 2018-02-02 | 2019-08-08 | コニカミノルタ株式会社 | 画像形成装置及びプログラム |
JP6992557B2 (ja) | 2018-02-02 | 2022-01-13 | コニカミノルタ株式会社 | 画像形成装置及びプログラム |
Also Published As
Publication number | Publication date |
---|---|
KR101201856B1 (ko) | 2012-11-15 |
US20110022974A1 (en) | 2011-01-27 |
JP5200095B2 (ja) | 2013-05-15 |
KR20100103723A (ko) | 2010-09-27 |
US8726175B2 (en) | 2014-05-13 |
JPWO2009107589A1 (ja) | 2011-06-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5200095B2 (ja) | ユーザインタフェース生成装置 | |
JP5406176B2 (ja) | ユーザインタフェース生成装置 | |
JP5351165B2 (ja) | ユーザインタフェース生成装置 | |
JP5680404B2 (ja) | ユーザインタフェース生成装置 | |
JP5431321B2 (ja) | ユーザインタフェース生成装置 | |
KR101256016B1 (ko) | 유저인터페이스 생성장치 | |
KR101256014B1 (ko) | 유저인터페이스 생성장치 | |
WO2010027088A1 (ja) | 情報処理装置及びプログラム | |
JPH1023117A (ja) | 携帯電話装置 | |
JP3354549B2 (ja) | 携帯電話装置 | |
JP4672717B2 (ja) | 情報処理装置及び画面表示方法 | |
JP2012079181A (ja) | 文書閲覧装置 | |
JP2001273213A (ja) | 情報入力システムにおける入力方法及び入力装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 09714440 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2010500681 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 12919351 Country of ref document: US |
|
ENP | Entry into the national phase |
Ref document number: 20107018957 Country of ref document: KR Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 09714440 Country of ref document: EP Kind code of ref document: A1 |