US11847148B2 - Information processing device and setting device - Google Patents
Information processing device and setting device Download PDFInfo
- Publication number
- US11847148B2 US11847148B2 US17/006,082 US202017006082A US11847148B2 US 11847148 B2 US11847148 B2 US 11847148B2 US 202017006082 A US202017006082 A US 202017006082A US 11847148 B2 US11847148 B2 US 11847148B2
- Authority
- US
- United States
- Prior art keywords
- words
- section
- control
- control target
- user
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
- 230000010365 information processing Effects 0.000 title claims description 39
- 230000014509 gene expression Effects 0.000 claims abstract description 114
- 238000004364 calculation method Methods 0.000 claims abstract description 54
- 238000000034 method Methods 0.000 claims description 112
- 230000008569 process Effects 0.000 claims description 101
- 238000004088 simulation Methods 0.000 claims description 29
- 238000004891 communication Methods 0.000 claims description 15
- 238000003672 processing method Methods 0.000 claims 1
- 238000006243 chemical reaction Methods 0.000 description 58
- 101100520231 Caenorhabditis elegans plc-3 gene Proteins 0.000 description 13
- 239000000463 material Substances 0.000 description 8
- 230000009471 action Effects 0.000 description 6
- 239000000284 extract Substances 0.000 description 6
- 230000005236 sound signal Effects 0.000 description 6
- 230000006870 function Effects 0.000 description 5
- 238000013528 artificial neural network Methods 0.000 description 4
- 230000004044 response Effects 0.000 description 4
- 230000008859 change Effects 0.000 description 3
- 239000002994 raw material Substances 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 230000007704 transition Effects 0.000 description 2
- 230000004397 blinking Effects 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/30—Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
- G06F16/33—Querying
- G06F16/3331—Query processing
- G06F16/334—Query execution
- G06F16/3343—Query execution using phonetics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/30—Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
- G06F16/33—Querying
- G06F16/3331—Query processing
- G06F16/3332—Query translation
- G06F16/3334—Selection or weighting of terms from queries, including natural language queries
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/30—Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
- G06F16/33—Querying
- G06F16/335—Filtering based on additional data, e.g. user or group profiles
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/30—Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
- G06F16/35—Clustering; Classification
- G06F16/355—Class or cluster creation or modification
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0483—Interaction with page-structured environments, e.g. book metaphor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
- G06F3/167—Audio in a user interface, e.g. using voice commands for navigating, audio feedback
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/10—Text processing
- G06F40/166—Editing, e.g. inserting or deleting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/20—Natural language analysis
- G06F40/205—Parsing
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
- G10L15/00—Speech recognition
- G10L15/08—Speech classification or search
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
- G10L13/00—Speech synthesis; Text to speech systems
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
- G10L15/00—Speech recognition
- G10L15/08—Speech classification or search
- G10L2015/081—Search algorithms, e.g. Baum-Welch or Viterbi
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
- G10L15/00—Speech recognition
- G10L15/08—Speech classification or search
- G10L2015/088—Word spotting
Definitions
- the present invention relates to an information processing device and a setting device.
- Patent Literature 1 discloses an information processing device which: assigns weights to a first voice, which is a voice of a user, and a second voice, which is a voice of a person over the phone, respectively; and determines, in accordance with the values of the weights, which of the first and second voices should be given higher priority.
- Patent Literature 1 merely assigns weights to the first voice and the second voice, respectively, and therefore it is difficult to carry out the search in a manner such that applications having similar keywords are distinguished from each other. Therefore, the information processing device is incapable of sufficiently improving the accuracy of selection of an application.
- An object of an aspect of the present invention is to improve the accuracy of selection of a control target.
- an information processing device in accordance with an aspect of the present invention includes: an identifying section configured to, by referring to one or more search keywords set for one or more control targets, identify at least one search keyword from among the one or more search keywords, the at least one search keyword matching any of one or more main words contained in input data acquired through voice input; and a selecting section configured such that: the at least one search keyword identified by the identifying section is referred to; and thereby the selecting section selects at least one control target from among the one or more control targets based on one or more numeric values obtained through calculation of one or more expressions each of which batch-converts, into numerical form, one or more of the one or more search keywords set for the one or more control targets.
- An aspect of the present invention makes it possible to improve the accuracy of selection of a control target.
- FIG. 1 is a block diagram illustrating configurations of a PC, a programmable display device, and an external device, in accordance with Embodiment 1 of the present invention.
- FIG. 2 is a flowchart showing steps of a process carried out by the programmable display device of FIG. 1 .
- FIG. 3 illustrates an edit screen for screen editing, which is displayed by a display included in the PC of FIG. 1 .
- FIG. 6 schematically illustrates how a graphics screen moves.
- FIG. 7 shows search conditions stored in a user memory of a programmable display device in accordance with Embodiment 2 of the present invention.
- FIG. 1 is a block diagram illustrating the configurations of the PC 1 , the programmable display device 2 , and the external device 5 , in accordance with Embodiment 1 of the present invention.
- the PC 1 includes a control section 11 , a display 12 , an interface section 13 , an operation section 14 , and a storage section 15 .
- the PC 1 is connected to the programmable display device 2 .
- the PC 1 functions as a screen creating device that executes a program for screen editing and thereby creates a graphics screen that is to be displayed by the programmable display device 2 .
- the property PP of the material tank MA is displayed in the right part of the edit screen ED.
- the property PP contains entry boxes I 1 and I 2 .
- the user can enter one or more required words, as search keywords, in the entry box I 1 , and can enter one or more preferred words, as search keywords, in the entry box I 2 , during execution of a runtime program.
- the entry of preferred words in the entry box I 2 is not essential, and the entry box I 2 may be left blank.
- the property PP may contain, instead of the entry boxes I 1 and I 2 , boxes in which a list of required words (candidates) and a list of preferred words (candidates) are displayed.
- the setting section 111 stores, in the storage section 15 , settings made on the settings screen SS.
- the number display column N 1 contains numbers for respective control targets.
- the user can enter variables in the variable entry column V 1 on a per-control target basis.
- Each of the variables here is the one based on which the control section 10 of the programmable display device 2 identifies a corresponding control target. More specifically, the variable is associated with the address of an internal memory of the PLC 3 or with the address of an internal memory of the programmable display device 2 , and is information in the form that is easily understandable to the user, such as a character string representing a control target.
- the user can enter the names of control targets in the name entry column NA.
- Each of the names is for the user to easily identify a corresponding control target.
- the variables are associated with the object OB; therefore, on the settings screen SS, the name of the object OB corresponding to the variables may be displayed.
- the user can enter, as an expression for conversion of preferred words into numerical form, for example, the expression “(remaining
- the user can use, as preferred words, the words “remaining”, “level”, “current fill level”, and “quantity” entered in the entry box I 2 , and can prepare expressions using such preferred words.
- an expression for conversion of required words into numerical form can be entered directly in the entry box I 1 ; and an expression for conversion of preferred words into numerical form can be entered directly in the entry box I 2 .
- an expression for conversion of required words into numerical form and an expression for conversion of preferred words into numerical form can be entered directly in the entry columns CF 1 and CF 2 , respectively, without entering required words and preferred words in the entry boxes I 1 and I 2 , respectively. Also in such cases, search keywords are classified into required words and preferred words.
- the property PP does not need to contain the entry box I 2 .
- the entry box I 1 of the property PP may be arranged such that either (i) a required word(s) or (ii) an expression(s) for conversion of required words into numerical form is/are entered in the entry box I 1 .
- the settings screen SS does not need to contain the entry column CF 2 .
- the property PP may contain entry boxes in which variables and names can be entered.
- the simulation section 112 causes the display 12 to display at least one of (i) the course of a process in which a search keyword(s) is/are identified in a simulation manner and (ii) the course of a process in which a control target(s) is/are selected in a simulation manner.
- the simulation section 112 causes the display 12 to display the course of a process in which: the search keyword(s) set by the setting section 111 for the control target(s) is/are referred to; and thereby a search keyword(s) matching any of the extracted main word(s) is/are identified in a simulation manner.
- the simulation section 112 causes the display 12 to display the course of a process in which a control target(s) is/are selected in a simulation manner based on a numeric value(s) obtained through calculation of an expression(s) for conversion, into numerical form, of the search keyword(s) set by the setting section 111 for the control target(s).
- the user can pre-check, through the PC 1 , whether or not a process to select a control target(s) is carried out properly on the programmable display device 2 . Furthermore, since the identifying process and selecting process are displayed in a simulation manner on the PC 1 before the search keyword(s) and expression(s) are transmitted to the control section 10 , the user can easily check whether or not a process to select a control target(s) is carried out properly.
- the simulation demonstrates the course through to the selection of a control target(s) based on the result(s) of calculation of the expression(s), the user can check whether the selected result is correct or not. It will cost a lot of time and effort to check many control targets on-site in a debug mode (described later) of the programmable display device 2 .
- the simulation presents, before carrying out the debug mode, the process carried out in the programmable display device 2 in a simulation manner as far as possible.
- the external device 5 includes a control section 31 , a display section 32 , an interface section 33 , and a touch panel 34 .
- the external device 5 is, for example, a mobile terminal that is configured to communicate with the programmable display device 2 .
- the control section 31 controls each section of the external device 5 .
- the control section 31 identifies a control action from operation on the touch panel 34 .
- the control section 31 carries out control related to display by the display section 32 , and carries out control related to communications between the external device 5 and the programmable display device 2 through the interface section 33 .
- the interface section 33 is a communication section through which the external device 5 communicates with the programmable display device 2 .
- the programmable display device 2 includes a control section 10 , a display section 20 , a touch panel 30 , a user memory 40 , a microphone 50 , a speaker 60 , and interface sections 70 , 80 , and 90 .
- the programmable display device 2 is connected to a programmable logic controller (PLC) 3 .
- PLC programmable logic controller
- the control section 10 identifies the action of displaying the status of a device 4 connected to the PLC 3 , the action of controlling the status of the device 4 in accordance with the operation on the touch panel 30 , and the like.
- the control section 10 controls each section of the programmable display device 2 .
- the control section 10 includes a storage control section 110 , a conversion-to-text section 120 , a classifying section 130 , an extracting section 140 , an identifying section 150 , an executing section 160 , a selecting section 170 , a course-of-process display control section 180 , an edit section 190 , a display switching control section 200 , and a screen display control section 230 .
- the screen display control section 230 includes a screen movement control section 210 and an emphasis control section 220 . The details of processes carried out by respective sections in the control section 10 will be described later.
- the microphone 50 is a part through which a voice issued by the user is inputted.
- the speaker 60 is a part through which a sound is outputted from the programmable display device 2 .
- the programmable display device 2 may contain an interface(s) alone instead of the microphone 50 and the speaker 60 . If the programmable display device 2 contains an interface(s) alone, the microphone 50 and the speaker 60 , which are external devices, are connected to the programmable display device 2 through the interface(s).
- the interface section 70 is a communication section through which the programmable display device 2 communicates with the PC 1 .
- the interface section 80 is a communication section through which the programmable display device 2 communicates with the PLC 3 .
- the interface section 90 is a communication section through which the programmable display device 2 communicates with the external device 5 .
- the PLC 3 is a control device which, in accordance with a sequential program prepared by a user, reads the status of the device 4 and provides a control instruction to the device 4 at predetermined scanning times.
- the device 4 may be controlled by the PLC 3 or may be configured to output detected values (e.g., a sensor). There are a plurality of such devices 4 .
- FIG. 2 is a flowchart showing steps of a process carried out by the programmable display device 2 of FIG. 1 .
- the storage control section 110 first causes the user memory 40 to store a search keyword(s) set for one or more control targets such as a screen(s), a variable(s), and/or a component(s), relating to information acquired by the programmable display device 2 from the PLC 3 .
- the control section 11 provides, to the interface section 13 , settings information set on the edit screen ED and/or the settings screen SS of the screen editing program.
- the interface section 13 transmits, to the interface section 70 , the settings information provided from the control section 11 .
- the interface section 70 provides, to the control section 10 , the settings information transmitted from the interface section 13 .
- the interface section 13 transmits, to the control section 10 , a search keyword(s) set by the setting section 111 and an expression(s) to convert search keyword into numerical form.
- At least a portion of a process carried out by the control section 10 may be carried out by a server which is communicably connected to the programmable display device 2 .
- the storage control section 110 stores the settings information in the user memory 40 .
- the settings information contains the search keyword(s), and therefore this means that the storage control section 110 stores, in the user memory 40 , the search keyword(s) set for one or more control targets.
- the settings information may be copied from the PC 1 to the programmable display device 2 with use of a storage medium.
- a memory card universal serial bus (USB) memory or a secure digital (SD) card
- USB universal serial bus
- SD secure digital
- the storage control section 110 may store the settings information in the PLC 3 so that the settings information can be easily edited by a person who made a program of the PLC 3 and that the settings information can be shared by, for example, a supervisory control and data acquisition (SCADA) which is used as a host human machine interface (HMI), while the programmable display device 2 is in operation.
- SCADA supervisory control and data acquisition
- HMI host human machine interface
- the storage control section 110 may store the settings information in a memory included in the PLC 3 .
- the control section 31 provides, to the interface section 33 , content of the user operation on the touch panel 34 .
- the interface section 33 transmits the content of the operation to the interface section 90 , and the interface section 90 provides the content of the operation to the control section 10 .
- the edit section 190 causes content stored in the memory of the PLC 3 to reflect the received content of the operation, through the interface section 80 .
- a voice issued by the user is inputted through the microphone 50 .
- the microphone 50 converts the voice into audio data, and provides the audio data to the control section 10 .
- the control section 10 receives the audio data from the microphone 50 (S 1 ).
- the classifying section 130 may be configured to, with use of a neural network included in the control section 10 , classify the intent of speech based on the input data. Specifically, the classifying section 130 receives input data, extracts characteristics from the input data, and inputs the characteristics to the neural network. The neural network classifies the intent of speech based on the inputted characteristics. The classifying section 130 generates the intent of speech classified by the neural network.
- the extracting section 140 extracts one or more main words from the input data (S 4 ). Note that the extracting section 140 may extract one or more main words obtained together with the intent of speech classified by the classifying section 130 , or may extract, from the entire text of the input data, only one or more main words that match a search keyword(s) set for one or more control targets.
- the control section 10 determines which of a plurality of intent classes programmed in the control section 10 the class of the intent classified by the classifying section 130 falls under (S 5 ). In step S 5 , for example, in a case where the class of the intent classified by the classifying section 130 is text-to-speech conversion of variable, the control section 10 carries out a process of text-to-speech conversion of variable (S 6 ).
- step S 5 the class of the intent classified by the classifying section 130 is screen switching
- the control section 10 carries out a screen switching process (S 7 ).
- step S 5 the class of the intent classified by the classifying section 130 is component search, value setting, or some other class
- the control section 10 carries out a component search process, value setting process, or some other process (S 8 to S 10 ), respectively.
- Such other process is different from any of processes of S 6 to S 9 .
- FIG. 5 is a flowchart showing a flow of the process of text-to-speech conversion of variable carried out by the control section 10 included in the programmable display device 2 of FIG. 1 .
- the user memory 40 stores therein settings information made on the settings screen SS illustrated in FIG. 4 .
- the identifying section 150 identifies a search keyword(s) matching any of the one or more main words extracted by the extracting section 140 (S 21 ).
- step S 22 for the name “current fill level of material tank MA”, the expression (tank & (source material
- the executing section 160 reads the value of the found variable (S 30 ). Specifically, the executing section 160 carries out a text-to-speech conversion process in which the executing section 160 generates an audio signal indicative of speech indicating the value of “level of tank A 1 ” which is the control target selected by the selecting section 170 ), provides the audio signal to the speaker 60 , and causes the speaker 60 to output the signal in the form of a sound (S 31 ).
- the unit for each variable may be set in the settings information so that the unit is also converted to speech together with the value of the variable when the text-to-speech conversion process is carried out.
- the selecting section 170 is configured such that: the search keyword(s) identified by the identifying section 150 is/are referred to; and thereby the selecting section 170 selects a control target(s) based on the numeric value(s) obtained through calculation of an expression(s) for batch conversion of one or more search keywords set for the control target(s) into numerical form.
- the selecting section 170 selects a control target(s) based on the following numeric values: the numeric value(s) obtained through calculation of an expression(s) for batch conversion of one or more required words set for the control target(s) into numerical form; and the numeric value(s) obtained through calculation of an expression(s) for batch conversion of one or more preferred words set for the control target(s) into numerical form.
- the user can also easily make settings for a search to select a control target(s), because the user only needs to set a search keyword(s) and an expression(s) for one or more control targets. Furthermore, the user can easily change the settings for a search to select a control target(s) so that, even if the user does not remember the correct name of a control target, the control target is more appropriately selected.
- the control section 10 is capable of carrying out search processes corresponding to various search keywords set by the user.
- the selecting section 170 selects a control target(s) in the following manner. Specifically, the selecting section 170 selects a control target(s) based on (i) a numeric value(s) (first numeric value(s)) obtained through calculation of an expression(s) for batch conversion of required words into numerical form and (ii) a numeric value(s) (second numeric value(s)) obtained through calculation of an expression(s) for batch conversion of preferred words into numerical form.
- the selecting section 170 selects a control target(s) based on the first numeric value(s) without using the second numeric value(s).
- a normal mode in which a graphics screen is displayed on the display section 20 or on the display section 32 of the external device 5 by the programmable display device 2 , the external device 5 , or the like, can transition to a debug mode.
- the control section 10 transitions from the normal mode to the debug mode in response to user operation.
- the debug mode the user can set whether an action the user wants a control target to carry out is carried out or not.
- the course-of-process display control section 180 controls the display section 20 of the programmable display device 2 to display at least one of (i) the course of a process in which the identifying section 150 identifies a search keyword(s) matching a main word(s) and (ii) the course of a process in which the selecting section 170 selects a control target(s) based on a numeric value(s) obtained through the foregoing calculation.
- the course-of-process display control section 180 also controls the display section 20 to display, during the course of the process in which the control target(s) is/are selected, the expression(s) for conversion of search keywords into numerical form. Note that the course-of-process display control section 180 may control the display section 32 of the external device 5 so that the course of the process and the expression(s) are displayed on the display section 32 .
- the user can check the course of the process in which a search keyword(s) is/are identified and the course of the process in which a control target(s) is/are selected, on the display section 20 of the programmable display device 2 and/or on the display section 32 of the external device 5 . This allows the user to check whether settings are made so that intended actions will be carried out.
- control section 10 carries out a process of controlling the display section 32 of the external device 5 to carry out display
- the control section 10 provides the processed content to the interface section 90
- the interface section 90 transmits the processed content to the interface section 33 .
- the interface section 33 provides the processed content to the control section 31
- the control section 31 carries out a process of controlling the display section 32 to carry out display in accordance with the processed content.
- the edit section 190 edits the search keyword(s) and expression(s) displayed on the display section 20 and/or on the display section 32 of the external device 5 , in response to user operation. Specifically, upon the user operation to the touch panel 30 , the edit section 190 edits the search keyword(s) and expression(s) in accordance with the user operation.
- the user can edit the search keyword(s) and expression(s) by operating the external device 5 and thereby proving an instruction to the edit section 190 of the programmable display device 2 .
- the user can download the search keyword(s) and expression(s) at a time from the programmable display device 2 and edit them on the touch panel 34 of the external device 5 , in order to improve the efficiency of editing work.
- the user uploads, to the programmable display device 2 , the search keyword(s) and expression(s) edited on the touch panel 34 .
- the edit section 190 overwrites the search keyword(s) and expression(s) stored in the user memory 40 with the edited search keyword(s) and expression(s). This configuration makes it possible for the user to edit a search keyword(s) and expression(s) displayed on the display section 20 or on the display section 32 of the external device 5 so that a more appropriate control target(s) is/are selected.
- the external device 5 includes an application program suitable for editing search keywords and expressions. This makes it possible to carry out editing more efficiently than editing on the programmable display device 2 .
- history of speeches converted into the form of text data, search results, and a list of ordered search results may be displayed on the display section 20 or on the display section 32 of the external device 5 , or may be stored in the user memory 40 or a memory (not illustrated) of the external device 5 .
- step S 7 i.e., the screen switching process, which is carried out by the control section 10 .
- the screen switching process (step S 7 ) is different from the process of text-to-speech conversion of variable (step S 6 ) in that the selecting section 170 selects a graphics screen as a control target instead of selecting a variable as a control target.
- the screen switching process (step S 7 ) is different from the process of text-to-speech conversion of variable (step S 6 ) also in that the display switching control section 200 carries out the screen switching process instead of the executing section 160 carrying out the process of text-to-speech conversion of the value(s) of a variable(s).
- the display switching control section 200 switches a graphics screen created by a user from the currently displayed graphics screen to a graphics screen selected by the selecting section 170 .
- the control section 10 may identify a to-be-displayed screen with use of the screen number or the name of the screen as a required word, without using expressions.
- step S 8 i.e., the component search process, which is carried out by the control section 10 .
- the component search process (step S 8 ) is different from the process of text-to-speech conversion of variable (step S 6 ) in that the selecting section 170 selects a component (object OB) as a control target instead of selecting a variable as a control target.
- the component search process (step S 8 ) is different from the process of text-to-speech conversion of variable (step S 6 ) also in that the display switching control section 200 carries out the screen switching process instead of the executing section 160 carrying out the process of text-to-speech conversion of the value(s) of a variable(s).
- the display switching control section 200 switches a graphics screen created by a user from (i) a graphics screen is which a component(s) as a control target(s) selected by the selecting section 170 is/are not displayed to (ii) a graphics screen in which a component(s) as a control target(s) selected by the selecting section 170 is/are displayed.
- a graphics screen in which a selected control target(s) is/are not displayed is switched to a graphics screen in which a selected control target(s) is/are displayed. This makes it possible for the user to instantly check the selected control target(s).
- a graphics screen created by the user contains a component(s) as a control target(s) selected by the selecting section 170 and that graphics screen is higher in resolution than the display section 20 or the display section 32 of the external device 5 which displays the graphics screen.
- the previously displayed graphics screen has been switched by the display switching control section 200 to the graphics screen in which the component(s) selected by the selecting section 170 is/are displayed.
- the screen movement control section 210 may control the graphics screen to move to a location at which the component(s) as a control target(s) selected by the selecting section 170 is/are displayed on the display section 20 or on the display section 32 of the external device 5 .
- FIG. 6 schematically illustrates how the graphics screen moves.
- FIG. 6 shows display on the display section 20 as an example.
- the graphics screen MP contains a component PA as a control target selected by the selecting section 170 and that the display section 20 displays a part of the graphics screen MP.
- the screen movement control section 210 controls the graphics screen MP to move to a location at which the component PA is displayed on the display section 20 .
- the screen movement control section 210 compares the coordinates of the component PA before the movement with the coordinates of a location (e.g., center) on the display section 20 on which the component PA is to be displayed, decides the amount by which the graphics screen MP is to move, and, based on the amount of movement, controls the graphics screen to move.
- the graphics screen MP is controlled to move so that the selected component PA is displayed on the display section 20 , and therefore the user can instantly check the selected component PA.
- the emphasis control section 220 may control the display section 20 of the programmable display device 2 to display, in an emphasized manner, the component PA as a control target selected by the selecting section 170 .
- the emphasis control section 220 may control the display section 32 of the external device 5 to display, in an emphasized manner, the component PA selected by the selecting section 170 .
- Examples of a method of displaying a control target in an emphasized manner include: displaying a frame enclosing the control target; displaying a frame enclosing the control target in a blinking manner; displaying an arrow indicating the control target; displaying the control target in a different color; and displaying the control target in an enlarged manner.
- the configuration of the emphasis control section 220 makes it possible for the user to easily identify the selected control target from among one or more control targets. The configuration also makes it possible to prevent the user from recognizing some other control target falsely as the selected control target.
- the screen movement control section 210 and the emphasis control section 220 may carry out processes independently of each other or may carry out processes concurrently. For example, in a case where a graphics screen MP containing many control targets is moved by the screen movement control section 210 , even if a component PA selected by the selecting section 170 is displayed on the display section 20 , the selected component PA is difficult to distinguish from other components. To address this, the component PA selected by the selecting section 170 is displayed in an emphasized manner by the emphasis control section 220 . This makes it possible for the user to easily distinguish the component PA from the other components.
- the screen display control section 230 carries out at least one of (i) a screen moving process carried out by the screen movement control section 210 and (ii) an emphasizing process carried out by the emphasis control section 220 .
- step S 9 i.e., the value setting process, which is carried out by the control section 10 .
- the value setting process (step S 9 ) is different from the process of text-to-speech conversion of variable (step S 6 ) in that the selecting section 170 selects a control target(s) for which a value(s) is/are to be set, instead of selecting a variable(s) as a control target(s).
- the value setting process (step S 9 ) is different from the process of text-to-speech conversion of variable (step S 6 ) also in that the executing section 160 carries out the process of setting a value(s) as a control target(s) instead of carrying out the process of text-to-speech conversion of the value(s) of a variable(s).
- a confirming operation is carried out to prevent wrong operation when, for example, the value setting process (step S 9 ) is carried out or control target is carried out
- the following arrangement may be employed: after a control target(s) is/are selected by the selecting section 170 , a user can carry out the confirming operation through an input means other than voice input. Examples of such a means other than voice input include: a “Confirm” button on the touch panel 30 ; and a “Confirm” switch provided external to the programmable display device 2 .
- FIG. 7 shows search conditions stored in a user memory 40 of a programmable display device 2 in accordance with Embodiment 2 of the present invention.
- the executing section 160 determines whether or not the identifying section 150 has identified a search keyword(s) matching any of the main words extracted by the extracting section 140 . Assume a case in which the executing section 160 has determined that the identifying section 150 has identified a search keyword matching one of the main words (YES). In this case, the word “cooler” is extracted as a main word by the extracting section 140 , and the logical value of the “cooler” as a required word is a first logical value (1) indicating that the identifying section 150 has determined that the required word matches one of the main words. Therefore, the executing section 160 narrows down the list of a plurality of candidate variables to “set temperature of cooler” and “current temperature of cooler”, and the process proceeds to step S 25 .
- step S 28 the executing section 160 selects an expression(s) associated with the required word(s) matching any of the main words, and carries out calculation to convert preferred words into numerical form using the selected expression(s).
- the selecting section 170 selects, as a control target, the variable “current temperature of cooler”, which is a variable for which its corresponding numeric value is greatest.
- the selecting section 170 selects a control target(s) based on the identification of a required word(s) and on the numeric value(s) obtained through calculation of an expression(s) for conversion of preferred words into numerical form.
- the selecting section 170 may select a control target(s) based on the numeric value(s) obtained through calculation of an expression(s) for conversion of required words into numerical form and on the identification of a preferred word(s).
- identification of a preferred word(s) means, similarly to the identification of a required word(s) using the logical value(s) of the required word(s) described earlier, determining whether the logical value(s) of the preferred word(s) is/are the first logical value or the second logical value.
- Control blocks of the programmable display device 2 can be realized by a logic circuit (hardware) provided in an integrated circuit (IC chip) or the like or can be alternatively realized by software.
- the control section 10 includes a computer that executes instructions of a program that is software realizing the foregoing functions.
- the computer for example, includes at least one processor and at least one computer-readable storage medium storing the program.
- An object of the present invention can be achieved by the processor of the computer reading and executing the program stored in the storage medium.
- the processor encompass a central processing unit (CPU).
- the storage medium encompass a “non-transitory tangible medium” such as a read only memory (ROM), a tape, a disk, a card, a semiconductor memory, and a programmable logic circuit.
- the computer may further include a random access memory (RAM) or the like in which the program is loaded.
- the program may be supplied to or made available to the computer via any transmission medium (such as a communication network and a broadcast wave) which allows the program to be transmitted.
- a transmission medium such as a communication network and a broadcast wave
- an aspect of the present invention can also be achieved in the form of a computer data signal in which the program is embodied via electronic transmission and which is embedded in a carrier wave.
- the present invention is not limited to the foregoing embodiments, but can be altered by a skilled person in the art within the scope of the claims.
- the present invention also encompasses, in its technical scope, any embodiment derived by combining technical means disclosed in differing embodiments.
- An information processing device in accordance with an aspect of the present invention includes: an identifying section configured to, by referring to one or more search keywords set for one or more control targets, identify at least one search keyword from among the one or more search keywords, the at least one search keyword matching any of one or more main words contained in input data acquired through voice input; and a selecting section configured such that: the at least one search keyword identified by the identifying section is referred to; and thereby the selecting section selects at least one control target from among the one or more control targets based on one or more numeric values obtained through calculation of one or more expressions each of which batch-converts, into numerical form, one or more of the one or more search keywords set for the one or more control targets.
- a control target(s) is/are selected based on the numeric value(s) obtained by conversion of a search keyword(s) into numerical form; therefore, it is possible to improve the accuracy of selection of a control target(s).
- the information processing device may be arranged such that: the identifying section is configured to, in order to identify the at least one search keyword matching any of the one or more main words, refer to the one or more search keywords set by a user; and the selecting section is configured to select the at least one control target based on the one or more numeric values obtained through calculation of the one or more expressions set by the user.
- the search keyword(s) and expression(s) can be set by the user; therefore, the user can make settings for searching using search keywords and expressions so that a control target(s) is/are appropriately selected.
- the user only needs to set a required word(s) and/or a preferred word(s) for one or more control targets and to set an expression(s).
- This enables the user to easily make settings for searching using a required word(s) and/or a preferred word(s) and freely change the required word(s), preferred word(s), and expression(s) as the user wishes.
- the information processing device is capable of carrying out searching processes corresponding to various search keywords with use of a combination of a required word(s) and a preferred word(s).
- the information processing device may further include a course-of-process display control section configured to: control a display section to display at least one of (i) course of a process in which the identifying section identifies the at least one search keyword matching any of the one or more main words and (ii) course of a process in which the selecting section selects the at least one control target based on the one or more numeric values obtained through calculation of the one or more expressions; and control the display section to display the one or more expressions during the course of the process in which the selecting section selects the at least one control target.
- a course-of-process display control section configured to: control a display section to display at least one of (i) course of a process in which the identifying section identifies the at least one search keyword matching any of the one or more main words and (ii) course of a process in which the selecting section selects the at least one control target based on the one or more numeric values obtained through calculation of the one or more expressions; and control the display section to display the one or more expression
- the user can check the course of the process in which a search keyword(s) is/are identified and the course of the process in which a control target(s) is/are selected, on the display section. This allows the user to check whether settings are made so that intended actions will be carried out.
- the information processing device may further include an edit section configured to edit, in accordance with user operation, the one or more search keywords and the one or more expressions which are displayed on the display section.
- This configuration makes it possible for the user to edit a search keyword(s) and expression(s) displayed on the display section so that a more appropriate control target(s) is/are selected.
- the information processing device may further include a screen display control section configured to carry out at least one of: an emphasizing process including controlling a display section to display, in an emphasized manner, the at least one control target selected by the selecting section; and a screen moving process including, in a case where a graphics screen created by a user contains the at least one control target selected by the selecting section and where the graphics screen is higher in resolution than the display section which displays the graphics screen, moving the graphics screen to a location at which the at least one control target selected by the selecting section is displayed on the display section.
- a screen display control section configured to carry out at least one of: an emphasizing process including controlling a display section to display, in an emphasized manner, the at least one control target selected by the selecting section; and a screen moving process including, in a case where a graphics screen created by a user contains the at least one control target selected by the selecting section and where the graphics screen is higher in resolution than the display section which displays the graphics screen, moving the graphics screen to a location at which the at least one control target selected
- the configuration makes it possible for the user to easily check the selected control target(s).
- the configuration also makes it possible to prevent the user from recognizing some other control target falsely as the selected control target. Furthermore, since the graphics screen is controlled to move so that the selected control target(s) is/are displayed on the display section, the user can instantly check the selected control target(s).
- the information processing device may further include a display switching control section configured to switch a graphics screen created by a user from a first screen to a second screen, the first screen being a screen in which the at least one control target selected by the selecting section is not displayed, the second screen being a screen in which the at least one control target selected by the selecting section is displayed.
- a display switching control section configured to switch a graphics screen created by a user from a first screen to a second screen, the first screen being a screen in which the at least one control target selected by the selecting section is not displayed, the second screen being a screen in which the at least one control target selected by the selecting section is displayed.
- the user can easily set, through the setting device, a search keyword(s) and an expression(s) for one or more targets. This makes it possible for the user to easily achieve an improvement in accuracy of selection of a control target(s).
- the user can pre-check, through the setting device, whether or not a process to select a control target(s) is carried out properly on the information processing device. Furthermore, since the identifying process and selecting process are displayed in a simulation manner on the setting device before the search keyword(s) and expression(s) are transmitted to the information processing device, the user can easily check whether or not a process to select a control target(s) is carried out properly.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computational Linguistics (AREA)
- Human Computer Interaction (AREA)
- Health & Medical Sciences (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Databases & Information Systems (AREA)
- Data Mining & Analysis (AREA)
- General Health & Medical Sciences (AREA)
- Artificial Intelligence (AREA)
- Acoustics & Sound (AREA)
- Multimedia (AREA)
- Programmable Controllers (AREA)
Abstract
Description
-
- 1 PC (setting device)
- 2 programmable display device (information processing device)
- 10 control section
- 12 display
- 13 interface section (communication section)
- 20 display section
- 32 display section
- 111 setting section
- 112 simulation section
- 150 identifying section
- 170 selecting section
- 180 course-of-process display control section
- 190 edit section
- 200 display switching control section
- 230 screen display control section
Claims (10)
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2019-209871 | 2019-11-20 | ||
JP2019209871A JP7335794B2 (en) | 2019-11-20 | 2019-11-20 | Information processing device and setting device |
Publications (2)
Publication Number | Publication Date |
---|---|
US20210149938A1 US20210149938A1 (en) | 2021-05-20 |
US11847148B2 true US11847148B2 (en) | 2023-12-19 |
Family
ID=75907673
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/006,082 Active 2041-01-20 US11847148B2 (en) | 2019-11-20 | 2020-08-28 | Information processing device and setting device |
Country Status (3)
Country | Link |
---|---|
US (1) | US11847148B2 (en) |
JP (1) | JP7335794B2 (en) |
CN (1) | CN112825077A (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117280063A (en) | 2021-05-13 | 2023-12-22 | 日本制铁株式会社 | Steel sheet for hot stamping and hot stamping molded article |
Citations (39)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08212242A (en) | 1995-02-03 | 1996-08-20 | Fujitsu Ltd | Screen generation system |
US20020174147A1 (en) * | 2000-05-19 | 2002-11-21 | Zhi Wang | System and method for transcoding information for an audio or limited display user interface |
JP2004038867A (en) * | 2002-07-08 | 2004-02-05 | Sharp Corp | Adaption information outputting method and device, adaption information outputting program, storage medium storing it |
US6697879B1 (en) * | 1996-09-06 | 2004-02-24 | J. Bryan Blundell | Computer implemented virtual sensor object and tangible medium utilizing same |
US20040153992A1 (en) * | 2000-04-04 | 2004-08-05 | Pedro Juan Molina-Moreno | Method and apparatus for automatic generation of information system user interfaces |
US20070072299A1 (en) * | 2005-09-12 | 2007-03-29 | Toshihide Orihashi | Automatc analyzer |
US7234094B2 (en) * | 2003-09-30 | 2007-06-19 | Sudhir Dattaram Kadkade | Automaton synchronization during system verification |
US20070239837A1 (en) * | 2006-04-05 | 2007-10-11 | Yap, Inc. | Hosted voice recognition system for wireless devices |
US20080104571A1 (en) * | 2001-02-15 | 2008-05-01 | Denny Jaeger | Graphical object programming methods using graphical directional indicators |
JP4221838B2 (en) * | 1998-09-30 | 2009-02-12 | 株式会社セガ | Game device, hit determination method, and information recording medium |
US7769344B1 (en) * | 1981-11-03 | 2010-08-03 | Personalized Media Communications, Llc | Signal processing apparatus and methods |
US20110182283A1 (en) * | 2010-01-27 | 2011-07-28 | Terry Lynn Van Buren | Web-based, hosted, self-service outbound contact center utilizing speaker-independent interactive voice response and including enhanced IP telephony |
US20110302551A1 (en) * | 2010-06-02 | 2011-12-08 | Hummel Jr David Martin | System and method for analytic process design |
US20120166372A1 (en) * | 2005-03-30 | 2012-06-28 | Primal Fusion Inc. | Systems and methods for applying statistical inference techniques to knowledge representations |
US20120166373A1 (en) * | 2005-03-30 | 2012-06-28 | Primal Fusion Inc. | Knowledge representation systems and methods incorporating inference rules |
US8589271B2 (en) * | 2002-02-04 | 2013-11-19 | Alexander William EVANS | System and method for verification, authentication, and notification of transactions |
JP2014002586A (en) | 2012-06-19 | 2014-01-09 | Ntt Docomo Inc | Function execution instruction system, function execution instruction method, and function execution instruction program |
US20140025660A1 (en) * | 2012-07-20 | 2014-01-23 | Intertrust Technologies Corporation | Information Targeting Systems and Methods |
US20140180798A1 (en) * | 2012-12-26 | 2014-06-26 | Richrelevance, Inc. | Contextual selection and display of information |
CN103892801A (en) * | 2012-12-26 | 2014-07-02 | 飞比特公司 | Biometric monitoring device with wrist-motion triggered display |
US20150135206A1 (en) * | 2002-05-10 | 2015-05-14 | Convergent Media Solutions Llc | Method and apparatus for browsing using alternative linkbases |
US9190054B1 (en) * | 2012-03-31 | 2015-11-17 | Google Inc. | Natural language refinement of voice and text entry |
US20150363478A1 (en) * | 2008-07-11 | 2015-12-17 | Michael N. Haynes | Systems, Devices, and/or Methods for Managing Data |
US20160006854A1 (en) | 2014-07-07 | 2016-01-07 | Canon Kabushiki Kaisha | Information processing apparatus, display control method and recording medium |
US20160048274A1 (en) * | 2014-03-26 | 2016-02-18 | Unanimous A.I., Inc. | Multi-phase multi-group selection methods for real-time collaborative intelligence systems |
US20160133246A1 (en) * | 2014-11-10 | 2016-05-12 | Yamaha Corporation | Voice synthesis device, voice synthesis method, and recording medium having a voice synthesis program recorded thereon |
US9558265B1 (en) * | 2016-05-12 | 2017-01-31 | Quid, Inc. | Facilitating targeted analysis via graph generation based on an influencing parameter |
US20170068986A1 (en) * | 2015-09-03 | 2017-03-09 | Duolingo, Inc. | Interactive sponsored exercises |
US9710544B1 (en) * | 2016-05-19 | 2017-07-18 | Quid, Inc. | Pivoting from a graph of semantic similarity of documents to a derivative graph of relationships between entities mentioned in the documents |
US20170230472A1 (en) * | 2016-02-09 | 2017-08-10 | Takashi Hasegawa | Server apparatus and transmission system |
US20170315849A1 (en) * | 2016-04-29 | 2017-11-02 | Microsoft Technology Licensing, Llc | Application target event synthesis |
US20170328733A1 (en) * | 2016-05-11 | 2017-11-16 | Yuuto GOTOH | Apparatus, system, and method of information sharing, and recording medium |
JP2017215671A (en) | 2016-05-30 | 2017-12-07 | 株式会社トラス | Building material selection device and program |
US20180222056A1 (en) * | 2017-02-09 | 2018-08-09 | Canon Kabushiki Kaisha | Method of teaching robot and robot system |
US10089965B1 (en) * | 2015-09-15 | 2018-10-02 | Simbulus, lnc. | User-controlled movement of graphical objects |
JP2018180030A (en) | 2017-04-03 | 2018-11-15 | 三菱電機株式会社 | Operation device and operation method |
US20200107891A1 (en) * | 2018-10-06 | 2020-04-09 | Sysmex Corporation | Method of remotely supporting surgery assistant robot and remote support system |
US20200118548A1 (en) * | 2018-10-15 | 2020-04-16 | Midea Group Co., Ltd. | System and method for customizing portable natural language processing interface for appliances |
US10779085B1 (en) * | 2019-05-31 | 2020-09-15 | Apple Inc. | User interfaces for managing controllable external devices |
-
2019
- 2019-11-20 JP JP2019209871A patent/JP7335794B2/en active Active
-
2020
- 2020-08-28 US US17/006,082 patent/US11847148B2/en active Active
- 2020-09-14 CN CN202010958303.XA patent/CN112825077A/en active Pending
Patent Citations (41)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7769344B1 (en) * | 1981-11-03 | 2010-08-03 | Personalized Media Communications, Llc | Signal processing apparatus and methods |
JPH08212242A (en) | 1995-02-03 | 1996-08-20 | Fujitsu Ltd | Screen generation system |
US6697879B1 (en) * | 1996-09-06 | 2004-02-24 | J. Bryan Blundell | Computer implemented virtual sensor object and tangible medium utilizing same |
JP4221838B2 (en) * | 1998-09-30 | 2009-02-12 | 株式会社セガ | Game device, hit determination method, and information recording medium |
US20040153992A1 (en) * | 2000-04-04 | 2004-08-05 | Pedro Juan Molina-Moreno | Method and apparatus for automatic generation of information system user interfaces |
US20020174147A1 (en) * | 2000-05-19 | 2002-11-21 | Zhi Wang | System and method for transcoding information for an audio or limited display user interface |
US20080104571A1 (en) * | 2001-02-15 | 2008-05-01 | Denny Jaeger | Graphical object programming methods using graphical directional indicators |
US8589271B2 (en) * | 2002-02-04 | 2013-11-19 | Alexander William EVANS | System and method for verification, authentication, and notification of transactions |
US20150135206A1 (en) * | 2002-05-10 | 2015-05-14 | Convergent Media Solutions Llc | Method and apparatus for browsing using alternative linkbases |
JP2004038867A (en) * | 2002-07-08 | 2004-02-05 | Sharp Corp | Adaption information outputting method and device, adaption information outputting program, storage medium storing it |
US7234094B2 (en) * | 2003-09-30 | 2007-06-19 | Sudhir Dattaram Kadkade | Automaton synchronization during system verification |
US20120166372A1 (en) * | 2005-03-30 | 2012-06-28 | Primal Fusion Inc. | Systems and methods for applying statistical inference techniques to knowledge representations |
US20120166373A1 (en) * | 2005-03-30 | 2012-06-28 | Primal Fusion Inc. | Knowledge representation systems and methods incorporating inference rules |
US20070072299A1 (en) * | 2005-09-12 | 2007-03-29 | Toshihide Orihashi | Automatc analyzer |
US20070239837A1 (en) * | 2006-04-05 | 2007-10-11 | Yap, Inc. | Hosted voice recognition system for wireless devices |
US20150363478A1 (en) * | 2008-07-11 | 2015-12-17 | Michael N. Haynes | Systems, Devices, and/or Methods for Managing Data |
US20110182283A1 (en) * | 2010-01-27 | 2011-07-28 | Terry Lynn Van Buren | Web-based, hosted, self-service outbound contact center utilizing speaker-independent interactive voice response and including enhanced IP telephony |
US20110302551A1 (en) * | 2010-06-02 | 2011-12-08 | Hummel Jr David Martin | System and method for analytic process design |
US9190054B1 (en) * | 2012-03-31 | 2015-11-17 | Google Inc. | Natural language refinement of voice and text entry |
US20150142448A1 (en) | 2012-06-19 | 2015-05-21 | Ntt Docomo, Inc. | Function execution instruction system, function execution instruction method, and function execution instruction program |
JP2014002586A (en) | 2012-06-19 | 2014-01-09 | Ntt Docomo Inc | Function execution instruction system, function execution instruction method, and function execution instruction program |
US20140025660A1 (en) * | 2012-07-20 | 2014-01-23 | Intertrust Technologies Corporation | Information Targeting Systems and Methods |
CN103892801A (en) * | 2012-12-26 | 2014-07-02 | 飞比特公司 | Biometric monitoring device with wrist-motion triggered display |
US20140180798A1 (en) * | 2012-12-26 | 2014-06-26 | Richrelevance, Inc. | Contextual selection and display of information |
US20160048274A1 (en) * | 2014-03-26 | 2016-02-18 | Unanimous A.I., Inc. | Multi-phase multi-group selection methods for real-time collaborative intelligence systems |
JP2016019070A (en) | 2014-07-07 | 2016-02-01 | キヤノン株式会社 | Information processing device, display control method, computer program, and recording medium |
US20160006854A1 (en) | 2014-07-07 | 2016-01-07 | Canon Kabushiki Kaisha | Information processing apparatus, display control method and recording medium |
US20160133246A1 (en) * | 2014-11-10 | 2016-05-12 | Yamaha Corporation | Voice synthesis device, voice synthesis method, and recording medium having a voice synthesis program recorded thereon |
US20170068986A1 (en) * | 2015-09-03 | 2017-03-09 | Duolingo, Inc. | Interactive sponsored exercises |
US10089965B1 (en) * | 2015-09-15 | 2018-10-02 | Simbulus, lnc. | User-controlled movement of graphical objects |
US20170230472A1 (en) * | 2016-02-09 | 2017-08-10 | Takashi Hasegawa | Server apparatus and transmission system |
US20170315849A1 (en) * | 2016-04-29 | 2017-11-02 | Microsoft Technology Licensing, Llc | Application target event synthesis |
US20170328733A1 (en) * | 2016-05-11 | 2017-11-16 | Yuuto GOTOH | Apparatus, system, and method of information sharing, and recording medium |
US9558265B1 (en) * | 2016-05-12 | 2017-01-31 | Quid, Inc. | Facilitating targeted analysis via graph generation based on an influencing parameter |
US9710544B1 (en) * | 2016-05-19 | 2017-07-18 | Quid, Inc. | Pivoting from a graph of semantic similarity of documents to a derivative graph of relationships between entities mentioned in the documents |
JP2017215671A (en) | 2016-05-30 | 2017-12-07 | 株式会社トラス | Building material selection device and program |
US20180222056A1 (en) * | 2017-02-09 | 2018-08-09 | Canon Kabushiki Kaisha | Method of teaching robot and robot system |
JP2018180030A (en) | 2017-04-03 | 2018-11-15 | 三菱電機株式会社 | Operation device and operation method |
US20200107891A1 (en) * | 2018-10-06 | 2020-04-09 | Sysmex Corporation | Method of remotely supporting surgery assistant robot and remote support system |
US20200118548A1 (en) * | 2018-10-15 | 2020-04-16 | Midea Group Co., Ltd. | System and method for customizing portable natural language processing interface for appliances |
US10779085B1 (en) * | 2019-05-31 | 2020-09-15 | Apple Inc. | User interfaces for managing controllable external devices |
Non-Patent Citations (1)
Title |
---|
Office Action issued for JP Application No. 2019-209871 dated Mar. 22, 2023, with English translation, 5 pages. |
Also Published As
Publication number | Publication date |
---|---|
US20210149938A1 (en) | 2021-05-20 |
CN112825077A (en) | 2021-05-21 |
JP7335794B2 (en) | 2023-08-30 |
JP2021081622A (en) | 2021-05-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
TWI510965B (en) | Input method editor integration | |
CN105931644A (en) | Voice recognition method and mobile terminal | |
JPS6091450A (en) | Table type language interpreter | |
US20160125037A1 (en) | Information processing apparatus, information processing method, information processing program, and storage medium | |
CN114116441A (en) | UI (user interface) testing method and device, electronic equipment and storage medium | |
US11847148B2 (en) | Information processing device and setting device | |
CN115964115B (en) | Numerical control machine tool interaction method based on pre-training reinforcement learning and related equipment | |
CN105867645A (en) | Code input method for digital control system and code format arrangement method | |
US12032937B2 (en) | Programming support program for preventing work efficiency from lowering as a result of conversion | |
KR101565499B1 (en) | Data processing apparatus, data processing program, recording medium | |
JP2005173999A (en) | Device, system and method for searching electronic file, program, and recording media | |
CN115455961A (en) | Text processing method, device, equipment and medium | |
US20160292237A1 (en) | Numerical controller with ambiguous search function in program | |
US20190317664A1 (en) | Display control apparatus, non-transitory recording medium and display controlling method | |
CN112654940A (en) | Program generation device, control method for program generation device, control program, and recording medium | |
KR20210029463A (en) | Apparatus and method for automatically creating a list of materials on drawings for national defense development item | |
JP7357030B2 (en) | Communication terminal, program, and display method | |
CN112596475B (en) | System safety analysis system based on process control | |
JP7541172B1 (en) | Information generating device, information generating method, and program | |
JP6364786B2 (en) | Design document management program, design document management method, and design document management apparatus | |
JP7474295B2 (en) | Information processing system, information processing method, and program | |
US10650105B2 (en) | Method and system for automatically translating process instructions | |
US20230325205A1 (en) | System and computer-implemented method to generate a configuration for external datapoint access | |
JP2010157166A (en) | Device, system and method for lot tracing, and program | |
CN117311668A (en) | Method, device, equipment and storage medium for generating technical requirements of intelligent instrument |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: SCHNEIDER ELECTRIC JAPAN HOLDINGS LTD., JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:TERADA, TORU;REEL/FRAME:053630/0598 Effective date: 20200708 |
|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT RECEIVED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |