WO2000019307A1 - Procede et dispositif d'interaction de traitement - Google Patents

Procede et dispositif d'interaction de traitement Download PDF

Info

Publication number
WO2000019307A1
WO2000019307A1 PCT/JP1998/004295 JP9804295W WO0019307A1 WO 2000019307 A1 WO2000019307 A1 WO 2000019307A1 JP 9804295 W JP9804295 W JP 9804295W WO 0019307 A1 WO0019307 A1 WO 0019307A1
Authority
WO
WIPO (PCT)
Prior art keywords
output
information
data processing
input
output information
Prior art date
Application number
PCT/JP1998/004295
Other languages
English (en)
Japanese (ja)
Inventor
Yasuharu Nanba
Tomohiro Murata
Hirokazu Aoshima
Original Assignee
Hitachi, Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi, Ltd. filed Critical Hitachi, Ltd.
Priority to PCT/JP1998/004295 priority Critical patent/WO2000019307A1/fr
Publication of WO2000019307A1 publication Critical patent/WO2000019307A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display

Definitions

  • the present invention relates to a technology for using a user interface such as a personal computer and a portable information device. More specifically, the present invention relates to a microphone, a camera, a pen, a mouse, a keyboard, and the like as input devices for voice, handwritten, and printed characters. It controls computers and application software on behalf of the user based on instructions using multiple input modalities such as image images, gestures, pen gestures, etc., and uses a display, speaker, robot arm, carrier, etc. as an output device and audio. The present invention relates to a dialogue processing method and a dialogue processing device that responds with a plurality of output modalities such as text, stationary surfaces, moving images, and sound effects. Background art
  • An object of the present invention is to provide a plurality of data processing means so that an output result requiring simple calculation processing is output to a user in a short time, and an output result requiring complicated calculation processing is output to a user after an appropriate time. In this way, it is possible to obtain output results corresponding to the processing time from a single data / instruction input by the user, and to increase the efficiency of interaction. It is an object of the present invention to provide a dialog processing method and a dialog processing apparatus which can easily select an output result in real time.
  • Another object of the present invention is to sequentially report the intermediate result to the user, thereby achieving a response-free time.
  • Dialogue processing and dialogue processing that can reduce the anxiety felt by the user by shortening the time, make the behavior of the dialogue processing itself understandable, and allow the user to unconsciously learn an efficient data input method. It is to provide a device.
  • a plurality of data processing means capable of responding to a single data are prepared, and these are operated in parallel. Each time output information based on the plurality of data processes is obtained, a response is output to the user. Therefore, a quick response is returned from the data processing means consisting of only the sub-means that does not require much effort, and the sub-means that take a long time to respond to perform complicated calculations may respond after a certain amount of time. it can. Since the response contents required for the user are returned in a time commensurate with the response time, the no-response time is minimized, and therefore, it is possible to reduce the user's feeling of irritability and inconvenience, and the problem ( 1) is solved.
  • FIG. 1 is a diagram showing Embodiment 1 of the present invention
  • FIG. 2 is a diagram showing Embodiment 2 of the present invention
  • FIG. 3 is a diagram showing Embodiment 3 of the present invention
  • FIG. 9 is a diagram showing Example 4.
  • FIG. 1 shows an embodiment of the present invention.
  • a plurality of input means (101 1 103) and data processing of input information from the plurality of input means (101 to 103) are performed.
  • a plurality of output means (10 7 1) for outputting output information from the plurality of data processing means (10 4 -10 6).
  • Each of these means (101111) is realized by combining software and software in a computer system. Some means may be realized by sharing the resources of one computer system.
  • each of these means (1101109) is connected to a processor, storage means for temporarily storing data to be processed by the processor (such as a memory or an external storage device), and other means. Communication means for transferring data may be provided.
  • processors, storage means, and communication means may use existing technologies used in ordinary computer systems, their details will not be described here. Incidentally, it is desirable that each of these means (101111) be processed independently in parallel or in parallel in order to effectively utilize the present invention. Techniques for processing multiple means in parallel or in parallel are also known as parallel processing and distributed processing. Since the technique may be used, the details will not be described here.
  • the input means (101 to 103) in the present embodiment includes, in addition to input means of a general computer system (keyboard / mouse, etc.), means for inputting a user's voice such as a microphone or a handset, Touchno ,.
  • a means of inputting user movements such as a Nerve computer data globe, or input of information (eg, printed matter) or a user (eg, facial expression, gesture, etc.) viewed by the user, such as an image scanner or video camera There are means to do so.
  • the effects of the present invention are not limited to the input means enumerated in this way. For example, even if a sensor for detecting a user's body temperature, pulse, brain wave, or the like is used as an input means, it will be easily understood by those skilled in the art from the specification that the present invention can be implemented.
  • the data processing means (104 to 106) in this embodiment receives the input information received by such input means (101 to 103). What is desired here is that, for example, the input information received by a certain input unit 101 is usually transferred to at least two or more data processing units. The present invention can be implemented even when it is delivered to one data processing means, but is not desirable in terms of the effects intended by the present invention. Now, each data processing means independently processes in parallel or in parallel. An example of the data processing means (104 to 106) will be described below. In addition, between the sub-units, which are the components of each data processing unit described below, the processing may be performed via a storage unit or a communication unit in the processing of the computer system.
  • Example 1 of data processing means Input information from input means having an input device such as a microphone, a camera, a pen, a keyboard, etc. is stored in a reception buffer, and the input information is transmitted as output information from a speaker / guffer to a speaker. And output to an output device such as a display with an output device. For example, audio input from a microphone is output from a speaker. Alternatively, the image data input from the camera is displayed on the display. Alternatively, stroke data entered from a pen is displayed in ink format as a trajectory (handwriting).
  • a recognition process is performed on input information input in the same manner as in example 1 of data processing means, and the recognition result is output in the same manner as in example 1 of data processing means.
  • a speech recognition result is output for speech.
  • it outputs an image recognition result from the image data, in particular, an OCR character recognition result of a character image captured in the image data.
  • a handwritten character recognition result is output from the stroke data.
  • these recognition results are once converted to codes (character codes, etc.) and then output in another modality.
  • the voice input is displayed in characters, or the voice is synthesized and output from the image input.
  • modality in the present application is used to mean “the type of exchange channel used by a person and a computer for communication”. Specifically, for input, voice, handwriting stroke, image surface image, keyboard typing, gesture gesture, etc.For output, synthesized voice, character display, graphic display, response to abbreviated program operation, etc. Means
  • Example 3 of data processing means A semantic analysis process is performed on the recognition processing result recognized in the same manner as in Example 2 of the data processing means, and a command is generated based on the semantic analysis result.
  • the output device and the application program are operated and responded in the same manner as in Example 2 of the data processing means based on the data. For example, if you type “Send mail” by voice, a message “Start mail system” appears in the window, and the mail system that is the application system starts.
  • Example 4 of data processing means An intention analysis is performed on the semantic analysis results analyzed in the same manner as in Example 3 of the data processing means, and a response strategy is determined based on the intention analysis results. Then, based on the determined response joint result, the data is output in the same manner as in Example 3 of the overnight processing means. For example, if the character “Return” is input by hand and the voice is also input “How is this method?” At the same time, the response strategy is “Describe the method of returning”. Animated icons and output help messages with synthesized speech synchronized with them.
  • the output means (107 to 109) in the present embodiment includes, for example, a speech synthesizer, a robot arm, and the like in addition to the output means (display, speaker, etc.) of a normal computer system.
  • a speech synthesizer when displaying on a display, there are various formats such as text, still images, moving images, icons, and combinations thereof.
  • a speaker for output when using a speaker for output, there are also modes such as BGM, sound effects, voice, and combinations thereof. In the present invention, these may be divided into different output units, or some of the methods may be combined and implemented as one output unit. Further, the effects of the present invention are not limited to the output means thus enumerated.
  • a response may be indirectly output by controlling the application system or issuing a command to the database or the database system. In this sense, it will be easily understood by those skilled in the art from this specification that the present invention can be implemented even if the television air conditioner is used as an output unit.
  • the same effect as described above can be obtained when the data processing means performs processing for outputting an intermediate result of the sub-means.
  • various interpretation processes on the computer system side are performed. By notifying the user of the results of the process one after another, the non-response time can be shortened to reduce the anxiety felt by the user, and the behavior of the dialog processing itself can be understood. Further, by receiving such a response, the user can learn what input means can be used to obtain a desired computer system response.
  • each means can be implemented by an independent program (or device).
  • Each means can be individually distributed as a computer system.
  • Such implementations can usually directly use techniques such as distributed processing and age-oriented programming. These techniques are well known to those skilled in the art and will not be described further.
  • FIG. 2 shows another embodiment of the present invention.
  • a plurality of input means 101 to 103
  • a plurality of these input means 101 to
  • Input information management means (210) for receiving input information from the input information processing means (210), and a plurality of data processing means (104-104) for processing the input information from the input information management means (201).
  • Example 1 Main differences from Example 1 The difference is that an input information management means (201) and an output information management means (202) are provided. The other means are the same as those described in the first embodiment, and therefore will not be described here.
  • Each of these means includes a processor, storage means for temporarily storing data to be processed by the processor (such as a memory and an external storage device), and communication means for transferring data to other means. May be provided.
  • the input information management means (201) in this embodiment receives all input information from the plurality of input means (101 to 103).
  • the input information is passed to an appropriate (single or plural) data processing means based on classification information determined in advance for each input means or separation information derived from the input information.
  • the classification information determined in advance for each input means is information on the data processing means to be delivered according to the input means.
  • the data processing means (104-: L06) is associated in advance with data processing means including speech recognition and speaker recognition from among the data processing means (104-: L06), and data which does not include these is stored. It is determined in advance that there is no correspondence with the processing means (for example, a data processing means comprising only character recognition means and the like).
  • the classification information that can be derived from the input information is information relating to a data processing unit to which the input information is to be delivered to specific input information in advance. For example, in the case of input information having an unambiguous intention such as an emergency stop, it is determined that the data processing means (104 to 106) should be preferentially delivered to a specific data processing means. .
  • the input information is delivered to an appropriate data processing unit for each input unit. In other words, it is possible to avoid malfunction of the data processing means and reduce the value of error check processing and error processing.
  • the output information management means (202) in the present embodiment receives all the output information from the plurality of data processing means (104 to 106).
  • the output information is appropriately determined based on the classification information determined in advance for each output means and the classification information derived from the output information.
  • Information about the data processing means to be delivered according to the output means For example, in the case of an output unit called a speaker, a data processing unit including voice synthesis and sound effect synthesis is previously associated with one of the data processing units (104 to 106), and a data processing unit that does not include these is provided. It is determined in advance that there is no correspondence with the means (for example, the data processing means consisting only of the moving image generation means).
  • the classification information that can be derived from the output information is information on an output unit to which the output information is to be delivered to specific output information in advance. For example, in the case of output information having a unique control content such as a reduction in volume, it is determined that output information (107 to 109) should be preferentially delivered to a specific output means.
  • the output information is delivered to an appropriate data processing unit for each output unit. That is, it is possible to avoid a malfunction of the output unit and reduce the load of the error check processing and the error processing.
  • Example 2 of data processing means Recognizing voice, A process of responding to the result of conversion into a character string of “mail”.
  • Example 3 of data processing means Interpret the contents of voice as a command to the application system, and respond, for example, to activate a mail system.
  • Processing (Example 4 of data processing means) Processing that interprets the intention of the voice and responds with a help suggestion such as "Who will you send it to?"
  • Such a plurality of data processing means Data processing results from the output information management means (202) can be sequentially transferred to appropriate output means.
  • the output information management means (202) sequentially and sequentially transfers the data processing results to the output means
  • the output information management means (202) is based on priority information determined in advance for each data processing means. Then, the output information management means (202) may interrupt the transfer of the data processing results to the output means sequentially and sequentially. For example, it is assumed that the priorities are determined in the order of (Example 1 of data processing means) to (Example 4 of data processing means).
  • the delivery of the processing result to the output means is interrupted, and the delivery of the data processing result of (Example 4 of the data processing means) to the output means is started.
  • the priority is higher than that of the data processing result of (Example 4 of data processing means). Because it is low, do not interrupt. In this way, by determining the priority for each data processing means, even if the data being output is interrupted earlier in time, more appropriate data processing obtained later can be output. In the above example, for the sake of simplicity, the priority is determined in advance for each data processing means, but depending on the work situation, designation from the user, the content of the input data, etc.
  • the priority information may be changed.
  • the output information management means (202) performs data processing. If the input of predetermined confirmation instruction information (for example, pressing the “Ok” button) is performed while the results are sequentially and sequentially delivered to the output means, the output information management means In (202), the delivery of the data processing results to the output means may be interrupted. At this time, information such as the time of occurrence, the place of occurrence, and the person who accompanied the confirmation instruction information may be used to determine whether or not to actually interrupt. In this way, it is possible to prevent the output information management means (202) from inadvertently continuing to successively deliver data processing results to the output means forever.
  • the output information management means (202) includes storage means (memory or memory) for storing output information from a plurality of data processing means (104 to: I06).
  • storage means memory or memory
  • option information corresponding to the output information is passed to the output means (107-109), and The predetermined selection instruction information corresponding to the option information is input again by the input means.
  • the output information management means (202) has, as option information,
  • Option 1 Voice playback
  • Option 2 Voice recognition result output
  • Option 3 Interpretation as command
  • Option 4 Interpretation of intention
  • the output information management means (202) outputs the corresponding data processing result. That is, for example, a process of outputting a start command to the mail system, which is a result of interpreting the contents of the voice of “mail” as a command, is executed.
  • the option information is output in a lump and the user selects it, thereby allowing the user to select the data processing result originally intended.
  • the user can take appropriate data processing according to the task at that time. It can be carried out.
  • the options output in this way allow the user to clearly and clearly understand the results and behavior during the conversation processing (the power of speech recognition being successful but not understanding the meaning). There is also a side effect that the user can understand the data input method suitable for the user.
  • FIG. 3 shows another embodiment of the present invention.
  • This embodiment is an example in which, when there are sub-means that can be commonly used in the data processing means (104 to 106) in FIGS. 1 and 2, these are combined into one.
  • the input means (10;! To 103), the input information management means (201), the output information management means (202), and the output means (107 to 109) are almost the same as those described so far, so the detailed description will be omitted. Omit.
  • the data processing means 1 (301) is composed of input information management means (201) and output information management means (202) as sub-means.
  • the input information management means (201) receives input information from the plurality of input means (101 to 103) and passes the input information to the input information recognition means (303) and the output information management means (202).
  • the output information management means (202) transfers the data processing result transferred from the input information management means (201) or the output information synthesis means (304) to the plurality of output means (107-109).
  • the data processing means 2 (302) is a sub-means of the data processing means 1 (301) (that is, an input information management means (201) and an output information management means (202)), and has input information recognition as a sub-means. Means (303) and output information synthesizing means (304).
  • the input information recognizing means (303) receives the input information from the input information managing means (201), performs a recognition process, and delivers the recognition result of the input information to the semantic analyzing means (306) and the output information synthesizing means (304). .
  • the output information synthesizing means (304) receives the data processing result from the input information recognition means (303) or the command generating means (307), performs a synthesizing process, and delivers the data processing result to the output information managing means (202). .
  • the data processing means 3 (305) is a sub-means of the data processing means 2 (302) (ie, input information management means (201), output information management means (202), and input information recognition means).
  • semantic analysis means (306) and command generation means (307) are sub-means.
  • the semantic analysis means (306) receives the recognition result of the input information from the input information recognition means (303), performs a semantic analysis process, and delivers the semantic analysis result to the intention analysis means (309) and the command generation means (307).
  • the command generating means (307) receives the data processing result from the semantic analyzing means (306) or the response planning means (310), performs a command generating process, and delivers the data processing result to the output information synthesizing means (304).
  • the data processing means 4 (308) is a sub-means of the data processing means 3 (305) (ie, input information management means (201), output information management means (202), input information recognition means (303), and output
  • the semantic analyzing means (306), and the command generating means (307) it is composed of intention analyzing means (309) and response planning means (310) as sub-means.
  • the intention analysis means (309) receives the result of the semantic analysis of the input information from the semantic analysis means (306), analyzes the intention, and delivers the result to the response planning means (310).
  • the response planning means (310) receives the data processing result from the intention analysis means (309), drafts a response, and outputs the data processing result to the command generation means (30).
  • each of these means and sub-means is used to store data in a processor, storage means (memory, external storage device, etc.) for temporarily storing data processed by the processor, and other means. It's okay to have communication means for handing over.
  • the processing route of the data processing means in this embodiment is substantially equivalent to the following four routes. That is, (Processing path of data processing means 1) Input information management means (201) Processing path to output information management means (202) immediately
  • Processing path 2 of data processing means Processing path from input information management means (201) to input information recognition means (303), to output information synthesis means (304), and to output information management means (202)
  • Processing path 4 of data processing means From input information management means (201), through input information recognition means (303), through semantic analysis means (306), through intention analysis means (309), and response planning means ( 310), through the command generation means (307), through the output information configuration means (304), and to the output information management means (202).
  • FIG. 4 shows another embodiment of the present invention.
  • This embodiment is an example of implementation called multi-modal interaction processing or multi-modal interaction processing that simultaneously handles a plurality of data processing means while having common sub-means as shown in FIG.
  • the voice input device (101) is one of input means, and is a device for inputting voice from a user via a microphone, a transmitter, or the like.
  • the stroke input device 402) is one of the input means, and is a device for inputting a stroke by a user's hand through a pen, a tablet touch panel, or the like.
  • the image input device (103) is one of input means, and is a device for inputting data of a printed material or the like viewed by a user via a image scanner, a CCD camera, or the like.
  • Text input device (104) input It is one of the means, and is a device in which a user inputs text via a keyboard or the like.
  • input means such as a mouse, a data glove, and a line-of-sight recognition device are connected in the same manner as the above-described input device, it will be understood by those skilled in the art that this specification can be implemented by those skilled in the art. It will be easy to understand if you read it.
  • the data processing means 1 includes input media control means (406) and output media control means (407) as sub-means.
  • the input media control means (406) controls the input devices (that is, the voice input device (101), the stroke input device (102), the image input device (103), and the text input device (104)). Receives raw input information from each input device, such as data, stroke data (usually a row of coordinates), image data, character codes, etc., formats the data, controls individual modality recognition means (409) and output media control Hand over to means (407).
  • it may be executed as a separate processing program for each input device.
  • the output media control means (407) controls output devices (that is, a voice synthesizer (418), an icon control device (419), a human agent control device (420), and an application control device (421)), and The raw data passed from the media control means (406), the control sequence of the speech synthesizer passed from the individual modality generation means (410), the event sequence to the window system, and the anthropomorphic agent And output information such as application commands, etc. to each output device.
  • a voice synthesizer (418)
  • an icon control device (419
  • a human agent control device 420
  • an application control device 4211
  • output devices that is, a voice synthesizer (418), an icon control device (419), a human agent control device (420), and an application control device (421)
  • the data processing means 2 is separate from the data processing means 1 (405) in addition to the sub-means (that is, the input media control means (406) and the output media control means (407)).
  • Individual modality recognition means (409) controls input media Receiving the voice data, stroke data, image data, and character codes as input information from the means (406), and recognizing them as a voice language, a handwritten character, a print character, and a character code, respectively; (415) and intermodality recognition adjustment means (412).
  • the preferred recognition algorithm / recognition unit differs for each modality, different implementation forms may be used as programs. In addition, these recognition processes may use existing technologies.
  • the individual modality generating means (410) converts the recognition result passed from the individual modality recognizing means (409) and the output information passed from the inter-modality response adjusting means (413) into a control sequence for the speech synthesizer. It converts the event sequence for the window system, the command for the anthropomorphic agent, the command for the application, etc. into the output information and transfers it to the output media control means (407).
  • the data processing means 3 (411) is a sub-means of the data processing means 2 (408) (ie, input media control means (406), individual modality recognition means (409), and individual modality generation means (410).
  • intermodality recognition adjustment means (412) determines an appropriate combination based on a plurality of recognition results from the individual modality recognition means (409), and uses this processing result with the semantic analysis means (415) and inter-modality response adjustment. Hand over to means (413).
  • the process for determining an appropriate combination is, for example, conversion into a data structure independent of each modality (for example, a recognition lattice structure), and a collection of components for each modality called a “dictionary”, This is done by evaluating the recognition results based on a set of rules for the temporal and positional arrangement of the components called a grammar, and determining a candidate combination of multiple recognition results.
  • a grammar a set of rules for the temporal and positional arrangement of the components
  • other processing methods may be used.
  • the inter-modality response adjusting means (413) is configured to output the appropriately combined recognition result passed from the inter-modality recognition adjusting means (412).
  • the data processing means 4 is a sub-means of the data processing means 3 (411) (ie, input media control means (406), individual modality recognition means (409), and intermodality recognition adjustment means (412). ), Intermodality response adjustment means (413), individual modality generation means (410), and output media control means (407)), as semantic analysis means (415), objective estimation means (416), and strategy determination means. (4 17) and power.
  • the semantic analyzing means (415) receives the processing result from the individual modality recognizing means (409), analyzes the semantics, and transfers the result to the purpose estimating means (416).
  • the purpose estimating means (417) receives the result of the semantic analysis from the semantic analyzing means (415), estimates the purpose, and transfers it to the strategy determining means (417).
  • the strategy determining means (417) receives the objective estimation result from the objective estimating means (417), determines a response strategy, derives information to be responded to the user, and delivers the information to the inter-modality response adjusting means (413).
  • the voice synthesizing device (418) is one of the output means, and performs voice synthesis based on the control sequence received from the output media control means (407).
  • the icon control device (419) is one of the output means, and performs a display control of icons and the like in the window system based on the event sequence received from the output media control means (407).
  • the anthropomorphic agent control device (420) is one of the output means, and is a device for controlling the anthropomorphic agent based on an instruction received from the output media control means (407).
  • the application control device (412) is one of output means, and is a device for controlling an application based on an application command or the like received from the output media control means (407).
  • output means such as projectors, transport vehicles, robot arms, etc. It will be easily understood by those skilled in the art from this specification that the present invention can be practiced even if the devices are connected in the same manner as the output device.
  • single or plural input data are processed by plural data processing means. That is, when one or a plurality of input data passes through the data processing means constituted by only the sub-means which does not require much effort (for example, as in the case of the data processing means 1 (405)). In the case of low-level processing paths, if a response is returned quickly and data processing involves sub-means that require time-consuming responses to perform complex calculations (for example, data processing means 4 (4 1 4) (A high level treatment route such as) will respond after some time.
  • the non-response time of the application becomes very short, and the remarkable effect of reducing the user's irritability and inconvenience can be reduced.
  • an output result requiring simple calculation processing can be obtained in a short time, and an output result requiring complicated calculation processing can be obtained.
  • an output result commensurate with the processing time can be obtained from a single data Z command input by the user, and an interaction processing method capable of increasing the efficiency of interaction. It is possible to provide a dialog processing apparatus and a dialog processing method and a dialog processing apparatus capable of easily selecting an output result according to a user's needs in real time.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un procédé dans lequel une pluralité de moyens de traitement de données traitant chacun une seule donnée sont utilisés en parallèle et une réponse est envoyée à l'utilisateur chaque fois qu'une information de sortie est obtenue en résultat du traitement de cette pluralité de données. Les résultats du traitement peuvent être sortis à l'intention de l'utilisateur par diverses voies, en une seule fois ou de manière consécutive, de manière à permettre une sélection des résultats du traitement de données.
PCT/JP1998/004295 1998-09-25 1998-09-25 Procede et dispositif d'interaction de traitement WO2000019307A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP1998/004295 WO2000019307A1 (fr) 1998-09-25 1998-09-25 Procede et dispositif d'interaction de traitement

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP1998/004295 WO2000019307A1 (fr) 1998-09-25 1998-09-25 Procede et dispositif d'interaction de traitement

Publications (1)

Publication Number Publication Date
WO2000019307A1 true WO2000019307A1 (fr) 2000-04-06

Family

ID=14209063

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP1998/004295 WO2000019307A1 (fr) 1998-09-25 1998-09-25 Procede et dispositif d'interaction de traitement

Country Status (1)

Country Link
WO (1) WO2000019307A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005321730A (ja) * 2004-05-11 2005-11-17 Fujitsu Ltd 対話システム、対話システム実行方法、及びコンピュータプログラム
JP2016512364A (ja) * 2013-03-15 2016-04-25 クアルコム,インコーポレイテッド ジェスチャを使用して処理モードを切り替えるためのシステムおよび方法
JP2020507165A (ja) * 2017-11-21 2020-03-05 ジョンアン インフォメーション テクノロジー サービシズ カンパニー リミテッド データ可視化のための情報処理方法及び装置

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS58219643A (ja) * 1982-06-16 1983-12-21 Hitachi Ltd シミユレ−シヨン実行管理方式
JPS62213949A (ja) * 1986-03-14 1987-09-19 Hitachi Ltd 工程監視の高速処理装置
JPH08263258A (ja) * 1995-03-23 1996-10-11 Hitachi Ltd 入力装置、入力方法、情報処理システムおよび入力情報の管理方法

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS58219643A (ja) * 1982-06-16 1983-12-21 Hitachi Ltd シミユレ−シヨン実行管理方式
JPS62213949A (ja) * 1986-03-14 1987-09-19 Hitachi Ltd 工程監視の高速処理装置
JPH08263258A (ja) * 1995-03-23 1996-10-11 Hitachi Ltd 入力装置、入力方法、情報処理システムおよび入力情報の管理方法

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
YASUHARU NANBA, SHUNICHI TANO, HIROYAKI KINUKAWA: "Semantic Analysis Utilizing Fusibility of Specific Attribute of Multimobal Data (in Japanese)", THE TRANSACTION OF INFORMATION PROCESSING SOCIETY OF JAPAN, vol. 38, no. 7, 15 July 1997 (1997-07-15), pages 1441 - 1453, XP002927046 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005321730A (ja) * 2004-05-11 2005-11-17 Fujitsu Ltd 対話システム、対話システム実行方法、及びコンピュータプログラム
JP2016512364A (ja) * 2013-03-15 2016-04-25 クアルコム,インコーポレイテッド ジェスチャを使用して処理モードを切り替えるためのシステムおよび方法
JP2020507165A (ja) * 2017-11-21 2020-03-05 ジョンアン インフォメーション テクノロジー サービシズ カンパニー リミテッド データ可視化のための情報処理方法及び装置

Similar Documents

Publication Publication Date Title
US6384829B1 (en) Streamlined architecture for embodied conversational characters with reduced message traffic
US6513011B1 (en) Multi modal interactive system, method, and medium
US20030009330A1 (en) Communication terminal controlled through touch screen or voice recognition and instruction executing method thereof
Hasegawa et al. Active agent oriented multimodal interface system
CN109003608A (zh) 庭审控制方法、系统、计算机设备及存储介质
EP0962014B1 (fr) Dispositif de reconnaissance vocale utilisant un lexique de commandes
GB2378776A (en) Apparatus and method for managing a multi-modal interface in which the inputs feedback on each other
JPH11249773A (ja) マルチモーダルインタフェース装置およびマルチモーダルインタフェース方法
JP3753882B2 (ja) マルチモーダルインターフェース装置及びマルチモーダルインターフェース方法
JP2001100878A (ja) マルチモーダル入出力装置
US20100223548A1 (en) Method for introducing interaction pattern and application functionalities
CN102473047A (zh) 能够允许独立触摸输入的辅助触摸显示系统和用于辅助触摸显示器的独立触摸输入方法
JP2004192653A (ja) マルチモーダルインタフェース装置およびマルチモーダルインタフェース方法
JP3822357B2 (ja) マルチモーダル入出力装置のインタフェース装置及びその方法
JPH06131108A (ja) 情報入力装置
WO2000019307A1 (fr) Procede et dispositif d'interaction de traitement
US20080109227A1 (en) Voice Control System and Method for Controlling Computers
CN107770253A (zh) 远程控制方法及系统
CN110955331A (zh) 一种基于计算机虚拟界面的人机交互系统
CN117971154A (zh) 多模态响应
US20090125640A1 (en) Ultrasmall portable computer apparatus and computing system using the same
JP2003228449A (ja) 対話装置及び対話処理プログラムを記録した記録媒体
JP2985785B2 (ja) 人物動作対話システム
TW200844769A (en) Manipulation device using natural language entry
JP2002108388A (ja) 対話装置及び対話処理プログラムを記録した記録媒体

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): JP US

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application
ENP Entry into the national phase

Ref country code: JP

Ref document number: 2000 572747

Kind code of ref document: A

Format of ref document f/p: F

122 Ep: pct application non-entry in european phase