CN110727772A - Method for realizing dynamic interaction of robot through condition judgment - Google Patents

Method for realizing dynamic interaction of robot through condition judgment Download PDF

Info

Publication number
CN110727772A
CN110727772A CN201910954866.9A CN201910954866A CN110727772A CN 110727772 A CN110727772 A CN 110727772A CN 201910954866 A CN201910954866 A CN 201910954866A CN 110727772 A CN110727772 A CN 110727772A
Authority
CN
China
Prior art keywords
module
information
robot
text
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910954866.9A
Other languages
Chinese (zh)
Inventor
王磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Baiying Technology Co Ltd
Original Assignee
Zhejiang Baiying Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Baiying Technology Co Ltd filed Critical Zhejiang Baiying Technology Co Ltd
Priority to CN201910954866.9A priority Critical patent/CN110727772A/en
Publication of CN110727772A publication Critical patent/CN110727772A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/33Querying
    • G06F16/332Query formulation
    • G06F16/3329Natural language query formulation or dialogue systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/33Querying
    • G06F16/3331Query processing
    • G06F16/334Query execution
    • G06F16/3343Query execution using phonetics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/33Querying
    • G06F16/3331Query processing
    • G06F16/334Query execution
    • G06F16/3344Query execution using natural language analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/33Querying
    • G06F16/335Filtering based on additional data, e.g. user or group profiles

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Databases & Information Systems (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Artificial Intelligence (AREA)
  • Mathematical Physics (AREA)
  • General Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Acoustics & Sound (AREA)
  • Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Manipulator (AREA)

Abstract

The invention discloses a method for realizing dynamic interaction by a robot through condition judgment, which solves the problems that the existing AI response mode is not flexible enough and cannot meet complex scenes, can be used for landing called key information, improves the control strength, can extract and land store the key information, is beneficial to subsequent analysis, statistics of user data, behaviors and the like, provides data support for subsequent conversation optimization, can more flexibly and efficiently control the behavior of the robot, creates more intelligent and more simulated conversation context scenes, and comprises a user information acquisition module, a search module, a database, a number import module, a robot, a communication module, a client, a data processor, an information extraction module, an information comparison module, a feedback module and a storage module, wherein the user information acquisition module is connected with the number import module which is connected with the robot, the robot is connected with the communication module, and the communication module is connected with the client.

Description

Method for realizing dynamic interaction of robot through condition judgment
Technical Field
The invention relates to the technical field of software, in particular to a method for realizing dynamic interaction by a robot through condition judgment.
Background
With the existing aging of the artificial intelligence technologies such as ASR (real-time speech recognition), NLP (natural language understanding) and the like, the robot can further understand the expression and meaning of people, and the related technologies are used in the scenes such as customer service and sales, so that the robot replaces people to a certain extent.
The system converts user audio into a text form, preset answers or questions are matched through semantic analysis, keyword matching and other modes, although the existing process can enable the robot to better understand and respond to the user questions, the operator cannot perform more detailed control in the real conversation process, complex scenes such as information review and verification cannot be met, the existing AI response mode is not flexible enough and cannot meet the complex scenes, and therefore the prior art needs to be improved.
Disclosure of Invention
Aiming at the defects of the prior art, the invention provides a method for realizing dynamic interaction by a robot through condition judgment, and solves the problems that the existing AI response mode is not flexible enough and cannot meet complex scenes.
In order to achieve the purpose, the invention is realized by the following technical scheme: a system for realizing dynamic interaction by robot through condition judgment comprises a user information acquisition module, a search module, a database, a number import module, a robot, a communication module, a client, a data processor, an information extraction module, an information comparison module, a feedback module and a storage module, wherein the user information acquisition module is connected with the number import module, the number import module is connected with the robot, the robot is connected with the communication module, the communication module is connected with the client, the number import module is connected with the communication module, the robot is connected with the data processor, the data processor is connected with the information extraction module, the information extraction module is connected with the information comparison module, the information comparison module is connected with the storage module, the information comparison module is connected with the feedback module, and the feedback module is connected with the robot, the information comparison module is connected with the user information acquisition module.
Preferably, the user information acquisition module comprises a date acquisition module, a certificate acquisition module, a number acquisition module, an address acquisition module and a gender acquisition module.
Preferably, the information acquisition module is connected with the search module, the user information acquisition module is connected with the database, and the search module is connected with the database.
Preferably, the robot comprises a voice recognition module, an information conversion module and a text recognition module, wherein the voice recognition module is connected with the information conversion module, and the information conversion module is connected with the text recognition module.
Preferably, the data processor comprises a data receiving module, a data analyzing module and a data sending module, wherein the data receiving module is connected with the data analyzing module, and the data analyzing module is connected with the data sending module.
A method for realizing dynamic interaction by a robot through condition judgment specifically comprises the following steps:
the method comprises the following steps: the user information acquisition module transmits the instruction to the search module, and the search module starts to search the database for the user information after receiving the instruction.
Step two: after the search module finishes searching, the user acquisition module starts to acquire client information, the date acquisition module acquires an expiration date from the database, the certificate acquisition module acquires client certificate number information from the database, the number acquisition module acquires user number information from the database, the address acquisition module acquires user address information from the database, and the gender acquisition module acquires user gender information from the database.
Step three: after the user acquisition module finishes acquisition, the information is transmitted to the information comparison module, and the information comparison module transmits the information to the storage module for storage.
Step four: meanwhile, the user acquisition module transmits the number information to the number import module for number import and importing the number to be dialed.
Step five: and the robot starts to call the client through the communication module according to the number to be dialed, which is imported by the number import module, if the client is not connected, the communication module transmits the information to the number import module for recording, and the fourth step is repeated.
Step six: if the robot is connected, the robot starts to talk with the client through the communication module normally, the content input by the client is identified through the voice identification module, after the identification is finished, the voice identification module transmits the identified information to the information conversion module, and the information conversion module converts the received voice information into a text format.
Step seven: the information conversion module transmits the text information to the data processor after the voice information is converted, the data receiving module transmits the text information to the data analysis module after receiving the transmitted text information, the data analysis module receives and analyzes the text information, and the data analysis module transmits the information to the data sending module after the text information is analyzed.
Step eight: the data processor transmits the information to the information extraction module through the data sending module, the information extraction module extracts keywords contained in the information after receiving the information, and the information extraction module sends the keyword information to the information comparison module after extracting the keywords.
Step nine: after the information comparison module receives the keyword information, the client information prestored in the storage module is called to be compared and matched, and after the information comparison and matching are completed, the result is transmitted to the robot through the feedback module in a text form by the information comparison module.
Step ten: after the robot acquires the text information, the text recognition module starts to recognize the text information, after the text recognition module recognizes the text information, the information is transmitted to the information conversion module, and after the information conversion module receives the text information, the text information is converted into a voice format.
Step eleven: after the information conversion module completes the conversion of the text information, the robot transmits the voice information to the client through the communication module.
Advantageous effects
The invention provides a method for realizing dynamic interaction of a robot through condition judgment, which has the following beneficial effects: the method for realizing dynamic interaction by condition judgment of the robot can extract key information in real time based on the current conversation content, then carry out condition judgment on the extracted key information and the preset content before conversation, determine the specific response content of the robot according to the judgment result, and improve the control strength by falling the called key information in the same way.
According to the scheme, the key information of the speaking content of the called person can be extracted and stored on the ground, so that follow-up analysis, user data statistics, behaviors and the like are facilitated, data support is provided for follow-up speaking optimization, the key information is extracted and compared with preset content, the robot behavior can be controlled more flexibly and efficiently, and a more intelligent and more simulated conversation context scene is created.
Drawings
FIG. 1 is a schematic diagram of the system of the present invention;
FIG. 2 is a flow chart of FIG. 1 of the present invention.
In the figure: 1. a user information acquisition module; 2. a search module; 3. a database; 4. a number import module; 5. a robot; 6. a communication module; 7. a client; 8. a data processor; 9. an information extraction module; 10. an information comparison module; 11. a feedback module; 12. a storage module; 101. a date collection module; 102. a certificate acquisition module; 103. a number acquisition module; 104. an address acquisition module; 105. a gender collection module; 51. a voice recognition module; 52. an information conversion module; 53. a text recognition module; 81. a data receiving module; 82. a data analysis module; 83. and a data sending module.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1-2, the present invention provides a technical solution: the utility model provides a system that robot realizes dynamic interaction through condition judgement, includes user information collection module 1, search module 2, database 3, number import module 4, robot 5, communication module 6, customer end 7, data processor 8, information extraction module 9, information contrast module 10, feedback module 11 and storage module 12, its characterized in that: the user information acquisition module 1 is connected with the number import module 4 to import the number to be dialed, the number import module 4 is connected with the robot 5, the robot 5 is connected with the communication module 6, the system is used for voice communication, a communication module 6 is connected with a client 7, a number import module 4 is connected with the communication module 6, a robot 5 is connected with a data processor 8 for data processing, the data processor 8 is connected with an information extraction module 9, the information extraction module 9 extracts keywords contained in information, the information extraction module 9 is connected with an information comparison module 10 for information comparison and matching, the information comparison module 10 is connected with a storage module 12 for facilitating subsequent analysis and statistics of user data and behaviors, the information comparison module 10 is connected with a feedback module 11, and the feedback module 11 is connected with the robot 5, and the information comparison module 10 is connected with the user information acquisition module 1.
The user information collection module 1 comprises a date collection module 101, a certificate collection module 102, a number collection module 103, an address collection module 104 and a gender collection module 105, the user collection module 1 is used for collecting client information, the information collection module 1 is connected with a search module 2 for searching user information for a database 3, the user information collection module 1 is connected with the database 3, the search module 2 is connected with the database 3, the robot 5 comprises a voice recognition module 51, an information conversion module 52 and a text recognition module 53, the voice recognition module 51 is connected with the information conversion module 52 for recognizing the content input by the client 7 and converting the information into a format, the information conversion module 52 is connected with the text recognition module 53, the data processor 8 comprises a data receiving module 81, a data analysis module 82 and a data sending module 83, the data receiving module 81 is connected with the data analysis module 82, the text information is analyzed, and the data analysis module 82 is connected with the data sending module 83.
A method for realizing dynamic interaction by a robot through condition judgment specifically comprises the following steps:
the method comprises the following steps: the user information acquisition module 1 transmits the instruction to the search module 2, and the search module 2 starts to search the database 3 for the user information after receiving the instruction.
Step two: after the search module 2 finishes searching, the user collection module 1 starts to collect customer information, the date collection module 101 collects expiration dates from the database 3, the certificate collection module 102 collects customer certificate number information from the database 3, the number collection module 103 collects user number information from the database 3, the address collection module 104 collects user address information from the database 3, and the gender collection module 105 collects user gender information from the database 3.
Step three: after the user acquisition module 1 finishes acquisition, the information is transmitted to the information comparison module 10, and the information comparison module 10 transmits the information to the storage module 12 for storage.
Step four: meanwhile, the user acquisition module 1 transmits the number information to the number import module 4 for number import, and imports the number to be dialed.
Step five: and the robot 5 starts to call the client 7 through the communication module 6 according to the number to be dialed, which is imported by the number import module 4, if the communication module 6 is not connected, the communication module 6 transmits the information to the number import module 4 for recording, and the fourth step is repeated.
Step six: if the robot 5 is connected, the robot starts normal conversation with the client 7 through the communication module 6, the content input by the client 7 is recognized through the voice recognition module 51, after the recognition is completed, the voice recognition module 51 transmits the recognized information to the information conversion module 52, and the information conversion module 52 converts the received voice information into a text format.
Step seven: the information conversion module 52 transmits the text information to the data processor 8 after the voice information is converted, the data receiving module 81 receives the transmitted text information and transmits the text information to the data analysis module 82, the data analysis module 82 receives and analyzes the text information, and the data analysis module 82 transmits the information to the data sending module 83 after the text information is analyzed.
Step eight: the data processor 8 transmits the information to the information extraction module 9 through the data sending module 83, the information extraction module 9 extracts the keywords contained in the information after receiving the information, and the information extraction module 9 sends the keyword information to the information comparison module 10 after extracting the keywords.
Step nine: after the information comparison module 10 receives the keyword information, the client information pre-stored in the storage module 12 is called for comparison and matching, and after the information comparison and matching are completed, the information comparison module 10 transmits the result to the robot 5 through the feedback module 11 in a text form.
Step ten: after the robot 5 acquires the text information, the text recognition module 53 starts recognizing the text information, after the text recognition module 53 recognizes the text information, the information is transmitted to the information conversion module 52, and after the information conversion module 52 receives the text information, the text information is converted into a voice format.
Step eleven: after the information conversion module 52 completes the conversion of the text information, the robot 5 transmits the voice information to the client 7 through the communication module 6.
The invention has the beneficial effects that: the machine 5 realizes a dynamic interaction method through condition judgment, can extract key information in real time through the information extraction module 9 based on the current conversation content, then carries out condition judgment on the extracted key information and the preset content before conversation through the information comparison module 10, determines the specific response content of the robot 5 according to the judgment result, and is used for landing called key information and improving the control strength through the mode.
According to the scheme, the key information of the speaking content of the called person can be extracted and stored on the ground, so that follow-up analysis, user data statistics, behaviors and the like are facilitated, data support is provided for follow-up speaking optimization, the key information is extracted and compared with preset content, the behavior of the robot 5 can be controlled more flexibly and efficiently, and a more intelligent and more simulated conversation context scene is created.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation.
Although embodiments of the present invention have been shown and described, it will be appreciated by those skilled in the art that changes, modifications, substitutions and alterations can be made in these embodiments without departing from the principles and spirit of the invention, the scope of which is defined in the appended claims and their equivalents.

Claims (6)

1. The utility model provides a system for robot realizes dynamic interaction through condition judgement, includes user information acquisition module (1), search module (2), database (3), number import module (4), robot (5), communication module (6), customer end (7), data processor (8), information extraction module (9), information contrast module (10), feedback module (11) and storage module (12), its characterized in that: the user information acquisition module (1) is connected with the number import module (4), the number import module (4) is connected with the robot (5), the robot (5) is connected with a communication module (6), the communication module (6) is connected with a client (7), the number import module (4) is connected with the communication module (6), the robot (5) is connected with the data processor (8), the data processor (8) is connected with the information extraction module (9), the information extraction module (9) is connected with the information comparison module (10), the information comparison module (10) is connected with the storage module (12), the information comparison module (10) is connected with the feedback module (11), the feedback module (11) is connected with the robot (5), and the information comparison module (10) is connected with the user information acquisition module (1).
2. The system for realizing dynamic interaction by robot through condition judgment as claimed in claim 1, wherein: the user information acquisition module (1) comprises a date acquisition module (101), a certificate acquisition module (102), a number acquisition module (103), an address acquisition module (104) and a gender acquisition module (105).
3. The system for realizing dynamic interaction by robot through condition judgment as claimed in claim 1, wherein: the information acquisition module (1) is connected with the search module (2), the user information acquisition module (1) is connected with the database (3), and the search module (2) is connected with the database (3).
4. The system for realizing dynamic interaction by robot through condition judgment as claimed in claim 1, wherein: the robot (5) comprises a voice recognition module (51), an information conversion module (52) and a text recognition module (53), wherein the voice recognition module (51) is connected with the information conversion module (52), and the information conversion module (52) is connected with the text recognition module (53).
5. The system for realizing dynamic interaction by robot through condition judgment as claimed in claim 1, wherein: the data processor (8) comprises a data receiving module (81), a data analyzing module (82) and a data sending module (83), wherein the data receiving module (81) is connected with the data analyzing module (82), and the data analyzing module (82) is connected with the data sending module (83).
6. A method for realizing dynamic interaction by a robot through condition judgment is characterized in that: the method comprises the following specific steps:
the method comprises the following steps: the user information acquisition module (1) transmits the instruction to the search module (2), and the search module (2) starts to search the database (3) for the user information after receiving the instruction.
Step two: after the search module (2) finishes searching, the user acquisition module (1) starts to acquire customer information, the date acquisition module (101) acquires an expiration date from the database (3), the certificate acquisition module (102) acquires customer certificate number information from the database (3), the number acquisition module (103) acquires user number information from the database (3), the address acquisition module (104) acquires user address information from the database (3), and the gender acquisition module (105) acquires user gender information from the database (3).
Step three: after the user acquisition module (1) finishes acquisition, the information is transmitted to the information comparison module (10), and the information comparison module (10) transmits the information to the storage module (12) for storage.
Step four: meanwhile, the user acquisition module (1) transmits the number information to the number import module (4) for number import, and the number to be dialed is imported.
Step five: and the robot (5) starts to call the client (7) through the communication module (6) according to the number to be dialed, which is imported by the number import module (4), and if the client is not connected, the communication module (6) transmits the information to the number import module (4) for recording, and the fourth step is repeated.
Step six: if the robot is connected, the robot (5) starts to normally talk with the client (7) through the communication module (6), the content input by the client (7) is recognized through the voice recognition module (51), after recognition is completed, the voice recognition module (51) transmits the recognized information to the information conversion module (52), and the information conversion module (52) converts the received voice information into a text format.
Step seven: the information conversion module (52) transmits the text information to the data processor (8) after the voice information is converted, the data receiving module (81) transmits the text information to the data analysis module (82) after receiving the transmitted text information, the data analysis module (82) receives and analyzes the text information, and the data analysis module (82) transmits the information to the data sending module (83) after the text information is analyzed.
Step eight: the data processor (8) transmits the information to the information extraction module (9) through the data sending module (83), the information extraction module (9) extracts keywords contained in the information after receiving the information, and the information extraction module (9) sends the keyword information to the information comparison module (10) after extracting the keywords.
Step nine: after the information comparison module (10) receives the keyword information, the client information pre-stored in the storage module (12) is called for comparison and matching, and after the information comparison and matching are completed, the result is transmitted to the robot (5) through the feedback module (11) by the information comparison module (10) in a text form.
Step ten: after the robot (5) acquires the text information, the text recognition module (53) starts to recognize the text information, after the text recognition module (53) recognizes the text information, the information is transmitted to the information conversion module (52), and after the information conversion module (52) receives the text information, the text information is converted into a voice format.
Step eleven: after the information conversion module (52) completes the conversion of the text information, the robot (5) transmits the voice information to the client (7) through the communication module (6).
CN201910954866.9A 2019-10-09 2019-10-09 Method for realizing dynamic interaction of robot through condition judgment Pending CN110727772A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910954866.9A CN110727772A (en) 2019-10-09 2019-10-09 Method for realizing dynamic interaction of robot through condition judgment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910954866.9A CN110727772A (en) 2019-10-09 2019-10-09 Method for realizing dynamic interaction of robot through condition judgment

Publications (1)

Publication Number Publication Date
CN110727772A true CN110727772A (en) 2020-01-24

Family

ID=69219845

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910954866.9A Pending CN110727772A (en) 2019-10-09 2019-10-09 Method for realizing dynamic interaction of robot through condition judgment

Country Status (1)

Country Link
CN (1) CN110727772A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006171719A (en) * 2004-12-01 2006-06-29 Honda Motor Co Ltd Interactive information system
CN107146622A (en) * 2017-06-16 2017-09-08 合肥美的智能科技有限公司 Refrigerator, voice interactive system, method, computer equipment, readable storage medium storing program for executing
CN107618034A (en) * 2016-07-15 2018-01-23 浙江星星冷链集成股份有限公司 A kind of deep learning method of robot
CN109767791A (en) * 2019-03-21 2019-05-17 中国—东盟信息港股份有限公司 A kind of voice mood identification and application system conversed for call center
CN109949805A (en) * 2019-02-21 2019-06-28 江苏苏宁银行股份有限公司 Intelligent collection robot and collection method based on intention assessment and finite-state automata

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006171719A (en) * 2004-12-01 2006-06-29 Honda Motor Co Ltd Interactive information system
CN107618034A (en) * 2016-07-15 2018-01-23 浙江星星冷链集成股份有限公司 A kind of deep learning method of robot
CN107146622A (en) * 2017-06-16 2017-09-08 合肥美的智能科技有限公司 Refrigerator, voice interactive system, method, computer equipment, readable storage medium storing program for executing
CN109949805A (en) * 2019-02-21 2019-06-28 江苏苏宁银行股份有限公司 Intelligent collection robot and collection method based on intention assessment and finite-state automata
CN109767791A (en) * 2019-03-21 2019-05-17 中国—东盟信息港股份有限公司 A kind of voice mood identification and application system conversed for call center

Similar Documents

Publication Publication Date Title
CN111739516A (en) Speech recognition system for intelligent customer service call
CN110266899B (en) Client intention identification method and customer service system
CN106357942A (en) Intelligent response method and system based on context dialogue semantic recognition
CN105261356A (en) Voice recognition system and method
CN104462600A (en) Method and device for achieving automatic classification of calling reasons
CN106294774A (en) User individual data processing method based on dialogue service and device
CN111178081B (en) Semantic recognition method, server, electronic device and computer storage medium
CN109065052A (en) A kind of speech robot people
CN109922213A (en) Data processing method, device, storage medium and terminal device when voice is seeked advice from
CN111159375A (en) Text processing method and device
CN109325737A (en) A kind of enterprise intelligent virtual assistant system and its method
CN114818649A (en) Service consultation processing method and device based on intelligent voice interaction technology
CN111046148A (en) Intelligent interaction system and intelligent customer service robot
CN112800743A (en) Voice scoring model construction system and method based on specific field
CN110196897B (en) Case identification method based on question and answer template
CN109961789B (en) Service equipment based on video and voice interaction
CN110688473A (en) Method for robot to dynamically acquire information
CN117456995A (en) Interactive method and system of pension service robot
CN113037934A (en) Hot word analysis system based on call recording of call center
CN210516214U (en) Service equipment based on video and voice interaction
CN116883888A (en) Bank counter service problem tracing system and method based on multi-mode feature fusion
CN116129903A (en) Call audio processing method and device
CN115934918A (en) Multi-turn conversation method of electric charge payment prompting robot based on intelligent voice technology
CN110727772A (en) Method for realizing dynamic interaction of robot through condition judgment
CN113314103B (en) Illegal information identification method and device based on real-time speech emotion analysis

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200124