CN116757882A - Hotel interaction method, hotel interaction device, electronic equipment and hotel interaction medium - Google Patents

Hotel interaction method, hotel interaction device, electronic equipment and hotel interaction medium Download PDF

Info

Publication number
CN116757882A
CN116757882A CN202310817647.2A CN202310817647A CN116757882A CN 116757882 A CN116757882 A CN 116757882A CN 202310817647 A CN202310817647 A CN 202310817647A CN 116757882 A CN116757882 A CN 116757882A
Authority
CN
China
Prior art keywords
interaction
hotel
target
appearance
related information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310817647.2A
Other languages
Chinese (zh)
Inventor
林庭锐
支涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Yunji Technology Co Ltd
Original Assignee
Beijing Yunji Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Yunji Technology Co Ltd filed Critical Beijing Yunji Technology Co Ltd
Priority to CN202310817647.2A priority Critical patent/CN116757882A/en
Publication of CN116757882A publication Critical patent/CN116757882A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/12Hotels or restaurants

Landscapes

  • Business, Economics & Management (AREA)
  • Tourism & Hospitality (AREA)
  • Health & Medical Sciences (AREA)
  • Economics (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Toys (AREA)

Abstract

The disclosure relates to the technical field of hotel management, and provides a hotel interaction method, a hotel interaction device, electronic equipment and a hotel interaction medium. The method comprises the following steps: determining a target interaction object; acquiring appearance related information of the target interactive object; generating an interaction scheme based on the appearance related information; and controlling the interaction robot to interact based on the interaction scheme. The interactive robot is used for interacting with the target interactive object, so that the user can conveniently take enough time and go through quick handling, and the safety of the target interactive object can be protected.

Description

Hotel interaction method, hotel interaction device, electronic equipment and hotel interaction medium
Technical Field
The disclosure relates to the technical field of hotel management, and in particular relates to a hotel interaction method, a hotel interaction device, electronic equipment and a hotel interaction medium.
Background
With the progress of science and technology, artificial intelligence technology is widely applied to mass life, and when a customer with children transacts in a hall, the safety problem of the children is easily ignored, or the transacting is not smooth due to crying of the children. Therefore, how to interact with children while the clients transact business is a challenge.
Disclosure of Invention
In view of the above, the embodiments of the present disclosure provide a hotel interaction method, apparatus, electronic device, and medium, so as to solve the problem in the prior art how to interact with a child when a customer deals with a check-in.
In a first aspect of an embodiment of the present disclosure, a hotel interaction method is provided, including: determining a target interaction object; acquiring appearance related information of the target interactive object; generating an interaction scheme based on the appearance related information; and controlling the interaction robot to interact based on the interaction scheme.
In a second aspect of the disclosed embodiments, a hotel interaction device is provided, including: a determining unit configured to determine a target interactive object; an acquisition unit configured to acquire appearance related information of the target interactive object; a generation unit configured to generate an interaction scheme based on the appearance-related information; and the interaction unit is configured to control the interaction robot to interact based on the interaction scheme.
In a third aspect of the disclosed embodiments, a computer device is provided, comprising a memory, a processor and a computer program stored in the memory and executable on the processor, the processor implementing the steps of the above method when the computer program is executed.
In a fourth aspect of the disclosed embodiments, a computer-readable storage medium is provided, which stores a computer program which, when executed by a processor, implements the steps of the above-described method.
Compared with the prior art, the embodiment of the disclosure has the beneficial effects that: firstly, determining a target interaction object; secondly, obtaining the appearance related information of the target interactive object; then, generating an interaction scheme based on the appearance related information; and finally, controlling the interactive robot to interact based on the interaction scheme. The target interaction objects of the present disclosure are objects to be monitored, such as children, which need to interact and may prevent the clients from handling the objects; firstly, preliminarily determining an interactable object based on preset screening conditions, and further determining whether the interactable object is determined to be a target interaction object based on the reaction of a user; further, obtaining appearance related information of the target interactive object; here, the sex and wearing of the target interactive object are primarily determined based on the appearance related information of the target interactive object, and then an appropriate interactive scheme is determined for the target interactive object in a preset interactive scheme library based on the sex and wearing and additionally acquired age information and preference information. Finally, the interactive robot is used as a carrier to interact with the target interactive object based on the proper interaction scheme, and the interactive method can include playing an cartoon and asking related problems, chatting, learning and other interaction modes, so as to attract the attention of target interaction, temporarily replace the target interactive object to monitor, enable a user to have enough time and experience to quickly manage and live in, and utilize the interactive robot to interact with the target interactive object, and simultaneously ensure the personal safety of the target interactive object and prevent strangers from approaching the target interactive object. The interactive robot is used for interacting with the target interactive object, so that the user can conveniently take enough time and go through quick handling, and the safety of the target interactive object can be protected.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present disclosure, the drawings that are required for the embodiments or the description of the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present disclosure, and other drawings may be obtained according to these drawings without inventive effort for a person of ordinary skill in the art.
Fig. 1 is a schematic illustration of one application scenario of a hotel interaction method according to some embodiments of the present disclosure;
figure 2 is a flow chart of some embodiments of a hotel interaction method according to the present disclosure;
FIG. 3 is a schematic structural view of some embodiments of hotel interaction devices according to the present disclosure;
fig. 4 is a schematic structural diagram of an electronic device suitable for use in implementing some embodiments of the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete. It should be understood that the drawings and embodiments of the present disclosure are for illustration purposes only and are not intended to limit the scope of the present disclosure.
It should be noted that, for convenience of description, only the portions related to the present application are shown in the drawings. Embodiments of the present disclosure and features of embodiments may be combined with each other without conflict.
It should be noted that the terms "first," "second," and the like in this disclosure are merely used to distinguish between different devices, modules, or units and are not used to define an order or interdependence of functions performed by the devices, modules, or units.
It should be noted that references to "one", "a plurality" and "a plurality" in this disclosure are intended to be illustrative rather than limiting, and those of ordinary skill in the art will appreciate that "one or more" is intended to be understood as "one or more" unless the context clearly indicates otherwise.
The names of messages or information interacted between the various devices in the embodiments of the present disclosure are for illustrative purposes only and are not intended to limit the scope of such messages or information.
The present disclosure will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
Fig. 1 is a schematic diagram of one application scenario of a hotel interaction method according to some embodiments of the present disclosure.
In the application scenario of fig. 1, first, the computing device 101 may determine the target interactive object 102. Second, the computing device 101 may obtain the appearance related information 103 of the target interactive object 102. The computing device 101 may then generate an interaction scheme 104 based on the appearance-related information 103. Finally, the computing device 101 may control the interactive robot to interact based on the interaction scheme 104 described above, as indicated by reference numeral 105.
The computing device 101 may be hardware or software. When the computing device 101 is hardware, it may be implemented as a distributed cluster of multiple servers or terminal devices, or as a single server or single terminal device. When the computing device 101 is embodied as software, it may be installed in the hardware devices listed above. It may be implemented as a plurality of software or software modules, for example, for providing distributed services, or as a single software or software module. The present application is not particularly limited herein.
It should be understood that the number of computing devices in fig. 1 is merely illustrative. There may be any number of computing devices, as desired for an implementation.
Figure 2 is a flow chart of some embodiments of hotel interaction methods according to the present disclosure. The hotel interaction method of fig. 2 may be performed by the computing device 101 of fig. 1. As shown in fig. 2, the hotel interaction method includes:
in step S201, a target interactive object is determined.
In some embodiments, the target interactive object may be an object to be supervised, a child that needs to interact, and the like that may interfere with the customer's business. Specifically, the execution subject of the hotel interaction method may determine the target interaction object by:
firstly, determining an interactable object based on preset screening conditions; here, the preset screening condition may be a height, and as an example, the preset screening condition is a height of 0.7 meter or less, and an object having a height of 0.7 meter or less is determined as an interactable object.
Step two, if the target voice input is detected, determining whether the interactable object is a target interaction object or not based on the target voice input; the target voice input is a calling word or an interaction related word of the interaction robot; here, the call word may be the name of the robot (e.g., "housekeeper"); the interactive related words may be "attended", etc. If the user is detected to send the calling word or the interactive related word, determining that the user wants the interactive robot to interact with the interactable object, and determining whether the interactable object is a target interactive object.
In some optional implementations of some embodiments, after determining the interactable object based on the preset screening condition, the method further comprises: firstly, if no target voice input is detected, generating an inquiry sentence based on a preset word stock; and secondly, determining whether the interactable object is a target interaction object based on the inquiry sentence. Here, if the call word or the interactive related word is not detected, actively inquiring whether the user needs to look at the interactable object based on the multi-round voice dialogue, and if so, determining that the interactable object is the target interaction object. Otherwise, determining that the interactable object is not the target interaction object.
Step S202, obtaining the appearance related information of the target interactive object.
In some embodiments, the appearance related information includes at least one of: sex information, wear information. Specifically, the execution main body of the hotel interaction method may acquire the appearance related information of the target interaction object through the following steps:
firstly, obtaining an appearance image of a target interactive object based on a camera built in the interactive robot; here, the interactive robot has a built-in camera, and obtains an appearance image of the target interactive object based on the built-in camera.
Secondly, extracting appearance related information in the appearance image; wherein the appearance related information at least comprises one of the following: sex information, wear information; after the appearance image of the target interactive object is obtained, the feature extraction is carried out on the appearance image, and foreign trade related information is obtained, wherein the foreign trade related information comprises sex, wearing, sex ratio and wearing reference information which can be generated as follow-up keywords.
In some optional implementations of some embodiments, after the extracting the appearance related information in the appearance image, the method further includes: acquiring supplementary information of the interactive object based on multiple rounds of voice interaction; wherein the supplemental information includes at least one of the following: age information, preference information. Here, in order to generate the interaction scheme more accurately, age information and preference information are further acquired. The age information and the preference information are obtained directly through the appearance image, so the information is obtained based on multiple rounds of voice interaction.
Step S203, generating an interaction scheme based on the appearance related information.
In some embodiments, the execution subject of the hotel interaction method may generate the interaction scheme by:
extracting at least one keyword from the appearance related information and the supplementary information; here, for the above-described appearance related information and the above-described supplemental information, keywords are extracted, and as an example, the keywords may be: female, sheep, 5 years old, drawing.
Step two, matching is carried out on the basis of the at least one keyword and the keywords of each preset interaction scheme in the preset interaction scheme library, so as to obtain a matching degree set; here, each preset interaction scheme in the preset interaction scheme library has at least one keyword, and the at least one keyword extracted based on the appearance related information and the supplementary information is matched with the keyword of each preset interaction scheme in the preset interaction scheme library to obtain a matching degree set.
Thirdly, taking the preset interaction scheme with the highest matching degree in the matching degree set as an interaction scheme; here, the matching set includes matching degrees of the plurality of keywords and the preset interaction scheme, and the preset interaction scheme with the highest matching degree in the matching degree set is used as the interaction scheme. The interaction scheme with the highest matching degree and the most proper interaction scheme is obtained.
In step S204, the interactive robot is controlled to interact based on the interaction scheme.
In some embodiments, the interactive robot is controlled to interact based on the above-described interaction scheme. Here, the interaction scheme may be playing an animation and asking related questions, chatting, learning, etc.
In some optional implementations of some embodiments, the method further includes: firstly, detecting the distance between the target interactive object and a preset dangerous point; and secondly, when the distance is smaller than the preset distance, an alarm is sent out. Here, in addition to the interaction with the target interaction object, the safety of the target interaction object can be monitored, so that the target interaction object is prevented from approaching dangerous points, such as high places, sharp objects and the like.
Compared with the prior art, the embodiment of the disclosure has the beneficial effects that: firstly, determining a target interaction object; secondly, obtaining the appearance related information of the target interactive object; then, generating an interaction scheme based on the appearance related information; and finally, controlling the interactive robot to interact based on the interaction scheme. The target interaction objects of the present disclosure are objects to be monitored, such as children, which need to interact and may prevent the clients from handling the objects; firstly, preliminarily determining an interactable object based on preset screening conditions, and further determining whether the interactable object is determined to be a target interaction object based on the reaction of a user; further, obtaining appearance related information of the target interactive object; here, the sex and wearing of the target interactive object are primarily determined based on the appearance related information of the target interactive object, and then an appropriate interactive scheme is determined for the target interactive object in a preset interactive scheme library based on the sex and wearing and additionally acquired age information and preference information. Finally, the interactive robot is used as a carrier to interact with the target interactive object based on the proper interaction scheme, and the interactive method can include playing an cartoon and asking related problems, chatting, learning and other interaction modes, so as to attract the attention of target interaction, temporarily replace the target interactive object to monitor, enable a user to have enough time and experience to quickly manage and live in, and utilize the interactive robot to interact with the target interactive object, and simultaneously ensure the personal safety of the target interactive object and prevent strangers from approaching the target interactive object. The interactive robot is used for interacting with the target interactive object, so that the user can conveniently take enough time and go through quick handling, and the safety of the target interactive object can be protected.
Any combination of the above optional solutions may be adopted to form an optional embodiment of the present application, which is not described herein.
The following are device embodiments of the present disclosure that may be used to perform method embodiments of the present disclosure. For details not disclosed in the embodiments of the apparatus of the present disclosure, please refer to the embodiments of the method of the present disclosure.
Fig. 3 is a schematic structural view of some embodiments of hotel interaction devices according to the present disclosure. As shown in fig. 3, the hotel interaction device includes: a determining unit 301, an acquiring unit 302, a generating unit 303, and an interacting unit 304. Wherein the determining unit 301 is configured to determine a target interactive object; an obtaining unit 302 configured to obtain appearance related information of the target interactive object; a generating unit 303 configured to generate an interaction scheme based on the above-described appearance-related information; the interaction unit 304 is configured to control the interaction robot to interact based on the interaction scheme.
In some optional implementations of some embodiments, the determining unit 301 of the hotel interaction device is further configured to: determining an interactable object based on a preset screening condition; if a target voice input is detected, determining whether the interactable object is a target interaction object based on the target voice input; the target voice input is a calling word or an interactive related word of the interactive robot.
In some optional implementations of some embodiments, the determining unit 301 of the hotel interaction device is further configured to: if the target voice input is not detected, generating an inquiry sentence based on a preset word stock; based on the query sentence, it is determined whether the interactable object is a target interaction object.
In some optional implementations of some embodiments, the acquisition unit 302 of the hotel interaction device is further configured to: acquiring an appearance image of a target interactive object based on a camera built in the interactive robot; extracting appearance related information in the appearance image; wherein the appearance related information at least comprises one of the following: sex information, wear information.
In some optional implementations of some embodiments, the acquisition unit 302 of the hotel interaction device is further configured to: acquiring supplementary information of the interactive object based on multiple rounds of voice interaction; wherein the supplemental information includes at least one of the following: age information, preference information.
In some optional implementations of some embodiments, the generating unit 303 of the hotel interaction device is further configured to: extracting at least one keyword from the appearance related information and the supplementary information; matching is carried out on the basis of the at least one keyword and the keywords of each preset interaction scheme in the preset interaction scheme library, so as to obtain a matching degree set; and taking the preset interaction scheme with the highest matching degree in the matching degree set as an interaction scheme.
In some alternative implementations of some embodiments, the hotel interaction device is further configured to: detecting the distance between the target interactive object and a preset dangerous point; and when the distance is smaller than the preset distance, an alarm is sent out.
It should be understood that the sequence number of each step in the foregoing embodiment does not mean that the execution sequence of each process should be determined by the function and the internal logic of each process, and should not constitute any limitation on the implementation process of the embodiments of the disclosure.
Referring now to FIG. 4, a schematic diagram of an electronic device 400 (e.g., computing device 101 of FIG. 1) suitable for use in implementing some embodiments of the present disclosure is shown. The server illustrated in fig. 4 is merely an example, and should not be construed as limiting the functionality and scope of use of the embodiments of the present disclosure in any way.
As shown in fig. 4, the electronic device 400 may include a processing means (e.g., a central processing unit, a graphics processor, etc.) 401, which may perform various suitable actions and processes according to a program stored in a Read Only Memory (ROM) 402 or a program loaded from a storage means 408 into a Random Access Memory (RAM) 403. In the RAM 403, various programs and data necessary for the operation of the electronic device 400 are also stored. The processing device 401, the ROM 402, and the RAM 403 are connected to each other by a bus 404. An input/output (I/O) interface 405 is also connected to bus 404.
In general, the following devices may be connected to the I/O interface 405: input devices 406 including, for example, a touch screen, touchpad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; an output device 407 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage 408 including, for example, magnetic tape, hard disk, etc.; and a communication device 409. The communication means 409 may allow the electronic device 400 to communicate with other devices wirelessly or by wire to exchange data. While fig. 4 shows an electronic device 400 having various means, it is to be understood that not all of the illustrated means are required to be implemented or provided. More or fewer devices may be implemented or provided instead. Each block shown in fig. 4 may represent one device or a plurality of devices as needed.
In particular, according to some embodiments of the present disclosure, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, some embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method shown in the flow chart. In such embodiments, the computer program may be downloaded and installed from a network via communications device 409, or from storage 408, or from ROM 402. The above-described functions defined in the methods of some embodiments of the present disclosure are performed when the computer program is executed by the processing device 401.
It should be noted that, in some embodiments of the present disclosure, the computer readable medium may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In some embodiments of the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In some embodiments of the present disclosure, however, the computer-readable signal medium may comprise a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, fiber optic cables, RF (radio frequency), and the like, or any suitable combination of the foregoing.
In some implementations, the clients, servers may communicate using any currently known or future developed network protocol, such as HTTP (HyperText Transfer Protocol ), and may be interconnected with any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the internet (e.g., the internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed networks.
The computer readable medium may be embodied in the apparatus; or may exist alone without being incorporated into the electronic device. The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: determining a target interaction object; acquiring appearance related information of the target interactive object; generating an interaction scheme based on the appearance related information; and controlling the interaction robot to interact based on the interaction scheme.
Computer program code for carrying out operations for some embodiments of the present disclosure may be written in one or more programming languages, including an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in some embodiments of the present disclosure may be implemented by means of software, or may be implemented by means of hardware. The described units may also be provided in a processor, for example, described as: a processor includes a determination unit, an acquisition unit, a generation unit, and an interaction unit. The names of these units do not constitute a limitation on the unit itself in some cases, and the determining unit may also be described as "a unit that determines a target interactive object", for example.
The functions described above herein may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: a Field Programmable Gate Array (FPGA), an Application Specific Integrated Circuit (ASIC), an Application Specific Standard Product (ASSP), a system on a chip (SOC), a Complex Programmable Logic Device (CPLD), and the like.
The foregoing description is only of the preferred embodiments of the present disclosure and description of the principles of the technology being employed. It will be appreciated by those skilled in the art that the scope of the application in the embodiments of the present disclosure is not limited to the specific combination of the above technical features, but encompasses other technical features formed by any combination of the above technical features or their equivalents without departing from the spirit of the application. Such as the above-described features, are mutually substituted with (but not limited to) the features having similar functions disclosed in the embodiments of the present disclosure.

Claims (10)

1. A hotel interaction method, comprising:
determining a target interaction object;
acquiring appearance related information of the target interactive object;
generating an interaction scheme based on the appearance related information;
and controlling the interaction robot to interact based on the interaction scheme.
2. The hotel interaction method of claim 1, wherein the determining the target interaction object comprises:
determining an interactable object based on a preset screening condition;
if a target voice input is detected, determining whether the interactable object is a target interaction object based on the target voice input; the target voice input is a calling word or an interactive related word of the interactive robot.
3. The hotel interaction method of claim 2, wherein after determining the interactable object based upon the preset screening criteria, the method further comprises:
if the target voice input is not detected, generating an inquiry sentence based on a preset word stock;
based on the query statement, it is determined whether the interactable object is a target interaction object.
4. The hotel interaction method of claim 1, wherein the obtaining the appearance related information of the target interactive object comprises:
acquiring an appearance image of a target interactive object based on a camera built in the interactive robot;
extracting appearance related information in the appearance image; wherein the appearance related information at least comprises one of the following: sex information, wear information.
5. The hotel interaction method of claim 4, wherein after said extracting appearance-related information in said appearance image, said method further comprises:
acquiring supplementary information of the interactive object based on multiple rounds of voice interaction; wherein the supplemental information includes at least one of the following: age information, preference information.
6. The hotel interaction method of claim 5, wherein the generating an interaction scheme based on the appearance-related information comprises:
extracting at least one keyword from the appearance related information and the supplementary information;
matching is carried out on the basis of the at least one keyword and the keywords of each preset interaction scheme in a preset interaction scheme library, so that a matching degree set is obtained;
and taking the preset interaction scheme with the highest matching degree in the matching degree set as an interaction scheme.
7. The hotel interaction method of claim 1, wherein the method further comprises:
detecting the distance between the target interactive object and a preset dangerous point;
and when the distance is smaller than the preset distance, an alarm is sent out.
8. A hotel interactive apparatus, comprising:
a determining unit configured to determine a target interactive object;
an acquisition unit configured to acquire appearance related information of the target interactive object;
a generation unit configured to generate an interaction scheme based on the appearance-related information;
and the interaction unit is configured to control the interaction robot to interact based on the interaction scheme.
9. An electronic device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the steps of the method according to any of claims 1 to 7 when the computer program is executed.
10. A computer readable storage medium storing a computer program, characterized in that the computer program when executed by a processor implements the steps of the method according to any one of claims 1 to 7.
CN202310817647.2A 2023-07-05 2023-07-05 Hotel interaction method, hotel interaction device, electronic equipment and hotel interaction medium Pending CN116757882A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310817647.2A CN116757882A (en) 2023-07-05 2023-07-05 Hotel interaction method, hotel interaction device, electronic equipment and hotel interaction medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310817647.2A CN116757882A (en) 2023-07-05 2023-07-05 Hotel interaction method, hotel interaction device, electronic equipment and hotel interaction medium

Publications (1)

Publication Number Publication Date
CN116757882A true CN116757882A (en) 2023-09-15

Family

ID=87955093

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310817647.2A Pending CN116757882A (en) 2023-07-05 2023-07-05 Hotel interaction method, hotel interaction device, electronic equipment and hotel interaction medium

Country Status (1)

Country Link
CN (1) CN116757882A (en)

Similar Documents

Publication Publication Date Title
CN110288049B (en) Method and apparatus for generating image recognition model
CN112650841A (en) Information processing method and device and electronic equipment
CN109829164B (en) Method and device for generating text
CN112434620B (en) Scene text recognition method, device, equipment and computer readable medium
CN111368973A (en) Method and apparatus for training a hyper-network
CN112364144B (en) Interaction method, device, equipment and computer readable medium
CN115908640A (en) Method and device for generating image, readable medium and electronic equipment
CN112259079A (en) Method, device, equipment and computer readable medium for speech recognition
WO2022188534A1 (en) Information pushing method and apparatus
CN109598344B (en) Model generation method and device
CN111312243B (en) Equipment interaction method and device
CN109840072B (en) Information processing method and device
CN111709784B (en) Method, apparatus, device and medium for generating user retention time
CN111858916A (en) Method and device for clustering sentences
CN116757882A (en) Hotel interaction method, hotel interaction device, electronic equipment and hotel interaction medium
CN111754984B (en) Text selection method, apparatus, device and computer readable medium
CN113709573B (en) Method, device, equipment and storage medium for configuring video special effects
CN111899747B (en) Method and apparatus for synthesizing audio
CN112017685B (en) Speech generation method, device, equipment and computer readable medium
CN110941683B (en) Method, device, medium and electronic equipment for acquiring object attribute information in space
CN113742593A (en) Method and device for pushing information
CN111986669A (en) Information processing method and device
CN111797263A (en) Image label generation method, device, equipment and computer readable medium
CN113077353B (en) Method, device, electronic equipment and medium for generating nuclear insurance conclusion
CN111709583B (en) User retention time generation method, device, electronic equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination