WO2023033256A1 - Système et procédé de service de communication utilisant des expressions faciales apprises à partir d'images d'un animal de compagnie - Google Patents

Système et procédé de service de communication utilisant des expressions faciales apprises à partir d'images d'un animal de compagnie Download PDF

Info

Publication number
WO2023033256A1
WO2023033256A1 PCT/KR2021/017671 KR2021017671W WO2023033256A1 WO 2023033256 A1 WO2023033256 A1 WO 2023033256A1 KR 2021017671 W KR2021017671 W KR 2021017671W WO 2023033256 A1 WO2023033256 A1 WO 2023033256A1
Authority
WO
WIPO (PCT)
Prior art keywords
companion animal
image
communication
guardian
companion
Prior art date
Application number
PCT/KR2021/017671
Other languages
English (en)
Korean (ko)
Inventor
박정훈
김민석
Original Assignee
박정훈
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 박정훈 filed Critical 박정훈
Publication of WO2023033256A1 publication Critical patent/WO2023033256A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/01Social networking
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01KANIMAL HUSBANDRY; AVICULTURE; APICULTURE; PISCICULTURE; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
    • A01K29/00Other apparatus for animal husbandry
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01KANIMAL HUSBANDRY; AVICULTURE; APICULTURE; PISCICULTURE; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
    • A01K29/00Other apparatus for animal husbandry
    • A01K29/005Monitoring or measuring activity, e.g. detecting heat or mating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/40Processing or translation of natural language
    • G06F40/58Use of machine translation, e.g. for multi-lingual retrieval, for server-side translation for client devices or for real-time translation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/40Business processes related to the transportation industry
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/802D [Two Dimensional] animation, e.g. using sprites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L13/00Speech synthesis; Text to speech systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/07User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail characterised by the inclusion of specific contents
    • H04L51/10Multimedia information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/40Support for services or applications
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/12Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R3/00Circuits for transducers, loudspeakers or microphones
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/80Services using short range communication, e.g. near-field communication [NFC], radio-frequency identification [RFID] or low energy communication
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Definitions

  • the present invention relates to a communication service system and method using expressions learned from companion animal images.
  • companion animals In a survey on the status of raising companion animals, more than half of companion animal owners cited 'emotional stability and happiness as a family member' as the main reason for raising a companion animal. In addition, it was also investigated that companion animals have positive effects such as increased responsibility and reduced loneliness for the vulnerable. In light of this, even if the guardian goes to work or goes out and is separated from the companion animal, a specific plan is needed to increase the bond and intimacy with the companion animal.
  • the present invention was derived from the above background, and provides a communication service system and method for improving the quality of communication with companion animals and exchanges between companion animal guardians using expressions learned from images of companion animals collected by guardians. Its purpose is to provide
  • a communication service system includes an input device for a companion animal generating a signal related to a companion animal's desire according to an input of the companion animal; a signal relaying device that receives the signal through short-range communication with the companion animal input device and transmits the signal to a communication server through a network; a communication server generating an interactive message of the companion animal based on the signal and providing the image of the companion animal together with the interactive message to a companion animal guardian terminal described below; and a companion animal guardian terminal equipped with a communication platform that provides a chatting screen between the companion animal and the guardian of the companion animal and outputs the interactive message and image on the chatting screen.
  • the communication server includes a communication unit for transmitting and receiving signals and data to and from the signal mediation device and the communication platform; a controller that selects an interactive message of the companion animal and an image of the companion animal to be transmitted to the communication platform based on the signal related to the desire of the companion animal; and a storage unit that stores an interactive message of the companion animal and an image of the companion animal that can be selected by the control unit.
  • the communication server may further include a learning unit generating a companion animal expression model using the companion animal image.
  • the companion animal expression model refers to a model that infers the expression of the companion animal shown in the image from the image of the companion animal.
  • the learning unit machine-learns conversation contents of the companion animal guardian through the chatting app, and a message learning module that generates a deep learning model enabling signals related to the companion animal's desire to be expressed as colloquial messages; a facial expression learning module for generating the companion animal expression model using the image of the companion animal; and a model manager that obtains an inference result by executing the deep learning model and the companion animal expression model.
  • the communication server may further include an emoticon generating unit generating an emoticon using an image of a companion animal.
  • the communication platform includes a message interface that provides a chatting screen between the companion animal and a guardian of the companion animal and outputs the interactive message and image to the chatting screen; and a facial expression analysis interface providing a screen displaying an image of the companion animal and a predetermined number of options.
  • the option means that the name of the facial expression that the companion animal in the image can have is described.
  • a communication service system includes a companion animal guardian terminal equipped with a communication platform for receiving a remote control signal from a companion animal guardian and transmitting the remote control signal to the following communication server; a communication server that transmits the remote control signal to the following signal relaying device; and a signal relaying device that outputs voice information of the companion animal guardian through a built-in speaker according to the remote control signal.
  • a communication service system includes a companion animal guardian terminal equipped with a communication platform for receiving a message to be sent to the companion animal from a companion animal guardian and transmitting the message to the following communication server; A communication server that synthesizes the companion animal guardian's voice based on the message and transmits the synthesized voice information to the following signal relaying device; and a signal relaying device that outputs the synthesized voice information through a built-in speaker.
  • the communication server includes a communication unit that receives a signal related to a companion animal's desire and transmits an interactive message of the companion animal and an image of the companion animal to a terminal of the guardian of the companion animal; a control unit that selects an interactive message of the companion animal and an image of the companion animal to be transmitted to the guardian terminal based on the signal related to the desire of the companion animal; and a storage unit that stores an interactive message of the companion animal and an image of the companion animal that can be selected by the control unit.
  • the communication server may further include a learning unit generating a companion animal expression model using the companion animal image.
  • the companion animal expression model means inferring the expression of the companion animal shown in the image from the image of the companion animal.
  • a method of operating a messenger interface in a communication platform dedicated to companion animals is a cause of displaying at least one of an interactive message of the companion animal or an image of the companion animal in a chat window with the companion animal. Step of generating a specific event that becomes; Determining a situation through information collection based on the occurrence of the event; selecting at least one of a message and an image to be output to the chatting window according to the determined situation; and outputting the selection result to the chatting window.
  • the image of the companion animal may be an expression of the companion animal mapped to the determined situation.
  • the specific event may be at least one of reception of a companion animal input signal, elapse of a certain time without guardian feedback after the companion animal input signal, reception of an advertisement, reception of weather information, and arrival of a specific time, and information collected to determine the situation.
  • reception of a companion animal input signal elapse of a certain time without guardian feedback after the companion animal input signal
  • reception of an advertisement elapse of weather information
  • arrival of a specific time elapse of a specific time without guardian feedback after the companion animal input signal
  • information collected to determine the situation elapse of a certain time without guardian feedback after the companion animal input signal
  • At least among the type of companion animal input signal, the elapsed time after message output according to the companion animal input signal, the type of advertisement, the change in weather forecast, the location information of the companion animal guardian terminal, and the message sent by the companion animal guardian to the chat window can be one
  • a method for executing facial expression analysis of a communication platform dedicated to companion animals includes selecting a companion animal to be subjected to facial expression analysis; selecting an image of the companion animal; Presenting a plurality of options regarding the expression of the companion animal shown in the image of the companion animal; and outputting, on the screen of the terminal of the user, whether the facial expression described in the option selected by the user of the facial expression analysis service matches the facial expression label previously assigned to the image of the companion animal.
  • a step of executing a facial expression model for the companion animal may be further included prior to the step of presenting the plurality of options.
  • an execution result of the facial expression model may be reflected on the options.
  • the expression model means inferring the expression of the companion animal shown in the image from the image of the companion animal.
  • the step of aggregating the selection result of the expression of one or more users for the companion animal image and outputting the aggregated result to the screen may be further included.
  • the companion animal emoticon generation method includes selecting a companion animal image to be converted into an emoticon from a gallery in which the companion animal image is uploaded; generating an emoticon by processing the selected image; and assigning a tag to the emoticon.
  • an image containing the expression of the companion animal is inserted to improve the bond and intimacy felt by the guardian of the companion animal with the companion animal there is
  • an interface capable of analyzing a companion animal's expression in a companion animal SNS account using a companion animal image is configured, thereby increasing exchanges between companion animal guardians.
  • FIG. 1 is a block diagram showing the configuration of a communication service system according to an embodiment of the present invention
  • FIG. 2 is a block diagram showing the configuration of a communication server of a communication service system according to an embodiment of the present invention
  • FIG. 3 is a block diagram showing the configuration of a communication platform installed in a companion animal guardian terminal according to an embodiment of the present invention.
  • FIG. 4 is a reference diagram illustrating an example of a screen provided by a messenger interface according to an embodiment of the present invention
  • FIG. 5 is a reference diagram illustrating an example of a screen provided by a messenger interface according to an embodiment of the present invention
  • FIG. 6 is a reference diagram illustrating an example of a screen provided by a facial expression analysis interface according to an embodiment of the present invention.
  • FIG. 7 is a flowchart illustrating a method of using an image of a companion animal in a messenger interface of a communication platform according to an embodiment of the present invention.
  • FIG. 8 is a flowchart for explaining a method of performing facial expression analysis in a communication platform according to an embodiment of the present invention.
  • FIG. 9 is a flowchart illustrating a method of tagging after converting a companion animal image into an emoticon according to an embodiment of the present invention.
  • expressions such as “A or B,” “at least one of A and/and B,” or “one or more of A or/and B” may include all possible combinations of the items listed together.
  • the term 'at least one' is defined as a term including singular and plural, and even if the term 'at least one' does not exist, each component may exist in singular or plural, and singular or plural. It will be self-evident that it can mean. In addition, it will be possible to change that each component is provided in singular or plural numbers according to embodiments.
  • FIG. 1 is a block diagram showing the configuration of a communication service system according to an embodiment of the present invention.
  • the communication service system using facial expressions learned from companion animal images includes a companion animal input device 100, a signal relay device 200, and a communication server 300. ) and a companion animal guardian terminal 400.
  • the communication service system receives a companion animal's desire through a button of the companion animal input device 100 and transmits it to a communication platform dedicated to companion animals in the companion animal guardian terminal 400 (hereinafter referred to as communication platform). , 410, see FIG. 5) or an existing messenger application to communicate with a companion animal.
  • Companion animals can be trained to express their desires using buttons of the companion animal input device 100 . Trained companion animals express their needs with a push button or touch button, and Internet of Things technology is introduced to enable confirmation and response even when a guardian in the house is absent.
  • the input device 100 for a companion animal may be implemented as a button in a different form to express the desire of a companion animal through training.
  • the companion animal's desire information may be collected through the companion animal input device 100 .
  • the collected desire information can be transmitted to the companion animal guardian, and the companion animal guardian can recognize this and create a communication window through which the companion animal can be remotely fed back to the companion animal by voice or photo.
  • the communication server 300 dataizes, refines, and processes the button input, and communicates with the companion animal guardian terminal 400.
  • the platform 410 can communicate as if a companion animal is having a conversation using a real language.
  • the button of the companion animal input device 100 may be a touch button or a press button.
  • the button is composed of a plurality of buttons, each of which can be matched with different request contents.
  • each button can be matched with content that a companion animal can request from its guardian, such as meal service, play equipment operation, video call, owner's voice, and operation of home appliances (TV, audio, air conditioner, fan).
  • the buttons are preferably implemented in a number, shape, and color that can be selected by a companion animal through training. The number, shape and color of the buttons can be variously modified.
  • press the ⁇ button to request food press the ⁇ button to operate rides, press the ⁇ button to make a video call with the owner, and press the ⁇ button to turn on the air conditioner or TV.
  • Animals can be matched to recognizable situations.
  • words used in real life such as mother, owner, let's do, please, and differ, may be matched.
  • a signal (hereinafter abbreviated as 'companion animal input signal') is generated from the companion animal input device 100.
  • the mediation device 200 transmits a companion animal input signal to the communication server 300 .
  • the companion animal input device 100 transmits a companion animal input signal to the companion animal guardian terminal 400 via the signal mediation device 200 and the communication server 300 .
  • the communication server 300 refines and delivers content according to the companion animal input signal to the communication platform 410 of the companion animal guardian terminal 400 .
  • the communication platform 410 receives the above contents and provides the companion animal guardian with a messenger screen enabling interactive communication between the companion animal guardian and the companion animal through the messenger interface 411 .
  • interactive communication may be implemented by interlocking with an existing chatting-only app operated in the companion animal guardian terminal 400 .
  • a speaker may be included so that recorded contents or stored special sounds are reproduced when a button of the companion animal input device 100 is pressed.
  • a light emitting device such as an LED may be embedded in the button so that the corresponding light emitting device is turned on when the button is pressed.
  • the companion animal input device 100 is connected to the signal relay device 200 through a local area network.
  • the companion animal input device 100 includes a short-range wireless communication module such as a Bluetooth module to transmit a signal detecting that the corresponding button is pressed to the signal intermediary device 200 when the corresponding button is pressed.
  • the signal broker 200 transfers the input signal transmitted from the companion animal input device 100 to the communication server 300 .
  • the signal broker 200 transfers the input signal transmitted from the companion animal input device 100 to the companion animal guardian terminal 400 via the communication server 300 .
  • the signal broker 200 includes a camera and a communication module.
  • the signal broker 200 may include at least one or more of various sensor modules capable of checking the state of a companion animal, such as an ultrasonic sensor, a sound sensor, an NFC sensor, and an RFID sensor.
  • the communication module of the signal relaying device 200 performs a short-distance wireless communication function such as a wireless LAN, Bluetooth, or an infrared sensor.
  • a short-distance wireless communication function such as a wireless LAN, Bluetooth, or an infrared sensor.
  • it includes a technical configuration capable of transmitting a remote controller signal to home appliances (peripheral devices) including an IR sensor such as a TV, an air conditioner, and the like.
  • the signal mediation device 200 may be connected to the companion animal input device 100 by wire (eg, LAN).
  • wire eg, LAN
  • the signal broker 200 may transmit an image captured by a camera to the communication server 300 or the companion animal guardian terminal 400.
  • the companion animal guardian terminal 400 may communicate with a plurality of signal brokers 200 .
  • the signal broker 200 includes a communication module that performs network communication with the communication server 300 .
  • the signal relaying device 200 uses the input signal of the companion animal input device 100 (hereinafter Companion animal input signal) is transmitted to the communication server 300, and the remote control signal received through the communication server 300 is transmitted to home appliances, that is, peripheral devices capable of satisfying appetite or play.
  • the 'remote control signal' of the present invention is a signal for controlling the operation of a peripheral device matched to a companion animal input signal, and is used in the same meaning thereafter.
  • the remote control signal includes a control signal capable of controlling the operation of various peripheral devices, such as a control signal for feeding the feeder, a signal for controlling the snack feeder to dispense snacks, and an operation signal for a ball game machine.
  • the signal broker 200 controls the operation of peripheral devices according to the remote control signal.
  • Peripheral devices include various devices that can be operated by remote control signals and can satisfy the needs of companion animals.
  • Peripheral devices include one of the following: a feed feeder or snack feeder to solve the companion animal's appetite, a dog laser machine or ball machine to solve the companion animal's need for play, and a video call device to solve the companion animal's need to find a guardian. .
  • a feed feeder or snack feeder to solve the companion animal's appetite
  • a dog laser machine or ball machine to solve the companion animal's need for play
  • a video call device to solve the companion animal's need to find a guardian.
  • it is not limited thereto, and is interpreted to cover various devices and devices that operate by remote control signals and can satisfy the needs of companion animals.
  • the peripheral device may be implemented to provide feedback on the companion animal's desire by outputting the companion animal guardian's voice through a speaker mounted in the signal broker 200 or a physically separate speaker.
  • the video call device can be physically implemented as a function mounted in the signal brokering device 200.
  • the communication service system may further include a wearable biometric module (for example, in the form of a necklace), and the wearable biometric module is worn so as to come into contact with the companion animal's torso. It includes at least one bioelectrode sensor for measuring a signal.
  • the biosignal may be one of signals related to electromyography, bone conduction, pulse rate (heart rate), body temperature, and respiratory rate, but is not limited thereto.
  • the wearable biometric module may include a microphone for collecting the companion animal's voice in real time.
  • the wearable biometric module can transmit the biosignal and voice information of the companion animal to the signal intermediary device 200 using a local area network.
  • the signal mediation device 200 may transmit the received biosignal and voice information to the companion animal guardian terminal 400 via the communication server 300 or the communication server 300 .
  • the signal broker 200 may include a speaker for outputting voice information of a companion animal guardian.
  • the signal broker 200 may output voice information of a companion animal guardian through a speaker according to a remote control signal received from the communication server 300 .
  • the voice information of the companion animal guardian may be positive language such as 'good job', 'good', and 'nice', or negative language such as 'no' and 'don't do it'.
  • the voice information of the companion animal guardian may be voice information synthesized according to a message input by the companion animal guardian through the messenger interface 411 of the communication platform 410 installed in the companion animal guardian terminal 400 .
  • the communication server 300 receives the message, synthesizes the companion animal guardian's voice based on this message, and synthesizes the synthesized voice.
  • Information is transmitted to the signal broker 200, and the signal broker 200 may output the synthesized voice through a speaker.
  • the signal broker 200 communicates with the communication server 300 through a network.
  • the network refers to a connection structure capable of exchanging information between nodes such as a plurality of terminals and servers, and examples of such networks include RF, 3rd Generation Partnership Project (3GPP) network, and Long Term Evolution (LTE) ) network, 5GPP (5th Generation Partnership Project) network, WIMAX (World Interoperability for Microwave Access) network, Internet, LAN (Local Area Network), Wireless LAN (Wireless Local Area Network), WAN (Wide Area Network), PAN (Personal Area Network), Bluetooth (Bluetooth) network, NFC network, satellite broadcasting network, analog broadcasting network, DMB (Digital Multimedia Broadcasting) network, etc. are included, but are not limited thereto.
  • 3GPP 3rd Generation Partnership Project
  • LTE Long Term Evolution
  • 5GPP Fifth Generation Partnership Project
  • WIMAX Worldwide Interoperability for Microwave Access
  • Internet Internet
  • LAN Local Area Network
  • Wireless LAN Wireless Local Area Network
  • WAN Wide Area Network
  • PAN Personal Area Network
  • Bluetooth
  • the signal broker 200 may turn on/off the operation of peripheral devices according to the input signal transmitted from the companion animal input device 100 .
  • the peripheral devices may be IoT devices capable of communication control such as electric fans, air conditioners, TVs, food feeders, snack feeders, and play equipment.
  • the signal broker 200 and peripheral devices are connected through a home network.
  • the signal mediation device 200 After registering the peripheral device information input from the communication platform 410 of the companion animal guardian terminal 400, the signal mediation device 200 registers the companion animal input device 100 based on the peripheral device operation setting information specified in the communication platform 410. ), you can set, modify, and add operable peripherals with input signals.
  • the signal broker 200 may further include a camera, a speaker, and a recorder, and may provide a camera-captured image/video or recorded sound to the communication server 300 in the form of a file or streaming.
  • the signal broker 200 transmits an image at the moment when the companion animal presses the button of the companion animal input device 100 or a video of a predetermined time right before or after the companion animal presses the button, to the communication server 300. ) can be transmitted to the communication unit 310.
  • the companion animal input device 100 or the signal relaying device 200 may include a chip recognition unit capable of recognizing an internal chip or an external chip for recognizing companion animals.
  • the companion animal input device 100 transmits an input signal and a chip recognition signal to the signal relaying device 200 .
  • the communication server 300 may receive the chip recognition signal together with the input signal from the signal broker 200 to identify which companion animal the input signal is.
  • the communication server 300 may transmit the input signal and identification information of the companion animal to the companion animal guardian terminal 400 as needed.
  • RFID or NFC technology may be used as the chip recognition technology for companion animal identification, but is not limited thereto.
  • the companion animal input device 100 or the signal relaying device 200 may include a print recognition unit capable of recognizing the print of a companion animal.
  • the inscription recognition unit may generate an identification signal of a companion animal by using artificial intelligence such as deep learning based on an image of the companion animal captured by a camera. If the companion animal input device 100 has a non-print recognition unit, the companion animal input device 100 transmits the identification signal to the signal broker 200.
  • the signal broker 200 transmits the identification signal transmitted from the companion animal input device 100 or the identification signal generated by itself to the communication server 300 .
  • the communication server 300 may receive the identification signal along with the input signal from the signal broker 200 to identify which companion animal the input signal is.
  • the communication server 300 may transmit the input signal and identification information of the companion animal to the companion animal guardian terminal 400 .
  • the communication server 300 is implemented as a service server including a wireless LAN communication module for receiving data with the signal broker 200 and the companion animal guardian terminal 400 .
  • the communication server 300 performs machine learning based on the time at which a companion animal input signal is received, the number of transmissions of the input signal for a predetermined time, and the language content of the companion animal guardian, and performs natural language
  • a call and request message may be extracted and provided to the companion animal guardian terminal 400 .
  • the contents of the companion animal guardian's language are chat contents in messengers such as KakaoTalk, contents posted on SNS (eg, Twitter, Facebook or Instagram), contents entered in the chat window with companion animals, and input Voice information of may be included, but is not limited thereto.
  • the communication server 300 transmits a video (image, video) or text (message) to the companion animal guardian terminal 400 in a chatting format.
  • the communication server 300 When the communication server 300 receives the same plurality of companion animal input signals through the communication unit 310 within a predetermined time (when a specific button of the companion animal input device 100 is repeatedly pressed), the current time information and the previous Contents may be selectively sent to the companion animal guardian terminal 400 through execution data of peripheral devices.
  • the companion animal guardian can set the number of uses and duration of use of peripheral devices related to the companion animal's desire relief, and disable audio output and LED lighting so that they do not work even if the desire button is pressed. For example, when the desire button is repeatedly pressed within a predetermined time, the operation may be deactivated.
  • the communication server 300 may be implemented to receive feedback about the companion animal's desire relief from a peripheral device and transmit the feedback to the companion animal guardian terminal 400 .
  • the communication server 300 may generate companion animal health care information based on the companion animal's biometric information, behavior pattern information, and input information of the input pad, and provide it to the companion animal guardian terminal 400 .
  • the communication server 300 may receive peripheral device operation information by time zone transmitted from the companion animal guardian terminal 400 and provide statistical information on the companion animal's peripheral device usage rate by time zone and daily to the companion animal guardian terminal 400. there is.
  • the companion animal guardian terminal 400 is owned by a companion animal guardian.
  • the companion animal-specific communication platform 410 mounted on and executed in the companion animal guardian terminal 400 provides a chat function with companion animals, a function to analyze the expression of companion animals, and a signal mediation device 200. It can transmit voice or perform video call function. Also, the communication platform 410 may provide a screen for selecting an emoticon created based on an image of a companion animal.
  • the companion animal guardian transmits a remote control signal corresponding to a companion animal input signal (a signal expressing the companion animal's desire) through the communication platform 410 executed in the companion animal guardian terminal 400.
  • a companion animal input signal a signal expressing the companion animal's desire
  • the communication platform 410 transmits the remote control signal to the communication unit 310 of the communication server 300.
  • the communication unit 310 receives the remote control signal and transmits the remote control signal to the signal relaying device 200 .
  • the communication platform 410 may generate and transmit a remote control signal to a peripheral device provided in the home.
  • the companion animal guardian may input a remote control signal through various applications capable of transmitting remote control signals to peripheral devices as well as the communication platform 410 running on the companion animal guardian terminal 400. possible.
  • the companion animal guardian not only uses the communication platform 410 executed in the companion animal guardian terminal 400 but also peripheral devices that correspond to the needs of companion animals in the communication platform 410 in the case of peripheral devices that are not connected to the signal broker 200.
  • peripheral devices that correspond to the needs of companion animals in the communication platform 410 in the case of peripheral devices that are not connected to the signal broker 200.
  • the communication platform 410 may include applications, programs, web pages, and the like that are installed or driven in the companion animal guardian terminal 400 through a network.
  • the communication server 300 is integrally mounted on the companion animal guardian terminal 400 .
  • it is implemented to perform the function of the communication server 300 through the communication platform 410 installed in the companion animal guardian terminal 400 . Accordingly, it is possible to save data usage costs associated with physically operating a separate service server.
  • the companion animal guardian terminal 400 includes a smart phone, a portable terminal, a mobile terminal, a foldable terminal, a personal computer, a laptop computer, a slate PC ( Slate PC), tablet PC, ultrabook, wearable device (e.g., smartwatch, Wibro terminal, IPTV (Internet Protocol Television) terminal, smart TV , digital broadcasting terminals, AVN (Audio Video Navigation) terminals, A / V (Audio / Video) systems, flexible terminals (Flexible Terminal) can be applied to various terminals.
  • wearable device e.g., smartwatch, Wibro terminal, IPTV (Internet Protocol Television) terminal, smart TV , digital broadcasting terminals, AVN (Audio Video Navigation) terminals, A / V (Audio / Video) systems, flexible terminals (Flexible Terminal) can be applied to various terminals.
  • FIG. 2 is a block diagram showing the configuration of a communication server of a communication service system according to an embodiment of the present invention.
  • the communication server 300 of the communication service system includes a communication unit 310, a learning unit 320, a control unit 330, and a storage unit 340. do. And the communication server 300 may further include an emoticon generating unit 350 . In addition, the communication server 300 may further include a member management unit, a peripheral device management unit, an advertisement information providing unit, and a platform providing unit.
  • the communication unit 310 transmits/receives data with the signal broker 200. Also, the communication unit 310 transmits/receives data with the communication platform 410. Data exchanged between the communication unit 310 and the signal broker 200 and the communication platform 410 includes all types of data such as images, videos, and text.
  • the communication unit 310 receives a companion animal input signal from the signal broker 200 and transmits the companion animal input signal to the communication platform 410 operated in the companion animal guardian terminal 400 .
  • the communication platform 410 converts the input signal of the companion animal into spoken language through the messenger interface 411 and delivers it to the companion animal guardian.
  • the companion animal guardian can grasp the needs of the companion animal through the messenger interface 411 provided by the communication platform 410 .
  • the communication unit 310 may transmit a remote control signal for a peripheral device that matches the companion animal input signal to the signal relaying device 200 .
  • the communication unit 310 receives the chip identification signal together with the companion animal input signal from the signal broker 200 .
  • the communication unit 310 may transmit the chip recognition signal as an identification code of the companion animal to the inside of the communication server 300, or convert the chip recognition signal into a separate companion animal identification code and transmit it to the inside of the communication server 300.
  • the learning unit 320, the control unit 330, and the storage unit 340 classify models or data based on the identification code of the companion animal. For example, when creating/updating expression models/behavior models, generating/extracting messages, or storing/extracting images/videos, data is classified and processed using the identification code of the companion animal.
  • the peripheral device Check whether direct control signal transmission is possible. In addition, if the control signal can be directly transmitted to the peripheral device, the communication unit 310 transfers the remote control signal to the peripheral device. If it is not possible to directly transmit a control signal to a peripheral device, an application that performs a remote control function of a peripheral device separately installed in the companion animal guardian terminal 400 is first started, and then through an application that performs a remote control function of the peripheral device. A remote control signal can be transmitted to a peripheral device.
  • the communication unit 310 may receive a message from the companion animal guardian from the companion animal guardian terminal 400 .
  • the communication unit 310 includes commands for executing expression models and behavior models of the communication platform 410, images necessary for model execution (or image identification codes capable of extracting images from the image DB 342) and videos (or video DB ( In 343), a video identification code for extracting a video) can be received.
  • images necessary for model execution or image identification codes capable of extracting images from the image DB 342
  • videos or video DB ( In 343), a video identification code for extracting a video
  • the communication unit 310 transfers the received contents to the model manager 324 .
  • the communication unit 310 returns messages generated by the control unit 330, messages extracted from the message DB 341, images and videos of companion animals, emoticons generated by converting images of companion animals, and data for analysis of companion animal expressions. It can be delivered to the communication platform 410 of the animal guardian terminal 400.
  • the communication platform 410 displays the message, image or video received from the communication unit 310 through the messenger interface 411.
  • the communication platform 410 may display the image of the companion animal transmitted from the communication unit 310 and the companion animal expression analysis data (name of expression) through the expression analysis interface 412 .
  • the communication platform 410 may display a video of the companion animal and companion animal behavior analysis data (action name) transmitted from the communication unit 310 through the behavior analysis interface 413 .
  • the communication platform 410 may display the emoticon received from the communication unit 310 through the emoticon interface 414 .
  • the communication unit 310 may receive an image or video of a companion animal or voice information of the companion animal from the signal broker 200 .
  • the communication unit 310 transfers the image/video/voice information to the controller 330.
  • the communication unit 310 may receive external advertisement information. When external advertisement information is received, the communication unit 310 transfers the information to the advertisement information providing unit or control unit 330 .
  • the communication unit 310 may receive the biosignal and voice information of the companion animal transmitted by the wearable biometric module via the signal intermediary device 200 .
  • the communication unit 310 transmits the biosignal and voice information to the peripheral device management unit and the control unit 330 .
  • the communication unit 310 may transmit companion animal identification information to the companion animal guardian terminal 400 .
  • the companion animal guardian may check which companion animal has pressed the button of the companion animal input device 100 according to the companion animal identification information.
  • the communication method supported by the communication unit 310 is not limited, and a short-range wireless communication method between devices as well as a communication method using a communication network (eg, a mobile communication network, a wired Internet, a wireless Internet, and a broadcasting network) that the network may include. data can be transmitted and received.
  • a communication network eg, a mobile communication network, a wired Internet, a wireless Internet, and a broadcasting network
  • the learning unit 320 generates a companion animal expression model (hereinafter referred to as an expression model) or a companion animal behavior model (hereinafter referred to as a behavior model) through artificial intelligence (machine learning) using the companion animal image selected by the companion animal guardian for model learning. and save
  • the facial expression model outputs the inference result of the companion animal's expression shown in the companion animal's image.
  • the behavior model outputs the inference result of the companion animal's behavior shown in the video about the companion animal. Details are described later.
  • the control unit 330 determines the type of input signal, the elapsed time after the companion animal input signal, the type of received advertisement or weather information, the current time, The situation of the companion animal or companion animal guardian is determined based on information such as schedule information stored in the companion animal guardian terminal 400 and the current location of the companion animal guardian.
  • the event refers to an event that causes the message interface 411 to display at least one of a message, an image, and an emoticon in a chat window with a companion animal.
  • Examples of the above events include reception of companion animal input signal, elapse of a certain time without guardian feedback after companion animal input signal, reception of advertisement, reception of weather information, arrival of a specific time, change in biometric information of companion animal, and specificity of companion animal.
  • the companion animal guardian When a companion animal input signal is received by the communication unit 310 and there is no feedback from the companion animal guardian for a message expressing the companion animal's desire in the messenger interface 411 for a predetermined period of time, the companion animal guardian responds to the messenger interface 411 ), the event may occur when weather or advertisement information is received from an external server to the communication unit 310 . However, it is not limited thereto.
  • the controller 330 determines at least one of a message, an image, and an emoticon (hereinafter referred to as 'message, etc.') to be output on the messenger interface 411 according to the determined situation. That is, the controller 330 generates a message according to the determined situation, extracts a message from the message DB 341, extracts an image of a companion animal from the image DB 342, or extracts an emoticon from the emoticon DB 344. can be extracted.
  • the control unit 330 may select a specific keyword as a result of the situation determination and extract a message corresponding to the keyword from the message DB 341 . If the generated or extracted message is singular, the control unit 330 selects the corresponding message as a message to be delivered to the communication platform 410 . When there are a plurality of generated or extracted messages, the control unit 330 may select a message to be delivered to the communication platform 410 at random or according to a predetermined rule.
  • the controller 330 may select an expression of a specific companion animal mapped to a specific keyword as a result of the situation determination, and extract an image of the companion animal corresponding to the expression from the image DB 342 . If the extracted image is singular, the controller 330 selects the corresponding image as an image to be transmitted to the communication platform 410 . When the number of extracted images is plural, the controller 330 may select an image to be transmitted to the communication platform 410 randomly or according to a predetermined rule.
  • the controller 330 may select an emoticon to be transmitted to the communication platform 410 along with or instead of the image according to settings.
  • the control unit 330 selects a specific tag mapped to a specific keyword as a result of situation determination, and selects an emoticon corresponding to the tag in the emoticon DB 344 according to a predetermined criterion. can be extracted according to
  • a situation judgment logic according to event occurrence and a mapping table of keywords and expression names (or tags) derived as a result of situation judgment are stored in the internal storage of the controller 330 or the storage unit 340 .
  • the logic and mapping table may be updated periodically or randomly.
  • the controller may generate a message “I am hungry” or select a keyword “meal meal” to extract a message “I am hungry” corresponding to the keyword “meal meal” from the message DB 341, and may extract a message “I am hungry” from the image DB 342
  • An image of a companion animal corresponding to 'eagerness', a facial expression mapped with the keyword 'food service', can be extracted.
  • the control unit 330 In the case of receiving an advertisement from the outside that a dog cafe is open near the companion animal guardian's home, the control unit 330 generates a message saying "Hyung ⁇ There's a dog cafe near my house! Let's go with me!!" , By selecting the 'dog cafe' keyword, a message corresponding to the 'dog cafe' keyword can be extracted from the message DB (341), "Hyung ⁇ There's a dog cafe near my house! Let's go with me!!" In (342), an image of a companion animal corresponding to 'excited', which is a facial expression mapped with the keyword 'dog cafe', can be extracted.
  • the control unit 330 transmits the message and the like determined in this way to the communication platform 410 of the companion animal guardian terminal 400 through the communication unit 310 .
  • the communication platform 410 displays the received message on the messenger interface 411.
  • control unit 330 transmits the captured image, video or audio information of the companion animal sent by the signal broker 200 through the communication unit 310 to the communication platform of the companion animal guardian terminal 400 ( 410).
  • the communication platform 410 may display at least one of the received image/video/voice information on the messenger interface 411 .
  • control unit 330 converts the companion animal input signal into a colloquial message and transmits it to the communication platform 410 through the communication unit 310 .
  • the communication platform 410 delivers the colloquial message to the companion animal guardian through the chatting function of the messenger interface 411 .
  • the controller 330 identifies keywords according to the input signal of the companion animal, and selects sentences including the keywords from conversational sentences pre-stored in the message DB 341.
  • the communication unit 310 transmits the sentence to the communication platform 410 .
  • the communication platform 410 delivers the sentence to the companion animal guardian through the chatting function of the messenger interface 411 .
  • control unit 330 converts the input signal of the companion animal into a spoken message based on the companion animal photographed image received from the camera.
  • the communication unit 310 delivers the colloquial message to the communication platform 410 .
  • the communication platform 410 delivers the colloquial message to the companion animal guardian through the chatting function of the messenger interface 411 .
  • a colloquial message indicating a situation such as "I'm done” can be delivered.
  • the communication platform 410 may convert the companion animal's input signal into spoken language and deliver it, or may transmit the status notification message without converting it into spoken language according to the setting of the companion animal guardian. That is, it is also possible to simply deliver a notification message in the form of a keyword or sentence that can recognize the motion state and desire state of the companion animal.
  • control unit 330 collects images, videos, and data necessary for the analysis to be transmitted to the communication platform 410 through the communication unit 310. perform a function
  • control unit 330 randomly extracts images or videos to be used for the analysis from the image DB 342 or the video DB 343 and transmits them to the communication platform 410 through the communication unit 310. Let it be.
  • control unit 330 extracts an image or video designated by the companion animal guardian to be used for the analysis from the image DB 342 or the video DB 343 and uses the communication unit 310 to communicate with the communication platform. (410).
  • control unit 330 transmits the image or video to be used for the analysis to the model manager 324 and receives the inference result of the expression model or behavior model from the model manager 324 .
  • the controller 330 selects facial expressions/actions inferred to be highly probable in the model within a predetermined number range, and creates options by adding arbitrary facial expressions/actions.
  • the number of options is not limited.
  • the control unit 330 sets 'embarrassment (60%)' as the most probable expression and 'fear (30%)' as the next most probable expression as a result of deducing the facial expression model of the image of the companion animal to be used for analysis. ' appears, it is possible to randomly add an expression of 'happy' to create three options.
  • the control unit 330 transmits the created options to the communication platform 410 through the communication unit 310 .
  • the facial expression analysis interface 412 or behavior analysis interface 413 outputs the options on the screen so that the companion animal guardian or user can select them. If the result of the companion animal guardian's expression/action selection is inconsistent with the inference result of the expression model/action model, the control unit 330 takes this as feedback and delivers it to the expression learning module 322 or the action learning module 323 to provide a facial expression model. Alternatively, the behavioral model can be relearned.
  • control unit 330 performs an operation to identify a companion animal that has pressed a button based on a video taken by the companion animal, or provides the video taken by the companion animal to the companion animal guardian terminal 400 to assist the companion animal guardian. to be able to check directly.
  • the companion animal input device 100 or the signal relaying device 200 may include a chip recognition unit capable of recognizing an internal chip or an external chip for recognizing companion animals.
  • the communication unit 310 transmits the signal to the control unit 330.
  • the control unit 330 may generate companion animal identification information from the chip recognition signal and transmit it to the companion animal guardian terminal 400 through the communication unit 310 .
  • the storage unit 340 stores a message that can be displayed on a chat screen provided by the messenger interface 411 and an image of a companion animal.
  • the storage unit 340 may store a video of a companion animal that may be displayed on the messenger interface 411 .
  • the storage unit 340 may classify data to be stored for each identification code of the companion animal.
  • the storage unit 340 stores images or videos of companion animals uploaded to the gallery of the communication platform 410 by companion animal guardians.
  • An image or video of a companion animal stored in the storage unit 340 may be labeled with a corresponding expression or behavior of the companion animal.
  • the label may be generated by a facial expression model stored by the learning unit 320 or directly input by a companion animal guardian.
  • the storage unit 340 stores the emoticons generated by the emoticon generating unit 350 .
  • the emoticon is an emoticon created based on an image of a companion animal.
  • the storage unit 340 may be a flash memory type, a hard disk type, a multimedia card micro type, or a card type memory (eg, SD or XD memory).
  • magnetic memory, magnetic disk, optical disk, RAM (Random Access Memory: RAM), SRAM (Static Random Access Memory), ROM (Read-Only Memory: ROM), EEPROM (Electrically Erasable Programmable Read-Only Memory), PROM (Programmable Read-Only Memory) may include at least one storage medium.
  • the storage unit 340 is a program for grasping a companion animal's desire according to a companion animal input signal and a program necessary for driving the communication platform 410 operated in the companion animal guardian terminal 400. , Stores a program necessary to transfer the remote control signal received from the companion animal guardian terminal 400 to the signal broker 200 matched with the companion animal guardian and data necessary to drive the program.
  • the emoticon generating unit 350 creates an emoticon using an image of a companion animal stored in the image DB 342 .
  • the image of the companion animal may be an image designated by the companion animal guardian through the emoticon interface 414 .
  • the emoticon generator 350 may reduce the image of the companion animal or extract a specific area to generate an emoticon to have a predetermined size suitable for use on a chatting screen or an SNS screen.
  • the emoticon generating unit 350 may generate an emoticon by converting an image of a companion animal using a deep learning technique.
  • GAN may be exemplified, but is not limited thereto.
  • the emoticon generating unit 350 when a facial expression label is given to the companion animal image extracted from the image DB 342, transfers the label to the emoticon DB 344 together with the emoticon as an emoticon tag.
  • the emoticon DB 344 matches emoticon tags and emoticons and stores them together.
  • the emoticon generating unit 350 causes the model manager 324 to determine the expression of the companion animal shown in the image, and determines the expression of the companion animal as an emoticon. It is tagged and delivered to the emoticon DB 344 together with the emoticon.
  • the emoticon DB 344 matches emoticon tags and emoticons and stores them together.
  • the user of the emoticon interface 414 sends an emoticon through the emoticon interface 414 regardless of whether a facial expression label is not assigned to the image of the companion animal or whether a facial expression label is assigned to the image of the companion animal. Tags can be given arbitrarily.
  • the member management unit of the communication server 300 registers and manages member information using the IoT-based companion animal management service.
  • the member information includes essential information (name, contact information) required for membership registration.
  • the member information may include the ID of the companion animal input device 100, the ID of the signal broker 200 used by the member, and information on peripheral devices that are interlocked.
  • the peripheral device management unit of the communication server 300 monitors information and control information collected by the signal broker 200 registered by the member.
  • the information collected by the signal relay device 200 includes an input signal transmitted from the companion animal input device 100 to the signal relay device 200.
  • the peripheral device management unit may monitor the change information obtained by changing the peripheral device linkage setting value (control target) of the signal mediation device 200 in the companion animal guardian terminal 400 .
  • the peripheral device management unit monitors operation information of peripheral devices controlled by the signal broker 200 and control information for controlling the peripheral devices, and records (registers) and stores the monitored information.
  • the operation information of the peripheral devices may be information including operation (operation) time of the peripheral devices, power consumption, amount of discharge (in the case of a feed feeder), and the like.
  • the operation information includes control information for controlling peripheral devices interworking with the signal broker 200 .
  • the peripheral device management unit analyzes the health condition based on the biometric information of the companion animal collected during a predetermined period, the amount of food eaten, the time spent playing, and the time spent sleeping. If the peripheral device management unit determines that there is an abnormality in the companion animal's health, it is transmitted to the control unit 330 as a message generation event.
  • the control unit 330 generates a message about abnormal health symptoms and transmits the message to the communication platform 410 through the communication unit 310 so that the message about the symptoms of abnormal health of the companion animal is output on the messenger interface 411. .
  • the peripheral device management department For example, if the companion animal's respiratory rate per minute exceeds the normal range, if the companion animal's pulse rate per minute exceeds the normal pulse rate per minute according to the breed/age, or if the companion animal's body temperature exceeds 39 degrees Celsius, the peripheral device management department The content is transmitted to the control unit 330.
  • the advertisement information provider of the communication server 300 may be a component that provides product advertisement information from a companion animal product company.
  • the learning unit 320 of the communication server includes a message learning module 321, a facial expression learning module 322, and a model manager 324.
  • the learning unit 320 may further include a behavioral learning module 323 .
  • the message learning module 321 machine-learns the conversation contents of the companion animal guardian through the chat app, creates a deep learning model, and stores it in the internal storage of the model manager 324.
  • the message learning module 321 renews the deep learning model by machine learning keywords included in questions, conversations, and sentences between the companion animal guardian and the other party used in the chat app, so that the deep learning model is a communication platform (410 ) through the chat function, it can be optimized to express the companion animal's desire or intention in a colloquial message. Accordingly, it is possible to chat realistically between the companion animal guardian and the companion animal.
  • the facial expression learning module 322 learns an image of a companion animal to generate or update a companion animal expression model (hereinafter, a facial expression model).
  • the expression model is used to infer the expression of companion animals.
  • the companion animal guardian may upload an image or video of the companion animal to the gallery of the communication platform 410 .
  • the uploaded image is stored in the image DB 342 of the storage unit 340, and the uploaded video is stored in the video DB 343 of the storage unit 340.
  • a companion animal guardian who wants to create a facial expression model selects an image for model learning among companion animal images uploaded to the gallery through a learning screen provided by the facial expression analysis interface 412 .
  • An image for model training is preferably an image containing a companion animal's expression.
  • the companion animal guardian labels the facial expression for each image of the companion animal while flipping the learning images left and right or up and down.
  • the facial expression analysis interface 412 may present and select facial expressions that can be labeled by companion animal guardians on the learning screen.
  • the facial expression analysis interface 412 is a companion animal expression displayed in an image of a companion animal, such as happiness, good, excitement, dislike, pouting, annoyance, sadness, curiosity, embarrassment, fear, worry, disappointment, eagerness, Expressions such as angry, tired, sleepy, sullen, or pitiful can be presented to the pet guardian to select a label.
  • the facial expression analysis interface 412 may select and present several facial expressions in order of high probability according to the result of executing the facial expression model stored in the communication server 300 .
  • the companion animal guardian may directly input a facial expression label, or the companion animal guardian may select an image for model training by selecting a companion animal image stored in the image DB 342 and to which a facial expression label is assigned.
  • the facial expression learning module 322 transfers the labeled companion animal image to the image DB 342, and the image DB 342 stores it.
  • the facial expression learning module 322 creates/updates a facial expression model through learning of an image of a companion animal selected for model learning.
  • the facial expression learning module 322 may create/update a separate facial expression model for each breed of dog or companion animal.
  • the facial expression learning module 322 preprocesses the model training image to remove noise, and recognizes the face of the companion animal in the preprocessed image.
  • the expression learning module 322 extracts features for determining the expression from the companion animal's face using a deep learning technique.
  • a deep learning technique for feature extraction an autoencoder, a deep autoencoder, a CNN model, etc. may be used, but is not limited thereto.
  • the facial expression learning module 322 uses the extracted features to generate a facial expression model by learning any one of the models (eg, SVM, KSVM, and DNN models) stored in the model manager 324. In addition, the facial expression learning module 322 may re-learn and update the previously generated facial expression model.
  • the models eg, SVM, KSVM, and DNN models
  • the companion animal guardian may test the expression model on the learning screen of the expression analysis interface 412 .
  • the companion animal guardian can test the facial expression model by selecting a random companion animal image from the gallery. If the companion animal guardian is not satisfied with the facial expression inference result, he or she may provide feedback on the learning screen of the facial expression analysis interface 412 . According to the feedback, an image for learning is additionally selected and learning is performed again to update the expression model, or after the learning model is changed (for example, SVM to KSVM or DNN), learning is performed again to generate another expression model. there is.
  • the facial expression learning module 322 transfers the facial expression model to the model manager 324, and the model manager 324 stores the facial expression model in an internal storage.
  • an image corresponding to the target breed of dog or companion animal is extracted from the image DB 342 and used.
  • the communication platform 410 sends a message to the communication unit 310 to request expression model learning/testing, and a field for extracting the companion animal image from the DB ( dog breed, companion animal identification code), feedback on the inference result, and the selected label (name of expression) are transmitted, and the communication unit 310 transmits the inference result of the companion animal image and expression model to the communication platform 410. .
  • the communication unit 310 transmits a request for learning a facial expression model from the communication platform 410 to the facial expression learning module 322 so that learning is performed.
  • the messenger interface 411 outputs an interactive message and options for facial expression in the form of a chatbot to the screen, so that the guardian of the companion animal can By checking the image of the animal and selecting one of the options related to the expression, the expression can be labeled in the image without the expression label. An option requesting to present another option or an option skipping without selecting this time may be added to the option.
  • the control unit 330 of the communication server 300 selects an image without a facial expression label among companion animal images stored in the image DB 342 at random or according to a predetermined rule.
  • the controller 330 creates a message using the companion animal's name corresponding to the companion animal's image.
  • the controller 330 may input the selected companion animal image into a facial expression model to execute the facial expression model, and extract a predetermined number according to the probability of each facial expression that appears as a result.
  • the controller 330 may generate (select) an option by randomly extracting a facial expression name from a facial expression list stored in the internal storage.
  • the facial expression list may be previously stored in the communication platform 410 or may be stored in the internal storage of the controller 330 .
  • the control unit 330 transmits the image of the selected companion animal, the created message, and options to the communication platform 410 through the communication unit 310 .
  • the communication platform 410 outputs images, messages, and options received from the chatting window of the messenger interface 411 .
  • the communication platform 410 transmits the selection result of the companion animal guardian to the communication unit 310, and the control unit 330 matches the facial expression selected by the companion animal guardian with the corresponding companion animal image and transmits the result to the image DB 342.
  • the DB 342 matches and stores the corresponding expression label with the corresponding companion animal image. If the option selected by the companion animal guardian requires other options to be presented, a predetermined number of facial expressions that are not selected as options are extracted from the expression model execution results, or expression names are randomly extracted from the facial expression list stored in the internal storage to generate options. And, the image of the companion animal, the message and the regenerated options are transmitted to the communication platform 410 through the communication unit 310. The subsequent process is the same as described above.
  • the expression label of the companion animal can be continuously added to the image DB.
  • the behavioral learning module 323 learns videos of companion animals to create or update companion animal behavior models (below, behavior models).
  • the behavioral model is used to infer the motion/state (behavior below) of the companion animal.
  • a companion animal guardian who wants to create a behavior model selects a video for model learning among companion animal videos uploaded to the gallery through a learning screen provided by the behavior analysis interface 413 .
  • a video for model training is preferably a video containing the behavior or condition of a companion animal.
  • the companion animal guardian flips the learning video left and right or up and down, and labels the behavior of the companion animal for each video while playing the corresponding video as needed.
  • the behavior analysis interface 413 may present and select behaviors that can be labeled by companion animal guardians on the learning screen.
  • the behavior analysis interface 413 may display motions such as 1 sitting, standing, lying down, lying down, walking, running, jumping, shaking one's body, wagging a tail, barking, mounting, etc., or 2 hunger, vigilance, By presenting conditions such as feeling sick, wanting to go for a walk, wanting attention, wanting to defecate (wanting to go to the bathroom), excited, angry, or in heat, the pet owner can choose a label.
  • the behavior analysis interface 413 may select and present several behaviors in order of high probability according to a result of executing an existing behavior model stored in the communication server 300 .
  • the companion animal guardian may select a video for model learning by directly inputting a behavior label or by selecting a companion animal video stored in the video DB 343 and to which a behavior label is assigned.
  • the action learning module 323 transfers the labeled companion animal video to the video DB 343, and the video DB 343 stores it.
  • the behavioral learning module 323 creates/updates a behavioral model through learning of a video of a companion animal selected for model learning.
  • the behavioral learning module 323 may create/update a separate behavioral model for each breed of dog or companion animal.
  • the action learning module 323 pre-processes the video for model learning. Noise removal or image/voice extraction may be performed in the preprocessing process.
  • the action learning module 323 detects an object (companion animal) in a preprocessed image (static data) or video (dynamic data). You Only Look Once (YOLO) and Region-based CNN (R-CNN) models may be used for object detection, but are not limited thereto.
  • the behavioral learning module 323 designates or extracts characteristics for determining the companion animal's behavior. Dense Trajectories or Histogram of Oriented Gradients (HOG) can be used as hand-crafted features.
  • a recurrent neural network (RNN) model or a CNN model may be used as a deep learning technique for feature extraction. However, techniques for specifying or extracting features are not limited to the above.
  • the behavioral learning module 323 generates a behavioral model by learning any one model (eg, SVM or AdaBoost) stored in the model manager 324 using the extracted features. In addition, the behavioral learning module 323 may re-learn and update the previously created behavioral model.
  • any one model eg, SVM or AdaBoost
  • the companion animal guardian can test the behavior model on the learning screen of the behavior analysis interface 413 .
  • the companion animal guardian can test the behavior model by selecting a random companion animal video from the gallery. If the companion animal guardian is not satisfied with the result of the behavioral reasoning, he or she may provide feedback on the learning screen of the behavioral analysis interface 413. According to the above feedback, a learning video is additionally selected and learning is performed again, and the behavior model is updated, or the learning model is changed (for example, SVM is changed to an ensemble of SVM and AdaBoost), and learning is conducted again, and another behavior model is created. can be created
  • the action learning module 323 transfers the action model to the model manager 324, and the model manager 324 stores the action model in an internal storage.
  • a video corresponding to the target dog species or companion animal is extracted from the video DB 343 and used.
  • the communication platform 410 sends a behavior model learning/test request message to the communication unit 310, and a field for extracting a companion animal video from the DB ( dog species, companion animal identification code), feedback on inference results, and selected labels (names of actions) are transmitted, and the communication unit 310 transmits the video of the companion animal and the inference results of the behavior model to the communication platform 410. .
  • the communication unit 310 transmits a behavior model learning request from the communication platform 410 to the behavior learning module 323 so that learning is performed.
  • the model manager 324 stores a pre-learning model (eg, SVM), which is a subject of facial expression learning of the facial expression learning module 322 or behavioral learning of the behavioral learning module 323, in an internal storage.
  • a pre-learning model eg, SVM
  • the model manager 324 stores an object detection model or a deep learning model (for example, CNN) used for feature extraction, which is required in the model learning process, in an internal storage.
  • the model manager 324 stores the facial expression model created or updated by the facial expression learning module 322 and the behavior model created or updated by the behavioral learning module 323 in an internal storage.
  • the model manager 324 obtains an inference result by executing a facial expression model or a behavior model according to the progress of a test, expression analysis, or behavior analysis in the expression analysis interface 412 or the behavior analysis interface 413 .
  • the model manager 324 transmits the reasoning result to the communication platform 410 through the communication unit 310 .
  • the model manager 324 may transmit the inference result to the controller 330 so that the controller 330 may use the inference result when generating an option of facial expression analysis or behavior analysis.
  • the storage unit 340 of the communication server includes a message DB 341, an image DB 342, and a video DB 343. And the storage unit 340 may further include an emoticon DB 344 .
  • the message DB 341 stores messages that can be displayed on the chatting screen provided by the messenger interface 411 . Each message stored in the message DB 341 is given a keyword.
  • a keyword mapped according to the situation is extracted from the message DB 341 and then a message to be displayed on the chatting screen is selected.
  • the image DB 342 stores an image of a companion animal uploaded to the gallery of the communication platform 410 by a companion animal guardian. Also, the image DB 342 stores an image of a companion animal that can be displayed on a chatting screen provided by the messenger interface 411 .
  • An expression label may be assigned to an image of a companion animal stored in the image DB 342 . The label may be generated as a result of the model manager 324 executing the facial expression model, or may be directly input by a companion animal guardian.
  • the video DB 343 stores videos of companion animals that can be displayed on the messenger interface 411 . Also, the video DB 343 stores videos of companion animals uploaded to the gallery of the communication platform 410 by companion animal guardians.
  • a behavior label may be assigned to an image of a companion animal stored in the video DB 343 . The label may be generated as a result of the model manager 324 executing the behavior model, or may be directly input by a companion animal guardian.
  • the emoticon DB 344 stores emoticons generated by the emoticon generating unit 350 .
  • the emoticon is an emoticon created based on an image of a companion animal.
  • the emoticon can be used in a chatting screen of the messenger interface 411, a chatting app installed in the companion animal guardian terminal 400, and SNS.
  • the companion animal guardian may use the emoticon himself or herself, and may register and sell the emoticon on an external emoticon purchasing platform.
  • FIG. 3 is a block diagram showing the configuration of a communication platform installed in a companion animal guardian terminal according to an embodiment of the present invention.
  • the communication platform 410 includes a messenger interface 411 , a facial expression analysis interface 412 , and a behavior analysis interface 413 .
  • the communication platform 410 according to an embodiment of the present invention may further include an emoticon interface 414.
  • the communication platform 410 may further include a companion animal SNS interface.
  • the communication platform 410 may further include a peripheral device management interface and a companion animal management interface.
  • the communication platform 410 displays a message, image or video received from the communication unit 310 through the messenger interface 411 .
  • the communication platform 410 displays the image of the companion animal transmitted from the communication unit 310 and the companion animal expression analysis data (name of expression) through the expression analysis interface 412 .
  • the communication platform 410 displays the video of the companion animal and companion animal behavior analysis data (action name) transmitted from the communication unit 310 through the behavior analysis interface 413 .
  • the communication platform 410 displays the emoticon received from the communication unit 310 through the emoticon interface 414 .
  • the messenger interface 411 provides an environment (hereinafter referred to as a companion animal chatting window) in which guardians of companion animals chat with each other.
  • the messenger interface 411 conveys the companion animal's desire as if having a conversation using a chatting function. That is, the messenger interface 411 provides a communication method in the form of a chatting personified companion animal.
  • the companion animal's desire may be determined based on the companion animal input signal.
  • the control unit 330 of the communication server 300 determines a situation, selects a message to be output on the messenger interface 411, and transmits the message to the communication platform 410 through the communication unit 310. . Also, the controller 330 may transmit an image of a companion animal together with a message.
  • the communication platform 410 outputs it to the companion animal chatting window as if it is a companion animal message (interactive message).
  • the messenger interface 411 utilizes information such as location information, weather, traffic congestion, current time, and news of the companion animal guardian terminal 400 to provide products/services related to companion animals and customized products/services for companion animal guardians.
  • information such as location information, weather, traffic congestion, current time, and news of the companion animal guardian terminal 400 to provide products/services related to companion animals and customized products/services for companion animal guardians.
  • a function of displaying a message recorded as an interactive sentence for an advertisement message may be provided.
  • an interactive message 1 If the companion animal guardian does not arrive home within a certain time according to the current location, time, etc., the companion animal guardian terminal 400 sends "Hyung, you're a little late today? So, are you eating first?" You can display interactive messages such as Also, 2 If the interactive message is in the form of an advertisement, "Hyung, 000 is newly aired on Netflix today. See you on the way home from work later.” "Hyung, there's a dog cafe near my house. Let's go with me!!" It is possible to provide interactive advertising messages such as
  • the messenger interface 411 may output an image containing a companion animal's expression (hereinafter referred to as a companion animal expression image) or a video of the companion animal transmitted from the communication server 300 along with the interactive message to a chat window.
  • a companion animal expression image an image corresponding to 'eagerness' in the case of 1 and an image corresponding to 'excitement' in the case of 2 can be output.
  • These images can be selected after the control unit 330 in the communication server 300 extracts a companion animal image corresponding to a facial expression mapped to a keyword according to situational judgment from the image DB 342, and in the communication platform 410 can also be selected.
  • the facial expression analysis interface 412 provides a screen displaying an image of a companion animal and a predetermined number of options.
  • the facial expression analysis interface 412 may provide a screen in the form of a multiple-choice quiz asking what the companion animal's expression contained in the companion animal's image is.
  • the option is the name of a facial expression that the companion animal in the image may have.
  • the expression analysis interface 412 displays whether the expression label assigned to the image matches the user's selected option.
  • a user of the facial expression analysis interface 412 may be a guardian of a companion animal as well as another person having access authority.
  • the facial expression analysis interface 412 may allow the user to select which companion animal among the plurality of companion animals to perform facial expression analysis. If a facial expression analysis service is used through an SNS account of a specific companion animal provided by the companion animal SNS interface of the communication platform 410, a facial expression analysis screen is provided for the companion animal.
  • the expression analysis interface 412 randomly selects an image of the corresponding companion animal or performs expression analysis based on the corresponding companion animal image designated by the user.
  • the facial expression analysis interface 412 may display a thumbnail of the companion animal image on the screen so that the user can designate the image for facial expression analysis.
  • the expression analysis interface 412 displays the corresponding image and a predetermined number of options on the screen.
  • the option is the name of a facial expression that the companion animal in the image may have.
  • the names of facial expressions to be displayed in the plurality of options may be determined based on the result of inferring facial expressions from the facial expression analysis target image by the facial expression model in the communication server 300 together with the facial expression labels assigned to the images.
  • the names of facial expressions to be displayed in the plurality of options may be configured by mixing the inference result and the facial expression names randomly extracted from the facial expression list.
  • the facial expression list may be previously stored in the communication platform 410 or may be stored in the internal storage of the controller 330 .
  • the facial expression options to be presented by the facial expression analysis interface 412 are set to 4, the facial expression label given to the image of the target for facial expression analysis is 'embarrassed', and the facial expression model of the companion animal is inferred from the image. If one result is 'embarrassment (60%)', 'fear (30%)', 'concern (8%)', or 'curiosity (2%)', the controller 330 displays 'embarrassment', 'fear' , 'Worry', 'happy', which is a randomly extracted facial expression, can be added to the three options of 'worry' to configure the four options and transmit them to the communication platform 410 through the communication unit 310.
  • the facial expression analysis interface 412 determines whether the facial expression selected by the user and the facial expression label assigned to the image of the target of facial expression analysis match on the screen. can be displayed In addition, when a facial expression label is not assigned to an image of a facial expression analysis target, the facial expression analysis interface 412 may display on the screen whether the facial expression selected by the user and the facial expression having the highest probability among the inference results of the facial expression model match. .
  • the facial expression label assigned to the image of the facial expression analysis target is 'embarrassed', and 'embarrassed', 'fear', 'worry', and 'happy' are presented on the screen of the facial expression analysis interface 412 as options, , When the user selects 'fear' among options, the facial expression analysis interface 412 may display on the screen that the facial expression selected by the user does not match the facial expression label.
  • the communication platform 410 takes this as feedback and uses the inconsistent content may be transmitted to the communication unit 310 so that the facial expression learning module 322 learns the facial expression model again.
  • the facial expression analysis interface 412 may aggregate facial expression selection results of others who access the facial expression analysis interface 412 for a specific image of a companion animal designated by a companion animal guardian, and output the aggregated results to the screen. there is.
  • the facial expression analysis interface 412 may display the result of analyzing other people's facial expressions in the form of a text or a graph such as a pie chart or a histogram.
  • the counting result may be in the form of comparing correct numbers (ratio) with incorrect numbers (ratio), or may be in the form of indicating the number or ratio of selections for each facial expression.
  • the facial expression analysis interface 412 provides a screen for generating or updating a facial expression model.
  • a companion animal guardian who wants to create/update a facial expression model selects an image for model learning among companion animal images uploaded to the gallery through a learning screen provided by the facial expression analysis interface 412 .
  • An image for model training is preferably an image containing a companion animal's expression.
  • the companion animal guardian labels the facial expression for each image of the companion animal while flipping the learning images left and right or up and down.
  • the facial expression analysis interface 412 may present and select facial expressions that can be labeled by companion animal guardians on the learning screen.
  • the facial expression analysis interface 412 is a companion animal expression displayed in an image of a companion animal, such as happiness, good, excitement, dislike, pouting, annoyance, sadness, curiosity, embarrassment, fear, worry, disappointment, eagerness, Expressions such as angry, tired, sleepy, sullen, or pitiful can be presented to the pet guardian to select a label.
  • the facial expression analysis interface 412 may select and present several facial expressions in order of high probability according to a result of executing an existing facial expression model stored in the communication server 300 .
  • the companion animal guardian may directly input a facial expression label, or the companion animal guardian may select an image for model training by selecting a companion animal image stored in the image DB 342 and to which a facial expression label is assigned.
  • the facial expression learning module 322 transfers the labeled companion animal image to the image DB 342, and the image DB 342 stores it.
  • the facial expression learning module 322 creates/updates a facial expression model through learning of an image of a companion animal selected for model learning.
  • the facial expression learning module 322 may create/update a separate facial expression model for each breed of dog or companion animal.
  • the behavior analysis interface 413 provides a screen displaying a video of a companion animal and a predetermined number of options.
  • the behavior analysis interface 413 may provide a screen in the form of a multiple-choice quiz asking what the companion animal's behavior contained in the video of the companion animal is. It is preferable that the option be the name of an action that the companion animal in the video can have.
  • the behavior analysis interface 413 displays whether the behavior label assigned to the video matches the user's selection.
  • a user of the behavior analysis interface 413 may be a guardian of a companion animal as well as a third party having access authority.
  • the behavior analysis interface 413 may allow a user to select which companion animal among the plurality of companion animals to perform behavior analysis. If a behavior analysis service is used through an SNS account of a specific companion animal provided by the companion animal SNS interface of the communication platform 410, a behavior analysis screen is provided for the companion animal.
  • the behavior analysis interface 413 randomly selects a video of the companion animal or conducts behavior analysis based on the video of the companion animal designated by the user.
  • the behavior analysis interface 413 may display a thumbnail of the companion animal video on the screen so that the user can designate the video for behavior analysis.
  • the behavioral analysis interface 413 displays the corresponding video and a predetermined number of options on the screen. It is preferable that the option be the name of an action that the companion animal in the video can have.
  • the names of the actions to be displayed in the plurality of options may be determined based on the result of inferring the actions from the action-analyzed object video by the action model in the communication server 300 together with the action labels assigned to the video. In addition, the names of actions to be displayed in the plurality of options may be configured by mixing the inference results and action names randomly extracted from the action list.
  • the action list may be previously stored in the communication platform 410 or may be stored in the internal storage of the controller 330 .
  • the behavioral label assigned to the behavioral analysis target video is 'I want to go for a walk'
  • the behavioral model of the corresponding companion animal is the video.
  • the controller 330 'I want to go for a walk', 'I want to attract attention', and 'I want to go to the bathroom' are added to the three options of 'I want to draw attention' and 'I want to go to the bathroom' to form 4 options, and the communication platform ( 410).
  • the behavior analysis interface 413 determines whether the behavior selected by the user and the behavior label assigned to the target video for behavior analysis match on the screen. can be displayed In addition, when a behavioral label is not assigned to the behavioral analysis target video, the behavioral analysis interface 413 may display on the screen whether the behavior selected by the user matches the behavior with the highest probability among the inference results of the behavioral model. .
  • the behavioral label given to the video subject to behavioral analysis is 'I want to go for a walk', and 'I want to go for a walk', 'I want to draw attention', 'I want to go to the bathroom' as an option are actions If it is presented on the screen of the analysis interface 413 and the user selects 'I want to attract attention' among the options, the behavior analysis interface 413 may display on the screen that the behavior selected by the user does not match the behavior label. .
  • the communication platform 410 uses this as feedback to inform the content of the inconsistency. may be transmitted to the communication unit 310 so that the action learning module 323 learns the action model again.
  • the behavior analysis interface 413 may aggregate behavioral selection results of others who have access to the behavior analysis interface 413 for a specific video of a companion animal designated by a companion animal guardian, and output the result on the screen.
  • a companion animal video designated by a companion animal guardian is 'I want to go for a walk'
  • 10 people access the behavior analysis interface 413 for a certain period of time
  • 4 people will say 'I want to go for a walk'
  • 3 people will say 'I want to go to the bathroom'.
  • the behavior analysis interface 413 may display the result of analyzing other people's behavior analysis in the form of a text or a graph such as a pie chart or a histogram.
  • the counting result may be in the form of comparing correct numbers (rate) with incorrect numbers (rate), or may be in the form of indicating the number or ratio of selections for each action.
  • the behavior analysis interface 413 provides a screen for generating or updating a behavior model.
  • a companion animal guardian who wants to create/update a behavior model selects a video for model learning among companion animal videos uploaded to the gallery through a learning screen provided by the behavior analysis interface 413.
  • Videos for model training are preferably videos containing behaviors of companion animals.
  • Companion animal guardians flip through learning videos left and right or up and down, and label the behavior of each companion animal while playing the video as needed.
  • the behavior analysis interface 413 may present and select behaviors that can be labeled by companion animal guardians on the learning screen.
  • the behavior analysis interface 413 may display motions such as 1 sitting, standing, lying down, lying down, walking, running, jumping, shaking one's body, wagging a tail, barking, mounting, etc., or 2 hunger, vigilance, By presenting conditions such as feeling sick, wanting to go for a walk, wanting attention, wanting to defecate (wanting to go to the bathroom), excited, angry, or in heat, the pet owner can choose a label.
  • the behavior analysis interface 413 may select and present several behaviors in order of high probability according to a result of executing an existing behavior model stored in the communication server 300 .
  • the companion animal guardian may select a video for model learning by directly inputting a behavior label or by selecting a companion animal video stored in the video DB 343 and to which a behavior label is assigned.
  • the action learning module 323 transfers the labeled companion animal video to the video DB 343, and the video DB 343 stores it.
  • the behavioral learning module 323 creates/updates a behavioral model through learning of a video of a companion animal selected for model learning.
  • the behavioral learning module 323 may create/update a separate behavioral model for each breed of dog or companion animal.
  • the emoticon interface 414 provides a screen for generating an emoticon using an image of a companion animal stored in the gallery of the communication platform 410 (hereinafter referred to as 'emoticon creation screen').
  • the emoticon interface 414 provides a screen for using the created emoticon on a chatting app, Internet screen, or SNS screen of the companion animal guardian terminal 400 .
  • the emoticon interface 414 may provide a screen that supports uploading an emoticon created in conjunction with an external emoticon purchasing platform to the emoticon purchasing platform. In this way, companion animal guardians can generate revenue by selling emoticons in which images of their companion animals are converted.
  • the emoticon generator 350 When a companion animal guardian selects an image of a companion animal from the gallery on the emoticon creation screen of the emoticon interface 414, the emoticon generator 350 reduces the image of the companion animal or extracts a specific area from the chat screen or SNS screen. Emoticons can be created to be of a predetermined size suitable for use.
  • the emoticon generating unit 350 when a facial expression label is given to the companion animal image extracted from the image DB 342, transfers the label to the emoticon DB 344 together with the emoticon as an emoticon tag.
  • the emoticon DB 344 matches emoticon tags and emoticons and stores them together.
  • the guardian of the companion animal may directly assign a tag to an emoticon during the emoticon creation process through the emoticon interface 414, or may modify a tag assigned to an emoticon stored in the emoticon DB 344.
  • the companion animal SNS interface of the communication platform 410 provides an environment in which an SNS account (hereinafter referred to as 'companion animal SNS account') can be opened for each companion animal.
  • the companion animal-only SNS interface is an interface (eg, 'gallery') through which a companion animal guardian can post articles about the companion animal (for example, 'bulletin board') or images or videos of the companion animal on the companion animal SNS account. ') is provided.
  • the companion animal SNS interface grants the guardian of the companion animal the right to modify or delete the posts, images, or videos posted on their companion animal SNS account.
  • the SNS interface dedicated to companion animals provides an environment where random or others with permission to access can visit the SNS account and comment on articles about companion animals, or leave ratings, recommendations, and comments on images or videos of companion animals. to provide.
  • companion animal guardians can provide links to other SNS accounts (eg, Instagram, Facebook), their own blog or YouTube account that they have opened through the companion animal SNS interface.
  • the companion animal-specific SNS interface may provide a button or link through which others who visit the companion animal SNS account can access services provided by the facial expression analysis interface 412 or the behavior analysis interface 413 .
  • the peripheral device management interface of the communication platform 410 includes the companion animal input device 100, the wearable biometric module, the signal mediation device 200, and peripheral devices (air conditioners) controlled in conjunction with the signal mediation device 200 through a home network. , fan, smart TV, etc.) is an interface for registering at least one of them.
  • the peripheral device management interface supports a function of setting, modifying, and adding peripheral devices operable with an input signal transmitted from the companion animal input device 100 based on the designated peripheral device operation setting information.
  • peripheral device management interface may support a function of limiting the number of operations of registered peripheral devices.
  • the peripheral device management interface supports a function of monitoring and displaying operation results of peripheral devices controlled by the signal broker 200 according to an input signal of the companion animal input device 100 .
  • the companion animal management interface of the communication platform 410 monitors the companion animal input signal generated from the companion animal input device 100 and displays the input frequency and input pattern of a specific button of the companion animal input device 100. supports
  • the companion animal management interface supports a function of setting and changing feedback information on the behavior of the companion animal.
  • the feedback information is voice information of the companion animal guardian, which includes negative language such as 'no' and 'don't do it' and positive language such as 'good job', 'good', and 'not nice'.
  • the feedback information may be provided according to a set value limiting the number of operations of the peripheral devices set by the companion animal guardian in the peripheral device management interface.
  • the communication unit 310 receives the corresponding input signal from the signal broker 200 and transmits the input signal to the peripheral device management unit, and when the input time of the companion animal input signal and the time set by the companion animal guardian are different, the peripheral device management unit rejects the above-mentioned negligence.
  • Companion animal guardian speech information including language is provided to the signal mediation device 200 through the communication unit 310, and the signal mediation device 200 outputs the companion animal guardian speech information including the received negative language to the speaker.
  • FIG. 4 is a reference diagram illustrating an example of a screen provided by a messenger interface according to an embodiment of the present invention.
  • the messenger interface 411 of the communication platform 410 conveys the companion animal's desire as if having a conversation using a chatting function.
  • the control unit 330 transmits information for starting a conversation to the communication platform 410 of the companion animal guardian terminal 400 through the communication unit 310.
  • the controller 330 generates an appropriate message suitable for the situation based on time information, schedule information stored in the companion animal guardian terminal 400, and location information, or extracts/selects a message from the message DB 341 using keywords. .
  • the communication unit 310 transfers this message to the communication platform 410 of the companion animal guardian terminal 400 .
  • the control unit 330 can be implemented to randomly generate a message based on time information, schedule information stored in the companion animal guardian terminal 400, and the location of the companion animal guardian. .
  • the generated message is transmitted to the communication platform 410 of the companion animal guardian terminal 400 by the communication unit 310 .
  • the control unit 330 may generate a conversation message based on learning content in the message learning module 321 .
  • the communication platform 410 transmits chatting contents with others of the chatting app running on the companion animal guardian terminal 400 to the communication unit 310, and the message learning module 321 of the learning unit 320 transmits the chatting contents. It is received from the communication unit 310 and a deep learning model is created or updated through machine learning.
  • the control unit 330 selects a specific keyword as a result of the situation determination, extracts a message corresponding to the keyword from the message DB 341, or executes a deep learning model using the selected keyword to generate a conversation message based on the result. generate
  • the extracted or created message is transmitted to the communication platform 410 through the communication unit 310 and stored in the message DB 341. Messages stored in the message DB 341 can then be extracted and used again.
  • the communication platform 410 generates a conversation message based on learning contents in the message learning module 321 .
  • it is preferable that whether or not to interwork with other people and conversation contents is determined by the companion animal guardian's choice.
  • the communication platform 410 displays the generated or delivered message on the messenger interface 411 (a).
  • the companion animal guardian may input feedback by responding through the messenger interface 411 of the communication platform 410 (b).
  • the controller 330 determines the companion animal's situation according to the companion animal's input signal and generates an appropriate colloquial message.
  • the communication unit 310 transfers the generated colloquial message to the communication platform 410 .
  • the controller 330 determines that the companion animal wants to eat food and generates a colloquial message requesting the companion animal to eat food.
  • the communication platform 410 displays the received message on the messenger interface 411 (c).
  • the companion animal guardian can immediately select the “feed cycle” menu through the communication platform 410 .
  • the remote control signal is transmitted to the peripheral device corresponding to the companion animal input signal through the signal broker 200 provided in the house.
  • the signal intermediary device 200 having an IoT function transmits a control signal to supply feed to the feed feeder, and the feed feed notification and feed feed notification and images taken around the feed feeder are displayed in the chat window of the communication platform 410. can be uploaded via
  • the signal intermediary device 200 obtains a message informing that the feed is completed after the feed feeder completes the feed, and transmits the message to the communication server 300 .
  • the communication unit 310 delivers this message to the communication platform 410 .
  • the companion animal guardian can check a message informing that food has been supplied to the companion animal by the food dispenser on the messenger interface 411 (d).
  • the companion animal guardian can check the video of the companion animal eating food in real time in the chat window with the companion animal. If a predetermined time elapses after the food is supplied, or if the companion animal object disappears from the image taken around the feeder, a message about relieving the need may be further output through the messenger interface 411 (e).
  • FIG. 5 is a reference diagram illustrating an example of a screen provided by a messenger interface according to an embodiment of the present invention.
  • the communication unit 310 when the communication unit 310 receives an advertisement from the outside, the communication unit 310 transmits the advertisement information to the advertisement information providing unit or the control unit 330, and the control unit 330 determines the situation according to the result.
  • An interactive message containing advertisement content may be generated and transmitted to the communication platform 410 through the communication unit 310 .
  • the control unit 330 responds with "Hyung ⁇ near my house" as a result of situation determination. There's a dog cafe! Let's go with me!!" !
  • the control unit 330 may utilize information such as location information of the companion animal guardian terminal 400, weather, traffic congestion, current time, and news.
  • the control unit 330 transmits the message and the like determined in this way to the communication platform 410 of the companion animal guardian terminal 400 through the communication unit 310 .
  • the communication platform 410 displays the received message on the messenger interface 411.
  • FIG. 6 is a reference diagram illustrating an example of a screen provided by a facial expression analysis interface according to an embodiment of the present invention.
  • the facial expression analysis interface 412 provides a screen displaying an image of a companion animal and a predetermined number of options.
  • the option is the name of a facial expression that the companion animal in the image may have.
  • the expression analysis interface 412 displays whether the expression label assigned to the image matches the user's selected option.
  • a user of the facial expression analysis interface 412 may be a guardian of a companion animal as well as another person having access authority.
  • the expression analysis interface 412 randomly selects an image of the corresponding companion animal or performs expression analysis based on the corresponding companion animal image designated by the user.
  • the facial expression analysis interface 412 displays an image of a selected or designated companion animal and a predetermined number of options (three in the example).
  • the option is the name of a facial expression that the companion animal in the image may have.
  • the names of facial expressions to be displayed in the plurality of options may be determined based on the result of inferring facial expressions from the facial expression analysis target image by the facial expression model in the communication server 300 together with the facial expression labels assigned to the images.
  • the names of facial expressions to be displayed in the plurality of options may be configured by mixing the inference result and the facial expression names randomly extracted from the facial expression list.
  • the facial expression list may be previously stored in the communication platform 410 or may be stored in the internal storage of the controller 330 .
  • the facial expression options to be presented by the facial expression analysis interface 412 are set to 3, the facial expression label given to the image of the target for facial expression analysis is 'embarrassed', and the facial expression model of the companion animal is inferred from the image. If one result is 'embarrassment (60%)', 'fear (30%)', 'concern (8%)', or 'curiosity (2%)', the controller 330 displays 'embarrassment', 'fear' By adding 'happy', which is a randomly extracted facial expression, to the two options of , three options can be configured and transmitted to the communication platform 410 through the communication unit 310.
  • the facial expression analysis interface 412 displays the three options ('embarrassed', 'fearful', 'happy') received from the expression analysis target image and the communication platform 410 on the screen.
  • the facial expression analysis interface 412 displays a facial expression label assigned to the user's selected facial expression and the image of the facial expression analysis target (in the example, 'embarrassed'). ) can be displayed on the screen.
  • the facial expression analysis interface 412 can output a message 'Yes!' If 'happy' is selected, the message 'No!' can be displayed on the screen because the facial expression selected by the user does not match the facial expression label ('confused').
  • the communication platform 410 takes this as feedback and uses the inconsistent content may be transmitted to the communication unit 310 so that the facial expression learning module 322 learns the facial expression model again.
  • FIG. 7 is a flowchart illustrating a method of using a companion animal image in a messenger interface of a communication platform according to an embodiment of the present invention.
  • S510 is an event generating step.
  • the event refers to an event that causes the message interface 411 to display at least one of a message or an image in a chat window with a companion animal. Examples of the event include reception of a companion animal input signal, elapse of a certain time without guardian feedback after the companion animal input signal, reception of an advertisement, reception of weather information, and the arrival of a specific time. However, it is not limited thereto.
  • S520 is a situation judgment step.
  • the control unit 330 collects information based on the occurrence of an event, determines a specific situation based on this, and derives a specific keyword. That is, the control unit 330 controls the type of companion animal input signal, the elapsed time after message output according to the companion animal input signal, the type of advertisement, changes in weather forecast, current location information of the companion animal guardian terminal 400, companion animal A specific keyword is derived based on information such as whether a specific word is included in the message sent by the guardian (a message sent to the chat window by the companion animal guardian). However, information used for deriving keywords is not limited thereto.
  • S530 is a message and image extraction step.
  • the control unit 330 displays at least one of a message, image, and emoticon (hereinafter referred to as 'message, etc.') to be output in the chatting window with a companion animal of the messenger interface 411 according to the determined situation, that is, the derived keyword. select That is, the control unit 330 generates a message or extracts a message from the message DB 341 according to the derived keyword. In addition, the controller 330 may extract an image of a companion animal corresponding to a facial expression name mapped with the derived keyword from the image DB 342 . If there are a plurality of extracted messages and images, messages and images to be displayed on the screen of the messenger interface 411 are determined according to a predetermined rule or randomly. At least one of a message and an image may be selected according to settings.
  • S540 is a step of outputting the selected message and image on the messenger interface 411 screen (chat window).
  • FIG. 8 is a flowchart illustrating a method of executing expression analysis in a communication platform according to an embodiment of the present invention.
  • S610 is a step of selecting a companion animal to be analyzed for expression. If the companion animal guardian has a plurality of companion animals, the facial expression analysis interface 412 may allow the user to select which companion animal among the plurality of companion animals to perform facial expression analysis. If an expression analysis service is used through an SNS account of a specific companion animal provided by the companion animal SNS interface of the communication platform 410, the corresponding companion animal is targeted.
  • S620 is a step of selecting an image of the corresponding companion animal.
  • the expression analysis interface 412 randomly selects an image of the corresponding companion animal or performs expression analysis based on the corresponding companion animal image designated by the user.
  • the facial expression analysis interface 412 may display a thumbnail of the companion animal image on the screen so that the user can designate the image for facial expression analysis.
  • S630 is a step of executing a facial expression model.
  • the communication platform 410 transmits an expression model execution command to the communication unit 310 of the communication server 300
  • the communication unit 310 transfers it to the model manager 324.
  • the model manager 324 obtains an inference result by executing a facial expression model or a behavior model according to the progress of a test, expression analysis, or behavior analysis in the expression analysis interface 412 or the behavior analysis interface 413 .
  • S640 is a step of presenting a plurality of options for the facial expressions of the companion animal shown in the image of the companion animal to be analyzed.
  • the number of options is determined according to settings. For example, you can set the number of choices to three.
  • the names of facial expressions to be displayed in the plurality of options may be determined based on the result of inferring facial expressions from the facial expression analysis target image by the facial expression model in the communication server 300 together with the facial expression labels assigned to the images. If there is no expression label assigned to the image, it may be determined based only on the inference result.
  • the plurality of options may be configured by mixing the facial expression label and/or the reasoning result with a facial expression name randomly extracted from a facial expression list.
  • the facial expression list may be previously stored in the communication platform 410 or may be stored in the internal storage of the controller 330 .
  • the options are configured in this way, the options are presented along with the image of the companion animal to be analyzed on the facial expression analysis interface 412 .
  • S650 determines, when the user selects a companion animal's expression, whether the selected expression matches a facial expression label assigned to the companion animal image or matches the inference result of the expression model (the most probable expression) This is a step of outputting a message on whether or not matching occurs on the screen of the user terminal.
  • step S650 a step of aggregating facial expression selection results of others accessing the facial expression analysis interface 412 for a specific image of the companion animal designated by the companion animal guardian and displaying the result on the screen may be added.
  • FIG. 9 is a flowchart illustrating a method of tagging after converting a companion animal image into an emoticon according to an embodiment of the present invention.
  • S710 is a step of selecting a companion animal image to be converted into an emoticon.
  • the companion animal guardian selects an image of the companion animal from the gallery.
  • the companion animal guardian may upload an image of the companion animal to the gallery of the communication platform 410 prior to step S710.
  • S720 is a step of generating an emoticon by processing the selected image.
  • the emoticon generator 350 reduces the image of the companion animal or extracts a specific area to generate an emoticon to have a predetermined size suitable for use on a chatting screen or an SNS screen.
  • S730 is a step of assigning a tag to the emoticon. If a facial expression label is assigned to the image of a companion animal extracted from the image DB 342, the emoticon generator 350 may set this label as an emoticon tag. In addition, the user of the emoticon interface 414 (the guardian of the companion animal) may directly assign a tag to the emoticon regardless of whether a facial expression label is assigned to the image of the companion animal.
  • S740 is a step of storing the tagged emoticon in an emoticon DB.
  • the emoticon generator 350 transmits the emoticon tag and the emoticon together to the emoticon DB 344, and the emoticon DB 344 matches the emoticon tag with the emoticon and stores them.
  • the above method may be implemented as an application or implemented in the form of program instructions that can be executed through various computer components and recorded on a computer readable recording medium.
  • the computer readable recording medium may include program instructions, data files, data structures, etc. alone or in combination.
  • components according to an embodiment of the present invention may be implemented in software or hardware form such as a Field Programmable Gate Array (FPGA) or Application Specific Integrated Circuit (ASIC), and may perform predetermined roles.
  • FPGA Field Programmable Gate Array
  • ASIC Application Specific Integrated Circuit
  • ' ⁇ unit' used in this embodiment means software or a hardware component such as FPGA or ASIC, and ' ⁇ unit' performs certain roles.
  • ' ⁇ part' is not limited to software or hardware.
  • ' ⁇ bu' may be configured to be in an addressable storage medium and may be configured to reproduce one or more processors. Therefore, as an example, ' ⁇ unit' refers to components such as software components, object-oriented software components, class components, and task components, processes, functions, properties, and procedures. , subroutines, segments of program code, drivers, firmware, microcode, circuitry, data, databases, data structures, tables, arrays, and variables.
  • components and ' ⁇ units' may be combined into smaller numbers of components and ' ⁇ units' or further separated into additional components and ' ⁇ units'.
  • components and ' ⁇ units' may be implemented to play one or more CPUs in a device or a secure multimedia card.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Health & Medical Sciences (AREA)
  • Tourism & Hospitality (AREA)
  • General Health & Medical Sciences (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Primary Health Care (AREA)
  • General Business, Economics & Management (AREA)
  • Strategic Management (AREA)
  • Marketing (AREA)
  • Human Resources & Organizations (AREA)
  • Economics (AREA)
  • Computing Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Environmental Sciences (AREA)
  • Software Systems (AREA)
  • Medical Informatics (AREA)
  • General Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Artificial Intelligence (AREA)
  • Animal Husbandry (AREA)
  • Mathematical Physics (AREA)
  • Evolutionary Computation (AREA)
  • Data Mining & Analysis (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Computational Linguistics (AREA)
  • Acoustics & Sound (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Biophysics (AREA)
  • Human Computer Interaction (AREA)
  • Telephonic Communication Services (AREA)
  • Operations Research (AREA)

Abstract

La présente invention concerne un système et un procédé de service de communication utilisant des expressions faciales apprises à partir d'images d'un animal de compagnie. Le système de service de communication selon la présente invention comprend : un dispositif d'entrée pour animaux de compagnie, le dispositif d'entrée générant un signal associé à un souhait d'un animal de compagnie en réponse à une entrée de l'animal de compagnie ; un dispositif d'intermédiation de signal qui reçoit le signal par communication à courte portée avec le dispositif d'entrée pour animaux de compagnie, et transmet le signal au serveur de communication décrit ci-dessous par l'intermédiaire d'un réseau ; le serveur de communication qui génère des messages interactifs de l'animal de compagnie sur la base du signal, et fournit des images de l'animal de compagnie conjointement avec les messages interactifs au terminal d'un responsable d'animal de compagnie décrit ci-dessous ; et le terminal d'un responsable d'animal de compagnie qui fournit un écran de conversation entre l'animal de compagnie et un responsable de l'animal de compagnie, et est équipé d'une plateforme de communication pour fournir les messages interactifs et les images à l'écran de conversation.
PCT/KR2021/017671 2021-08-30 2021-11-26 Système et procédé de service de communication utilisant des expressions faciales apprises à partir d'images d'un animal de compagnie WO2023033256A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2021-0115013 2021-08-30
KR1020210115013A KR20230033208A (ko) 2021-08-30 2021-08-30 반려동물 이미지에서 학습한 표정을 이용한 커뮤니케이션 서비스 시스템 및 방법

Publications (1)

Publication Number Publication Date
WO2023033256A1 true WO2023033256A1 (fr) 2023-03-09

Family

ID=85411451

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2021/017671 WO2023033256A1 (fr) 2021-08-30 2021-11-26 Système et procédé de service de communication utilisant des expressions faciales apprises à partir d'images d'un animal de compagnie

Country Status (2)

Country Link
KR (1) KR20230033208A (fr)
WO (1) WO2023033256A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005177129A (ja) * 2003-12-19 2005-07-07 Nippon Telegr & Teleph Corp <Ntt> ペットコミュニケーション装置
KR101622035B1 (ko) * 2014-08-20 2016-05-19 한양대학교 에리카산학협력단 게임 서비스 제공 장치 및 게임 서비스 제공 방법
KR20180025121A (ko) * 2016-08-30 2018-03-08 베이징 바이두 넷컴 사이언스 앤 테크놀로지 코., 엘티디. 메시지 입력 방법 및 장치
KR20200071837A (ko) * 2018-12-03 2020-06-22 정진해 인공지능을 이용한 반려동물 감성봇 장치 및 이를 이용한 교감 방법
KR102174198B1 (ko) * 2020-06-12 2020-11-04 박정훈 사물 인터넷 기반의 반려동물 소통기능 제공장치 및 방법, 사용자 단말기
KR20210005094A (ko) * 2018-08-24 2021-01-13 텐센트 테크놀로지(센젠) 컴퍼니 리미티드 가상펫의 번식 방법, 장치, 기기 및 기억매체

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101905134B1 (ko) 2017-02-03 2018-10-05 주식회사 창의산업 고양이와의 의사소통을 위한 시스템

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005177129A (ja) * 2003-12-19 2005-07-07 Nippon Telegr & Teleph Corp <Ntt> ペットコミュニケーション装置
KR101622035B1 (ko) * 2014-08-20 2016-05-19 한양대학교 에리카산학협력단 게임 서비스 제공 장치 및 게임 서비스 제공 방법
KR20180025121A (ko) * 2016-08-30 2018-03-08 베이징 바이두 넷컴 사이언스 앤 테크놀로지 코., 엘티디. 메시지 입력 방법 및 장치
KR20210005094A (ko) * 2018-08-24 2021-01-13 텐센트 테크놀로지(센젠) 컴퍼니 리미티드 가상펫의 번식 방법, 장치, 기기 및 기억매체
KR20200071837A (ko) * 2018-12-03 2020-06-22 정진해 인공지능을 이용한 반려동물 감성봇 장치 및 이를 이용한 교감 방법
KR102174198B1 (ko) * 2020-06-12 2020-11-04 박정훈 사물 인터넷 기반의 반려동물 소통기능 제공장치 및 방법, 사용자 단말기

Also Published As

Publication number Publication date
KR20230033208A (ko) 2023-03-08

Similar Documents

Publication Publication Date Title
WO2020235712A1 (fr) Dispositif d&#39;intelligence artificielle pour générer du texte ou des paroles ayant un style basé sur le contenu, et procédé associé
WO2020213750A1 (fr) Dispositif d&#39;intelligence artificielle pour reconnaître un objet, et son procédé
WO2018128362A1 (fr) Appareil électronique et son procédé de fonctionnement
WO2019182265A1 (fr) Dispositif d&#39;intelligence artificielle et procédé pour faire fonctionner celui-ci
WO2020235696A1 (fr) Appareil d&#39;intelligence artificielle pour interconvertir texte et parole en prenant en compte le style, et procédé associé
WO2018194243A1 (fr) Dispositif de communication vidéo, procédé de communication vidéo, et procédé de médiation de communication vidéo
WO2018117685A1 (fr) Système et procédé de fourniture d&#39;une liste à faire d&#39;un utilisateur
EP3545436A1 (fr) Appareil électronique et son procédé de fonctionnement
EP3345379A1 (fr) Procèdè pour la commande d&#39;un objet par un dispositif èlectronique et dispositif èlectronique
WO2020204221A1 (fr) Dispositif de conditionnement d&#39;air
WO2019225961A1 (fr) Dispositif électronique permettant de générer une réponse à une entrée vocale à l&#39;aide d&#39;une application, et procédé de fonctionnement associé
WO2020230933A1 (fr) Dispositif d&#39;intelligence artificielle pour reconnaître la voix d&#39;un utilisateur et procédé associé
WO2014061872A1 (fr) Support d&#39;enregistrement pour méthode de commande de messager, et appareil et système de celui-ci
EP3552163A1 (fr) Système et procédé de fourniture d&#39;une liste à faire d&#39;un utilisateur
WO2016108660A1 (fr) Procédé et dispositif pour commander un dispositif domestique
WO2015072619A1 (fr) Système et procédé de protection d&#39;animal domestique en utilisant une communication bidirectionnelle en temps réel
WO2021251711A1 (fr) Dispositif et procédé basés sur l&#39;internet des objets pour fournir une fonction de communication pour animal de compagnie, et terminal utilisateur
WO2020184748A1 (fr) Dispositif d&#39;intelligence artificielle et procédé de commande d&#39;un système d&#39;arrêt automatique sur la base d&#39;informations de trafic
WO2021006405A1 (fr) Serveur d&#39;intelligence artificielle
WO2020184746A1 (fr) Appareil d&#39;intelligence artificielle permettant de commander un système d&#39;arrêt automatique sur la base d&#39;informations de conduite, et son procédé
WO2019240562A1 (fr) Dispositif électronique et son procédé de fonctionnement pour délivrer en sortie une réponse à une entrée d&#39;utilisateur en utilisant une application
WO2020184753A1 (fr) Appareil d&#39;intelligence artificielle pour effectuer une commande vocale à l&#39;aide d&#39;un filtre d&#39;extraction de voix, et procédé associé
WO2021215804A1 (fr) Dispositif et procédé de fourniture de simulation de public interactive
WO2020032564A1 (fr) Dispositif électronique et procédé permettant de fournir un ou plusieurs articles en réponse à la voix d&#39;un utilisateur
WO2023033256A1 (fr) Système et procédé de service de communication utilisant des expressions faciales apprises à partir d&#39;images d&#39;un animal de compagnie

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21956183

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE