WO2021080133A1 - Procédé de commande d'énoncé automatique de haut-parleur en utilisant une communication corporelle humaine et dispositif portatif pour sa mise en œuvre - Google Patents

Procédé de commande d'énoncé automatique de haut-parleur en utilisant une communication corporelle humaine et dispositif portatif pour sa mise en œuvre Download PDF

Info

Publication number
WO2021080133A1
WO2021080133A1 PCT/KR2020/009878 KR2020009878W WO2021080133A1 WO 2021080133 A1 WO2021080133 A1 WO 2021080133A1 KR 2020009878 W KR2020009878 W KR 2020009878W WO 2021080133 A1 WO2021080133 A1 WO 2021080133A1
Authority
WO
WIPO (PCT)
Prior art keywords
human body
speaker
utterance
body communication
information
Prior art date
Application number
PCT/KR2020/009878
Other languages
English (en)
Inventor
Eun Kyeong KWON
Original Assignee
Dnx Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dnx Co., Ltd. filed Critical Dnx Co., Ltd.
Priority to JP2022524116A priority Critical patent/JP2022553405A/ja
Publication of WO2021080133A1 publication Critical patent/WO2021080133A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • A61B5/0026Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by the transmission medium
    • A61B5/0028Body tissue as transmission medium, i.e. transmission systems where the medium is the human body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/7405Details of notification to user or communication with user or patient ; user input means using sound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/7465Arrangements for interactive communication between patient and care services, e.g. by using a telephone network
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K19/00Record carriers for use with machines and with at least a part designed to carry digital markings
    • G06K19/06Record carriers for use with machines and with at least a part designed to carry digital markings characterised by the kind of the digital marking, e.g. shape, nature, code
    • G06K19/067Record carriers with conductive marks, printed circuits or semiconductor circuit elements, e.g. credit or identity cards also with resonating or responding marks without active components
    • G06K19/07Record carriers with conductive marks, printed circuits or semiconductor circuit elements, e.g. credit or identity cards also with resonating or responding marks without active components with integrated circuit chips
    • G06K19/077Constructional details, e.g. mounting of circuits in the carrier
    • G06K19/07749Constructional details, e.g. mounting of circuits in the carrier the record carrier being capable of non-contact communication, e.g. constructional details of the antenna of a non-contact smart card
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/04Segmentation; Word boundary detection
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B1/00Details of transmission systems, not covered by a single one of groups H04B3/00 - H04B13/00; Details of transmission systems not characterised by the medium used for transmission
    • H04B1/38Transceivers, i.e. devices in which transmitter and receiver form a structural unit and in which at least one part is used for functions of transmitting and receiving
    • H04B1/3827Portable transceivers
    • H04B1/385Transceivers carried on the body, e.g. in helmets
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B13/00Transmission systems characterised by the medium used for transmission, not provided for in groups H04B3/00 - H04B11/00
    • H04B13/005Transmission systems in which the medium consists of the human body
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/22Procedures used during a speech recognition process, e.g. man-machine dialogue
    • G10L2015/226Procedures used during a speech recognition process, e.g. man-machine dialogue using non-speech characteristics
    • G10L2015/227Procedures used during a speech recognition process, e.g. man-machine dialogue using non-speech characteristics of the speaker; Human-factor methodology
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B1/00Details of transmission systems, not covered by a single one of groups H04B3/00 - H04B13/00; Details of transmission systems not characterised by the medium used for transmission
    • H04B1/38Transceivers, i.e. devices in which transmitter and receiver form a structural unit and in which at least one part is used for functions of transmitting and receiving
    • H04B1/3827Portable transceivers
    • H04B1/385Transceivers carried on the body, e.g. in helmets
    • H04B2001/3861Transceivers carried on the body, e.g. in helmets carried in a hand or on fingers

Definitions

  • the present invention relates to a method of controlling the automatic utterance of a speaker provided in a wearable device or portable terminal or a speaker connected to a network by using human body communication.
  • the action of naturally coming into contact with an object in everyday life may be used as a method of identifying an object in real time or acquiring desired information.
  • spatial information, action information, desire information, traffic information, location information, and information about the use of an object may be acquired by coming into contact with a surrounding object.
  • human body communication technology may be applied.
  • Human body communication technology is a technology that transmits information via the human body through contact with a human body. It is an intuitive communication technology that has a high level of security and does not require complicated connection procedures because communication is performed through contact with a human body.
  • human body communication can transmit and receive data only by contact with a human body.
  • the reader may receive tag information by using the human body as a communication medium.
  • the reader needs to be constructed in a form including a human body communication reception function, and may be implemented in the form of a wearable device that can be worn or carried on the human body, such as a bracelet, a watch, and a smartphone.
  • the tag needs to be constructed in a form including a human body communication transmission function.
  • a conventional speaker starts to operate and responds to a preset specific voice when a user utters the preset specific voice.
  • a speaker capable of uttering in response to a user's actions for care service for the elderly.
  • the present invention has been conceived to overcome the above-described problems, and an object of the present invention is to provide a method of controlling the automatic utterance of a speaker using human body communication that can adaptively respond according to a user's action, surrounding situation, time, and/or the like and a wearable device that performs the method of controlling the automatic utterance of a speaker.
  • a method of controlling the automatic utterance of a speaker using human body communication in a wearable device including: receiving action information including identification data about an external device from the external device, with which a user comes into contact, by using human body communication; acquiring context information including data on a surrounding situation in response to the reception of the action information using the human body communication; determining whether to allow the automatic utterance of a speaker and an utterance message based on the received action information and the acquired context information; and processing the determined utterance message to be uttered via the speaker.
  • the determining may include: determining the type of external device, with which the user comes into contact, by using the received identification data; and reading a corresponding utterance message from memory or receiving a corresponding utterance message from an external server based on the determined type of external device and time or weather data included in the context information.
  • the steps of the method of controlling the automatic utterance of a speaker may be implemented as a computer program so that they can be performed in a portable terminal according to an embodiment of the present invention, and the corresponding computer program may be stored in a computer-readable storage medium.
  • a wearable device including: an information receiver configured to receive action information including identification data about an external device from the external device, with which a user comes into contact, by using human body communication; an information acquirer configured to acquire context information including data about at least one of current time, a location, and weather; an utterance determiner configured to determine whether to allow the automatic utterance of a speaker and an utterance message based on the received action information and the acquired context information; and a controller configured to process the determined utterance message to be uttered via the speaker.
  • action information is received from an external device, to which a tag is attached, via human body communication, context information is acquired from the inside or outside, and then whether to allow the automatic utterance of a speaker and utterance message are determined based on the combination of the acquired pieces of information, so that adaptive speaker operation based on a user's specific situation and action may be implemented and may be used for care service for the elderly, etc.
  • FIG. 1 is a block diagram showing the overall configuration of a system for controlling the automatic utterance of a speaker using human body communication according to an embodiment of the present invention
  • FIG. 2 is a flowchart showing an embodiment of a method of controlling the automatic utterance of a speaker using human body communication according to the present invention
  • FIG. 3 is a view illustrating an embodiment of a method of automatically uttering a message via a speaker in a wearable device when a user comes into contact;
  • FIG. 4 is a table illustrating embodiments of a method of determining an utterance message based on action information and context information.
  • FIG. 5 is a diagram illustrating an embodiment of a method of providing information about a wearable device in a portable terminal.
  • action information is received from an external device, to which a tag is attached, via human body communication, context information is acquired from the inside or outside, and then whether to allow the automatic utterance of a speaker and utterance message are determined based on the combination of the acquired pieces of information. Accordingly, adaptive speaker operation based on a user's specific situation and action may be implemented, and may be used for care service for the elderly, etc.
  • a human body communication object-attachment tag device includes a human body communication tag body, a case, an object attachment portion, and a ground extension.
  • the human body communication tag body may transmit the identification information of a contacted object to a human body communication reception device through contact with a human body.
  • the human body communication tag body includes a sensor, a processor, storage, a human body communication transmitter, a battery, and a clock generator. These components are mounted on a printed circuit board (PCB) and electrically connected to each other.
  • PCB printed circuit board
  • the human body communication tag body includes a sensor extension and human body communication electrodes. These components may be disposed in parallel with each other on the object attachment portion, be exposed to the outside through the case, and be attached to an existing object so that part of a human body can come into contact with the object attachment portion.
  • the sensor may detect the contact state of a human body.
  • the sensor may be implemented as various sensors, such as a pressure sensor, a temperature sensor, a motion sensor, a proximity sensor, etc., in order to detect contact with a human body by sensing pressure, temperature, motion, or the like.
  • the senor may detect contact with a human body by sensing a person's approach via a proximity sensor that detects proximity or contact when the person approaches within a predetermined distance.
  • the sensor may be connected to the sensor extension to extend a sensing portion for detecting contact with a human body.
  • the processor may perform processing appropriate for a contact situation when a contact with a human body is detected by the sensor.
  • the processor may estimate a user's situation based on the context including at least one of contact or untact, the amount of motion, pressure, temperature, humidity, and illuminance that can be detected according to the sensor type of sensor, and may determine at least one available condition, under which contacted object information may be provided to the user, according to the estimated user's situation.
  • the storage may store additional information together with object identification information or a unique number that can identify a contacted corresponding object.
  • the additional information may refer to a service profile including at least one piece of service content for each available condition that can be provided to a user by the object.
  • the object identification information may include information about an object to which a corresponding tag is attached, information about the manufacturer and manufacturing time of the tag, and the serial number of the tag for each type.
  • the additional information may refer to a count indicative of the number of times a user comes into contact without wearing a wearable device, i.e., a human body communication reception device, or may refer to a value sensed by a temperature or humidity sensor.
  • the identification information may include the class of object, the type of object, etc.
  • the human body communication transmitter may transmit the object identification information and the additional information to the human body communication reception device via the human body communication electrodes.
  • the battery may refer to a battery that supplies operating power to the human body communication tag body.
  • the clock generator receives operating voltage from the battery, and generates and provides the operating clock signals of respective components.
  • the sensor extension may be connected to the sensor and be disposed long along the longitudinal direction of the object attachment portion, thereby extending a sensing range for sensing contact with a human body.
  • the sensor extension may sense contact with a human body and transfer a sensing signal to the sensor.
  • the human body communication electrodes may be connected to the human body communication transmitter, may form a human body communication channel with the human body communication reception device, and may output the object identification information and the additional information, transmitted by the human body communication transmitter, to the human body.
  • the human body communication electrodes may be disposed in parallel with the sensor extension on the object attachment portion. Accordingly, in the process of contact with a human body, the detection of contact with the human body and the formation of a human body communication channel may be simultaneously performed.
  • the case may include an upper case and a lower case, and may serve to protect the human body communication tag body by being installed to surround the overall human body communication tag body.
  • the case may be made of polycarbonate (PC), acrylonitrile-butadiene-styrene (ABS), silicone, rubber, etc., and may be easily attached to the surface of an existing object.
  • part or all of the case may be made of or coated (painted) with a conductive material, and thus the case may be electrically connected to the PCB of the human body communication tag body.
  • the object attachment portion is exposed to the outside through the case and extends an attachment portion for an object.
  • the sensor extension and the human body communication electrodes are disposed in parallel with each other.
  • the object attachment portion may be attached along the surface of an object regardless of the size, shape and area of an existing object, and may be implemented in the form of flexible printed circuit boards (FPCBs) or thin film strips having various lengths, various areas, and various designs according to the surface shape of an attachment target object.
  • FPCBs flexible printed circuit boards
  • the ground extension may extend the ground (GND) of the PCB of the human body communication tag body according to the transmission performance of human body communication.
  • the human body communication object-attachment tag device may include the human body communication tag body, the case configured to surround and protect the human body communication tag body, the object attachment portion, and the ground extension.
  • the human body communication object-attachment tag device may be attached to the surface of an external device, i.e., a specific object, by using double-sided tape or the like.
  • the human body communication object-attachment tag device may be attached to a portion of each of various types of objects with which a user frequently comes into contact.
  • the human body communication object-attachment tag device may be attached to a grip of the refrigerator.
  • the case is attached to the surface of an object and the human body communication tag body is contained in the case.
  • the battery may be mounted on one surface of a PCB
  • the sensor, the processor, the storage, the human body communication transmitter, and the clock generator may be mounted on the other surface of the PCB.
  • the object attachment portion may be electrically connected to the human body communication tag body, extended up to a portion with which an object comes into contact, and then attached.
  • the sensor extension and the human body communication electrodes may be disposed in parallel with each other.
  • the ground extension may be electrically connected to one side of a PCB via a connector, and the sensor extension and the human body communication electrodes may be electrically connected to the other side of the PCB.
  • the sensor extension may be selectively included depending on the type of sensor.
  • the ground extension, the sensor extension, and the human body communication electrodes may each be implemented in the form of a thin-film flexible printed circuit board (FPCB).
  • the ground extension may be attached to the surface of an object or disposed on the top, bottom or side surface of the case, and may serve to extend the ground GND of a PCB.
  • the sensor extension and the human body communication electrodes may be extended along a surface or line in contact with a human body, and may then be attached.
  • the human body communication tag body is formed by mounting the sensor, the processor, the storage, the human body communication transmitter, the human body communication electrodes, the battery, the clock generator, and the sensor extension on a PCB.
  • the sensor may detect the proximity or contact state of a human body.
  • the sensor may transfer a contact detection signal to the processor when a human body is in a proximity or contact state.
  • the contact detection signal may wake up the processor.
  • the sensor extension disposed to be exposed to the outside through the case may be electrically connected to the sensor.
  • the sensor extension may be implemented using an FPCB-type electric conductor or conductive fiber, and may extend the sensing range of the sensor and facilitate a user's contact with an object.
  • the processor is operated by receiving a contact detection signal, estimates a contact situation from the contact detection signal, stores information acquired during the estimation in the storage, or reads additional information based on the estimated situation as well as the identification information of a corresponding object from the storage and transfers the read information to the human body communication transmitter.
  • the processor may process a contact detection signal as the value of the amount of motion occurring when a user moves an object in a process in which a human body comes into contact with the object.
  • the processor may process a contact detection signal as the value of the pressure that is applied to an object by a user in a process in which a human body comes into contact with the object.
  • the processor may be implemented as a CPU that stands for a central processing unit, in which case the storage and the human body communication transmitter may be included in the CPU.
  • the storage may store information that is transmitted to the human body communication reception device in a process in which a human body comes into contact with a object.
  • the storage may store the identification information of an object including the type, manufacturing history, number of contacts and contact time of the corresponding object.
  • the human body communication transmitter may output the sensor data processed from a contact detection signal by the processor together with the identification information of an object via the human body communication electrodes.
  • the sensor data may include the value of the amount of motion, brightness, temperature, and humidity.
  • the human body communication electrodes may be electrically connected to the human body of a user, may form a communication channel, and may output the identification information of a corresponding object and sensor data transmitted by the human body communication transmitter to a wearable device, i.e., the human body communication reception device including a human body communication reception function and worn by a user.
  • the clock generator may receive operating power from the battery, may generate clocks, and may provide the clocks to the CPU including the processor, the storage, and the human body communication transmitter.
  • the components constituting the human body communication tag body may be mounted on a PCB and fabricated in a small size.
  • the sensor extension and the human body communication electrodes may be connected to the PCB at their first ends, may be exposed to the outside through the case, and may be extended to an FPCB form, thereby being fabricated in the form of an object attachment portion with which a human body easily comes into contact with proximity or contact and which can be easily attached to an object.
  • the PCB of the human body communication tag body may be manufactured in a circular or square shape, but is not limited thereto.
  • the shape of the PCB may be fabricated according to the size or shape of an object.
  • the human body communication object-attachment tag device may be attached to the surfaces of objects such as a grip of a door, a grip of a refrigerator, a toilet seat, etc., may detect a situation in which a human body comes into proximity or contact via the sensor, may use a contact detection signal as a wake-up signal to save the power of the battery, and may automatically transmit identification information unique to an object and sensor data via human body communication in a process in which a human body comes into contact with an object.
  • FIG. 1 is a block diagram showing the overall configuration of a system for controlling the automatic utterance of a speaker using human body communication according to an embodiment of the present invention.
  • the shown system may include a wearable device 10 and a human body communication tag device 20.
  • the wearable device 10 may include an information receiver 110, an information acquirer 120, an utterance determiner 130, memory 140, a speaker 150, and a controller 160.
  • the information receiver 110 receives action information including identification data about an external device from the external device, with which a user comes into contact, by using human body communication.
  • the human body communication tag device 20 is attached to an external device. Since the configuration and operation of the human body communication tag device 20 may be the same as those of the above-described human body communication object-attachment tag device, detailed descriptions thereof will be omitted.
  • the human body communication tag device 20 may be attached to various external devices such as a refrigerator, a medicine container, a boiled-rice container, a toilet door, a front door, a remote control, etc., and may transmit identification information about a corresponding external device, etc. via human body communication when a user comes into contact with the external device.
  • various external devices such as a refrigerator, a medicine container, a boiled-rice container, a toilet door, a front door, a remote control, etc.
  • the wearable device 10 may be an electronic device that may be worn by a user or attached to the body of a user, and may be an electronic device in the form of a smart watch.
  • the wearable device 10 is not limited thereto, but may be implemented in various forms such as glasses, a bracelet, a ring, shoes, clothing, etc.
  • the information receiver 110 of the wearable device 10 may include a reader module in order to enable data transmission and reception to and from the tag device 20, attached to an external device, via human body communication.
  • the information acquirer 120 acquires context information indicative of the current situation of a user.
  • the context information may include data about current time, a location, weather, etc., the context information is not limited thereto.
  • the context information may be data stored in the wearable device 10, data received from an external server, or data acquired using at least part of data stored in the wearable device 10 and data received from an external server.
  • the information acquirer 120 receives context information from an external device or server via a separate portable terminal or a wired/wireless network.
  • the utterance determiner 130 determines whether to allow the automatic utterance of the speaker 150 and an utterance message based on the received action information and the acquired context information.
  • the utterance determiner 130 may determine the type of external device, with which a user comes into contact, by using identification data included in the action information, and may read or receive a corresponding utterance message from the memory 140 or external server based on the type of external device and time or weather data included in the context information.
  • the controller 160 may process the utterance message determined by the utterance determiner 130 to be uttered via the speaker 150, and may serve to control the overall operation of the wearable device 10.
  • the present invention has been described using an example in which the speaker 150 is provided in the wearable device 10 in FIG. 1, the present invention is not limited thereto.
  • the speaker 150 may be located outside the wearable device 10 and transmit and receive data to and from the wearable device 10 via wired/wireless communication.
  • the speaker 150 may be a speaker provided in a portable terminal wirelessly connected to the wearable device 10, or a separate artificial intelligence (AI) speaker connected over a wired/wireless network.
  • AI artificial intelligence
  • Embodiments of a method of controlling the automatic utterance of a speaker using human body communication according to the present invention will be described in detail below with reference to FIGS. 2 to 5.
  • FIG. 2 is a flowchart showing an embodiment of a method of controlling the automatic utterance of a speaker using human body communication according to the present invention. The shown control method will be described in conjunction with the block diagram showing the overall configuration of the system for controlling the automatic utterance of a speaker using human body communication according to an embodiment of the present invention shown in FIG. 1.
  • the information receiver 110 of the wearable device 10 receives action information including identification data about an external device from the corresponding external device, with which a user comes into contact, by using human body communication at step S200.
  • the external device may be an external device to which the human body communication tag device 20 is attached.
  • action information may be received from the tag device 20 having a human body communication transmission function that is attached to the corresponding external device.
  • the external device may any one of a refrigerator, a medicine container, a boiled-rice container, a toilet door, a front door, and a remote control.
  • the information acquirer 120 acquires context information including data on a surrounding situation in response to the reception of the action information using human body communication at step S210.
  • the context information acquired at step S210 includes data on at least one of current time, a location, and weather.
  • the information stored in the internal memory 140 may be used as the context information, or the context information may be acquired by receiving the information from an external server.
  • the utterance determiner 130 determines whether to allow the automatic utterance of the speaker 150 based on the action information received at step S210 and the context information acquired at step S220.
  • the speaker 150 may be provided in the wearable device 10, or may be provided in a portable terminal (not shown) wirelessly connected to the wearable device 10.
  • the utterance determiner 130 determines a message to be automatically uttered by the speaker 150 at step S240 based on the action information received at step S210 and the context information acquired at step S220.
  • controller 160 processes the utterance message, determined at step S240, to be uttered via the speaker 150 at step S250.
  • the utterance determiner 130 determines the type of external device, with which a user comes into contact, by using the identification data about the external device received from the tag device 20 via human body communication.
  • the utterance determiner 130 may read a corresponding utterance message from the memory 140 or receive a corresponding utterance message from an external server based on the determined type of external device and the time or weather data included in the context information acquired via the information acquirer 120.
  • the controller 160 allows a corresponding specific utterance message to be automatically uttered via the speaker 150.
  • the "touch tag,” i.e., the human body communication tag device 20 may transmit action information including the identifier of the corresponding refrigerator to the "touch band," i.e., the wearable device 10, via human body communication.
  • the "touch band,” i.e., the wearable device 10 may determine that the external device with which the user comes into contact is a "refrigerator" by using the identifier included in the action information received via human body communication, and may acquire context information including current time.
  • the "touch band,” i.e., the wearable device 10 may check whether the current time is after preset 22:00 (10 p.m.), and, if the current time is after 22:00, may read an utterance message (e.g., "Your meal is late. Please do not overeat ⁇ !”) corresponding to the action information (contact with the refrigerator) and the context information (after 22:00) from the memory 140 and output the utterance message via the speaker 150.
  • an utterance message e.g., "Your meal is late. Please do not overeat ⁇ !”
  • FIG. 4 illustrates embodiments of a method of determining an utterance message based on action information and context information. This drawing shows examples of an utterance message that is stored in the memory 140 to correspond to action information and context information.
  • a user's actions may be determined via the wearable device 10 by using the human body communication tag devices 20 attached to various external devices, respectively.
  • a user's action such as “a first touch after waking up,” a contact with a “refrigerator,” a contact with “rice cooker,” a contact with a “medicine container,” a contact with a “water container,” a contact with a “remote control,” or a “first touch after returning home,” may be recognized based on the action information received to the wearable device 10 from the tag device 20 via human body communication.
  • a message to be automatically uttered via the speaker 150 of the wearable device 10 may be determined by considering not only the above-identified user's action but also the context information acquired via the wearable device 10.
  • the utterance determiner 130 may an utterance message appropriate for the user's action information and surrounding context information among a plurality of utterance messages stored in the memory 140.
  • an utterance message may be determined by additionally considering user information.
  • the information acquirer 120 of the wearable device 10 may acquire user information including data on at least one of a user's name, gender, age, health-related numerical value, and disease history.
  • the utterance determiner 130 may determine whether to allow the automatic utterance of the speaker 150 and an utterance message based on the acquired user information together with the action information and the context information.
  • the message "Ma'am, it is better to drink hot tea than cold water during the change of season” may be automatically uttered via the speaker 150.
  • At least part of the action information received by the wearable device 10 by using human body communication and the context information acquired by the wearable device 10 as described above, or the automatic utterance message determined by the wearable device 10 may be transmitted to an external portable terminal (not shown) via wireless communication.
  • the portable terminal may display the information or message transmitted from the wearable device 10 on a screen or output the information or message via the speaker in the form of voice, and may analyze and process the transmitted information and provide the results of the analysis and the processing to the user.
  • FIG. 5 is a diagram illustrating an embodiment of a method of providing information about a wearable device in a portable terminal.
  • the portable terminal may be connected to the wearable device 10 via a wireless communication method such as Bluetooth, and may display data by means of various lights for a user's object contact actions enabling automatic utterance using information received from the wearable device 10.
  • a wireless communication method such as Bluetooth
  • the methods according to the embodiments of the present invention described above may each be manufactured as a program to be executed on a computer and stored in a computer-readable storage medium.
  • a computer-readable storage medium examples include ROM, RAM, CD-ROM, magnetic tape, a floppy disk, and optical data storage.
  • the computer-readable storage medium may be distributed over computer systems connected over a network so that computer-readable code is stored and executed in a distributed fashion.
  • Functional programs, codes, and code segments for implementing the method may be easily inferred by programmers in the technical field to which the present invention pertains.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Nursing (AREA)
  • Computational Linguistics (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Acoustics & Sound (AREA)
  • Multimedia (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • User Interface Of Digital Computer (AREA)
  • Alarm Systems (AREA)

Abstract

La présente invention concerne un procédé de commande de l'énoncé automatique d'un haut-parleur en utilisant une communication corporelle humaine dans un dispositif portatif, ledit procédé comprenant : la réception d'informations d'action qui comprennent des données d'identification en ce qui concerne un dispositif externe, à partir du dispositif externe, avec lequel un utilisateur entre en contact, en utilisant une communication corporelle humaine ; l'acquisition d'informations de contexte qui comprennent des données sur une situation environnante en réponse à la réception des informations d'action en utilisant la communication corporelle humaine ; la détermination qu'il faut autoriser ou non l'énoncé automatique d'un haut-parleur et d'un message d'énoncé sur la base des informations d'action reçues et des informations de contexte acquises ; et le traitement du message d'énoncé déterminé destiné à être énoncé par l'intermédiaire du haut-parleur.
PCT/KR2020/009878 2019-10-22 2020-07-27 Procédé de commande d'énoncé automatique de haut-parleur en utilisant une communication corporelle humaine et dispositif portatif pour sa mise en œuvre WO2021080133A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2022524116A JP2022553405A (ja) 2019-10-22 2020-07-27 人体通信を用いたスピーカー自動発話制御方法及びそれを遂行するためのウェアラブルデバイス

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2019-0131200 2019-10-22
KR1020190131200A KR20210047510A (ko) 2019-10-22 2019-10-22 인체통신을 이용한 스피커 자동 발화 제어 방법 및 그를 수행하기 위한 웨어러블 디바이스

Publications (1)

Publication Number Publication Date
WO2021080133A1 true WO2021080133A1 (fr) 2021-04-29

Family

ID=75620148

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2020/009878 WO2021080133A1 (fr) 2019-10-22 2020-07-27 Procédé de commande d'énoncé automatique de haut-parleur en utilisant une communication corporelle humaine et dispositif portatif pour sa mise en œuvre

Country Status (3)

Country Link
JP (1) JP2022553405A (fr)
KR (1) KR20210047510A (fr)
WO (1) WO2021080133A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20230111623A (ko) 2022-01-17 2023-07-26 광운대학교 산학협력단 비접촉 인식가능한 bt 인식 태그, 이를 구비한 시스템 및 인식 방법

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20090111618A (ko) * 2008-04-22 2009-10-27 조병수 환자관리 시스템 및 서비스 방법
US20130086056A1 (en) * 2011-09-30 2013-04-04 Matthew G. Dyor Gesture based context menus
US20150006670A1 (en) * 2013-06-27 2015-01-01 Samsung Electronics Co., Ltd. Electronic device and method for exchanging information using the same
KR20160001958A (ko) * 2014-06-30 2016-01-07 주식회사 만민엔터프라이즈 웨어러블 기기를 활용한 운동 트레이닝 방법 및 그를 위한 시스템, 웨어러블 기기
KR20190024513A (ko) * 2017-08-31 2019-03-08 한국과학기술원 AR환경에서 Visible light communication을 활용한 환경 인식 방법

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100942706B1 (ko) 2008-08-22 2010-02-16 한국전자통신연구원 인체 통신을 이용한 무선 주파수 식별 시스템

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20090111618A (ko) * 2008-04-22 2009-10-27 조병수 환자관리 시스템 및 서비스 방법
US20130086056A1 (en) * 2011-09-30 2013-04-04 Matthew G. Dyor Gesture based context menus
US20150006670A1 (en) * 2013-06-27 2015-01-01 Samsung Electronics Co., Ltd. Electronic device and method for exchanging information using the same
KR20160001958A (ko) * 2014-06-30 2016-01-07 주식회사 만민엔터프라이즈 웨어러블 기기를 활용한 운동 트레이닝 방법 및 그를 위한 시스템, 웨어러블 기기
KR20190024513A (ko) * 2017-08-31 2019-03-08 한국과학기술원 AR환경에서 Visible light communication을 활용한 환경 인식 방법

Also Published As

Publication number Publication date
JP2022553405A (ja) 2022-12-22
KR20210047510A (ko) 2021-04-30

Similar Documents

Publication Publication Date Title
WO2018070718A1 (fr) Dispositif de sortie délivrant un signal audio, et procédé de commande associé
WO2016111592A1 (fr) Dispositif pouvant être porté et son procédé de commande
WO2020105841A1 (fr) Liquide de traitement à haute température avec de la nanocellulose
WO2015170888A1 (fr) Dispositif d'affichage pour animaux de compagnie
WO2019156493A1 (fr) Dispositif électronique comprenant un module de mesure détachable et un coussinet de fixation
WO2018088612A1 (fr) Terminal de type montre
CN101489316A (zh) 用于确定网络关联状态的设备和方法
WO2020091294A1 (fr) Thermomètre de type timbre et système associé
EP3691521A1 (fr) Dispositif électronique et procédé de fourniture d'un indice de stress correspondant à l'activité d'un utilisateur
WO2019151701A1 (fr) Dispositif électronique pour générer des informations de santé sur la base d'une pluralité de signaux biologiques et son procédé de fonctionnement
WO2021080133A1 (fr) Procédé de commande d'énoncé automatique de haut-parleur en utilisant une communication corporelle humaine et dispositif portatif pour sa mise en œuvre
WO2021096216A1 (fr) Dispositif portable et procédé pour fournir des informations sur l'utilisateur
WO2020262808A1 (fr) Procédé de fourniture de service associé à un dispositif électronique par formation d'une zone, et dispositif associé
WO2014182034A1 (fr) Procédé de synchronisation temporelle pour haute efficacité énergétique dans un réseau sans fil, et réseau l'appliquant
WO2017078502A1 (fr) Dispositif de détection sans contact de défécation pour couche doté d'une structure de protection
WO2022035143A1 (fr) Dispositif électronique, procédé et support de stockage non transitoire permettant d'identifier la fraîcheur des aliments
WO2013141667A1 (fr) Système produisant des informations quotidiennes sur la santé et procédé produisant des informations quotidiennes sur la santé
WO2020159259A1 (fr) Procédé de calcul d'indice de récupération sur la base d'un stade de sommeil paradoxal et dispositif électronique associé
WO2009110701A2 (fr) Système d'alarme pour malentendants
WO2018171091A1 (fr) Procédé et système de transmission d'informations basés sur la lumière visible
WO2020096311A1 (fr) Dispositif électronique et procédé d'identification de la survenue d'une hypotension
WO2021261829A1 (fr) Procédé de réglage de luminosité et dispositif visiocasque
WO2021107601A1 (fr) Procédé de commande de réinitialisation matérielle et dispositif électronique
WO2021215590A1 (fr) Appareil d'étiquette sans fil pour la fixation d'objets
WO2019009571A1 (fr) Couche intelligente

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20879487

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2022524116

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 09/09/2022)

122 Ep: pct application non-entry in european phase

Ref document number: 20879487

Country of ref document: EP

Kind code of ref document: A1