EP2984546A1 - Vorrichtung und verfahren zur hörbaren und taktilen interaktion zwischen objekten - Google Patents

Vorrichtung und verfahren zur hörbaren und taktilen interaktion zwischen objekten

Info

Publication number
EP2984546A1
EP2984546A1 EP14716315.8A EP14716315A EP2984546A1 EP 2984546 A1 EP2984546 A1 EP 2984546A1 EP 14716315 A EP14716315 A EP 14716315A EP 2984546 A1 EP2984546 A1 EP 2984546A1
Authority
EP
European Patent Office
Prior art keywords
communicating
event
message
entity
communicating entity
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP14716315.8A
Other languages
English (en)
French (fr)
Inventor
Jean-Marc Alexandre
Xavier APOLINARSKI
Christian Bolzmacher
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Commissariat a lEnergie Atomique et aux Energies Alternatives CEA
Original Assignee
Commissariat a lEnergie Atomique et aux Energies Alternatives CEA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Commissariat a lEnergie Atomique et aux Energies Alternatives CEA filed Critical Commissariat a lEnergie Atomique et aux Energies Alternatives CEA
Publication of EP2984546A1 publication Critical patent/EP2984546A1/de
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/043Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using propagating acoustic waves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1698Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a sending/receiving arrangement to establish a cordless communication link, e.g. radio or infrared link, integrated cellular phone
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/002Specific input/output arrangements not covered by G06F3/01 - G06F3/16
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/043Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using propagating acoustic waves
    • G06F3/0436Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using propagating acoustic waves in which generating transducers and detecting transducers are attached to a single acoustic waves transmission substrate
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R17/00Piezoelectric transducers; Electrostrictive transducers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/80Services using short range communication, e.g. near-field communication [NFC], radio-frequency identification [RFID] or low energy communication
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R1/00Details of transducers, loudspeakers or microphones
    • H04R1/02Casings; Cabinets ; Supports therefor; Mountings therein
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R17/00Piezoelectric transducers; Electrostrictive transducers
    • H04R17/02Microphones
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R2201/00Details of transducers, loudspeakers or microphones covered by H04R1/00 but not provided for in any of its subgroups
    • H04R2201/02Details casings, cabinets or mounting therein for transducers covered by H04R1/02 but not provided for in any of its subgroups
    • H04R2201/021Transducers or their casings adapted for mounting in or to a wall or ceiling
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R2420/00Details of connection covered by H04R, not provided for in its groups
    • H04R2420/07Applications of wireless loudspeakers or wireless microphones

Definitions

  • the invention relates to the field of in ⁇ teractive systems and the web of objects and address more partic ⁇ particularly the sonic and tactile interaction between objects.
  • non-communicating or passive such as furniture, decorative objects for example, do not have the same ability to communicate and interact.
  • Nikolovski's patent FR 2 825 882 A1 proposes an interactive glazing unit having microphone and loudspeaker functions based on the adaptation of a plate to an acoustic wave emission device in the air by means of a piezoelectric transducer glued to the plate.
  • One of the limitations of this system is that it can not operate simultaneously in speaker mode and microphone mode, ie it can not communicate in "Full Duplex" mode.
  • An object of the present invention is to provide a hardware and software architecture that increases the interaction and communication capabilities of everyday objects by deporting all of them all or part of the capabilities of so-called communicating systems.
  • Another object of the present invention is to provide an intuitive and configurable system that provides spatial extension of all or part of the functions of a cellular phone, a PC or any other system with a processor.
  • An advantage of the present invention is to offer a very integrated embodiment and low production cost, by adding to the non-communicating objects a module comprising in addition piezoelectric elements and without altering the function of the device. primary use of objects.
  • the system which is the subject of the present invention is an integrated mechatronic and software device making it possible to transform a larger object or system such as a house or a car into a sound and tactile capture system and making it possible to remotely control these objects or systems. via a cell phone or a PC.
  • the system of the invention offers new applications 'general public' type leisure, surveillance or telepresence by making a vehicle or a home sound.
  • the present invention provides new services made possible by the sound of non-communicating objects.
  • the invention relates to a method for interacting between a communicating entity and at least one non-communicating entity, the non-communicating entity being equipped with an interaction device comprising piezoelectric means affixed to the surface of the non-communicating entity and interaction means, the device being able to operate the steps of
  • an identification and location technology is used.
  • RFID technology or NFC technology is used.
  • the detection step consists in detecting an event in an interface zone user of the interaction device, in particular detecting a support or an impact on the functionalized surface or detecting a voice message.
  • the message sent to the communicating entity comprises a header giving information on the communication protocol between the interaction device and the communicating entity and a message body containing the characterization parameters.
  • the message received from the communicating entity comprises a header giving information on the communication protocol between the interaction device and the communicating entity and a message body containing information for triggering an action associated with said event.
  • the protocol is in a preferential implementation the Bluetooth or WiFi protocol.
  • the electrical signal resulting from the conversion is amplified.
  • ⁇ piezo electric elements it can be vibrated structures or non-connecting surfaces which will allow to transform the vibration emitted acoustic wave.
  • the message generated by the communicating entity is sent to a plurality of non-communicating entities.
  • the non-communicating objects are chosen from the group of objects such as a wall, a door, a glass surface or other passive surface, or glasses.
  • the communicating entity can be a cell phone.
  • the invention also relates to a system for interacting between a communicating entity and a plurality of non-communicating entities, each non-communicating entity being equipped with an interaction device comprising piezoelectric means affixed to the surface of the non-communicating entity and means of interaction, the interaction device being able to perform the steps of the method according to any one of the claims.
  • the communicating entity comprises means for identifying the non-communicating entity, classifying an event, generating an execution message of an action associated with the event and sending the message to the interaction device of the non-communicating entity. communicating identified.
  • the communicating entity also comprises means for defining and network-configuring the interaction devices of the plurality of non-communicating entities.
  • the invention also relates to an interaction device for communicating a non-communicating object, the device comprising:
  • a treatment module having:
  • a remote user interface on the passive surface of the non-communicating object, adapted to detect an event on an area of said user interface
  • control module for characterizing the detected event
  • a communication module to generate a message containing information character- authorization of the detected event and iden ⁇ fying non-communicating object, and to receive a digital message containing information to trigger on the subject non-communicating an action associated with said event; piezoelectric means affixed to the non-communicating object and coupled to the processing module, adapted to generate a vibration mecha ⁇ nique of the area that converts it into an acoustic sound wave transmission mode or to produce a mechanical vibration from an acoustic wave in sound reception mode.
  • FIG. 1 a block diagram of the architecture of system of the invention
  • FIG. 2 a first implementation of the interaction device of the invention
  • Figure 3 a variant of the interaction device of the invention
  • Figure 4 a sequence of the main steps of the interaction between objects according to the principle of the invention.
  • Figure 1 shows a general architecture of the system of the invention comprising a plurality of non-communicating objects (102-1 to 102-n) and at least one communicating object (110).
  • Non-communicating objects can be arranged in the same place, such as a closed cabin like a house or a car.
  • These non-communicating objects may be for example a wall, a door, a glass surface, but also devices worn close to the body such as glasses or any other passive surface for affixing or coupling an interaction module (103-1 to 103-n). It should be noted that the principle that is described can be applied to a single non-communicating object.
  • Each non-communicating entity (102-i) is equipped with an interaction device (103-i) which will interacting with the communicating entity (110) according to the principle of the invention.
  • the interaction device (103-i) generally comprises two modules (104-i, 106-i) operatively coupled and which are described in more detail with reference to FIG.
  • the non-communicating objects will establish via the interaction device a communication with the communicating entity (110).
  • the identification, localization and communication of non-communicating objects can equally rely on protocols such as RFID, NFC, GSM, Bluetooth, WiFi or any other wireless protocol allowing the transfer of data.
  • a non-communicating object may be equipped with an interaction module for establishing a wired communication with the communicating entity.
  • the communicating entity (110) may be a remote or local device, fixed or mobile, of the cell phone, personal computer or ordiphone type or more generally any device comprising information processing circuits in transmission mode and reception mode of data.
  • the communicating entity in addition to standard circuits for processing information, comprises a configuration module (112) and an application module (114).
  • the configuration module (112) comprises means for defining, configuring and identifying each interaction device (103-i) that functions a object on which it is positioned.
  • the configuration module includes means for defining and configuring several interaction devices in a network.
  • the application module (114) is coupled to the configuration module and comprises means of exchange with the existing applications of the communicating entity to obtain, retrieve general and / or personal data and specific to the holder of the communicating entity .
  • the configuration and application modules make it possible to manage a whole set of interaction devices and to create application services by means of general data such as weather, web, etc. either personal agenda, music, etc.
  • FIG. 2 represents a first implementation of the interaction device (103) intended to be affixed to a non-communicating object in order to functionalize it.
  • the device includes a sound module (104) coupled to a processing module (106) for classifying a sensed event and transmitting it to the communicating entity.
  • the sound module preferably consists of one or more piezoelectric pellets connected to the processing module by an electrical wire and connector (108).
  • the processing module (106) includes a user interface (202), a power source (204) and an electronics block (206).
  • the user interface (202) consists of a touch zone around the interaction module that makes it possible to initiate an event by pressure or impact in the zone.
  • the interface may also include a standard microphone or use the functionalized surface as the acoustic antenna with the help of the piezoelectric pellet to receive an audible signal from the user.
  • the voice command is transferred to the communicating entity and processed thereon.
  • the power source (204) provides power to the interaction module, and can come from either a rechargeable battery, batteries or an external source (220V-50Hz power supply, battery or external battery, photovoltaic cell , etc.).
  • the electronics unit (206) includes a communication module (208), a control module (210), an audio amplifier (212), an antenna (214) and a power supply (216).
  • the communication module (208) is coupled to the control module and maintains the communication via a radio communication link that will transmit or receive a signal.
  • the communication link can be a wifi or Bluetooth type link.
  • the control module (210) integrates a processing unit, a memory module and a digital input / output management module.
  • a low power microcontroller can alternatively provide all of these functions.
  • the audio amplifier (212) amplifies the acoustic wave emitted from the sound module.
  • the audio amplifier comprises at least one stage amplifier, a low-pass filter stage, a high-voltage amplifier stage for supplying the piezoelectric chip.
  • the antenna (214) captures or radiates electromagnetic waves. It is a device for transforming an electrical signal in a conductor into an electromagnetic signal in space. Its size and geometry are adapted to the frequency band to transmit.
  • the processing module (106) may have attachment means (not shown) on the object to be functionalized, such as glue or self-gripping tape for example.
  • attachment means such as glue or self-gripping tape for example.
  • the hooking means make it possible to recharge the battery on a suitable docking station.
  • the user interface (202) can also be equipped with an alphanumeric display screen.
  • the piezoelectric piezoelectric pad (s) of the sound module (104) are affixed to the object to be functionalized. They can be glued or mechanically attached.
  • the one or more piezoelectric pellets are connected to the audio amplifier (212) of the electronic block.
  • each piezoelectric element will generate or impose a mechanical vibration of the rigid element or passive surface on which it is placed and this surface will transform this mechanical vibration into an acoustic wave (sound emission mode) or conversely produce a mechanical vibration from an acoustic wave (sound reception mode).
  • the piezoelectric pellets typically have a thickness between 0.1 mm and 1 mm with a rectangular or circular geometry. Their size may vary depending on the surface to be functionalized between 1mm and 50mm.
  • the sound module (104) constitutes a set of speaker, microphone and capture functions of a vibration or an impact that can for example be generated by the nail of a finger acting as a "click" function.
  • the processing module (106) may be a housing 5 to 10 cm long and of the order of 5 mm thick.
  • the implementation where the user interface includes buttons is particularly suitable for use on a communicating door.
  • the piezoelectric pellet transforms the impact on the door into an electrical signal that can be sent to the communicating entity. After processing on the communicating entity a signal is transmitted to the non-communicating entity. This signal is provided with an amplification to the piezoelectric pellet. The piezoelectric pellet then puts the door in
  • An advantage of the system of the invention is that it allows on the one hand replace in a single integrated object the bell button and the chime of a door and on the other hand it allows to broadcast contextualized and personalized messages.
  • the configuration / management module integrated with the communicating entity makes it possible to define the message to be transmitted as a function of time and context.
  • several options can be considered:
  • the melody or message can be programmed according to time (or other parameters if the communicating entity is not connected) or in relation to the weather by establishing a relationship with a weather application installed on the entity communicating.
  • User information mode for distributing contextualized information for the user.
  • an audible message is broadcast, for example indicating the tasks of a diary to be performed.
  • the tasks are retrieved via the communication module from the list of "ToDo List” in the agenda of the user who is available on the communicating entity.
  • Vehicle information mode which allows the user to define the message to be broadcast to the visitor based on user parameters, such as his schedule.
  • the message may be an example of a place of rendezvous.
  • messages broadcasting store opening hours, indicating the door of the office of such person or for a door of a teenager room a spoken message that is adapted to his mood or sounds like cell phone ringtones .
  • the device of Figure 3 comprises a sound module (104) coupled to a processing module (106).
  • the implementation of FIG. 3 presents a variant of implementation of the interaction device of the invention, particularly adapted to a use on a glass surface.
  • the user interface is composed of piezoelectric pellets which functionalised the passive surface in order to detect a variation of pressure, impact or speech.
  • a touch language allows you to change different functions such as launching music, read a "Short Message Service (SMS)", change the volume of the music. For example, an impact in a specific location selects the music mode (302), two impacts pass the SMS mode (304). Another place can be used to change the volume (312, 314), read the next SMS, change the title, etc.
  • SMS Short Message Service
  • the sound module (104) and the processing module (106) are arranged on the glass surface.
  • a user can select one of the applications proposed by pressing, touching or impacting the surface. If he chooses the music function, a reading of the contents of the music library available on the communicating entity is proposed and displayed on the alphanumeric screen (316). An impact on a specific place or a pressure on the passive surface makes it possible to choose the piece or to adjust the sound level in the room.
  • the window In another mode of operation, when a call is received on the communicating entity, the window emits the sound of the ringing of the telephone. Typing the window, it is possible to take the communication.
  • the modules can be configured and networked on the same communicating entity.
  • the modules can be configured and networked on the same communicating entity.
  • the glass In another mode of operation, when an SMS message is received, the glass "emits” an "SMS" sound corresponding to the sound of the communicating entity.
  • SMS short message
  • the glass By typing on the passive surface, either the alphanumeric screen displays the SMS, or the electromechanical conversion system allows that the sound delivered corresponds to the reading of the message.
  • a specific touch language can be programmed to know the previous and next messages.
  • each interaction module installed on a non-communicating object and to reproduce in all the rooms of a house, all or part of the applications of a mobile phone, and this by having a very integrated system and low cost.
  • An advantage of the present invention is to allow a multi-user use, by positioning interaction modules on the windows of each of the rooms of a house by example, and configure each module relative to a communicating entity.
  • the present invention makes a kitchen worktop interactive.
  • the piezoelectric pellet By arranging at least one piezoelectric pellet on the work plane, connected to an interaction module, itself placed in communication with a personal computer as a communicating entity, the piezoelectric pellet sounds the work plane, and the steps of a recipe can be stated.
  • the interaction module includes a microphone, the user can request by the words "next” or "previous” that the system broadcasts the message of the next step.
  • the user can state keywords such as "cooking time” or "temperature” for additional information.
  • the system can be enriched by the addition of a mini extra-flat LCD display broadcasting at each key step an image.
  • Figure 4 shows a sequence of the main steps (400) of the implementation of the interaction between a communicating entity and a non-communicating object.
  • a communicating entity of the type mobile phone for example identifies and locates the non-communicating entity or entities from a wireless network type (WiFi, Bluetooth, ).
  • the method begins in step (402) by detecting an event on a user interface area of an interaction module on a non-communicating entity.
  • the event can be a touch, a pressure and an impact exerted on the normally passive surface or a voice message captured by this same surface using piezoelectric pellets or electromechanical converters.
  • the detected event can also be a pressure of a button, a voice message to a microphone, a sound wave emitted by the person who wishes to interact with the system or a message received on a digital screen.
  • the method makes it possible to characterize the event by identifiers of the type of event (impact, touch, pressure, voice message, etc.) and identifiers of the interaction module.
  • the device that can be a patch stuck on a door detects impacts without specific logic.
  • the input event can be one or more impacts of different strength and duration.
  • the electronics embedded on the filter patch and classifies this information.
  • the next step (406) is to generate a message that contains the identification parameters and to send the message to the communicating entity.
  • This message is divided into two fields.
  • a header contains information about the communication protocol and a message body that contains the information to be transmitted.
  • the information to be transmitted is a pre-recorded command of the type "change of music" or "someone knocked on the door". Alternatively, it may be a voice message that will be processed by the communicating unit.
  • the communication protocols are in a preferred implementation Bluetooth or WiFi technology.
  • the next step performed by the communicating entity is to classify the event and generate another message that contains an action associated with the event type.
  • the communicating entity selects the non-communicating entity or entities concerned according to the event and sends the message to them.
  • a message containing execution information of an action associated with the detected event is received by the non-communicating entity.
  • This message is divided into two fields. A header that contains information about the communication protocol and a message body that contains the information to be transmitted.
  • the message contains in response from the central unit, the type of message to be sent from the non-communicating entity, for example voice message on the weather, or announcing the receipt of a text message, or sending a message. music frame to play.
  • the next step (410) is to convert the digital data of the received message into an electrical signal, and amplify the signal.
  • the electrical signal makes it possible to vibrate the elements piezoelectric elements coupled to the interaction module and generating (412) an acoustic wave by transformation of the vibration.
  • a next step (414) the corresponding function is executed via the passive surface of the non-communicating object.
  • This procedure can be repeated continuously on detecting a new event.
  • the position of the communicating entity can be used to adapt a feature such as the restitution of the music. If the user moves, another closer non-communicating entity is activated to play the music.
  • the usage functions that can be deported to non-communicating objects can be:
  • the message of the step (406) received by the communicating entity contains the information relating to the event, the data requested and the identity of the non-communicating entity to be functionalized.
  • the communicating entity collects the required data that may be available in memory as a library of music, a prerecorded message or retrieves data from websites or from a calendar. The data is then grouped and sent in a return message to the interaction module.
  • the content and the receiving units of the message depend on the information to be transmitted.
  • a message to inform that a person is knocking at the front door can be sent to all users.
  • communicating can be chosen by the user with the help of a button to trigger an action.
  • the present invention may be implemented from hardware and / or software elements. It may be available as a computer program product on a computer readable medium.
  • the support can be electronic, magnetic, optical, electromagnetic or to be a diffusion medium of the infrared type. Such supports are, for example, Random Access Memory RAMs (ROMs), magnetic or optical tapes, disks or disks (Compact Disk - Read Only Memory (CD-ROM), Compact Disk - Read / Write (CD-R / W) and DVD).

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Acoustics & Sound (AREA)
  • Signal Processing (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Telephone Function (AREA)
  • Mobile Radio Communication Systems (AREA)
EP14716315.8A 2013-04-10 2014-04-09 Vorrichtung und verfahren zur hörbaren und taktilen interaktion zwischen objekten Withdrawn EP2984546A1 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FR1353224A FR3004550B1 (fr) 2013-04-10 2013-04-10 Dispositif et procede d'interaction sonore et tactile entre objets
PCT/EP2014/057145 WO2014166990A1 (fr) 2013-04-10 2014-04-09 Dispositif et procede d'interaction sonore et tactile entre objets

Publications (1)

Publication Number Publication Date
EP2984546A1 true EP2984546A1 (de) 2016-02-17

Family

ID=48771657

Family Applications (1)

Application Number Title Priority Date Filing Date
EP14716315.8A Withdrawn EP2984546A1 (de) 2013-04-10 2014-04-09 Vorrichtung und verfahren zur hörbaren und taktilen interaktion zwischen objekten

Country Status (4)

Country Link
US (1) US20160070378A1 (de)
EP (1) EP2984546A1 (de)
FR (1) FR3004550B1 (de)
WO (1) WO2014166990A1 (de)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116233588B (zh) * 2023-05-10 2023-07-14 江西科技学院 一种智能眼镜交互系统及方法

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2825882B1 (fr) * 2001-06-12 2003-08-15 Intelligent Vibrations Sa Vitrage interactif avec fonctions microphones et haut-parleur
US7800595B2 (en) * 2003-12-18 2010-09-21 3M Innovative Properties Company Piezoelectric transducer
EP2247999A1 (de) * 2008-01-25 2010-11-10 Sensitive Object Berührungsempfindlicher schirm
US20100053087A1 (en) * 2008-08-26 2010-03-04 Motorola, Inc. Touch sensors with tactile feedback
US8312392B2 (en) * 2009-10-02 2012-11-13 Qualcomm Incorporated User interface gestures and methods for providing file sharing functionality
US9110509B2 (en) * 2010-07-28 2015-08-18 VIZIO Inc. System, method and apparatus for controlling presentation of content

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO2014166990A1 *

Also Published As

Publication number Publication date
WO2014166990A1 (fr) 2014-10-16
US20160070378A1 (en) 2016-03-10
FR3004550A1 (fr) 2014-10-17
FR3004550B1 (fr) 2016-12-09

Similar Documents

Publication Publication Date Title
CN108428452B (zh) 终端支架和远场语音交互系统
US11900941B1 (en) Remote initiation of commands for user devices
CN108538320B (zh) 录音控制方法和装置、可读存储介质、终端
US9685926B2 (en) Intelligent audio output devices
US10553098B2 (en) Appliance device integration with alarm systems
CN108521621B (zh) 信号处理方法、装置、终端、耳机及可读存储介质
US8081765B2 (en) Volume adjusting system and method
US20140045463A1 (en) Wearable Communication Device
CN108763978B (zh) 信息提示方法、装置、终端、耳机及可读存储介质
US11645469B2 (en) Context-based action recommendation based on a purchase transaction correlated with a monetary deposit or user biometric signs in an incoming communication
KR20170033641A (ko) 전자 장치 및 전자 장치의 동작 제어 방법
CN109062535B (zh) 发声控制方法、装置、电子装置及计算机可读介质
CN102422623B (zh) 装置之间基于装置对装置的物理接触的通信方法和系统
CN109643548A (zh) 用于将内容路由到相关联输出设备的系统和方法
CN101253755A (zh) 音频数据流同步
CN108540660B (zh) 语音信号处理方法和装置、可读存储介质、终端
US11126398B2 (en) Smart speaker
KR20170033025A (ko) 전자 장치 및 전자 장치의 동작 제어 방법
US20150010181A1 (en) Apparatus and method for outputting sound in mobile terminal
CN104767860A (zh) 来电提示方法、装置及终端
WO2023001195A1 (zh) 智能眼镜及其控制方法和系统
WO2006026254A2 (en) Methods and apparatus for aurally presenting notification messages in an auditory canal
CN113301544B (zh) 一种音频设备间语音互通的方法及设备
US20230418546A1 (en) Electronic Devices for Focused Listening
WO2014166990A1 (fr) Dispositif et procede d'interaction sonore et tactile entre objets

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20150828

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20171103