WO2015194807A1 - Procédé et appareil d'extraction et de transfert d'informations d'émotion d'après un effet photographique d'appareil de prise de vues - Google Patents

Procédé et appareil d'extraction et de transfert d'informations d'émotion d'après un effet photographique d'appareil de prise de vues Download PDF

Info

Publication number
WO2015194807A1
WO2015194807A1 PCT/KR2015/005987 KR2015005987W WO2015194807A1 WO 2015194807 A1 WO2015194807 A1 WO 2015194807A1 KR 2015005987 W KR2015005987 W KR 2015005987W WO 2015194807 A1 WO2015194807 A1 WO 2015194807A1
Authority
WO
WIPO (PCT)
Prior art keywords
emotion information
user
vibration
extracting
transmitting
Prior art date
Application number
PCT/KR2015/005987
Other languages
English (en)
Korean (ko)
Inventor
김지은
최경현
류호경
Original Assignee
한양대학교 산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 한양대학교 산학협력단 filed Critical 한양대학교 산학협력단
Publication of WO2015194807A1 publication Critical patent/WO2015194807A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42201Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS] biosensors, e.g. heat sensor for presence detection, EEG sensors or any limb activity sensors worn by the user
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/44Receiver circuitry for the reception of television signals according to analogue transmission standards

Definitions

  • the present invention relates to a method and apparatus for extracting and transmitting emotion information of an image in a non-metabolism section in broadcast audio.
  • DVS Descriptive Video Service
  • Screen-narration broadcasting is a service where voice actors comment on changing scenes, expressions or gestures of characters, and images processed without dialogue in order to enhance the understanding and interest of visually-impaired people who are watching TV based only on sound.
  • screen commentary broadcasts are subjectively interpreted by the voice actors without any special rules, and it is almost impossible to continuously provide information such as the emotions of the characters and the mood of the scene.
  • most broadcast producers and distributors have not provided screen commentary broadcasts that are expensive and hinder the visually impaired TV immersion. 44% are enforcing a law that requires screen commentary. Therefore, high cost, low-efficiency screen commentary broadcasting is required.
  • An object of the present invention is to provide a method and apparatus for improving the limitation of the information of the visually impaired, in the case of the video material mainly visual information.
  • Descriptive Video Service (DVS) according to the prior art is often subjective interpretation of the voice actors without special rules, and it is impossible to continue to provide information such as the emotions of the characters, the mood of the scene that changes from time to time It is intended to provide a method and apparatus for ameliorating the problem.
  • the method for extracting and transmitting emotion information based on the camera photographing effect proposed by the present invention analyzes the non-metabol section image according to a predetermined camera grammar based on the camera photographing effect of the non-metabol section image. Extracting and coding the emotion information of the non-metabolized section image according to the analysis result, and receiving the coded emotion information through a user terminal, by corresponding vibration according to the emotion information received through the user terminal. Switching, and transmitting the vibration to the user through the user terminal.
  • the predetermined camera grammar may correspond to emotion information according to a camera photographing effect of the non-metabolized section image and an editing effect of the image.
  • the step of switching to the corresponding vibration according to the emotion information received through the user terminal may be switched by matching the amplitude and time of the corresponding vibration differently according to the emotion information.
  • the transmitting of the vibration to the user through the user terminal may include transmitting the vibration at a predetermined cycle, and may include a predetermined rest period between the cycles.
  • the transmitting of the vibration to the user through the user terminal may allow the user to recognize the vibration corresponding to the emotion information through learning by repeatedly transmitting the vibration corresponding to the emotion information to the user.
  • the method for extracting and transferring emotion information between users proposed by the present invention includes extracting emotion information of a first user and coding the emotion information through a sensor of a first user terminal. Transmitting the coded emotion information to a second user terminal through an emotion information transmitter of a terminal, converting the coded emotion information into a corresponding vibration according to the received emotion information, and transmitting the vibration to a second user. Can be.
  • Extracting the emotion information of the first user through the sensor of the first user terminal and coding the emotion information extracts the signals of the heartbeat, heart rate, balance, skin conductivity, stimulation, excitement of the user through the sensor can do.
  • the step of switching to the corresponding vibration according to the received emotion information may be switched by matching the amplitude and time of the corresponding vibration differently according to the received emotion information.
  • the apparatus for extracting and transmitting emotion information proposed by the present invention analyzes the non-metabol section image according to a predetermined camera grammar based on a camera photographing effect of the non-metabol section image, and analyzes the result of the analysis.
  • the emotion information receiving part of the non-metabol section image is extracted and coded, an emotion information receiving unit receiving the coded emotion information, a sensor for extracting emotion information of a first user, a coding unit coding the extracted emotion information;
  • An emotion information transmitter for transmitting the emotion information of the first user to the emotion information receiver of the second user may include a vibration generator for converting the emotion information of the first user into a corresponding vibration according to the emotion information received from the emotion information receiver.
  • the vibration generating unit may switch by matching the amplitude and time of the corresponding vibration differently according to the emotion information.
  • the vibration generating unit may generate the vibration at a predetermined cycle, and may include a predetermined rest period between the cycles.
  • the vibration generating unit may include a haptic motor, and transmit a corresponding vibration to the user according to the emotion information through the haptic motor.
  • the sensor may extract a signal of the user's heartbeat, heartbeat signal, balance, skin conductivity, stimulus, excitement.
  • high-cost, low-efficiency screen-narration broadcasting can be solved through camera effect grammar and emotional haptic pattern technology.
  • information such as the emotions of the characters, the mood of the scene that changes from time to time may be transmitted to the user through vibration.
  • FIG. 1 is a flowchart illustrating a method of extracting and transmitting emotion information based on a camera photographing effect according to an embodiment of the present invention.
  • FIG. 2 is an illustration for explaining emotion information extracted according to a camera grammar according to an embodiment of the present invention.
  • FIG. 3 is an illustration for explaining a process of switching to a corresponding vibration according to the emotion information according to an embodiment of the present invention.
  • FIG. 4 is an exemplary view of a device for extracting and transmitting emotion information according to an embodiment of the present invention.
  • FIG. 5 is a flowchart illustrating a method of extracting and transferring emotion information between users according to an embodiment of the present invention.
  • FIG. 6 is a view for explaining an operation of transmitting emotion information between users of the emotion information extraction and delivery apparatus according to an embodiment of the present invention.
  • FIG. 7 is a view for explaining the configuration of the emotion information extraction and delivery apparatus according to an embodiment of the present invention.
  • the proposed method and apparatus for extracting and transmitting emotion information based on a camera photographing effect is a technology for extracting emotion information of an image in a non-metabolism section from broadcast audio and transmitting it as a non-verbal haptic pattern.
  • the method of extracting the emotion information from the image is 'Affective Cinematography', which is mainly used for the filmmakers to express the emotions of the actors.
  • a plurality of emotion information (joy, surprise, disgust, fear, warmth, sadness, etc.) may be extracted according to the 'angle'.
  • the emotion information extracted by analyzing various camera grammars is added to the broadcast signal, and a plurality of combinations of vibration amplitude, time, etc., called 'emotion vibrator', are performed through the Bluetooth communication technology of the user's terminal worn by the viewer. Emotions can be conveyed.
  • the 'emotion vibrator' technology uses the second user terminal to extract a plurality of emotion information of the first user extracted through the physiological signal analysis by the first user terminal near for the Bluetooth communication. This may be received and transmitted to the second user as a vibration.
  • FIG. 1 is a flowchart illustrating a method of extracting and transmitting emotion information based on a camera photographing effect according to an embodiment of the present invention.
  • the proposed method for extracting and transmitting emotion information based on the camera photographing effect analyzes the non-metabol section image according to a predetermined camera grammar based on the camera photographing effect of the non-metabol section image and the ratio according to the analysis result. Extracting and coding emotion information of a metabolic section image, receiving the coded emotion information through a user terminal 110, and converting the emotion information into corresponding vibrations according to the emotion information received through the user terminal 120. ), And transmitting the vibration to the user through the user terminal (130).
  • the non-metabolized section image may be analyzed according to a predetermined camera grammar based on the camera photographing effect of the non-metabolized section image.
  • the emotion information of the non-metabolized section image may be extracted and coded according to the analysis result. Completion of the coded emotion information may be received through the user terminal.
  • the predetermined camera grammar may correspond to the emotion information according to the camera shooting effect of the non-metabol section image and the editing effect of the image.
  • a method of extracting emotion information from an image may use camera shooting and editing effects, which are mainly used by an image producer to express an emotion of an actor by using 'affective cinematography'.
  • seven emotion information may be extracted according to the 'motion', 'frame', and 'angle' of the camera.
  • the seven emotion information may include emotion information of joy, surprise, disgust, fear, anger, sadness and tenderness.
  • Such emotion information is not limited to the embodiment only. For example, if the screen is shaken (motion: tilting), multiple screens pass quickly (frame: short shot), and the screen is angled (canted), the feeling of 'fear' It can be a basic grammar of photography to convey the.
  • FIG. 2 is an illustration for explaining emotion information extracted according to a camera grammar according to an embodiment of the present invention.
  • seven emotions that may be extracted according to a motion 211, a framing 212, and an angle 213 of camera effects 210 may be extracted. 220).
  • the extracted emotion information may be a joy 221.
  • the motion 211 is Vertigo, POV Object, Discover, the framing 212 is Fast, and the angle 213 is Canted angle, the extracted emotion information is surprise ( 222).
  • the extracted emotion information may be disgust 223.
  • the extracted emotion information is fear 224.
  • the motion 211 is a character dolly, expand dolly, the framing 212 is a close-up show, and the angle 213 is high-angle, the emotion information extracted is sad ( Sadness) (226).
  • the extracted emotion information is tender (227). May be).
  • the image producer may extract the emotion information of the image by using a camera photographing effect and an editing effect mainly used to express the emotion of the actor.
  • Emotion information extracted according to such a camera grammar is not limited only to the embodiment.
  • step 120 it may be switched to the corresponding vibration in accordance with the emotion information received through the user terminal.
  • the amplitude and time of the corresponding vibration may be differently matched and switched according to the emotion information.
  • the vibration corresponding to each emotion information may be transmitted at a predetermined cycle, and may include a predetermined rest period between the cycles. If the vibration is continuously transmitted without sufficient rest period, the vibration corresponding to the emotion information cannot be distinguished accurately.
  • FIG. 3 is an illustration for explaining a process of switching to a corresponding vibration according to the emotion information according to an embodiment of the present invention.
  • the emotion information of the non-metabolized section image may be extracted and received through the user terminal, and the coded emotion information may be converted into a corresponding vibration according to the emotion information.
  • emotion information of pleasure when emotion information of pleasure is extracted, it may be switched to vibration having a waveform of pleasure 310 of FIG. 3. In this case, the amplitude 311 and the time 312 of the corresponding vibration may be changed and matched according to the emotion information.
  • the waveform of enjoyment 310 has an amplitude 311 of 1.9 V, a time 312, in other words a period of 100 ms.
  • the waveform of the surprise 320 has an amplitude 311 of 1.9v and 2.8v, and can transmit the vibration of each amplitude sequentially for 5 seconds at a time 312 of 150ms and 650ms.
  • the vibration of the square wave shown in FIG. 3 is not limited to the exemplary embodiment, and may have a vibration pattern of another type such as a linear wave, a sine wave, a triangular wave, and the like.
  • the vibration corresponding to each emotion information may be transmitted at a predetermined cycle, and may include a predetermined rest period between the cycles. If the vibration is continuously transmitted without sufficient rest period, the vibration corresponding to the emotion information cannot be distinguished accurately.
  • the vibration may be transmitted to the user through the user terminal.
  • the user can recognize the vibration corresponding to the emotion information through learning.
  • the emotion information extracted by analyzing the various camera grammars is added to the broadcast signal, and a plurality of emotions can be transmitted through a combination of amplitude, time, and the like of the 'emotional vibration' through the Bluetooth communication technology worn by the viewer. .
  • a more immersive TV viewing experience can be provided to the visually impaired.
  • FIG. 4 is an exemplary view of a device for extracting and transmitting emotion information according to an embodiment of the present invention.
  • the apparatus for extracting and transmitting emotion information may include a sensor 410, a haptic motor 420, a blue tooth 430, an on / off & pairing button 440, and a battery 450. .
  • the sensor 410 may include a PPG (Photopletymography) 411 sensor and an EDA (Electrodermal Activity) 412 sensor.
  • a PPG (Photopletymography) sensor 411 may sense a heart rate, a heart rate signal (HRV), a balance, and the like.
  • the EDA (Electrodermal Activity) 412 sensor can sense skin conductance, arousal, excitement, and the like.
  • the apparatus for extracting and transmitting emotion information may use the sensor to extract and transmit emotion information between users.
  • the emotion information may be transmitted and received using the Bluetooth 430, and the received emotion information may be transmitted to the user by vibration using the haptic motor 420.
  • the on / off & pairing button 440 may be used to control power and pairing, and may include a battery 450 for power supply. Another embodiment of the apparatus for extracting and transmitting emotion information will be described in more detail with reference to FIGS. 5 to 6.
  • FIG. 5 is a flowchart illustrating a method of extracting and transferring emotion information between users according to an embodiment of the present invention.
  • the method for extracting and transferring emotion information between users extracting emotion information of a first user through a sensor of a first user terminal and coding the emotion information (510), the coding is performed through the emotion information transmitter of the first user terminal. Transmitting the received emotion information to the second user terminal (520), converting the corresponding emotion information into a corresponding vibration according to the received emotion information (530), and transmitting the vibration to the second user (540). Can be.
  • the emotion information of the first user may be extracted through the sensor of the first user terminal, and the emotion information may be coded.
  • a user's heartbeat, heartbeat signal, balance, skin conductivity, stimulus, and excitation signals may be extracted through the sensor.
  • the sensor may include a PPG (Photopletymography) sensor and an EDA (Electrodermal Activity) sensor.
  • a PPG (Photopletymography) sensor may sense a heart rate, a heart rate signal (HRV), a balance, and the like.
  • An EDA (Electrodermal Activity) sensor can sense skin conductance, arousal, excitement, and the like.
  • the coded emotion information may be transmitted to the second user terminal through the emotion information transmitter of the first user terminal.
  • the amplitude and time of the corresponding vibration may be differently matched and switched according to the coded emotion information.
  • step 530 it may be switched to the corresponding vibration in accordance with the received emotion information.
  • the amplitude and time of the corresponding vibration may be differently matched and switched according to the coded emotion information.
  • the vibration may be transmitted to the second user.
  • the proposed method and apparatus for extracting and transmitting emotion information based on the camera shooting effect is not only a technology for the visually impaired, but also for those who cannot accurately convey their emotions to easily convey their emotions. Can be used.
  • FIG. 6 is a view for explaining an operation of transmitting emotion information between users of the emotion information extraction and delivery apparatus according to an embodiment of the present invention.
  • the first user may wear the first user terminal 610 to extract and transmit emotion information, and transmit the emotion vibration of the first user to the second user terminal 620 through the Bluetooth 630 as described above.
  • an infant may wear an emotion information extraction and delivery device, and may transmit an emotion vibration to a mother. It is difficult for most mothers to know in advance what emotions are difficult for children to express, such as whether young children are doing well in kindergarten or if their child is alienated and depressed or sad. However, many accidents can be prevented through the proposed method and device for extracting and transmitting emotional information.
  • the emotion information extraction and delivery device may measure the child's biosignal and analyze the current emotional state, stress level, etc. of the child and provide the information to the mother or guardian wearing the emotion information extraction and delivery device.
  • This technology can be used as a future technology for dementia and adult care services, and can be used as a new family strengthening service for busy children.
  • FIG. 7 is a view for explaining the configuration of the emotion information extraction and delivery apparatus according to an embodiment of the present invention.
  • the apparatus for extracting and transmitting emotion information may include an emotion information receiver 710, a vibration generator 720, an emotion information transmitter 730, a sensor 740, and a coding unit 750.
  • the emotion information receiving unit 710 analyzes the non-metabol section image according to a predetermined camera grammar based on the camera photographing effect of the non-metabol section image, and extracts the emotion information of the non-metabol section image according to the analysis result. Coding, and receives the coded emotion information.
  • the emotion information receiver 710 of the second user terminal may receive the emotion information of the first user from the first user terminal.
  • the vibration generator 720 may switch to the corresponding vibration according to the emotion information received by the emotion information receiver 710. At this time, the amplitude and time of the corresponding vibration may be differently matched and switched according to the emotion information.
  • the vibration may be generated at a predetermined cycle according to the emotion information, and may include a predetermined rest period between the cycles.
  • the vibration generating unit 720 may include a haptic motor, and transmit a corresponding vibration to the user according to the emotion information through the haptic motor.
  • the emotion information transmitter 730 may transmit the coded emotion information of the first user to the second user. For example, by transmitting the emotion information of the first user to the emotion information receiver of the second user through Bluetooth, it is possible to extract and transmit emotion information between users.
  • the sensor 740 may extract emotion information of the first user.
  • the sensor 740 may extract a signal of the first user's heartbeat, heartbeat signal, balance, skin conductivity, stimulus, excitement, and the like.
  • the sensor may include a PPG (Photopletymography sensor) and an EDA (Electrodermal Activity) sensor.
  • the PPG (Photopletymography) sensor may sense a heart rate, a heart rate signal (HRV), a balance, and the like.
  • the EDA (Electrodermal Activity) sensor can sense skin conductance, arousal, excitement, etc.
  • the emotion information extraction and transmission device is based on the above-mentioned camera shooting effect. In addition to information extraction and transmission, the sensor may be used to extract and transmit emotion information between users.
  • the coding unit 750 may code the emotion information extracted by the sensor 740. For example, after the emotion information of the first user is coded through the coding unit 750, the coded emotion information of the first user may be transmitted to the second user terminal through the emotion information transmitter 710.
  • the proposed method and device for extracting and transmitting emotion information based on the camera shooting effect improve the limitation of screen description broadcasting provided for the visually impaired and extracting the emotional information for the visually impaired. Analyze the image and convey the emotion. In addition, based on PPG and EDA physiological signals, it is possible to extract and transfer emotions between users.
  • the proposed method may be in a wearable form, and the emotion information may be reinterpreted through haptic parameter transformation and transmitted in the form of vibration.
  • the viewing experience of 4DX feeling enjoyed at home in the future may be possible.
  • Many visually impaired people are frequently looking for movie theaters that offer 4DX despite the difficulty of viewing.
  • haptics deliver immersive viewing experiences by delivering information that cannot be provided in dialogue or audio information.
  • the proposed method can be used to improve this problem and to watch TV at home.
  • haptic broadcasting that post-analyzes the video based on rules according to an embodiment of the present invention is possible.
  • the elderly and children in other places can exchange emotion information.
  • the apparatus described above may be implemented as a hardware component, a software component, and / or a combination of hardware components and software components.
  • the devices and components described in the embodiments may be, for example, processors, controllers, arithmetic logic units (ALUs), digital signal processors, microcomputers, field programmable arrays (FPAs), It may be implemented using one or more general purpose or special purpose computers, such as a programmable logic unit (PLU), microprocessor, or any other device capable of executing and responding to instructions.
  • the processing device may execute an operating system (OS) and one or more software applications running on the operating system.
  • the processing device may also access, store, manipulate, process, and generate data in response to the execution of the software.
  • OS operating system
  • the processing device may also access, store, manipulate, process, and generate data in response to the execution of the software.
  • processing device includes a plurality of processing elements and / or a plurality of types of processing elements. It can be seen that it may include.
  • the processing device may include a plurality of processors or one processor and one controller.
  • other processing configurations are possible, such as parallel processors.
  • the software may include a computer program, code, instructions, or a combination of one or more of the above, and configure the processing device to operate as desired, or process it independently or collectively. You can command the device.
  • Software and / or data may be any type of machine, component, physical device, virtual equipment, computer storage medium or device in order to be interpreted by or to provide instructions or data to the processing device. Or may be permanently or temporarily embodied in a signal wave to be transmitted.
  • the software may be distributed over networked computer systems so that they may be stored or executed in a distributed manner.
  • Software and data may be stored on one or more computer readable recording media.
  • the method according to the embodiment may be embodied in the form of program instructions that can be executed by various computer means and recorded in a computer readable medium.
  • the computer readable medium may include program instructions, data files, data structures, etc. alone or in combination.
  • the program instructions recorded on the media may be those specially designed and constructed for the purposes of the embodiments, or they may be of the kind well-known and available to those having skill in the computer software arts.
  • Examples of computer-readable recording media include magnetic media such as hard disks, floppy disks, and magnetic tape, optical media such as CD-ROMs, DVDs, and magnetic disks, such as floppy disks.
  • Examples of program instructions include not only machine code generated by a compiler, but also high-level language code that can be executed by a computer using an interpreter or the like.
  • the hardware device described above may be configured to operate as one or more software modules to perform the operations of the embodiments, and vice versa.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • General Health & Medical Sciences (AREA)
  • Neurosurgery (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

L'invention concerne un procédé et un appareil d'extraction et de transfert d'informations d'émotion d'après un effet photographique d'appareil de prise de vues. Le procédé d'extraction et de transfert d'informations d'émotion d'après un effet photographique d'appareil de prise de vues selon la présente invention comporte les étapes consistant à: analyser une image de section sans paroles en fonction d'une grammaire prédéterminée d'appareil de prise de vues d'après un effet photographique d'appareil de prise de vues de l'image de section sans paroles, extraire et code des informations d'émotion de l'image de section sans paroles en fonction du résultat de l'analyse, et recevoir les informations d'émotion codées via un terminal d'utilisateur; convertir les informations d'émotion codées en une vibration correspondante en fonction des informations d'émotion reçues via le terminal d'utilisateur; et transférer la vibration à un utilisateur via le terminal d'utilisateur.
PCT/KR2015/005987 2014-06-18 2015-06-15 Procédé et appareil d'extraction et de transfert d'informations d'émotion d'après un effet photographique d'appareil de prise de vues WO2015194807A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2014-0074097 2014-06-18
KR1020140074097A KR101719868B1 (ko) 2014-06-18 2014-06-18 카메라촬영효과를 기반으로 한 감정정보 추출 및 전달 방법 및 장치

Publications (1)

Publication Number Publication Date
WO2015194807A1 true WO2015194807A1 (fr) 2015-12-23

Family

ID=54935738

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2015/005987 WO2015194807A1 (fr) 2014-06-18 2015-06-15 Procédé et appareil d'extraction et de transfert d'informations d'émotion d'après un effet photographique d'appareil de prise de vues

Country Status (2)

Country Link
KR (1) KR101719868B1 (fr)
WO (1) WO2015194807A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114008566A (zh) * 2019-06-28 2022-02-01 索尼集团公司 信息处理装置、信息处理方法和程序

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102464944B1 (ko) * 2018-10-19 2022-11-09 한국과학기술원 카메라 워크를 재현하는 방법 및 장치

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003324402A (ja) * 2002-05-07 2003-11-14 Nippon Hoso Kyokai <Nhk> 外部機器連動型コンテンツ生成装置、その方法及びそのプログラム、外部機器連動型コンテンツ再生装置、その方法及びそのプログラム
JP2005295170A (ja) * 2004-03-31 2005-10-20 Brother Ind Ltd 音声通信装置
KR20100061582A (ko) * 2008-10-08 2010-06-08 삼성전자주식회사 이동통신단말기에서 감정 표시 서비스를 제공하기 위한 장치 및 방법
KR101314609B1 (ko) * 2007-01-17 2013-10-07 엘지전자 주식회사 감성 정보 송수신 방법 및 감성 정보 송수신 장치
KR101376148B1 (ko) * 2009-03-10 2014-03-27 에스케이텔레콤 주식회사 영상 통화 중 진동을 전달하기 위한 방법과 이를 위한 이동통신 단말기

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20110041065A (ko) * 2009-10-15 2011-04-21 에스케이텔레콤 주식회사 영상통화 중 햅틱컨텐츠 전달 시스템, 서버, 단말기 및 방법

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003324402A (ja) * 2002-05-07 2003-11-14 Nippon Hoso Kyokai <Nhk> 外部機器連動型コンテンツ生成装置、その方法及びそのプログラム、外部機器連動型コンテンツ再生装置、その方法及びそのプログラム
JP2005295170A (ja) * 2004-03-31 2005-10-20 Brother Ind Ltd 音声通信装置
KR101314609B1 (ko) * 2007-01-17 2013-10-07 엘지전자 주식회사 감성 정보 송수신 방법 및 감성 정보 송수신 장치
KR20100061582A (ko) * 2008-10-08 2010-06-08 삼성전자주식회사 이동통신단말기에서 감정 표시 서비스를 제공하기 위한 장치 및 방법
KR101376148B1 (ko) * 2009-03-10 2014-03-27 에스케이텔레콤 주식회사 영상 통화 중 진동을 전달하기 위한 방법과 이를 위한 이동통신 단말기

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114008566A (zh) * 2019-06-28 2022-02-01 索尼集团公司 信息处理装置、信息处理方法和程序

Also Published As

Publication number Publication date
KR101719868B1 (ko) 2017-03-27
KR20150145292A (ko) 2015-12-30

Similar Documents

Publication Publication Date Title
US20210344991A1 (en) Systems, methods, apparatus for the integration of mobile applications and an interactive content layer on a display
US20210019982A1 (en) Systems and methods for gesture recognition and interactive video assisted gambling
US9848244B2 (en) Method of providing user specific interaction using device and digital television (DTV), the DTV, and the user device
JP7254772B2 (ja) ロボットインタラクションのための方法及びデバイス
KR20240011874A (ko) 신경 상태의 검출을 위해 생체 센서 데이터를 사용하여 라이브 엔터테인먼트를 디렉팅
CN104813642A (zh) 用于触发手势辨识模式以及经由非触摸手势的装置配对和共享的方法、设备和计算机可读媒体
WO2017122957A1 (fr) Dispositif d&#39;affichage et son procédé de commande
KR20020062325A (ko) 텔레비전 상에 디스플레이되는 가상 생물
WO2021251713A1 (fr) Système de parrainage de diffusion en continu utilisant un dispositif de l&#39;internet des objets
WO2015194807A1 (fr) Procédé et appareil d&#39;extraction et de transfert d&#39;informations d&#39;émotion d&#39;après un effet photographique d&#39;appareil de prise de vues
Oskooyee et al. Neuro movie theatre: A real-time internet-of-people based mobile application
WO2018207768A1 (fr) Procédé de distribution d&#39;images animées
Guedes et al. Subjective evaluation of 360-degree sensory experiences
CN107115675A (zh) 一种基于Kinect的体育健身游戏系统及实现方法
KR101839406B1 (ko) 디스플레이장치 및 그 제어방법
WO2015179466A1 (fr) Dispositifs multimédias interactifs distants
US20170246534A1 (en) System and Method for Enhanced Immersion Gaming Room
CN108777165A (zh) 一种评测社交状态的方法及装置
KR20140006424A (ko) 음원기반 체감진동 구현방법
JP6625809B2 (ja) 電子機器およびその制御方法
WO2016089112A2 (fr) Dispositif interface pour capteur de rayons x et module de capteur de rayons x comprenant le dispositif
KR20140084463A (ko) 화자 정보를 표시하는 장치 및 방법 그리고, 동영상을 편집하는 서버
WO2015076483A1 (fr) Système de commande de jouets à commande de scénario
WO2013094807A1 (fr) Système et procédé pour un service fournissant un contenu d&#39;expérience d&#39;action d&#39;animation
JP6667878B1 (ja) 着ぐるみ演出支援装置、着ぐるみ演出支援システムおよび着ぐるみ演出支援方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15809790

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15809790

Country of ref document: EP

Kind code of ref document: A1