WO2015084286A1 - Procédé de génération et de transmission d'émoticône d'utilisateur - Google Patents
Procédé de génération et de transmission d'émoticône d'utilisateur Download PDFInfo
- Publication number
- WO2015084286A1 WO2015084286A1 PCT/UA2013/000141 UA2013000141W WO2015084286A1 WO 2015084286 A1 WO2015084286 A1 WO 2015084286A1 UA 2013000141 W UA2013000141 W UA 2013000141W WO 2015084286 A1 WO2015084286 A1 WO 2015084286A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user
- emotions
- emotion
- input
- photographic images
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/174—Facial expression recognition
- G06V40/175—Static expression
Definitions
- This invention relates to methods for transmitting messages, namely, the transmission of messages with photographic objects that express the emotions of the user.
- Non-verbal communication such as the expression of emotions, gestures plays an important role in people's daily lives.
- text messaging has become the main means of communication, and since then people have not stopped searching for new tools and technologies that would help make communication on the network more lively, vibrant, emotional.
- a known method of detecting emotions the use of which provides a subtle identification of human emotions and allows you to generate sensitivity close to the sensitivity of a person. This result is achieved due to the fact that, based on the inputted voice signal, the intensity, pace, intonation in each word of the voice is determined, change values are obtained to reveal the content, respectively, and signals expressing the state of each of the emotions of anger, sadness, pleasure, based on the change values . Introduce the partner's emotion and information about the situation and thereby generate instinctive motivation information.
- emotional information is generated, which includes the basic emotional parameters of pleasure, anger and sadness, which are controlled on the basis of personality information (Application for Japanese Patent ⁇ » JP 2001-00002626 of January 6, 2001, IPC G06N5 / 00, G 10L 15/00, G06K9 / 00).
- ⁇ is carried out by transmitting text, graphic, audio, video, program messages or requests for information, while transmitting information about the geographical location of users, in which users first enter information, messages and requests using an emotion analyzer into a hardware-software complex, where they turn into a compact information code: + (positive emotion), - (negative emotion), 0 (neutral emotion), which is transmitted to the server for analysis, classification and placements in the message database or to search for a relevant response (Ukrainian Patent JM 775 1 7 dated 02.25.2013, IPC G06F 17/00, H04Q 9/00, G06F 17/21, G06F 17/27).
- the basis of the invention is the task of creating such a method of creating and transmitting user emotions in which, by creating a photographic object, selecting the type of emotions, creating a sequence of characters reflecting the photographic object with the selected type of emotions, it is possible to exchange photos of users with real emotions on the Internet.
- FIG. 1 illustrates a flowchart of a method for creating and sending emograms in accordance with the specified invention
- - Figa and Figv illustrate a graphical presentation of the method of creating and sending emograms in accordance with the specified invention.
- the software automatically recognizes photographic images on input and / or output devices of user information in accordance with a predefined algorithm. For example, the user's photos are uploaded to the user's mobile phone. When detecting, the position of the mouth, eyes, nose, and other points is determined. If the found object in the user's photo is identified as a face, then it is highlighted by a geometric figure, for example, a rectangle.
- emotion recognition is carried out on each face in the photo.
- Such recognition of emotions is carried out in accordance with a predefined algorithm. For example, the wider the user smiles at the recognized photo, the he is considered happier with the proposed invention, and the more the emotion conformity rating in a photograph is put in a percentage ratio.
- Those photos whose emotions are determined by a rating of 70 to 100% automatically fall into the corresponding library of types of user emotions.
- the user if there were no photographic images in the gallery, then the user will be offered to take a picture of the input and / or output device (mobile phone, tablet, laptop) on the camera or connect to a social network account or other Internet resource program having access to user photographic images.
- the next step is the selection by the user through an input and / or output device of information such as emograms (emotions) from the library of types of emotions.
- the list of emotions can be part of the interface of a special program through which the user can select one or more emograms.
- the user can type a combination of characters that means a particular type of emotion. For example, :) - for a smile,: (- for sadness,: D - for laughter,;) - for a wink, * - for a kiss. After entering a combination of characters, this combination is replaced by the corresponding photographic objects of the user from the library of types of emotions.
- the method After selecting the type of emoticon, the method provides for the transformation of a photographic object with the selected type of emogram through a pixel array generator into a user’s graphic emoticon.
- a graphical user emoticon is a photographic image of the user with the selected type of emogram, for example, a photograph of the user in which he smiles or winks.
- the sequence of characters created in this way can have a predetermined size.
- the next step of this invention is to obtain a sequence of characters from a user through an input and / or output device, where the sequence of user characters includes encoding and sending it to the recipient.
- the next step is to synchronize the user's photographic object to the library of photographic objects on the input and / or output device of the message recipient;
- step 1 He photographs himself, for example, on his mobile phone (step 1), writes a text message (step 2) and selects the emogram that is most suitable for his emotional state from the library of emotions (step 3).
- FIG. 4 Selects a photo (step 4) from a library of photographic objects that will be used instead of emoticon symbols.
- Figure 2 and Figure 2B illustrate a graphical presentation of a method for creating and sending an emoticon in accordance with the specified invention.
- the steps described above provide the ability to send text messages with the selected emoticon type very quickly. Also these steps can provide the creation and transmission of short messages, which are usually limited in size, for example 1-2 kilobytes or 400 characters.
- Information input and / or output devices can be mobile phones, tablets, laptops, ultrabooks, computers and other devices that can help users send messages, provide chat functions and other types of communication between users.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Information Transfer Between Computers (AREA)
Abstract
L'invention concerne des procédés de transmission de messages et plus particulièrement de transmission de messages avec des objets photographique qui expriment les émotions d'un utilisateur. L'invention concerne un procédé de génération et de transmission des émotions d'un utilisateur, qui comprend les étapes suivantes : reconnaître automatiquement des images photographiques de l'utilisateur sur des dispositifs d'entrée et/ou de sortie d'informations en fonction d'un algorithme préalablement défini; reconnaître les émotions d'un utilisateur par numérisation d'images photographiques de l'utilisateur et sélection des images photographiques les plus pertinentes en fonction d'un algorithme préalablement défini; conversion des images photographiques les plus pertinentes avec un type d'émotion correspondant en objet photographique; sélection par l'utilisateur et via le dispositif d'entrée et/ou de sortie d'informations du type d'émotion depuis une bibliothèque de types d'émotions; transformation de l'objet photographique via un générateur de massif de pixels avec un type choisi d'émotion en émoticône graphique de l'utilisateur; et génération d'une séquence de symboles de l'utilisateur via le dispositif d'entrée et/ou de sortie d'informations.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
UAA201314074 | 2013-12-03 | ||
UA2013014074 | 2013-12-03 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015084286A1 true WO2015084286A1 (fr) | 2015-06-11 |
Family
ID=53273864
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/UA2013/000141 WO2015084286A1 (fr) | 2013-12-03 | 2013-12-05 | Procédé de génération et de transmission d'émoticône d'utilisateur |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2015084286A1 (fr) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2021075996A1 (fr) * | 2019-10-16 | 2021-04-22 | федеральное государственное автономное образовательное учреждение высшего образования "Московский физико-технический институт (национальный исследовательский университет)" | Système de génération d'images dans un chat |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
RU2382408C2 (ru) * | 2007-09-13 | 2010-02-20 | Институт прикладной физики РАН | Способ и система для идентификации человека по изображению лица |
US20100053363A1 (en) * | 2008-09-03 | 2010-03-04 | Samsung Digital Imaging Co., Ltd. | Photographing method and apparatus |
US20120004511A1 (en) * | 2010-07-01 | 2012-01-05 | Nokia Corporation | Responding to changes in emotional condition of a user |
EP2426902A1 (fr) * | 2010-09-07 | 2012-03-07 | Research In Motion Limited | Manipulation dynamique d'un émoticône ou d'un avatar |
RU2488232C2 (ru) * | 2007-02-05 | 2013-07-20 | Амеговорлд Лтд | Сеть связи и устройства для преобразования текста в речь и текста в анимацию лица |
-
2013
- 2013-12-05 WO PCT/UA2013/000141 patent/WO2015084286A1/fr active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
RU2488232C2 (ru) * | 2007-02-05 | 2013-07-20 | Амеговорлд Лтд | Сеть связи и устройства для преобразования текста в речь и текста в анимацию лица |
RU2382408C2 (ru) * | 2007-09-13 | 2010-02-20 | Институт прикладной физики РАН | Способ и система для идентификации человека по изображению лица |
US20100053363A1 (en) * | 2008-09-03 | 2010-03-04 | Samsung Digital Imaging Co., Ltd. | Photographing method and apparatus |
US20120004511A1 (en) * | 2010-07-01 | 2012-01-05 | Nokia Corporation | Responding to changes in emotional condition of a user |
EP2426902A1 (fr) * | 2010-09-07 | 2012-03-07 | Research In Motion Limited | Manipulation dynamique d'un émoticône ou d'un avatar |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2021075996A1 (fr) * | 2019-10-16 | 2021-04-22 | федеральное государственное автономное образовательное учреждение высшего образования "Московский физико-технический институт (национальный исследовательский университет)" | Système de génération d'images dans un chat |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11303590B2 (en) | Suggested responses based on message stickers | |
CN110178132B (zh) | 在消息应用中自动建议内容的方法和系统 | |
US11250887B2 (en) | Routing messages by message parameter | |
KR102073351B1 (ko) | 서버 및 그것의 동작 방법 | |
EP3713159B1 (fr) | Galerie de messages ayant un intérêt partagé | |
US11138207B2 (en) | Integrated dynamic interface for expression-based retrieval of expressive media content | |
KR102374446B1 (ko) | 아바타 선택 메커니즘 | |
US10191920B1 (en) | Graphical image retrieval based on emotional state of a user of a computing device | |
US11573679B2 (en) | Integration of user emotions for a smartphone or other communication device environment | |
CN111565143B (zh) | 即时通信方法、设备及计算机可读存储介质 | |
KR20230029946A (ko) | 이미지들에 대한 여행 기반 증강 현실 콘텐츠 | |
US20220092071A1 (en) | Integrated Dynamic Interface for Expression-Based Retrieval of Expressive Media Content | |
KR101567555B1 (ko) | 이미지가 이용되는 소셜 네트워크 서비스 시스템 및 방법 | |
KR20230025917A (ko) | 여행과 연관된 증강 현실 기반 음성 번역 | |
KR20220155601A (ko) | 검출된 객체들에 대한 증강 현실 콘텐츠의 음성 기반 선택 | |
KR20230028553A (ko) | 여행과 연관된 증강 현실 기반 번역들 | |
US20240045899A1 (en) | Icon based tagging | |
US11743530B2 (en) | Systems and methods for improved searching and categorizing of media content items based on a destination for the media content machine learning | |
KR20240052043A (ko) | 대화 안내 증강 현실 경험 | |
WO2015084286A1 (fr) | Procédé de génération et de transmission d'émoticône d'utilisateur | |
JP2020101866A (ja) | 端末の表示方法、端末、端末のプログラム | |
US20230318992A1 (en) | Smart media overlay selection for a messaging system | |
CN118055091A (zh) | 基于会话的表情处理方法、装置、设备、存储介质及产品 | |
KR20240025018A (ko) | 맞춤화가능한 미디어를 위한 하이브리드 탐색 시스템 | |
TWM608752U (zh) | 具有文字及動畫圖像之情境式互動訊息傳遞系統 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13898561 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 12.10.2016) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 13898561 Country of ref document: EP Kind code of ref document: A1 |