KR101585083B1 - Cloud system of smart devices-based spatial information centric for senior and method for providing contents using it - Google Patents

Cloud system of smart devices-based spatial information centric for senior and method for providing contents using it Download PDF

Info

Publication number
KR101585083B1
KR101585083B1 KR1020150072012A KR20150072012A KR101585083B1 KR 101585083 B1 KR101585083 B1 KR 101585083B1 KR 1020150072012 A KR1020150072012 A KR 1020150072012A KR 20150072012 A KR20150072012 A KR 20150072012A KR 101585083 B1 KR101585083 B1 KR 101585083B1
Authority
KR
South Korea
Prior art keywords
content
user
emotion
classification index
information
Prior art date
Application number
KR1020150072012A
Other languages
Korean (ko)
Other versions
KR20150135747A (en
Inventor
허다혜
Original Assignee
허다혜
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 허다혜 filed Critical 허다혜
Publication of KR20150135747A publication Critical patent/KR20150135747A/en
Application granted granted Critical
Publication of KR101585083B1 publication Critical patent/KR101585083B1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Business, Economics & Management (AREA)
  • General Health & Medical Sciences (AREA)
  • Tourism & Hospitality (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Pathology (AREA)
  • Medical Informatics (AREA)
  • Economics (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • General Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

A smart device-based spatial information center cloud system for seniors according to the present invention and a method for providing contents using the same are disclosed. The smart device-based spatial information cloud system for seniors according to the present invention collects emotional information of a user about the contents whenever the contents are used through the activated application by activating the application, A user terminal for executing the user terminal; And a service server for providing content selected by a user in cooperation with the user terminal.

Description

Technical Field [0001] The present invention relates to a smart device-based spatial information cloud system for seniors, and a method for providing content using the same. 2. Description of the Related Art [0002]

The present invention relates to a cloud system, and more particularly, to a smart device-based spatial information center cloud system for seniors and a method for providing contents using the same.

Korea is expected to enter the "aging society" in 2000 and enter the "aged society" by 2018. According to the ranking of the 2013 Global AgeWatch Index published by the University of Sussex in the UK, Korea ranked 67th among the 91 countries analyzed and ranked lower than Venezuela, Kyrgyzstan and South Africa.

In addition, 31.7 suicides per 100,000 elderly people have already committed suicide rates in the world, excluding Greenland. Suicide of the elderly is caused by various problems such as family avoidance, loneliness, isolation, feeling of alienation, health problems, social participation of the elderly, mental illness, Psychological loneliness and alienation are the main causes.

Many institutions, including governments, local governments and social welfare organizations, are providing policy and support for the elderly, and welfare-related budgets are increasing each year.

However, the support of various governmental and local government policies and related organizations and organizations such as visiting services, customized welfare services centered on the field, and the operation of hope and welfare support groups are expanding. However, due to the limit of professional manpower, As a result, efforts and measures for solving the causes are very urgent.

Especially, it is necessary to provide a variety of contents, devices, and devices that can be enjoyed by the elderly people and the social effort to meet them by changing from rational judgment to emotional behavior based on the characteristics of the elderly.

SUMMARY OF THE INVENTION Accordingly, the present invention has been made to solve the above-mentioned problems occurring in the prior art, and it is an object of the present invention to provide a method and system for collecting emotion information of a user every time a user uses various contents in cooperation with a service server through a cloud network, A smart device-based spatial information center cloud system for seniors to be implemented in real time, and a method for providing contents using the system.

However, the objects of the present invention are not limited to those mentioned above, and other objects not mentioned can be clearly understood by those skilled in the art from the following description.

In order to achieve the above objects, according to one aspect of the present invention, a smart device-based spatial information center cloud system for seniors is provided with an application- A user terminal for applying emotional information to the content and executing the emotional information; And a service server for providing content selected by the user in cooperation with the user terminal.

Preferably, the user terminal analyzes the collected emotion information, calculates a sensibility classification index that is numerically expressed as a result of the analysis, and applies the calculated sensibility classification index to the corresponding content.

Preferably, when the user terminal receives the content menu from the service server after the user authentication is successful through the application, the user terminal confirms the user's emotion classification index for all contents, rearranges the content menu according to the determined emotion classification index And displays the rearranged contents menu, and selects a content to be used by the user from the displayed contents menu.

Preferably, when the content is selected from the displayed content menu, the user terminal requests the selected content to the service server, receives the content in response thereto, confirms the previous usage history of the provided content, If the previous use history exists, confirms the user's emotion classification index for the content, and applies the emotion classification index to the content according to the result of the checking.

Preferably, the user terminal collects emotion information of the user collected on the contents, converts the collected emotion information into a digital signal, detects R vertex information from the biological signal converted into the digital signal, The R vertex information of the bio-signal is analyzed in the time domain and the frequency domain, respectively, and the emotion classification index is derived from the analysis result.

Preferably, the user terminal calculates an average change value using each parameter calculated in the time domain and the frequency domain based on the R vertex information of the detected bio-signal, and calculates an average change value calculated for each parameter The emotion step between the predetermined arousal and the excitement is specified by combining all of the emotion stages and the emotion classification index based on the emotion classification table using the designated emotion step.

Preferably, the user terminal detects a change in the eyeball from the bio-signal, analyzes the positional shift of the eyeball and the surface reflectance of the eyeball in the change of the detected eyeball, and then applies the result of analysis to the derivation of the sensitivity classification index .

Preferably, the service server receives the emotion information collected from the user terminal, analyzes the received emotion information, calculates a emotion classification index expressed as a result of analyzing the emotion information, and stores the calculated emotion classification index And is applied to the content.

Preferably, the service server confirms the emotion classification index of the user for all the contents after the user authentication is successful through the application, rearranges the content menu according to the determined emotion classification index, And provides the authentication result to the user terminal that has successfully authenticated.

Preferably, the service server searches for the selected content when the content is selected from the displayed content menu, checks the previous usage history of the searched content, and if the previous usage history exists, And the emotional classification index is applied to the contents according to a result of the checking.

According to another aspect of the present invention, there is provided a smart device-based spatial information cloud system for seniors, comprising: a communication unit for requesting content selected by a user in cooperation with a service server; A control unit for applying emotional information of the user to the contents and executing the sensed information about the contents whenever the contents are used through the activated application by activating the application; And a display unit for displaying information related to the content through the activated application.

Preferably, the controller analyzes the collected emotion information, calculates a sensibility classification index that is numerically expressed as a result of the analysis, and applies the calculated sensibility classification index to the corresponding content.

Preferably, when the content menu is provided from the service server after the user authentication is successful through the application, the control unit confirms the emotion classification index of the user for all the contents, rearranges the content menu according to the sensed classification index And displays the rearranged contents menu, and selects contents to be used by the user from the displayed contents menu.

Preferably, when the content is selected from the displayed content menu, the controller requests the selected content to the service server, receives the content in response thereto, confirms the previous use history of the provided content, As a result, if the previous usage history exists, the user's emotional classification index for the content is confirmed, and the emotional classification index is applied to the content and executed according to the result of the confirmation.

Preferably, the controller collects emotion information of the user collected on the contents, converts the collected emotion information into a digital signal, detects R vertex information from the biological signal converted into the digital signal, The R vertex information of the bio-signal is analyzed in the time domain and the frequency domain, respectively, and the emotion classification index is derived from the analysis result.

Preferably, the controller calculates the average change value using each parameter calculated in the time domain and the frequency domain based on the R vertex information of the detected bio-signal, and calculates the mean change value And a emotion level between a predetermined arousal level and excitement is determined by combining the emotion level and the sensory classification index based on the emotion classification table.

Preferably, the user terminal detects a change in the eyeball from the bio-signal, analyzes the positional shift of the eyeball and the surface reflectance of the eyeball in the change of the detected eyeball, and then applies the result of analysis to the derivation of the sensitivity classification index .

According to another aspect of the present invention, there is provided a method of providing a content using a spatial information center cloud system, the method comprising: receiving a content selected by a user in association with a service server; Applying the emotion information of the user to the content and executing the sensed information of the user whenever the content is used through the activated application by activating the application; And displaying information related to the content through the activated application.

Preferably, the executing step may include analyzing the collected emotion information, calculating a sensibility classification index that is numerically expressed as a result of the analysis, and applying the calculated sensibility classification index to the corresponding content.

Preferably, when the content server receives the content menu from the service server after the user authentication is successfully performed through the application, the executing step checks the user's emotion classification index for all contents, Arranging and displaying the rearranged contents menu, and selecting contents to be used by the user from the displayed contents menu.

Preferably, when the content is selected from the displayed content menu, the executing step requests the selected content to the service server, receives the content in response thereto, and confirms the previous use history of the provided content And if the previous use history exists as a result of the checking, the user's emotional classification index for the content is confirmed, and the emotional classification index is applied to the content and executed according to the result of the confirmation.

Preferably, the executing step may include collecting emotion information of the user collected on the content, converting the collected emotion information into a digital signal, detecting R vertex information from the biological signal converted into the digital signal, The R vertex information of the detected bio-signal is analyzed in the time domain and the frequency domain, respectively, and the emotion classification index is derived from the analysis result.

Preferably, the executing step calculates the average change value using each parameter calculated in the time domain and the frequency domain based on the R vertex information of the detected bio-signal, and calculates an average change value The emotion step between the predetermined arousal and the excitement is designated by combining all of the emotion stages and the emotion classification index based on the emotion classification table using the designated emotion step.

Preferably, the user terminal detects a change in the eyeball from the bio-signal, analyzes the positional shift of the eyeball and the surface reflectance of the eyeball in the change of the detected eyeball, and then applies the result of analysis to the derivation of the sensitivity classification index .

Accordingly, the present invention collects emotional information of a user every time when using various contents in cooperation with a service server through a cloud network and applies the collected emotional information in real time at the time of using the next contents, It is possible to provide a content corresponding to the content in real time.

In addition, since the present invention can provide a content suitable for a user's emotions in real time, the user's convenience of using the content can be improved.

1 is a diagram illustrating a spatial information center cloud system according to an embodiment of the present invention.
2 is a view for explaining the principle of deriving the sensitivity classification index according to an embodiment of the present invention.
FIG. 3 is a view showing a composition of a friend meeting screen according to an embodiment of the present invention.
FIG. 4 is a view illustrating a configuration of a homebound road according to an exemplary embodiment of the present invention.
FIG. 5 is a diagram illustrating a configuration of an internal home information screen according to an embodiment of the present invention.
FIG. 6 is a diagram illustrating a configuration of a small-distance screen according to an embodiment of the present invention.
FIG. 7 illustrates a detailed configuration of a user terminal according to an exemplary embodiment of the present invention. Referring to FIG.
8 is a diagram illustrating a method for providing content in accordance with an embodiment of the present invention.
9 is a diagram illustrating a method for providing content according to another embodiment of the present invention.

Hereinafter, a smart device-based spatial information center cloud system for seniors and an operation method thereof according to an embodiment of the present invention will be described with reference to the accompanying drawings. The present invention will be described in detail with reference to the portions necessary for understanding the operation and operation according to the present invention.

In describing the constituent elements of the present invention, the same reference numerals may be given to constituent elements having the same name, and the same reference numerals may be given thereto even though they are different from each other. However, even in such a case, it does not mean that the corresponding component has different functions according to the embodiment, or does not mean that the different components have the same function. It should be judged based on the description of each component in the example.

In particular, the present invention proposes a new scheme for collecting emotional information of a user every time when using various contents in cooperation with a service server through a cloud network and applying the collected emotional information to the user in real time when using the next contents .

1 is a diagram illustrating a spatial information center cloud system according to an embodiment of the present invention.

As shown in FIG. 1, the spatial information center cloud system according to the present invention may include a user terminal 100, a service server 200, and a database 300.

The user terminal 100 may receive a variety of contents for the user, in particular, the elderly, in cooperation with the service server through the cloud network, and may apply the emotion information of the user to the various contents provided in real time.

To this end, the user terminal 100 can activate a separate application and collect emotion information of the user whenever the content is reproduced through the activated application.

Here, the emotion information of the user is information capable of judging the emotion of the user. For example, the emotion information of the user may be information such as a focus change of the eye, flicker of the eyelid, biometric information such as electrocardiogram, pulse wave, blood pressure, And usage information of music and the like.

At this time, the user terminal 100 may analyze the collected emotion information, calculate the emotion classification index expressed by the analysis result, and apply the calculated emotion classification index to the content.

In addition, the user terminal 100 provides the collected emotion information to the service server without directly analyzing the collected emotion information, receives the emotion classification index quantified from the service server in response thereto, And can be reproduced.

2 is a view for explaining the principle of deriving the sensitivity classification index according to an embodiment of the present invention.

2, the user terminal 100 can measure biometric information or biological signals such as emotional information, particularly electrocardiography (ECG), photoplethysmography (PPG), blood pressure, changes in the eyeball S210).

Next, the user terminal 100 may convert the measured bio-signal into a digital signal for storage in the memory (S220). At this time, the user terminal 100 divides and stores the bio-signals in units of consecutive time units.

For example, in the case where the user terminal 100 is stored in units of one minute, a signal from 0 minutes to 5 minutes is referred to as S1, a signal from 1 minute to 6 minutes is referred to as S2, and signals from 2 minutes to 7 minutes are referred to as S3 .

Next, the user terminal 100 may detect the R peak from the bio-signal converted into the digital signal and store the size, time information, and the like for the detected R vertices (S230).

At this time, the user terminal 100 stores R vertex information of each bio-signal, and stores information of R vertices in order to distinguish each of the bio-signals, r_S1, r_S2, r_S3, ... , r_Sx.

Next, the user terminal 100 analyzes the R vertex information of the stored bio-signal in the time domain and the frequency domain (S240, S250), and derives the sensitivity classification index as a result of the analysis (S290).

For example, the AVNN (m_RRi, average value of all normal to normal RR intervals), SDNN (STD_RRi, standard deviation of all normal to normal RR intervals), meanHR the mean value of the heart rate (m_HR), the standard deviation of the heart rate (STD_HR), and the square root of the mean normalized normal RR intervals (RMSSD).

At this time, among the calculated parameters, SDNN and RMSSD are expressed by the following Equation (1).

[Equation 1]

Figure 112015049566673-pat00001

Figure 112015049566673-pat00002

In the frequency domain, VLF (Very Low Frequency, 0.0033 to 0.04 Hz), LF (Low Frequency, 0.04 to 0.15 Hz), HF (High Frequency, 0.15 to 0.4 Hz) and LF / HF Low Frequency-High Frequency) can be calculated.

In this manner, the average change value can be calculated using each parameter calculated from the R vertex information by dividing the time domain and the frequency domain into regions.

For example, r_S1, r_S2, r_S3, r_S4, ... The analysis value of the parameter AVNN in the time domain can be numerically expressed as r_S1_AVNN, r_S2_AVNN, r_S3_AVNN, r_S4_AVNN, r_S4_AVNN .., r_Sx_AVNN, and the average value aver_AVNN can be calculated as a result of the averaging.

For example, it is possible to specify an emotion stage between a predetermined awakening and excitement, for example, one stage to seven stages or one stage to five stages, by combining aver_AVNN, aver_SDNN, aver_meanHR and aver_RMSSD calculated for each parameter in the time domain have. For example, the emotion stage can be subdivided into awaken very, awaken, awaken some, normal, excitement some, excitement, and excitement very.

At this time, the selected weighting value? Is analyzed by analyzing each gender and age group in the time domain, and it is summarized according to each parameter as shown in the following Equation (2).

&Quot; (2) "

Figure 112015049566673-pat00003

Figure 112015049566673-pat00004

Figure 112015049566673-pat00005

Figure 112015049566673-pat00006

Likewise, in the frequency domain as well, a feeling step between predetermined awakening and excitement, for example, from one step to seven steps or one step to five steps from a set of aver_VLF, aver_LF, aver_HF, aver_LF / One of the emotion stages can be specified.

At this time, the selected weight? Is analyzed by analyzing each gender and age group in the frequency domain, and the weighted value? Is used as shown in the following Equation (3).

&Quot; (3) "

Figure 112015049566673-pat00007

Figure 112015049566673-pat00008

Figure 112015049566673-pat00009

Figure 112015049566673-pat00010

In this way, the emotion classification index can be derived based on the predetermined emotion classification table using the emotion step calculated in the time domain and the frequency domain, respectively.

Such a sensitivity classification table is shown in the following [Table 1].

Awaken very (avt) Awaken (at) Awaken some (ast) normal (nt) Excitement some (est) Excitement (et) Excitement very (evt) Excitement very (evf) T6F6 T5F6 T4F6 T3F6 T2F6 T1F6 T0F6 Excitement
(ef)
T6F5 T5F5 T4F5 T3F5 T2F5 T1F5 T0F5
Excitement some (esf) T6F4 T5F4 T4F4 T3F4 T2F4 T1F4 T0F4 normal
(nf)
T6F3 T5F3 T4F3 T3F3 T2F3 T1F3 T0F3
Awaken some (asf) T6F2 T5F2 T4F2 T3F2 T2F2 T1F2 T0F2 Awaken
(af)
T6F1 T5F1 T4F1 T3F1 T2F1 T1F1 T0F1
Awaken very (af) T6F0 T5F0 T4F0 T3F0 T2F0 T1F0 T0F0

As shown in Table 1, the emotional classification table of the present invention can be divided into seven areas of arousal and excitement, and divided into 49 areas. Here, T3F3 in Table 1 can represent the basic state of the user at a normal time.

In addition, the user terminal 100 detects the change of the eyeball converted into the digital signal (S260), and analyzes the positional shift of the eyeball and the surface reflectance of the eyeball in the detected change of the eyeball (S270, S280) The result can be supplemented to derive the emotion classification index (S290).

Therefore, one of the steps of 49 steps can be selected using the emotion classification table, and the emotion classification index corresponding to the selected step can be derived.

For example, if there is a change in the average of the changes in the five-minute use of the initial user and the average of the changes in the usage of each content in the analysis of the eye movement and eyeball reflectance, The selected emotion classification index is selected one step higher than the selected emotion classification index.

In this way, the emotion classification index classified from the bio signal is stored together with the content name and personal information of the user currently using together with the area, hometown, age, sex, and the like.

In addition, the emotion classification index is different for each content.

For example, the emotion classification index lists the contents menu in order to select the contents menu sequentially from the high level when using the next system, and adjusts the background color, the background picture, the background music and the sound intensity of the music when using the system by using the color or music classified into the corresponding step And provides it to the user.

In addition, the emotion classification index is classified into each age, sex, hometown, and region, and is statistically processed for each content having a high emotion classification index per user by using statistical techniques. Thus, .

The emotional classification index for each user is statistically processed and accumulated data can be used to calculate the type of contents with a lot of emotional changes according to the type of user in the future, and it is provided to the application developer for the elderly using the same, It can also be used for special application development.

In addition, it can be utilized in the online open market by providing the database to the developers so that it can be utilized in various fields such as shopping, traveling, and hobbies for elderly people.

The user terminal 100 may be a dedicated terminal installed in a welfare center used by the elderly, or may include a personal terminal such as a smart phone, a tablet PC, a PC, and the like.

The service server 200 may provide various contents stored in the database to the user terminal. Here, the various contents may include contents for the elderly or the elderly.

At this time, the service server 200 receives the emotion information of the user from the user terminal and analyzes the emotion information of the user, calculates the emotion classification index which is numerically expressed as a result of the analysis, and outputs the calculated emotion classification index to the user terminal .

The database 300 may store various contents, emotion information for each user, and emotion classification index.

FIG. 3 is a view showing a composition of a friend meeting screen according to an embodiment of the present invention.

Referring to FIG. 3, the present invention relates to a method and apparatus for searching for " friends ", " friends (family members) " of an interactive community (chat) capable of video-based VoIP, video and voice mail transmission with various friendship level opponents such as hometown friends, Quot ;, " meet ", and the like.

FIG. 4 is a view illustrating a configuration of a homebound road according to an exemplary embodiment of the present invention.

Referring to FIG. 4, the present invention can provide a news of various hometowns, and can provide a "beautiful hometown" content that can provide information on my hometown such as annual events, pride, restaurants, and the like.

FIG. 5 is a diagram illustrating a configuration of an internal home information screen according to an embodiment of the present invention.

Referring to Fig. 5, the content of "Soil Street" of elderly job creation and linkage, such as the appearance of a hometown or oxygen of the ancestor, a photograph of the ancestor, a search for memories of local elders, .

FIG. 6 is a view showing a configuration of a small-distance screen according to an embodiment of the present invention.

Referring to FIG. 6, it is possible to provide " norot (old words of play) " contents for providing services such as a moving image and a song sound source for enhancing the sense of identity of users and inviting nostalgia.

3 to 6 are described as an embodiment of the present invention. However, the present invention is not limited to this, and various contents may be additionally provided as needed.

FIG. 7 illustrates a detailed configuration of a user terminal according to an exemplary embodiment of the present invention. Referring to FIG.

7, the user terminal 100 according to the present invention includes a communication unit 110, an input unit 120, a short range communication unit 130, a control unit 140, a display unit 150, a storage unit 160, A sensor unit 170, and the like.

The communication unit 110 can transmit and receive various data in cooperation with the service server through cloud networking. For example, the communication unit 110 transmits the login information received from the user to the service server or receives various contents from the service server.

The input unit 120 can receive information according to the user's menu or key operation.

The short-range communication unit 130 can transmit and receive various data in conjunction with a portable device through short-range wireless communication, such as RFID (Radio Frequency Identification) and NFC (Near Field Communication).

The control unit 140 can receive the content from the service server and execute the emotional classification index obtained by digitizing the emotional information of the user in real time.

The control unit 140 may acquire emotional information of the user through various sensors during execution of the content and may quantify the sensibility classification index of the user based on the sensed information obtained.

At this time, the control unit 140 can acquire usage information of the user from the emotion information of the user.

The display unit 150 can display a screen for executing a content reflecting the emotion classification index.

The storage unit 160 may store the content provided from the service server, the emotion information of the user acquired at the time of execution for each content, and the emotion classification index obtained by digitizing the emotion information.

The sensor unit 170 can acquire the user's body information from the emotion information of the user. For example, the sensor unit 170 may be a camera capable of measuring eye movements, flickering, and relative reflectivity, an electrocardiogram or a pulse wave measuring device, a blood pressure measuring device, or the like.

8 is a diagram illustrating a method for providing content in accordance with an embodiment of the present invention.

As shown in FIG. 8, the user terminal according to the present invention activates an application according to a user's menu or key operation (S801) and provides login information to the service server through the activated application to request a user login S802).

At this time, the method of logging in may include a login method using an ID and a password, a login method using RFID (Radio Frequency Identification) or NFC (Near Field Communication), and a login method using ID recognition.

Next, the service server performs user login based on the provided login information and the user information previously registered in the database (S803). If the user login is successful as a result of the login, the service server provides the content menu to the user terminal (S804).

Next, when the user terminal receives the content menu, the user terminal checks the user's emotion classification index for all the contents (S805), rearranges the content menu according to the determined emotion classification index (S806), displays the rearranged content menu (S807), it can be confirmed whether the content to be used is selected from the displayed content menu (S808).

At this time, the user terminal arranges and displays the contents according to the previously calculated emotion classification index. For example, the user terminal preferentially displays the content having the highest emotion classification index.

Next, when the content is selected from the content menu, the user terminal requests the selected content to the service server (S809), and the service server searches the database for the requested content (S810) and provides the retrieved content to the user terminal (S811).

Next, if the user terminal is provided with the corresponding content, it is confirmed whether there is a previous use history of the corresponding content (S812). If the user history exists, the user terminal can check the emotion classification index for the content S813).

Next, the user terminal can apply the emotion classification index to the selected content (S814) and execute it (S815). For example, the user can provide a background color, driving music, and the like of the entire contents according to the emotional classification index of the user, or create an atmosphere such as the brightness of the interior lighting and the color of the interior.

Next, the user terminal can acquire emotional information of the user through various sensors while executing the content (S816). Thus, the emotion information of the user acquired at the time of executing the content can be used to calculate the emotion classification index obtained by digitizing the emotion information of the user for the content.

Next, when the execution of the corresponding content is terminated (S817), the user terminal can terminate the execution of the activated application according to the user's menu or key operation (S818).

9 is a diagram illustrating a method for providing content according to another embodiment of the present invention.

As shown in FIG. 9, the user terminal according to the present invention activates an application according to a user's menu or key operation (S901) and provides login information to the service server through the activated application to request a user login S902).

Next, the service server performs user login based on the provided login information and the user information previously registered in the database (S903). If the user login is successful as a result of performing the user login, the service server obtains the user's emotion classification index (S904) and rearranges the content menu to be provided according to the determined emotion classification index (S905) and provides the content menu to the user terminal (S906).

Next, when the user terminal is provided with the content menu, the user terminal displays the provided content menu (S907), and can check whether the user has selected the content to use (S908).

Next, when the content is selected from the content menu, the user terminal can request the selected content to the service server (S909).

Next, when the service server receives a request for a content, the service server searches for the requested content (S910) and confirms whether there is a previous usage history for the content (S911). If the service history is found as a result of the check, The sensibility classification index can be confirmed (S912).

Next, the service server may apply the emotion classification index to the content (S913) and provide the emotion classification index to the user terminal (S914).

Next, the user terminal can receive and execute the content to which the emotion classification index is applied (S915).

Next, the user terminal acquires the emotion information of the user through various sensors while executing the content (S916), and provides the emotion information of the obtained user to the service server (S917).

Next, when the execution of the corresponding content is terminated (S918), the user terminal can deactivate the activated application in accordance with the user's menu or key operation (S919).

It is to be understood that the present invention is not limited to these embodiments, and all of the elements constituting the embodiments of the present invention described above are described as being combined or operated together. That is, within the scope of the present invention, all of the components may be selectively coupled to one or more of them. In addition, although all of the components may be implemented as one independent hardware, some or all of the components may be selectively combined to perform a part or all of the functions in one or a plurality of hardware. As shown in FIG. In addition, such a computer program may be stored in a computer-readable medium such as a USB memory, a CD disk, a flash memory, etc., and read and executed by a computer to implement embodiments of the present invention. As the storage medium of the computer program, a magnetic recording medium, an optical recording medium, a carrier wave medium, or the like may be included.

It will be apparent to those skilled in the art that various modifications and variations can be made in the present invention without departing from the spirit or essential characteristics thereof. Therefore, the embodiments disclosed in the present invention are intended to illustrate rather than limit the scope of the present invention, and the scope of the technical idea of the present invention is not limited by these embodiments. The scope of protection of the present invention should be construed according to the following claims, and all technical ideas within the scope of equivalents should be construed as falling within the scope of the present invention.

100: user terminal
110:
120: Input unit
130:
140:
150:
160:
170:
200: service server

Claims (24)

A user terminal that applies at least one of a background color and driving music according to emotion information of a user to the selected content, when the content is selected through the activated application by activating the application; And
A service server for providing content selected by a user in cooperation with the user terminal;
, Wherein the user terminal
Collecting emotion information of the user collected on the contents, converting the collected emotion information into a digital signal,
Detecting R vertex information from the biological signal converted into the digital signal,
Calculating an average change value using values of respective parameters calculated in the time domain and the frequency domain based on the R vertex information of the detected bio-signal,
And an emotion step between predetermined awakening and excitement is specified by combining all of the average change values calculated for each of the time domain and the frequency domain,
Deriving an emotion classification index based on the emotion classification table classified into a plurality of areas using the emotion step of the time domain and the emotion step of the frequency domain,
The parameters calculated through analysis in the time domain are AVNN (average value of all normal to normal RR intervals), SDNN (standard deviation of all normal to normal RR intervals), meanHR (mean hR rate), STD_HR deviation of heart rate values), RMSSD (the square root of the mean squared differences of successive normal to normal RR intervals)
The parameters calculated through the analysis in the frequency domain include very low frequency (VLF), low frequency (LF), high frequency (HF), and low frequency-high frequency (LF / HF)
A change of the eyeball converted into the digital signal is detected, and the positional shift of the eyeball and the surface reflectance of the eyeball are analyzed respectively in the change of the eyeball detected, and the obtained sensitivity classification index is corrected using the analyzed result,
In the analysis of the positional movement of the eyeball and the analysis of the degree of reflection of the eyeball, when there is a change of a difference between an average of the change of the initial user during a certain time of use and an average of the change of each content during use, Wherein the emotional classification index is increased by one step and is applied to the emotional classification selected in the plurality of steps at a change of a certain value or less.
The method according to claim 1,
The user terminal comprises:
A smart-device-based spatial information cloud for seniors, and a smart-classified cloud-based smart-device-based cloud for seniors, characterized in that the emotional information collected is analyzed, and a sensibility classification index that is numerically expressed as a result of the analysis is calculated, system.
3. The method of claim 2,
The user terminal comprises:
Upon receiving a content menu from the service server after successful user authentication through the application, the user's emotional classification index for all contents is confirmed,
Rearranges the contents menu according to the identified sensibility classification index to display the rearranged contents menu,
And the content to be used by the user is selected from the displayed content menu.
The method of claim 3,
The user terminal comprises:
When the content is selected from the displayed content menu, requesting the selected content to the service server and receiving the content in response thereto,
Checking the previous use history of the provided content, checking the user's emotion classification index for the content if the previous usage history exists,
And the emotional classification index is applied to the content and executed according to the result of the checking.
delete delete delete The method according to claim 1,
The service server,
Receiving the emotion information collected from the user terminal,
Wherein the emotional information analyzing unit analyzes the received emotional information, calculates a emotional classification index that is numerically expressed as a result of the analysis, and applies the calculated emotional classification index to the corresponding content to provide the smart content classification cloud. system.
9. The method of claim 8,
The service server,
A user's emotional classification index for all contents after successful user authentication is confirmed through the application,
Rearrange the content menu according to the identified emotion classification index
And provides the rearranged contents menu to the user terminal that has successfully authenticated the user.
10. The method of claim 9,
The service server,
Searching the content selected when the content is selected from the displayed content menu,
Checking the previous use history of the retrieved content, checking the user's emotion classification index for the content if the previous usage history exists,
And the emotional classification index is applied to the contents according to a result of the checking.
A communication unit for requesting a content selected by a user in cooperation with a service server and receiving the requested content;
A controller for applying at least one of background color and driving music according to emotion information of the user to the selected content when the content is selected through the activated application by activating the application; And
A display unit for displaying information related to the content through the activated application;
, Wherein the control unit
Collecting emotion information of the user collected on the contents, converting the collected emotion information into a digital signal,
Detecting R vertex information from the biological signal converted into the digital signal,
Calculating an average change value using values of respective parameters calculated in the time domain and the frequency domain based on the R vertex information of the detected bio-signal,
And an emotion step between predetermined awakening and excitement is specified by combining all of the average change values calculated for each of the time domain and the frequency domain,
Deriving an emotion classification index based on the emotion classification table classified into a plurality of areas using the emotion step of the time domain and the emotion step of the frequency domain,
The parameters calculated through analysis in the time domain are AVNN (average value of all normal to normal RR intervals), SDNN (standard deviation of all normal to normal RR intervals), meanHR (mean hR rate), STD_HR deviation of heart rate values), RMSSD (the square root of the mean squared differences of successive normal to normal RR intervals)
The parameters calculated through the analysis in the frequency domain include very low frequency (VLF), low frequency (LF), high frequency (HF), and low frequency-high frequency (LF / HF)
A change of the eyeball converted into the digital signal is detected, and the positional shift of the eyeball and the surface reflectance of the eyeball are analyzed respectively in the change of the eyeball detected, and the obtained sensitivity classification index is corrected using the analyzed result,
In the analysis of the positional movement of the eyeball and the analysis of the degree of reflection of the eyeball, when there is a change of a difference between an average of the change of the initial user during a certain time of use and an average of the change of each content during use, Wherein the emotional classification index is increased by one step and is applied to the emotional classification selected in the plurality of steps at a change of a certain value or less.
12. The method of claim 11,
Wherein,
A smart-device-based spatial information cloud for seniors, and a smart-classified cloud-based smart-device-based cloud for seniors, characterized in that the emotional information collected is analyzed, and a sensibility classification index that is numerically expressed as a result of the analysis is calculated, system.
13. The method of claim 12,
Wherein,
Upon receiving a content menu from the service server after successful user authentication through the application, the user's emotional classification index for all contents is confirmed,
Rearranges the contents menu according to the identified sensibility classification index to display the rearranged contents menu,
And the content to be used by the user is selected from the displayed content menu.
14. The method of claim 13,
Wherein,
When the content is selected from the displayed content menu, requesting the selected content to the service server and receiving the content in response thereto,
Checking the previous use history of the provided content, checking the user's emotion classification index for the content if the previous usage history exists,
And the emotional classification index is applied to the content and executed according to the result of the checking.
delete delete delete Requesting a content selected by a user in association with a service server and receiving the requested content;
Applying at least one of a background color and driving music according to emotional information of the user to the selected content when the content is selected through the activated application by activating the application; And
Displaying information related to the content through the activated application;
Wherein the executing step comprises:
Collecting emotion information of the user collected on the contents, converting the collected emotion information into a digital signal,
Detecting R vertex information from the biological signal converted into the digital signal,
Calculating an average change value using values of respective parameters calculated in the time domain and the frequency domain based on the R vertex information of the detected bio-signal,
And an emotion step between predetermined awakening and excitement is specified by combining all of the average change values calculated for each of the time domain and the frequency domain,
Deriving an emotion classification index based on the emotion classification table classified into a plurality of areas using the emotion step of the time domain and the emotion step of the frequency domain,
The parameters calculated through analysis in the time domain are AVNN (average value of all normal to normal RR intervals), SDNN (standard deviation of all normal to normal RR intervals), meanHR (mean hR rate), STD_HR deviation of heart rate values), RMSSD (the square root of the mean squared differences of successive normal to normal RR intervals)
The parameters calculated through the analysis in the frequency domain include very low frequency (VLF), low frequency (LF), high frequency (HF), and low frequency-high frequency (LF / HF)
A change of the eyeball converted into the digital signal is detected, and the positional shift of the eyeball and the surface reflectance of the eyeball are analyzed respectively in the change of the eyeball detected, and the obtained sensitivity classification index is corrected using the analyzed result,
In the analysis of the positional movement of the eyeball and the analysis of the degree of reflection of the eyeball, when there is a change of a difference between an average of the change of the initial user during a certain time of use and an average of the change of each content during use, Wherein the emotional classification index is selected by one step and applied to the emotional classification selected in the plurality of steps at a change of a predetermined value or less.
19. The method of claim 18,
Wherein the performing comprises:
The emotional information is analyzed, and the emotional classification index calculated as a result of the analysis is calculated, and the calculated emotional classification index is applied to the corresponding content and executed. Methods for providing.
19. The method of claim 18,
Wherein the performing comprises:
Upon receiving a content menu from the service server after successful user authentication through the application, the user's emotional classification index for all contents is confirmed,
Rearranges the contents menu according to the identified sensibility classification index to display the rearranged contents menu,
And selecting a content to be used by the user from the displayed content menu.
21. The method of claim 20,
Wherein the performing comprises:
When the content is selected from the displayed content menu, requesting the selected content to the service server and receiving the content in response thereto,
Checking the previous use history of the provided content, checking the user's emotion classification index for the content if the previous usage history exists,
And applying the emotion classification index to the contents according to a result of the checking, and executing the emotion classification index to the contents.
delete delete delete
KR1020150072012A 2014-05-22 2015-05-22 Cloud system of smart devices-based spatial information centric for senior and method for providing contents using it KR101585083B1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020140061877 2014-05-22
KR20140061877 2014-05-22

Publications (2)

Publication Number Publication Date
KR20150135747A KR20150135747A (en) 2015-12-03
KR101585083B1 true KR101585083B1 (en) 2016-01-22

Family

ID=54872029

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150072012A KR101585083B1 (en) 2014-05-22 2015-05-22 Cloud system of smart devices-based spatial information centric for senior and method for providing contents using it

Country Status (1)

Country Link
KR (1) KR101585083B1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101913850B1 (en) * 2018-04-02 2018-11-01 (주)에프앤아이 Method for assessing psychological state of user by analysing psychological state data obtained through psychological assessment kit and server using the same

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102256640B1 (en) 2019-10-15 2021-05-26 주식회사 테크포아이 A system and method for finding the senior memories
KR102254171B1 (en) * 2019-10-29 2021-05-20 황록주 Apparatus and Method for Providing Potal servic for senior people

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20120030789A (en) * 2010-09-20 2012-03-29 한국전자통신연구원 System and method for service or contents based on emotional information
KR101203182B1 (en) * 2010-12-22 2012-11-20 전자부품연구원 System for emotional contents community service
KR20120101233A (en) * 2011-02-28 2012-09-13 (주)다음소프트 Method for providing sentiment information and method and system for providing contents recommendation using sentiment information
KR20130119246A (en) * 2012-04-23 2013-10-31 한국전자통신연구원 Apparatus and method for recommending contents based sensibility

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101913850B1 (en) * 2018-04-02 2018-11-01 (주)에프앤아이 Method for assessing psychological state of user by analysing psychological state data obtained through psychological assessment kit and server using the same

Also Published As

Publication number Publication date
KR20150135747A (en) 2015-12-03

Similar Documents

Publication Publication Date Title
US10382670B2 (en) Cognitive recording and sharing
EP2950551B1 (en) Method for recommending multimedia resource and apparatus thereof
KR102363794B1 (en) Information providing method and electronic device supporting the same
US9886454B2 (en) Image processing, method and electronic device for generating a highlight content
AU2015201759B2 (en) Electronic apparatus for providing health status information, method of controlling the same, and computer readable storage medium
KR102558437B1 (en) Method For Processing of Question and answer and electronic device supporting the same
US9354702B2 (en) Manipulation of virtual object in augmented reality via thought
US9535499B2 (en) Method and display apparatus for providing content
WO2017172954A1 (en) Content collection navigation and autoforwarding
US8909636B2 (en) Lifestyle collecting apparatus, user interface device, and lifestyle collecting method
KR20160054392A (en) Electronic apparatus and operation method of the same
US10755487B1 (en) Techniques for using perception profiles with augmented reality systems
CN113383295A (en) Biofeedback methods to adjust digital content to elicit greater pupil radius response
US11934643B2 (en) Analyzing augmented reality content item usage data
CN109982124A (en) User's scene intelligent analysis method, device and storage medium
CN110568930B (en) Method for calibrating fixation point and related equipment
CN114648354A (en) Advertisement evaluation method and system based on eye movement tracking and emotional state
KR101585083B1 (en) Cloud system of smart devices-based spatial information centric for senior and method for providing contents using it
US11204949B1 (en) Systems, devices, and methods for content selection
KR102457247B1 (en) Electronic device for processing image and method for controlling thereof
CN113764099A (en) Psychological state analysis method, device, equipment and medium based on artificial intelligence
US20200226012A1 (en) File system manipulation using machine learning
KR20140002238A (en) Method of measuring sensitivity based on user feedback and storage media storing the same
CN105450868B (en) Harmful light prompting method and device
Lee et al. Real‐Time Mobile Emotional Content Player Using Smartphone Camera‐Based PPG Measurement

Legal Events

Date Code Title Description
A201 Request for examination
A302 Request for accelerated examination
E902 Notification of reason for refusal
AMND Amendment
E601 Decision to refuse application
AMND Amendment
X701 Decision to grant (after re-examination)
GRNT Written decision to grant
FPAY Annual fee payment

Payment date: 20181213

Year of fee payment: 6