KR20170106793A - Apparatus and method for controlling devices - Google Patents

Apparatus and method for controlling devices Download PDF

Info

Publication number
KR20170106793A
KR20170106793A KR1020160030363A KR20160030363A KR20170106793A KR 20170106793 A KR20170106793 A KR 20170106793A KR 1020160030363 A KR1020160030363 A KR 1020160030363A KR 20160030363 A KR20160030363 A KR 20160030363A KR 20170106793 A KR20170106793 A KR 20170106793A
Authority
KR
South Korea
Prior art keywords
information
sensory
user
humidity
preference
Prior art date
Application number
KR1020160030363A
Other languages
Korean (ko)
Inventor
양승준
안충현
서정일
장인선
최지훈
Original Assignee
한국전자통신연구원
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 한국전자통신연구원 filed Critical 한국전자통신연구원
Priority to KR1020160030363A priority Critical patent/KR20170106793A/en
Publication of KR20170106793A publication Critical patent/KR20170106793A/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/12Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
    • H04L67/125Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks involving control of end-device applications over a network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/2866Architectures; Arrangements
    • H04L67/30Profiles
    • H04L67/306User profiles
    • H04L67/36

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A device control apparatus and a method thereof are disclosed. The device control method includes: a step of receiving information on image content; a step of processing a sensory/emotional effect syntax associated with information about the received image content; a step of generating a sensory/emotional effect command for operating a plurality of devices using the processed sensory/emotional effect syntax; and a step of controlling a plurality of devices being reproduced according to the generated sensory/emotional effect command using the preference of a user and surrounding environment information. The present invention can maximize the experience of the user by supplying an optimized sensory/emotional effect service.

Description

[0001] APPARATUS AND METHOD FOR CONTROLLING DEVICES [0002]

The present invention relates to a device control apparatus and method, and more particularly, to an apparatus and method for controlling a sensory / emotional effect device that increases a user's experience by using a user's preference and surrounding environment information.

In the past, analog broadcasting has changed from digital broadcasting to digital broadcasting, and users' demands for contents have been increasing. Following HDTV broadcasting and 3DTV broadcasting, the resolution of contents is rapidly increasing up to UHDTV, and the viewer's visual satisfaction is raised.

In recent years, interest in realistic content consumption represented by words such as realistic media, augmented reality, increasing and decreasing reality, 4D, and emotional ICT is increasing. This is shifting from a form consuming contents represented by video and audio in the past to consuming a combination of contents and user's senses, emotions, and environmental factors so that users can feel as if they are the main characters in the contents.

Proper stimulation of other senses such as olfactory and tactile senses as well as conventional vision and hearing are required to increase the user's five senses. To this end, there is a growing interest in sensory effects services using devices that provide effects that stimulate sensations such as lighting, temperature, wind, vibration, aroma, motion, as well as ultra-high resolution 3D video and multi-channel audio.

ISO / IEC 23005 (MPEG-V), which has been ongoing since 2008 as a representative international standard for enhancing user experience, is composed of 7 parts in total. Part-2 (Control information) defines user preference meta data (User preference) and device capability meta data (Device capability). In addition, Part-3 (Sensory information) defines sensory effect. Part-4 (Virtual world objects) provides definition of object / characteristic and avatar of virtual environment. And Data formats for device interactions standardize metadata for sensor and device commands. The MPEG-V standard is intended to standardize the interface between the virtual space and the real space so that the user can accurately convey the sensory effect intended to be provided even in a heterogeneous device or broadcasting environment.

This sensory / emotional effect service can maximize the immersion, realism, and sense of unity of the contents of the non - disabled people, and it is expected to be very helpful for education and rehabilitation of the disabled. However, in the case of the 4D theater, which is a representative service currently provided, it is not suitable for personalized consumption according to user's preference, rejection, environment, etc. by transmitting the sensation / sensitivity effect specific to contents equally to all users.

The present invention provides an apparatus and method for maximizing a user's experience by providing a sensory / emotional effect service optimized for a user by controlling a sensory / emotional effect device that increases a user's experience by using a user's preference and surrounding environment information ≪ / RTI >

According to an embodiment of the present invention, there is provided a device control method comprising: receiving information on image content; Processing a sensory / emotional effect syntax associated with information about the received image content; Generating a sensory / emotional effect command for operating a plurality of devices using the processed sensory / emotional effect syntax; And controlling the plurality of devices being reproduced according to the generated sensory / emotional effect command using the user's preference and surrounding environment information.

The ambient environment information may include at least one of illuminance information, temperature information, and humidity information, and discomfort index information that is a combination of the temperature information and the humidity information.

The plurality of devices may be reproduced based on the user's preference for the surrounding information.

Wherein the controlling step comprises: receiving at least one of illuminance information, temperature information, humidity information, and the user's preference for the illuminance information, the temperature information, and the humidity information; Calculating an unpleasantness index using the received temperature information and humidity information; The plurality of devices can be turned on or off by comparing the calculated discomfort index, the received roughness information, temperature information, and humidity information with the user's preference.

The information about the video content may include at least one of a title and a duration of the video content.

Wherein the generating includes receiving at least one of information on availability of each device from the plurality of devices and maximum performance information for each device; And modifying the intensity of the generated sensory / emotional effect command using the received information.

A device control apparatus according to an exemplary embodiment of the present invention includes a receiver for receiving information on image content; A processing unit for processing a sensory / emotional effect syntax related to the information on the received image contents; A generating unit for generating a sensory / emotional effect command for operating a plurality of devices using the processed sensory / emotional effect syntax; And a controller for controlling the plurality of devices under reproduction using the sensory / emotion effect command using the user's preference and surrounding environment information.

The ambient environment information may include at least one of illuminance information, temperature information, and humidity information, and discomfort index information that is a combination of the temperature information and the humidity information.

The control unit receives at least one of the illuminance information, the temperature information, the humidity information, the illuminance information, the temperature information, and the user's preference for the humidity information, calculates the unpleasantness index using the received temperature information and the humidity information, The plurality of devices can be turned on or off by comparing the calculated discomfort index, the received roughness information, temperature information, and humidity information with the user's preference.

Wherein the generation unit receives at least one of information on usability of each device and maximum performance information for each device from the plurality of devices and transmits the generated sensed / The effect strength of the effect command can be modified.

According to an embodiment of the present invention, a sensory / emotional effect service optimized for a user can be provided by controlling a sensory / emotional effect device that enhances a user's experience by using a user's preference and surrounding environment information, Can be maximized.

1 is a block diagram illustrating a device control apparatus according to an embodiment of the present invention.
2 is a diagram illustrating a device control method according to an embodiment of the present invention in order.
3 is a diagram showing an example of device control performed by the device control apparatus according to an embodiment of the present invention.

It is to be understood that the specific structural or functional descriptions of embodiments of the present invention disclosed herein are presented for the purpose of describing embodiments only in accordance with the concepts of the present invention, May be embodied in various forms and are not limited to the embodiments described herein.

Embodiments in accordance with the concepts of the present invention are capable of various modifications and may take various forms, so that the embodiments are illustrated in the drawings and described in detail herein. However, it is not intended to limit the embodiments according to the concepts of the present invention to the specific disclosure forms, but includes changes, equivalents, or alternatives falling within the spirit and scope of the present invention.

The terms first, second, or the like may be used to describe various elements, but the elements should not be limited by the terms. The terms may be named for the purpose of distinguishing one element from another, for example without departing from the scope of the right according to the concept of the present invention, the first element being referred to as the second element, Similarly, the second component may also be referred to as the first component.

It is to be understood that when an element is referred to as being "connected" or "connected" to another element, it may be directly connected or connected to the other element, . On the other hand, when an element is referred to as being "directly connected" or "directly connected" to another element, it should be understood that there are no other elements in between. Expressions that describe the relationship between components, for example, "between" and "immediately" or "directly adjacent to" should be interpreted as well.

The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. The singular expressions include plural expressions unless the context clearly dictates otherwise. In this specification, the terms "comprises ", or" having ", and the like, are used to specify one or more of the features, numbers, steps, operations, elements, But do not preclude the presence or addition of steps, operations, elements, parts, or combinations thereof.

Unless defined otherwise, all terms used herein, including technical or scientific terms, have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. Terms such as those defined in commonly used dictionaries are to be interpreted as having a meaning consistent with the meaning of the context in the relevant art and, unless explicitly defined herein, are to be interpreted as ideal or overly formal Do not.

Hereinafter, embodiments will be described in detail with reference to the accompanying drawings. However, the scope of the patent application is not limited or limited by these embodiments. Like reference symbols in the drawings denote like elements.

The device control apparatus 100 according to an exemplary embodiment of the present invention determines whether the sensory / emotional effect device is operated according to the user's preference and surrounding environment information when the user personally consumes the sensory / Can be controlled. Such a device control apparatus 100 may select a device that provides a desired or undesired sensory / emotional effect according to a user's preference, or may include an adjustment to intensity. For example, a user who feels frightened by a shock, such as a person with a bad body or a blind person, may turn off a sensory / emotional effect device that provides a physical effect such as vibration or movement, It can be controlled weakly.

In addition, the device control apparatus 100 may reflect the user's expression. For example, the device control apparatus 100 may reflect emotional analysis results through facial expression recognition, which is being studied recently, or may appropriately adjust sensory / emotional effects according to the user's body information (blood pressure, body temperature, heart rate, etc.).

The device control apparatus 100 can control the sensory / emotional effect device in consideration of physical environmental factors such as temperature, humidity, and illumination. For example, replaying sensory / emotional services such as hot air or water spray in hot, humid summers can be uncomfortable to the user.

In the present invention, a sensory / emotional effect service optimized for a user is provided by controlling a sensory / emotional effect device by combining environmental factors such as a user's preference, a discomfort index and an illuminance using temperature and humidity, Can be maximized.

As shown in the following Table 1, there is a slight difference in the discomfort ratio of the nation according to the nationality. However, when the discomfort index is 80 or more, at least 50% % Or more are felt to be uncomfortable.

[Table 1]

Figure pat00001

The discomfort index can be calculated by the dry bulb temperature ( Ta ) and the wet bulb temperature ( Tw ) according to Equation (1) below.

[Formula 1]

DI = 0.4 ㅧ ( Ta + Tw ) + 15

In this case, the unit of temperature is Fahrenheit (F), and Equation (1) is converted from Fahrenheit to Celsius temperature.

[Formula 2]

DI = 0.72 ( Td + Tw ) + 40.6

The unpleasantness index may be calculated by modifying Equation 2 using the temperature ta (° C) and the relative humidity RH (%) as shown in Equation 3 below.

[Formula 3]

DI = 9/5 * ta , -0.55 (1-RH) (9/5 * ta , -26) +32

Table 2 below shows the body sensation experienced by the user according to the discomfort index classified by the Korea Meteorological Administration.

 [Table 2]

Figure pat00002

In other words, as shown in Table 2, if the discomfort index has a value between 75 and 80, half of the total population of the Republic of Korea feel uncomfortable, and if the discomfort index has a value of 80 or more, the entire population of the Republic of Korea may feel uncomfortable .

Table 3 below shows the illuminance standard stipulated in Korea KS A 3011.

[Table 3]

Figure pat00003

For example, if the user's environment is a living room in the home, the illuminance may be expected to be in the range of 60 to 100 to 150 LUX, and the sensory / emotional effect device related to the illumination may be used based on the illuminance, Lt; / RTI >

In the case of relative humidity, the Meteorological Agency presents the appropriate humidity for each temperature. For example, in summer, even if the temperature is high, if the humidity is low, it is tolerable, but if the humidity is high, it can feel hot. On the other hand, when the temperature is low in winter, when it is dry, it gets worse and when the temperature is low, when it is high, it can get cold.

Taking this into consideration, the Meteorological Agency recommends that the humidity of 70 RH% at 15 ° C, 60 RH% at 18 to 20 ° C, and 40 RH% at 24 ° C are appropriate. According to the indoor air quality standards of USA, Japan, and Korea, the minimum humidity limit is set at 30RH%. If it falls below the minimum, it will dry the mucous membranes and take away moisture from the eye or skin, causing flu, flu, dryness and itching.

Therefore, although it depends on the season, when the temperature is set to a range suitable for a person to live at 18 to 26 degrees Celsius, when the temperature rises, the humidity is lowered. When the temperature rises, % May be ideal.

The device control apparatus 100 according to an exemplary embodiment of the present invention can control the operation of the sensory / emotional effect device according to the user's preference and various peripheral environment information.

1 is a block diagram illustrating a device control apparatus according to an embodiment of the present invention.

The device control apparatus 100 according to an exemplary embodiment of the present invention may include a receiving unit 110, a processing unit 120, a generating unit 130, and a control unit 140. The receiving unit 110 can receive information about image contents to be reproduced on a display such as a TV screen, a screen of a movie theater, and the like. At this time, the information about the video content received by the receiving unit 110 may include at least one of the title and the duration of the video content.

For example, the receiving unit 110 may correspond to a TV or a receiving module included in a set-top box (STB). That is, in order to operate the sensory / emotional effects related to the image contents reproduced on the TV screen using the plurality of sensory / emotional effect devices 150, the title of the corresponding image contents and the sense / A list of emotion effect phrases may be required. At this time, since the sensory / emotional effect has the purpose of operating only during a duration in which a specific scene (Ex, burning lava scene) is reproduced, the receiving unit 110 can recognize the title of the specific scene, Time information can be received together.

The processing unit 120 may process the sensory / emotional effect syntax associated with the information about the received image content. In this case, the sensory / emotional effect syntax may be text information created by a series of rules for a sensory / emotional effect service for maximizing a user's experience. The processing unit 120 interprets the sensory / emotional effect syntax corresponding to the information on the image contents received by the receiver 110 among the plurality of sensory / emotional effect syntaxes, and generates a list of interpreted sensory / emotional effect syntaxes .

For example, the sensory / emotional effect syntax is 'Spray water on user's face for 5 seconds at 1 minute 10 seconds.' Or " Let hot wind blow for 4 seconds at 3 minutes 30 seconds ". At this time, such text information may be configured in a language such as XML (extensible markup language) according to an embodiment of the present invention.

The generating unit 130 may generate a sensory / emotional effect command for operating a plurality of devices using the processed sensory / emotional effect syntax. The generating unit 130 generates sensory / emotional effects corresponding to the sequence of the received image contents based on the list of sensory / emotional effect sent through the processing unit 120 according to a predetermined protocol, Command.

Since the sensory / emotion effect sentences received by the generating unit 130 through the processing unit 120 are configured in a human-readable language (Ex. XML, etc.), a plurality of sensory / emotional effect devices 150 Can not control. Accordingly, the generation unit 130 can generate sensory / emotional effect commands by converting the sensory / emotional effect sentences composed of a human-readable language into a code that can be understood by a machine according to a predetermined protocol.

At this time, the generating unit 130 may receive at least one of information on usability for each device from the plurality of sensory / emotional effect devices 150 and maximum performance information for each device. The generation unit 130 may modify the intensity of the sensed / emotion effect command converted using the received information. The generating unit 130 may transmit the converted sensory / emotional effect commands to the plurality of sensory / emotional effect devices 150.

At this time, the plurality of sensory / emotional effect devices 150 reproduced according to the preset information processes the sensory / emotional effect commands transmitted through the generation unit 130 according to the order in which the image contents progress, It is possible to provide an emotional effect service.

The control unit 140 can control a plurality of devices 150 being reproduced according to the sensed / emotion effect command using the user's preference and surrounding environment information. At this time, the surrounding environment information may include at least one of illuminance information, temperature information, and humidity information, and discomfort index information that is a combination of temperature information and humidity information.

For this, the controller 140 may receive illuminance information, temperature information, and humidity information through a plurality of sensors installed around the user. Thereafter, the controller 140 calculates the unpleasantness index using the received temperature information and the humidity information, and controls the plurality of sensory / emotional effect devices 150 by comparing the calculated unpleasantness index and the user's preference. A more specific control method will be described with reference to the subsequent drawings.

2 is a diagram illustrating a device control method according to an embodiment of the present invention in order.

In step 210, the device control apparatus 100 can receive information on the image content to be reproduced on a display such as a TV screen, a screen of a movie theater, or the like. At this time, the information about the video content received by the device control apparatus 100 may include at least one of the title and the duration of the video content.

For example, assuming that a scene in which a typhoon is approaching is reproduced from the image content reproduced on the display, the device control apparatus 100 determines whether the title of the image content related to the typhoon and the duration of the corresponding scene Information can be received.

In step 220, the device control apparatus 100 may process the sensory / emotional effect syntax associated with the information about the received image content. The device control apparatus 100 may analyze the sensory / emotional effect syntax corresponding to the information on the received image content among the plurality of sensory / emotional effect syntaxes and generate a list of interpreted sensory / emotional effect syntaxes.

For example, if the information about the received image content includes the title of the image content related to the typhoon and the duration of the scene, the device control apparatus 100 interprets the sensation / emotion effect syntax related to the wind, The sensory / emotional effect syntax can be included in the list.

In step 230, the device control apparatus 100 may generate a sensory / emotional effect command for manipulating a plurality of devices using the list of generated sensory / emotional effect statements. The device control apparatus 100 may convert the sensory / emotional effect matching sentences in accordance with the order in which the received image contents are processed into the sensory / emotional effect commands in accordance with a predetermined protocol, based on the list of sensed / have.

At this time, since the sensory / emotion effect phrases are configured in a human-readable language (Ex. XML or the like), they can not control a plurality of sensory / emotional effect devices 150 constituted by a machine. Accordingly, the device control apparatus 100 can generate the sensory / emotional effect commands by converting the sensory / emotional effect phrases configured in a human-readable language into codes that can be understood by the machine according to a predetermined protocol.

At this time, the device control apparatus 100 may receive at least one of information on availability of each device from the plurality of sensory / emotional effect devices 150 and maximum performance information for each device . The device control apparatus 100 can modify the intensity of the converted sensory / emotional effect command using the received information.

For example, suppose that the device control apparatus 100 converts the sensory / emotional effect statements into sensory / emotional effect commands that raise the temperature to 50 degrees according to a predetermined protocol. However, if the maximum performance information of the sensory / emotional effect devices related to the temperature among the plurality of sensory / emotional effect devices 150 provides sensory / emotional effect services of up to 40 degrees, Depending on the information, the effect intensity of the sensory / emotional effect command can be lowered.

In step 240, the device control apparatus 100 can control the plurality of devices 150 being reproduced according to the generated sensation / emotion effect command using the user's preference and surrounding environment information. At this time, the surrounding environment information may include at least one of illuminance information, temperature information, and humidity information, and discomfort index information that is a combination of temperature information and humidity information.

The device control apparatus 100 can compare the user's preferences with the surrounding environment information and the surrounding environment information to turn on or off the plurality of devices 150. [

For example, when the unpleasantness index calculated by comparing the user's preference with the unpleasantness index calculated using the temperature information and the humidity information received by the device control apparatus 100 is higher than the unpleasantness index of the user's preference Execution of all the devices 150 can be stopped.

3 is a diagram showing an example of device control performed by the device control apparatus according to an embodiment of the present invention.

The device control apparatus 100 can receive ambient information such as illuminance information, temperature information, and humidity information, and calculate the discomfort index information by combining the temperature information and the humidity information. The device control apparatus 100 can set a minimum value and a maximum value for illuminance, humidity, temperature, and unpleasantness index in advance according to the user's preference as shown in the following example.

<Example 1>

1. Temperature: min = 18, max = 26

2. If the current temperature is between 18 ~ 26, all related devices are operated

3. Cooling / wind-related device operation prohibited if current temperature <18

4. Current illumination> 26 prevents the operation of heating related devices

<Example 2>

1. Humidity: min = 40, max = 60

2. If the current humidity is between 40 and 60,

3. If the current temperature is <40, the operation of device related to humidity (sprayer, etc.), operation of heating (humidity lowering device) is prohibited

4. Humidity (atomizer, etc.) related device operation prohibited if current humidity> 60

<Example 3>

1. Poor indices: 70

2. Behavior of all devices under Poor Index 70

3. Device operation prohibits temperature and humidity from rising above 70

In step 301, the device control apparatus 100 receives the surrounding environment information such as the illuminance, the humidity, the temperature and the discomfort index, and the user's preference for the corresponding surrounding environment information. The user's preferences may include minimum and maximum values for illuminance, humidity, temperature, and discomfort index.

In step 302, the device control apparatus 100 may determine the response to the plurality of sensory / emotional effect devices 150 based on the user's preference. For example, a user who feels terrified by a shock, such as a person who is unwell or visually impaired, may be sensitive to the sensory / emotional effect device 150 that provides physical effects such as vibration or movement. Therefore, the device control apparatus 100 can stop the sensory / emotional effect device 150 providing the physical effect such as vibration or movement based on the preference of the user as in step 310 beforehand.

In step 303, the device control apparatus 100 may calculate the discomfort index using the humidity and the temperature input in step 301. [

In step 304, the device control apparatus 100 may compare the discomfort index calculated in step 303 with a predetermined discomfort index according to the user's preference to determine whether it is in a comfortable range. If it is determined that the calculated discomfort index is uncomfortable as a result of comparing the calculated discomfort index with a predetermined discomfort index according to the user's preference, the device control apparatus 100 can stop the operation of all devices as in step 311. [

In steps 305 to 306, the device control apparatus 100 can compare the humidity set in advance in accordance with the humidity input in step 301 and the user's preference. If it is determined that the inputted humidity is equal to or higher than the maximum value of the humidity set in advance according to the user's preference, the device control apparatus 100 controls the operation of the device (water injector, foam ejector, humidifier, etc.) It can be stopped. Alternatively, if it is determined that the inputted humidity is equal to or lower than the minimum value of the humidity set in advance according to the user's preference, the device control apparatus 100 stops the operation of the device (heater, air conditioner, etc.) .

In steps 307 to 308, the device control apparatus 100 can compare the temperature inputted in step 301 with the preset temperature according to the user's preference. If it is determined that the input temperature is equal to or higher than the preset maximum value of the temperature according to the user's preference, the device control apparatus 100 may stop the operation of the device (heater, etc.) for increasing the temperature as in step 314. Otherwise, if it is determined that the inputted temperature is lower than the preset minimum temperature value according to the user's preference, the device control apparatus 100 stops the operation of the device (fan, air conditioner, fan, etc.) .

In step 309, the device control apparatus 100 may compare a predetermined illuminance according to the illuminance input in step 301 and the user's preference. If it is determined that the input illuminance is equal to or higher than a reference value of a predetermined illuminance according to the user's preference, the device control apparatus 100 performs the operation of the device (LED, pin, long pin, Can be stopped.

The device control apparatus 100 may perform steps 301 to 309 only once during the reproduction of the video contents, or may repeat it periodically as necessary. For example, the device control apparatus 100 confirms changes in ambient information such as illuminance, humidity, temperature, and discomfort index at predetermined time intervals (Ex. 1 minute intervals) Can be repeated.

Thus, the device control apparatus 100 can individually control the sensory / emotional effect device 150 based on the user's preference for surrounding environment information such as illuminance, humidity, temperature, and discomfort index, and corresponding surrounding environment information have. Accordingly, the device control apparatus 100 can provide different sensory / emotional effect services according to individual users, thereby maximizing the user experience by stimulating other senses and emotions in addition to sight and hearing to the non-disabled persons, Or non-auditory user experience.

The apparatus described above may be implemented as a hardware component, a software component, and / or a combination of hardware components and software components. For example, the apparatus and components described in the embodiments may be implemented within a computer system, such as, for example, a processor, a controller, an arithmetic logic unit (ALU), a digital signal processor, a microcomputer, a field programmable gate array (FPGA) , A programmable logic unit (PLU), a microprocessor, or any other device capable of executing and responding to instructions. The processing device may execute an operating system (OS) and one or more software applications running on the operating system. The processing device may also access, store, manipulate, process, and generate data in response to execution of the software. For ease of understanding, the processing apparatus may be described as being used singly, but those skilled in the art will recognize that the processing apparatus may have a plurality of processing elements and / As shown in FIG. For example, the processing unit may comprise a plurality of processors or one processor and one controller. Other processing configurations are also possible, such as a parallel processor.

The software may include a computer program, code, instructions, or a combination of one or more of the foregoing, and may be configured to configure the processing device to operate as desired or to process it collectively or collectively Device can be commanded. The software and / or data may be in the form of any type of machine, component, physical device, virtual equipment, computer storage media, or device , Or may be permanently or temporarily embodied in a transmitted signal wave. The software may be distributed over a networked computer system and stored or executed in a distributed manner. The software and data may be stored on one or more computer readable recording media.

The method according to an embodiment may be implemented in the form of a program command that can be executed through various computer means and recorded in a computer-readable medium. The computer-readable medium may include program instructions, data files, data structures, and the like, alone or in combination. The program instructions to be recorded on the medium may be those specially designed and configured for the embodiments or may be available to those skilled in the art of computer software. Examples of computer-readable media include magnetic media such as hard disks, floppy disks and magnetic tape; optical media such as CD-ROMs and DVDs; magnetic media such as floppy disks; Magneto-optical media, and hardware devices specifically configured to store and execute program instructions such as ROM, RAM, flash memory, and the like. Examples of program instructions include machine language code such as those produced by a compiler, as well as high-level language code that can be executed by a computer using an interpreter or the like. The hardware devices described above may be configured to operate as one or more software modules to perform the operations of the embodiments, and vice versa.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. For example, it is to be understood that the techniques described may be performed in a different order than the described methods, and / or that components of the described systems, structures, devices, circuits, Lt; / RTI &gt; or equivalents, even if it is replaced or replaced.

Therefore, other implementations, other embodiments, and equivalents to the claims are also within the scope of the following claims.

100: Device control device
110:
120:
130:
140:
150: device

Claims (10)

Receiving information on image content;
Processing a sensory / emotional effect syntax associated with information about the received image content;
Generating a sensory / emotional effect command for operating a plurality of devices using the processed sensory / emotional effect syntax; And
Controlling the plurality of devices being reproduced according to the sensed / emotion effect command using the user's preference and surrounding environment information
Lt; / RTI &gt;
The method according to claim 1,
The ambient environment information,
And at least one of the illuminance information, the temperature information and the humidity information, and the discomfort index information which is a combination of the temperature information and the humidity information.
The method according to claim 1,
The plurality of devices include:
Wherein the preference information is reproduced based on the preference of the user with respect to the environment information.
The method according to claim 1,
Wherein the controlling comprises:
Receiving at least one of illuminance information, temperature information, humidity information, and the user's preference for the illuminance information, the temperature information, and the humidity information;
Calculating an unpleasantness index using the received temperature information and humidity information;
And comparing the calculated discomfort index, the received roughness information, temperature information, and humidity information with the user's preference to turn on or off the plurality of devices.
The method according to claim 1,
The information about the video content may include:
And a title and a duration of the image content.
The method according to claim 1,
Wherein the generating comprises:
Receiving at least one of information about availability of each device from the plurality of devices and maximum performance information for each device; And
And modifying the intensity of the generated sensory / emotional effect command using the received information
Lt; / RTI &gt;
A receiving unit for receiving information on image contents;
A processing unit for processing a sensory / emotional effect syntax related to the information on the received image contents;
A generating unit for generating a sensory / emotional effect command for operating a plurality of devices using the processed sensory / emotional effect syntax; And
A controller for controlling a plurality of devices under reproduction by using the user's preference and surrounding environment information according to the sensed /
The device control apparatus comprising:
8. The method of claim 7,
The ambient environment information,
And at least one of the illuminance information, the temperature information and the humidity information, and the discomfort index information obtained by combining the temperature information and the humidity information.
8. The method of claim 7,
Wherein,
Wherein the control unit receives at least one of the illuminance information, the temperature information, the humidity information, and the user's preference for the illuminance information, the temperature information, and the humidity information, calculates the unpleasantness index using the received temperature information and the humidity information, The device control unit compares the unpleasantness index, the received roughness information, the temperature information, and the humidity information with the user's preference to turn on or off the plurality of devices.
8. The method of claim 7,
Wherein the generation unit comprises:
Receiving at least one of information on availability of each device from the plurality of devices and maximum performance information for each device, and transmitting the generated sensory / emotional effect command A device control apparatus for modifying an intensity of an effect.
KR1020160030363A 2016-03-14 2016-03-14 Apparatus and method for controlling devices KR20170106793A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020160030363A KR20170106793A (en) 2016-03-14 2016-03-14 Apparatus and method for controlling devices

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020160030363A KR20170106793A (en) 2016-03-14 2016-03-14 Apparatus and method for controlling devices

Publications (1)

Publication Number Publication Date
KR20170106793A true KR20170106793A (en) 2017-09-22

Family

ID=60034833

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020160030363A KR20170106793A (en) 2016-03-14 2016-03-14 Apparatus and method for controlling devices

Country Status (1)

Country Link
KR (1) KR20170106793A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPWO2021005757A1 (en) * 2019-07-10 2021-01-14
KR102516431B1 (en) 2022-04-14 2023-04-03 케이지케미칼 주식회사 Microbial Fertilizer for Carbon Dioxide Reduction and Preparation Method of the Same

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPWO2021005757A1 (en) * 2019-07-10 2021-01-14
KR102516431B1 (en) 2022-04-14 2023-04-03 케이지케미칼 주식회사 Microbial Fertilizer for Carbon Dioxide Reduction and Preparation Method of the Same

Similar Documents

Publication Publication Date Title
WO2021038980A1 (en) Information processing device, information processing method, display device equipped with artificial intelligence function, and rendition system equipped with artificial intelligence function
US10957083B2 (en) Intelligent interactive and augmented reality based user interface platform
CN109564706B (en) User interaction platform based on intelligent interactive augmented reality
AU2003225115B2 (en) Method and apparatus for data receiver and controller
US9313439B2 (en) User adaptive display device and method thereof
JP5899111B2 (en) Method and system for adapting a user environment
WO2015198716A1 (en) Information processing apparatus, information processing method, and program
JP5536092B2 (en) Method and system for providing an effect that feels like a real experience
WO2022212946A1 (en) Virtual environment workout controls
CN103970892B (en) Various dimensions viewing system control method based on intelligent home device
CN110726222B (en) Air conditioner control method and device, storage medium and processor
CN111442464B (en) Air conditioner and control method thereof
CN105956170A (en) Real-time scene information embedding method, and scene realization system and method
US20220368770A1 (en) Variable-intensity immersion for extended reality media
CN112880119A (en) Air conditioner control method and device, storage medium and air conditioner
KR20170106793A (en) Apparatus and method for controlling devices
CN110493090B (en) Method and system for realizing intelligent home theater
CN109164713A (en) Intelligent household control method and device
CN113609958A (en) Light adjusting method and related device
CN114608170B (en) Intelligent regulation method and intelligent regulation system for indoor environment
CN110929146A (en) Data processing method, device, equipment and storage medium
WO2021079640A1 (en) Information processing device, information processing method, and artificial intelligence system
CN110966707A (en) Control method and device of air conditioner
US20230031160A1 (en) Information processing apparatus, information processing method, and computer program
US11675419B2 (en) User-driven adaptation of immersive experiences