US20170169296A1 - Intelligence system and method thereof - Google Patents

Intelligence system and method thereof Download PDF

Info

Publication number
US20170169296A1
US20170169296A1 US15/353,636 US201615353636A US2017169296A1 US 20170169296 A1 US20170169296 A1 US 20170169296A1 US 201615353636 A US201615353636 A US 201615353636A US 2017169296 A1 US2017169296 A1 US 2017169296A1
Authority
US
United States
Prior art keywords
module
image
key characteristic
intelligence system
predetermined space
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US15/353,636
Inventor
Jung-Ya Hsieh
Kuei-Yuan Chen
Ming-Te Lin
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Squarex Inc
Original Assignee
Squarex Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Squarex Inc filed Critical Squarex Inc
Priority to US15/353,636 priority Critical patent/US20170169296A1/en
Assigned to SQUAREX INC. reassignment SQUAREX INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: CHEN, KUEI-YUAN, HSIEH, JUNG-YA, LIN, MING-TE
Publication of US20170169296A1 publication Critical patent/US20170169296A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/105Controlling the light source in response to determined parameters
    • H05B47/11Controlling the light source in response to determined parameters by determining the brightness or colour temperature of ambient light
    • G06K9/00718
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47GHOUSEHOLD OR TABLE EQUIPMENT
    • A47G1/00Mirrors; Picture frames or the like, e.g. provided with heating, lighting or ventilating means
    • A47G1/02Mirrors used as equipment
    • G06K9/00335
    • G06K9/00744
    • G06K9/00758
    • G06K9/66
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/41Higher-level, semantic clustering, classification or understanding of video scenes, e.g. detection, labelling or Markovian modelling of sport events or news items
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/46Extracting features or characteristics from the video content, e.g. video fingerprints, representative shots or key frames
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/48Matching video sequences
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • G06V30/19Recognition using electronic means
    • G06V30/192Recognition using electronic means using simultaneous comparisons or correlations of the image signals with a plurality of references
    • G06V30/194References adjustable by an adaptive method, e.g. learning
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R1/00Details of transducers, loudspeakers or microphones
    • H04R1/02Casings; Cabinets ; Supports therefor; Mountings therein
    • H04R1/028Casings; Cabinets ; Supports therefor; Mountings therein associated with devices performing functions other than acoustics, e.g. electric candles
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R29/00Monitoring arrangements; Testing arrangements
    • H04R29/001Monitoring arrangements; Testing arrangements for loudspeakers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R3/00Circuits for transducers, loudspeakers or microphones
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R3/00Circuits for transducers, loudspeakers or microphones
    • H04R3/12Circuits for transducers, loudspeakers or microphones for distributing signals to two or more loudspeakers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R5/00Stereophonic arrangements
    • H04R5/02Spatial or constructional arrangements of loudspeakers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R5/00Stereophonic arrangements
    • H04R5/04Circuit arrangements, e.g. for selective connection of amplifier inputs/outputs to loudspeakers, for loudspeaker detection, or for adaptation of settings to personal preferences or hearing impairments
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04SSTEREOPHONIC SYSTEMS 
    • H04S7/00Indicating arrangements; Control arrangements, e.g. balance control
    • H04S7/30Control circuits for electronic adaptation of the sound field
    • H04S7/308Electronic adaptation dependent on speaker or headphone connection
    • G06K2009/00738
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/44Event detection
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R2420/00Details of connection covered by H04R, not provided for in its groups
    • H04R2420/01Input selection or mixing for amplifiers or loudspeakers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R2420/00Details of connection covered by H04R, not provided for in its groups
    • H04R2420/05Detection of connection of loudspeakers or headphones to amplifiers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R2430/00Signal processing covered by H04R, not provided for in its groups
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Definitions

  • FIG. 2B shows a schematic diagram of a key characteristic-relative information chart of one embodiment of the instant disclosure.
  • FIG. 2D shows a schematic diagram of a key characteristic-emotion chart of one embodiment of the instant disclosure.
  • FIG. 4 shows a schematic diagram of an intelligence system of another embodiment of the instant disclosure.
  • the intelligence system 1 can recognize the key characteristics obtained within the predetermined space A 1 to further analyze the user's preferences. Moreover, when the articles within the predetermined space A 1 are moved to other places, taken away or there is an article brought into the predetermined space A 1 , the intelligence system 1 will provide relevant information to the user. For example, when the event analysis module 13 of the intelligence system 1 recognizes that one article has been moved to another place, the service-information providing module 14 provides relevant information to the user, such as the time when the article was moved to another place, the person who took the article to the other place, the original position of the article, the original spatial distribution, the current spatial distribution or the current article usage rate of the room. In addition, the intelligence system 1 can provide information relevant to the room design of the predetermined space A 1 , such as the styles of the room design, information relevant to the articles in the room, and the prices of the articles in the room.
  • the control module 25 can further comprise a communication module (not shown) to access data stored in database 20 through the wired network or the wireless network.
  • the control module 25 has an event recognition module 251 and an event analysis module 252 .
  • the event recognition module 251 has kinds of sensors or image capturing units. In addition to the sensors or image capturing units of the event recognition module 251 , the image capturing unit of the intelligence system can also be used, such as the image capturing unit 2110 of the smart image module 21 .
  • the sensors that the event recognition module 251 has can be a microphone, a 1D acceleration sensor, a 3D acceleration sensor, a gyroscope, a G-sensor and the like. It is worth noting that, these sensors can be configured within the predetermined space A 2 or outside the predetermined space A 2 .
  • Step S 500 is to determine whether an event has happened, if no, step S 500 is repeatedly executed, but if yes, it goes to step S 510 .
  • Step S 510 is to capture an image within the predetermined space and to recognize a key characteristic from the captured image. After that, it goes to step S 520 , and step S 520 is to provide a service or a message according to the key characteristic.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Signal Processing (AREA)
  • Acoustics & Sound (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Otolaryngology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • Human Computer Interaction (AREA)
  • Computational Linguistics (AREA)
  • Software Systems (AREA)
  • Automation & Control Theory (AREA)
  • Quality & Reliability (AREA)
  • Manufacturing & Machinery (AREA)
  • General Engineering & Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • User Interface Of Digital Computer (AREA)
  • Circuit For Audible Band Transducer (AREA)
  • Selective Calling Equipment (AREA)
  • Circuit Arrangement For Electric Light Sources In General (AREA)
  • Mirrors, Picture Frames, Photograph Stands, And Related Fastening Devices (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

Disclosed are an intelligence system and a method for providing a message and/or service. This intelligence system is configured and used in a predetermined space, and comprises an event recognition module, an event analysis module and a service-information providing module. The event recognition module comprises an image capturing unit and an image recognition unit. The image capturing unit captures an image within the predetermined space, and the image recognition unit recognizes a key characteristic of the image. The event analysis module compares the key characteristic with a key characteristic-behavior chart, a key characteristic-relative information chart, a key characteristic-situation chart, a key characteristic-emotion chart or a key characteristic-style chart, and then generates a comparison result. The service-information providing module provides at least a message or a service according to the comparison result.

Description

    BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • The instant disclosure relates to an intelligence system; in particular, to an intelligence system configured and used in a predetermined space.
  • 2. Description of Related Art
  • In the field of the interior design, every accurate model of a fully designed living space including furniture and home appliances is a masterpiece of the host or a designer. Nowadays, the modern home appliances are no longer bulky. However, there is no uniform size for the home appliances manufactured by different companies. Designers choose pieces often according to their experience or the host's preference. In addition, different home appliances use different communication protocols, and there is no single platform for a user to control all the home appliances. Thus, an intelligence system that has system units having the same size which can communicate with different home appliances is needed.
  • SUMMARY OF THE INVENTION
  • The instant disclosure provides an intelligence system. This intelligence system is configured and used in a predetermined space, and comprises an event recognition module, an event analysis module and a service-information providing module. The event recognition module comprises an image capturing unit and an image recognition unit. The image capturing unit captures an image within the predetermined space, and the image recognition unit recognizes a key characteristic of the image. The event analysis module compares the key characteristic with a key characteristic-behavior chart, a key characteristic-relative information chart, a key characteristic-situation chart, a key characteristic-emotion chart or a key characteristic-style chart, and then generates a comparison result. The service-information providing module provides at least a message or a service according to the comparison result.
  • The instant disclosure further provides an intelligence system. This intelligence system is configured and used in a predetermined space, and comprises a control module. The control module comprises a smart image module and a smart speaker module. The smart image module comprises a plurality of smart image units to show images, and the smart speaker module comprises a plurality of smart speaker units to transmit an audio signal. In addition, the smart image module is panel-shaped.
  • The instant disclosure further provides a method for providing a message and/or service adapted to an intelligence system, wherein the intelligence system is configured and used in a predetermined space. The method comprises: capturing an image within the predetermined space by an image capturing unit; determining whether a user enters the predetermined space; predicting a user behavior happening in the predetermined space; and predicting a message or a service according to a prediction result.
  • The instant disclosure further provides a method for providing a message and/or service adapted to an intelligence system, wherein the intelligence system is configured and used in a predetermined space. The method comprises: determining whether an event is triggered; capturing an image within the predetermined space, and recognizing a key characteristic of the image; and providing a message or a service according to the key characteristic.
  • To sum up, the intelligence system provided by the instant disclosure is composed of a plurality of system units, which can reduce the cost and the complexity of assembly. In addition, the appearance of the intelligence system provided by the instant disclosure is a large plane, and thus it can easily fit a room and provide a diversity of room designs for a user. Moreover, the intelligence system provided by the instant disclosure can recognize a key characteristic of the user to predict the user behavior, and thus can providing a useful message and a cozy environment for the user.
  • For further understanding of the instant disclosure, reference is made to the following detailed description illustrating the embodiments of the instant disclosure. The description is only for illustrating the instant disclosure, not for limiting the scope of the claim.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • Embodiments are illustrated by way of example and not by way of limitation in the figures of the accompanying drawings, in which like references indicate similar elements and in which:
  • FIG. 1 shows a schematic diagram of an intelligence system of one embodiment of the instant disclosure.
  • FIG. 2A shows a schematic diagram of a key characteristic-behavior chart of one embodiment of the instant disclosure.
  • FIG. 2B shows a schematic diagram of a key characteristic-relative information chart of one embodiment of the instant disclosure.
  • FIG. 2C shows a schematic diagram of a key characteristic-situation chart of one embodiment of the instant disclosure.
  • FIG. 2D shows a schematic diagram of a key characteristic-emotion chart of one embodiment of the instant disclosure.
  • FIG. 2E shows a schematic diagram of a key characteristic-style chart of one embodiment of the instant disclosure.
  • FIG. 3 is a schematic diagram showing that an intelligence system of one embodiment of the instant disclosure is configured and used in a predetermined space.
  • FIG. 4 shows a schematic diagram of an intelligence system of another embodiment of the instant disclosure.
  • FIG. 5 is a schematic diagram showing a system unit of the intelligence system shown in FIG. 3.
  • FIG. 6 is a schematic diagram showing that the system units shown in FIG. 3 form a large plane.
  • FIG. 7 shows a schematic diagram of an intelligence system of still another embodiment of the instant disclosure.
  • FIG. 8 shows a flow chart of a method for providing a message and/or service adapted to an intelligence system of one embodiment of the instant disclosure.
  • FIG. 9 shows another flow chart of a method for providing a message and/or service adapted to an intelligence system of one embodiment of the instant disclosure.
  • DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS
  • The aforementioned illustrations and following detailed descriptions are exemplary for the purpose of further explaining the scope of the instant disclosure. Other objectives and advantages related to the instant disclosure will be illustrated in the subsequent descriptions and appended drawings.
  • It will be understood that, although the terms first, second, third, and the like, may be used herein to describe various elements, but these elements should not be limited by these terms. These terms are only to distinguish one element from another. For example, a first element, region or section discussed below could be termed a second element, region or section without departing from the teachings of the instant disclosure. As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items.
  • There is at least one embodiment provided in the following description to illustrate but not to restrict the intelligence system provided by the instant disclosure.
  • [One Embodiment of the Intelligence System]
  • Referring to FIGS. 1˜2E, FIG. 1 shows a schematic diagram of an intelligence system of one embodiment of the instant disclosure, FIG. 2A shows a schematic diagram of a key characteristic-behavior chart of one embodiment of the instant disclosure, FIG. 2B shows a schematic diagram of a key characteristic-relative information chart of one embodiment of the instant disclosure, FIG. 2C shows a schematic diagram of a key characteristic-situation chart of one embodiment of the instant disclosure, FIG. 2D shows a schematic diagram of a key characteristic-emotion chart of one embodiment of the instant disclosure, and FIG. 2E shows a schematic diagram of a key characteristic-style chart of one embodiment of the instant disclosure.
  • In this embodiment, the intelligence system 1 is used and configured in a predetermined space A1. For example, the predetermined space A1 can be a living room, a kitchen, a bedroom, a study room, a conference room and the like. The intelligence system 1 comprises a database 11, an event recognition module 12, an event analysis module 13 and a service-information providing module 14.
  • The event recognition module 12 comprises an image capturing unit 121 and an image recognition unit 122. The image capturing unit 121 captures an image within the predetermined space A1, and transmits the image to the image recognition unit 122. The image recognition unit 122 recognizes a key characteristic in the image, and then captures part of the key characteristic as a key characteristic image. The image and the key characteristic image are both stored in the database 11.
  • In this embodiment, when there is difference among the images captured by the image capturing unit 121, an event is triggered. In other words, the articles, the movements of the articles in the relevant frames of images and the illuminance of the relevant frames of images are all recorded in an event.
  • In this embodiment, the database 11 can be a remote database or a local database. For example, the data stored in the database 11 can be information in any field, such as the financial information, political information, legal information, international trade information, or information relevant to movies, video games or images. In addition, the above information can be acquired for free or with charge.
  • The intelligence system 1 further comprises a communication module (not shown), and this communication module can be a wired communication module or a wireless communication module. The intelligence system 1 is connected to an external network or the database 11 through this communication module. For example, if the database is a local database, the intelligence system 1 can be connected to an external network or a remote database through this communication module.
  • The event analysis module 13 compares the key characteristic with a key characteristic-behavior chart, a key characteristic-relative information chart, a key characteristic-situation chart, a key characteristic-emotion chart or a key characteristic-style chart to generate a comparison result. In this embodiment, the service-information providing module 14 is a display screen, which is configured as a system unit of the intelligence system 1. The event recognition module 12 and the event analysis module 13 are configured as another system unit of the intelligence system 1. It is worth noting that the system unit comprising the service-information providing module 14 and the system unit comprising the event recognition module 12 and the event analysis module 13 have the same size. In this embodiment, the system unit comprising the service-information providing module 14 is 12″×12″.
  • The key characteristic-behavior chart records different behavior modes corresponding to each key characteristic. For example, there are several possible behavior modes corresponding to the key characteristic that “a user U is sitting on the sofa” as follows. One behavior mode may refer to watching TV, another behavior mode may refer to surfing the Internet, still another one may refer to working, and the other behavior mode may refer to sitting idle. It should be noted that, different behavior modes recorded in the key characteristic-behavior chart are given various weights. The event analysis module 13 calculates and then obtains which behavior mode a user is likely to have by using algorithms relevant to the fields of machine learning and Deep Belief Network (DBN), such as Artificial Neural Network (ANN), Deep Neural Network (ANN), Convolutional Neural Network (CNN), Recurrent Neural Network (RNN) and the like. Thus, the key characteristic-behavior chart can always be updated and recorded. When the user's data is recorded by the intelligence system 1, the event analysis module 13 of the intelligence system 1 can generate a user preference chart (not shown) by collecting each key characteristic of the user, behavior modes and scenarios. This user preference chart is stored in the database 11 as an reference to predict the user behavior. In addition, the event analysis module 13 can periodically record and predict the user's health according to key characteristics, such as the user's expressions, body temperatures and figure. In this embodiment, different key characteristics may correspond to the same behavior mode. Thus, different weights may be given to the same behavior mode.
  • The key characteristic-relative information chart records different information corresponding to various key characteristics. For instance, when a user U takes his mobile phone and enters the predetermined space A1 and the image recognition unit 122 recognizes the game application used by the user as a key characteristic, the event analysis module 13 collects the information relevant to the game application though the external network or the database 11, and the service-information providing module 14 provides players’ comments, the players’ discussion forum or game tips to the user. Moreover, the event analysis module 13 collects the information relevant to the players’ discussion forum, game tips, in-app purchases, and update weights recorded in the key characteristic-relative information chart according to the content required by the user.
  • The key characteristic-situation chart records scenarios corresponding to different key characteristics. As shown in FIG. 2C, if the computer is recognized as the key characteristic, the corresponding scenario may be a working scenario or a recreation scenario and the working scenario or the recreation scenario are given different weights. The intelligence system 1 provides the information relevant the working scenario or the recreation scenario to the user U according to the comparison result generated by the event analysis module 13. For example, if the event analysis module 13 determines that the scenario corresponding to the recognized key characteristic is the working scenario, the intelligence system 1 provides relevant profiles, websites or conference call software to the user according to the user's needs for work.
  • When the user enters the predetermined space A1, the image capturing unit 121 captures an image of the user and transmits this image to the image recognition unit 122. The image recognition unit 122 recognizes the image and obtains key characteristics of the image, such as the user's appearance, gender, hair style, outfit, body temperature, expression, personal belongings and even a position where the user is standing at. Different parts of the image which are corresponded to various key characteristics are taken as key characteristic images of different key characteristics, and are all stored in the database 11. The event analysis module 13 analyzes these key characteristics by comparing these key characteristics with the above described key characteristic-behavior chart, key characteristic-relative information chart or key characteristic-situation chart, and generates a comparison result. It should be noted that, in addition to the key characteristic-behavior chart, the key characteristic-relative information chart or the key characteristic-situation chart, the time information, geographical information, weather, air temperature, moisture, environmental events (such as natural disasters or man-made hazards) can also be considered by the event analysis module 13 to predict the user's needs. In other embodiments, the intelligence system 1 can work with other smart home appliances (not shown). For example, the intelligence system 1 can provide a control message to the smart home appliances, such as a smart LED lamp, a smart refrigerator, a smart air-conditioner and the like according to the comparison result.
  • The image capturing unit 121 can be a common camera module, an infrared camera module, a 3D camera module, an ultraviolet camera module and the like. In addition, the event recognition module 12 may have other sensors, such as micro phone, a 1D acceleration sensor, a 3D acceleration sensor, a gyroscope, a G-sensor and the like to obtain each kind of sensing signal within the predetermined space A1 for further analysis.
  • The event analysis module 13 can recognize and analyze the key characteristic of the static articles within the predetermined space A1. Specifically speaking, the intelligence system 1, which is configured within the predetermined space A1, captures an image of the predetermined space A1, and recognizes the key characteristic of each static article within the predetermined space A1. For instance, the predetermined space A1 can be a living room. The event analysis module 13 of the intelligence system 1 recognizes the key characteristic of each static article within the predetermined space A1, such as the style or the color of the sofa, the coffee table, the carpet, the telephone and the like, and stores the recognized key characteristic as the predetermined data of the predetermined space A1. According to the room design of the predetermined space A1, the intelligence system 1 can recognize the key characteristics obtained within the predetermined space A1 to further analyze the user's preferences. Moreover, when the articles within the predetermined space A1 are moved to other places, taken away or there is an article brought into the predetermined space A1, the intelligence system 1 will provide relevant information to the user. For example, when the event analysis module 13 of the intelligence system 1 recognizes that one article has been moved to another place, the service-information providing module 14 provides relevant information to the user, such as the time when the article was moved to another place, the person who took the article to the other place, the original position of the article, the original spatial distribution, the current spatial distribution or the current article usage rate of the room. In addition, the intelligence system 1 can provide information relevant to the room design of the predetermined space A1, such as the styles of the room design, information relevant to the articles in the room, and the prices of the articles in the room.
  • In this embodiment, the intelligence system 1 can predict the user behavior according to the user's expressions (as shown in FIG. 2D) or the user's outfits (as shown in FIG. 2E), or can provide messages or services according to the user's personal style.
  • [Another Embodiment of the Intelligence System]
  • Referring to FIGS. 3˜6, FIG. 3 is a schematic diagram showing that an intelligence system of one embodiment of the instant disclosure is configured and used in a predetermined space, FIG. 4 shows a schematic diagram of an intelligence system of another embodiment of the instant disclosure, FIG. 5 is a schematic diagram showing a system unit of the intelligence system shown in FIG. 3, and FIG. 6 is a schematic diagram showing that the system units shown in FIG. 3 forming a larger plane.
  • The intelligence system 2 shown in FIG. 3 has functions similar to the functions that the intelligence system 1 shown in FIG. 1 has. The intelligence system 2 comprises a database 20, a smart image module 21, a first smart speaker module 22, a second smart speaker module 23, a third smart speaker module 24, a control module 25, a smart air-conditioner 26, a fragrance providing module 27 and a plurality of general function modules 28.
  • In this embodiment, the smart image module 21, the first smart speaker module 22, the second smart speaker module 23, the third smart speaker module 24, the control module 25, the smart air-conditioner 26 and the fragrance providing module 27 can be all considered smart function modules.
  • The smart image module 21 comprises nine smart image units, and these nine smart image units have the same size. The first smart speaker module 22 and the second smart speaker module 23 are respectively composed of two smart speaker units 221 and 231 which have the same size. The third smart speaker module 24, the control module 25, the smart air-conditioner 26, the fragrance providing module 27 and the general function modules 28 are all system units having the same size but different functions. The sizes of the third smart speaker module 24, the control module 25, the smart air-conditioner 26, the fragrance providing module 27, the general function modules 28, the smart image unit 211 and the smart speaker units 221 and 231 are all 12″×12″. In this embodiment, the control module 25 is configured within the predetermined space A2. In other embodiments, the control module 25 can be configured at a remote server (not shown) to be connected to the data base 20. The control module 25 can be implemented by hardware, software, firmware or the combination of them to provide functions required by the user.
  • The general function module 28 can be used to accommodate things or can be used as decoration. In other words, the general function module 28 can have no electronic device. Thus, when designing the room, the user can also use the general function module 28 to fill a vacancy in the intelligence system 2.
  • The system units 211, 221 and 231, the third smart speaker module 24, the control module 25, the smart air-conditioner 26, the fragrance providing module 27 and the general function module 28 are shown in FIG. 5. The system unit 40 in FIG. 5 has a square plane structure 401 and an engaging structure 402, wherein the engaging structure 402 is configured behind the square plane structure 401. Two system units 40 can be engaged by their engaging structures 402. As shown in FIG. 6, the four system units 40 can be combined as a square plane.
  • The intelligence system 2 is configured on the wall within the predetermined space A2. There is a plurality of slide rails (not shown) configured behind the intelligence system 2. The smart image module 21, the first smart speaker module 22, the second smart speaker module 23, the third smart speaker module 24, the control module 25, the smart air-conditioner 26, the fragrance providing module 27 and the general function module 28 can be sequentially configured on the slide rails, such that these modules are all configured on a large plane. The intelligence system 2 can further comprise a smart lamp module (not shown). According to the user's need, the smart lamp module is configured on the wall or at the ceiling within the predetermined space A2. In addition, the intelligence system 2 further comprises a plurality of intelligent power modules (not shown). An intelligent power module is configured behind each system unit (including the smart image module 21, the first smart speaker module 22, the second smart speaker module 23, the third smart speaker module 24, the control module 25, the smart air-conditioner 26 and the fragrance providing module 27) to supply a driving power.
  • In this embodiment, each smart image unit 211 has a display screen. One smart image unit 211 can independently display images, or some smart image units 211 can be combined to have a larger display region to display data. In addition, each smart image unit 211 has an image capturing unit 2110, which can be controlled by the control module 25.
  • In this embodiment, the first smart speaker module 22 and the second smart speaker module 23 are respectively composed of two smart speaker units 221 and 231, and these two smart speaker units 221 and 231 have the same size and the same function. For example, the first smart speaker module 22 can be the speaker of the left channel and the second smart speaker module 12 can be the speaker of the right channel. The third smart speaker module 24 can be a heavy bass speaker. The first smart speaker module 22, the second smart speaker module 23 and the third smart speaker module 24 are controlled by the control module 25 to play music or audio messages.
  • The control module 25 can further comprise a communication module (not shown) to access data stored in database 20 through the wired network or the wireless network. The control module 25 has an event recognition module 251 and an event analysis module 252. The event recognition module 251 has kinds of sensors or image capturing units. In addition to the sensors or image capturing units of the event recognition module 251, the image capturing unit of the intelligence system can also be used, such as the image capturing unit 2110 of the smart image module 21. The sensors that the event recognition module 251 has can be a microphone, a 1D acceleration sensor, a 3D acceleration sensor, a gyroscope, a G-sensor and the like. It is worth noting that, these sensors can be configured within the predetermined space A2 or outside the predetermined space A2.
  • The event recognition module 251 can capture at least one key characteristic relevant to the user using each kind of sensor. The event analysis module 252 analyzes the key characteristic relevant to the user to predict the user behavior. For example, the event analysis module 252 analyzes the key characteristic by comparing the user's key characteristic with the key characteristic-behavior chart, the key characteristic-relative information chart or the key characteristic-situation chart to generate a comparison result. The user behavior can be predicted according to this user behavior, which includes the behavior mode and the relevant information corresponding to the key characteristic.
  • The event analysis module 252 can calculate the possible behavior mode of the user and the relevant information corresponding to the key characteristic by using the machine learning and the Deep Belief Network (DBN), such as Artificial Neural Network (ANN), Deep Neural Network (ANN), Convolutional Neural Network (CNN), Recurrent Neural Network (RNN) and the like. In addition, the key characteristic-behavior chart, the key characteristic-relative information chart and the key characteristic-situation chart can be updated and recorded from time to time.
  • According to the comparison result, the control module 25 controls the smart image module 21 to display images of a television or a website and also to display the operating image of each kind of software. Also, the control module 25 can control the first smart speaker module 22, the second smart speaker module 23 and the third smart speaker module 24 to play audio data. In addition, the control module 25 can control the fragrance providing module 27 to spread fragrance, and can control the smart air-conditioner to adjust the air temperature, the moisture and the air quality within the predetermined space A2. Moreover, the control module 25 can control the smart lamp module (not shown) to adjust the light emitted into the predetermined space A2. Briefly, the control module 25 can provide all messages and information needed by a user and a user's preferred environment by recognizing the key characteristic relevant to the user.
  • [Another Embodiment of the Intelligence System]
  • Referring to FIG. 7, FIG. 7 shows a schematic diagram of an intelligence system of still another embodiment of the instant disclosure. In this embodiment, the intelligence system 3 is configured within the predetermined space A3. The predetermined space 3 comprises a first intelligent sub-system S31, a second intelligent sub-system S32, a third intelligent sub-system S33, a fourth intelligent sub-system S34, a fifth intelligent sub-system S35 and a sixth intelligent sub-system S36. The first intelligent sub-system S31 is configured within a first region A31 of the predetermined space A3, the second intelligent sub-system S32 is configured within a second region A32 of the predetermined space A3, the third intelligent sub-system S33 is configured within a third region A33 of the predetermined space A3, the fourth intelligent sub-system S34 is configured within a fourth region A34 of the predetermined space A3, the fifth intelligent sub-system S35 is configured within a fifth region A35 of the predetermined space A3, and the sixth intelligent sub-system S36 is configured within a sixth region A36 of the predetermined space A3.
  • For example, the predetermined space A3 can be a home. The first region A31 is a living room, the second region A32 is a studying room, the third region A33 is a first bedroom, the fourth region A34 is a second bedroom, the fifth region A35 is a room for working out and the sixth region composed of a kitchen and a dining room. FIG. 7 is only for illustrating an example of the predetermined space A3, so the details in FIG. 7 are omitted.
  • The structures and the functions of the first intelligent sub-system S31, the second intelligent sub-system S32, the third intelligent sub-system S33, the fourth intelligent sub-system S34, the fifth intelligent sub-system S35 and the sixth intelligent sub-system S36 are similar to the structures and the functions of the intelligence system 2 in the last embodiment.
  • The intelligence system 3 further comprises a remote server RS to connect the first intelligent sub-system S31, the second intelligent sub-system S32, the third intelligent sub-system S33, the fourth intelligent sub-system S34, the fifth intelligent sub-system S35 and the sixth intelligent sub-system S36 together. The remote server RS plays a role of the control module 25 in the last embodiment, which can receive the images of regions captured by the first intelligent sub-system S31, the second intelligent sub-system S32, the third intelligent sub-system S33, the fourth intelligent sub-system S34, the fifth intelligent sub-system S35 and the sixth intelligent sub-system S36, to recognize the user's key characteristic and then to predict the user behavior.
  • In addition, the first intelligent sub-system S31, the second intelligent sub-system S32, the third intelligent sub-system S33, the fourth intelligent sub-system S34, the fifth intelligent sub-system S35 and the sixth intelligent sub-system S36 all have the image capturing unit (not shown in FIG. 7) as described in the last embodiment to capture images of different regions of the predetermined space A3. The first intelligent sub-system S31, the second intelligent sub-system S32, the third intelligent sub-system S33, the fourth intelligent sub-system S34, the fifth intelligent sub-system S35 and the sixth intelligent sub-system S36 are configured within different regions of the predetermined space A3, so according to the images captured within the same time interval but within different regions of the predetermined space A3, the position of the user can be obtained. Different intelligent sub-systems S31˜S36 can recognize the user's key characteristic within various regions of the predetermined space A3 to further predict and record the user's behavior modes in different regions.
  • For example, when the moving path of a user is a first path P1 shown in FIG. 7, the intelligence system 3 can determine that the user enters the first region A31 of the predetermined space within the time interval from 19:00 to 19:20 according to the first intelligent sub-system S31. According to the second moving trace P2 of the user within the time interval from 19:30 to 21:00 obtained by the first intelligent sub-system S31, the fourth intelligent sub-system S34 and the fifth intelligent sub-system S35, the intelligence system 3 can determine that the second moving trace P2 of the user within the time interval from 19:30 to 21:00 is from the first region A31 (the living room), to the fourth region A34 (second bedroom) and then to the fifth region A35 (the room for working out). According to the third moving trace P3 of the user within the time interval from 21:00 to 21:20 obtained by the fifth intelligent sub-system S35 and the sixth intelligent sub-system S36, the intelligence system 3 can determine that the third moving trace P3 of the user within the time interval from 21:00 to 21:20 is from the fifth region A35 (the room for working out) to the sixth region A36 (the kitchen). According to the fourth moving trace P4 of the user within the time interval from 21:20 to 22:30 obtained by the sixth intelligent sub-system S36 and the first intelligent sub-system S31, the intelligence system 3 can determine that the fourth moving trace P4 of the user within the time interval from 21:20 to 22:30 is from the sixth region A36 (the kitchen) to the first region A31 (the living room). In other words, the intelligence system 3 can determine the behavior modes of different users within different regions and different time intervals, so as to further provide different information to different users in various regions, such as the information relevant to TV programs, movies, financial news, sports news and the like.
  • Briefly, the intelligence system 3 can obtain the position of the user and also can predict the user's movement. Additionally, by combining the intelligent sub-systems S31˜S36 configured within different regions A31˜A36, the intelligence system 3 can predict the user's behavior. The intelligence system 3 can not only predict the position of the user, but can also predict the moving path of the user over time.
  • [One Embodiment of the Method for Providing a Message and/or Service Adapted to an Intelligence System]
  • Referring to FIG. 8, FIG. 8 shows a flow chart of a method for providing a message and/or service adapted to an intelligence system of one embodiment of the instant disclosure. It should be noted that, the method for providing a message and/or service provided in this embodiment can be adapted to the intelligence system 2 described in the above embodiment, so the following illustration is based on the structure of the intelligence system 2.
  • The method for providing a message and/or service provided in this embodiment at least comprises the following steps. Step S500 is to determine whether an event has happened, if no, step S500 is repeatedly executed, but if yes, it goes to step S510. Step S510 is to capture an image within the predetermined space and to recognize a key characteristic from the captured image. After that, it goes to step S520, and step S520 is to provide a service or a message according to the key characteristic.
  • In step S500, the image capturing unit 2110 of the intelligence system 2 continually detects whether an event has happened within the predetermined space. When there is a difference among the images captured by the image capturing unit 121, an event is triggered. In other words, the articles, the movements of the articles in the relevant frames of the images and the illuminance of the relevant frames of the images are all recorded in an event. The event recognition module 251 of the intelligence system 2 can recognize at least one key characteristic from the image captured by the image capturing unit 2110. In addition, the intelligence system 2 can recognize and analyze the key characteristic for different events triggered by static articles or people. The image captured by the image capturing unit 2110 is transmitted to the control module 25.
  • In step S510, the control module 25 of the intelligence system 2 controls the event recognition module 251 to recognize the key characteristic for the event recorded in step S500. Also, the control module 25 controls the image recognition unit to capture part of the key characteristic as a key characteristic image, and the image and the key characteristic image are stored in a database 20.
  • In step S520, the event analysis module 252 compares the key characteristic with the key characteristic-behavior chart, the key characteristic-relative information chart or the key characteristic-situation chart and generates a prediction result to analyze and predict the user behavior. In addition, according to the prediction result, the control module 25 controls the smart image module 21, the first smart speaker module 22, the second smart speaker module 23, the third smart speaker module 24, the smart air-conditioner module 26 or the fragrance providing module 27 to provide the message, data or environment that a user may need.
  • [Another Embodiment of the Method for Providing a Message and/or Service Adapted to an Intelligence System]
  • Referring to FIG. 9, FIG. 9 shows another flow chart of a method for providing a message and/or service adapted to an intelligence system of one embodiment of the instant disclosure. It should be noted that, the method for providing a message and/or service provided in this embodiment can be adapted to the intelligence system 2 described in the above embodiment, so the following illustration is based on the structure of the intelligence system 2.
  • The method for providing a message and/or service provided in this embodiment at least comprises the following steps. In step S600, the image capturing unit captures an image of the predetermined space A2. In step S610, it is determined whether a user enters the predetermined space A2. If yes, it goes to step S620, but if not, it returns to step S600. In step S620, the possible user behavior within the predetermined space is predicted. In step S630, a message or a service are provided according to the prediction result.
  • In step 600, the image capturing unit 2110 of the intelligence system 2 captures an image of the predetermined space A2. In step S610, the image captured by the image capturing unit 2110 is used to determine whether there is a person entering the predetermined space A2. When there is a person entering the predetermined space A2, the key characteristic relevant to this person will be recognized and analyzed. Specifically, the image captured by the image capturing unit 2110 is transmitted to the control module 25 for recognition and analysis.
  • Is step S620, the control module 25 of the intelligence system 2 recognizes and analyzes the key characteristic relevant to the user of an authenticated mobile device or the person who enters the predetermined space A2. Specifically speaking, the event recognition module 25 recognizes and predicts the user behavior according to the key characteristic-behavior chart, the key characteristic-relative information chart or the key characteristic-situation chart. In this embodiment, the user behavior is predicted by using the machine learning and Deep Belief Network (DBN), such as Artificial Neural Network (ANN), Deep Neural Network (ANN), Convolutional Neural Network (CNN), Recurrent Neural Network (RNN) and the like.
  • In step S630, the event analysis module 252 of the control module 25 compares the key characteristic recognized in step S510 and a key characteristic-behavior chart, a key characteristic-relative information chart or a key characteristic-situation chart to generate a prediction result. After that, the control module 25 controls the smart image module 21, the first smart speaker module 22, the second smart speaker module 23, the third smart speaker module 24, the smart air-conditioner module 26 or the fragrance providing module 27 to provide the message, data or environment that a user may need.
  • To sum up, the intelligence system provided by the instant disclosure is composed of a plurality of system units, which can reduce the cost and the complexity of assembly. In addition, the appearance of the intelligence system provided by the instant disclosure is a large plane, and thus it can easily fit a room and provide a diversity of room designs for a user. Moreover, the intelligence system provided by the instant disclosure can recognize a key characteristic of the user to predict the user behavior, and thus can providing a useful message and a cozy environment for the user.
  • The descriptions illustrated supra set forth simply the preferred embodiments of the instant disclosure; however, the characteristics of the instant disclosure are by no means restricted thereto. All changes, alterations, or modifications conveniently considered by those skilled in the art are deemed to be encompassed within the scope of the instant disclosure delineated by the following claims.

Claims (19)

What is claimed is:
1. An intelligence system, configured and used in a predetermined space, comprising:
an event recognition module, comprising:
an image capturing unit, capturing an image within the predetermined space; and
an image recognition unit, recognizing a key characteristic of the image;
an event analysis module, comparing the key characteristic with a key characteristic-behavior chart, a key characteristic-relative information chart, a key characteristic-situation chart, a key characteristic-emotion chart or a key characteristic-style chart, and generating a comparison result; and
a service-information providing module, providing at least a message or a service according to the comparison result.
2. The intelligence system according to claim 1, wherein the image recognition unit captures part of the key characteristic as a key characteristic image, the image, the key characteristic image and the comparison result are stored in a database, and the key characteristic-behavior chart, the key characteristic-relative information chart and the key characteristic-situation chart are updated according to the comparison result.
3. The intelligence system according to claim 1, wherein the image captured within the predetermined space is the image of all of the predetermined space.
4. The intelligence system according to claim 3, wherein the intelligence system is configured on a wall in the predetermined space.
5. The intelligence system according to claim 1, wherein the service-information providing module is independently configured as a system unit, and the event recognition module and the event analysis module are configured as another system unit.
6. The intelligence system according to claim 5, wherein the system unit comprising the service-information providing module and the system unit comprising the event recognition module and the event analysis module have the same size.
7. An intelligence system, configured and used in a predetermined space, comprising:
a control module, comprising:
a smart image module, comprising a plurality of smart image units to show images; and
a smart speaker module, comprising a plurality of smart speaker units to transmit an audio signal;
wherein the smart image module is panel-shaped.
8. The intelligence system according to claim 7, wherein the smart image units and the smart speaker units have the same size.
9. The intelligence system according to claim 7, wherein the smart image module and the smart speaker module are configured in the same plane.
10. The intelligence system according to claim 7, wherein the control module is configured within the predetermined space or at a remote server.
11. The intelligence system according to claim 7, wherein the smart image module and the smart speaker module respectively have an intelligent power module.
12. The intelligence system according to claim 7, further comprising a general function module, wherein the control module, the smart image module, the smart speaker module and the general function module are configured in the same plane.
13. A method for providing a message and/or service adapted to an intelligence system, wherein the intelligence system is configured and used in a predetermined space, the method comprising:
capturing an image within the predetermined space by an image capturing unit;
determining whether a user enters the predetermined space;
predicting a user behavior happening in the predetermined space; and
predicting a message or a service according to a prediction result.
14. The method according to claim 13, wherein the step of predicting the user behavior happening in the predetermined space further comprises:
predicting the user behavior according to a key characteristic-behavior chart, a key characteristic-relative information chart, a key characteristic-situation chart, a key characteristic-emotion chart or a key characteristic-style chart, and generating the prediction result.
15. The method according to claim 14, wherein the intelligence system further comprises a smart image module and a smart speaker module, wherein the intelligence system provides the message by the smart image module or the smart speaker module according to the prediction result.
16. A method for providing a message and/or service adapted to an intelligence system, wherein the intelligence system is configured and used in a predetermined space, the method comprising:
determining whether an event is triggered;
capturing an image within the predetermined space, and recognizing a key characteristic of the image; and
providing a message or a service according to the key characteristic.
17. The method according to claim 16, wherein the intelligence system further comprises an event recognition module configured for recognizing whether the event is triggered.
18. The method according to claim 17, wherein the intelligence system further comprises an event analysis module configured for analyzing the key characteristic to generate a prediction result.
19. The method according to claim 18, wherein the intelligence system further comprises a smart image module, and the intelligence system provides the message or the service by the smart image module according to the prediction result.
US15/353,636 2015-12-11 2016-11-16 Intelligence system and method thereof Abandoned US20170169296A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/353,636 US20170169296A1 (en) 2015-12-11 2016-11-16 Intelligence system and method thereof

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201562266005P 2015-12-11 2015-12-11
US15/353,636 US20170169296A1 (en) 2015-12-11 2016-11-16 Intelligence system and method thereof

Publications (1)

Publication Number Publication Date
US20170169296A1 true US20170169296A1 (en) 2017-06-15

Family

ID=57391790

Family Applications (2)

Application Number Title Priority Date Filing Date
US15/353,636 Abandoned US20170169296A1 (en) 2015-12-11 2016-11-16 Intelligence system and method thereof
US15/372,398 Expired - Fee Related US9984292B2 (en) 2015-12-11 2016-12-08 Speaker device and speaker control method

Family Applications After (1)

Application Number Title Priority Date Filing Date
US15/372,398 Expired - Fee Related US9984292B2 (en) 2015-12-11 2016-12-08 Speaker device and speaker control method

Country Status (5)

Country Link
US (2) US20170169296A1 (en)
EP (1) EP3203414A3 (en)
JP (1) JP2017107547A (en)
CN (4) CN107085695A (en)
TW (4) TW201721473A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108712311A (en) * 2018-07-06 2018-10-26 李树金 A kind of intelligent domestic system
US10558863B2 (en) 2017-07-19 2020-02-11 Pegatron Corporation Video surveillance system and video surveillance method

Families Citing this family (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW201721473A (en) * 2015-12-11 2017-06-16 富奇想股份有限公司 Intelligent system
US10310803B2 (en) * 2016-09-02 2019-06-04 Bose Corporation Systems and methods for controlling a modular speaker system
TWI666895B (en) * 2017-02-08 2019-07-21 財團法人工業技術研究院 Connection management method for mobile device group
JP6733820B2 (en) * 2017-07-11 2020-08-05 東芝三菱電機産業システム株式会社 Computer update test support device
CN107360508A (en) * 2017-08-24 2017-11-17 蒋志斌 Intelligent music play system
TWI666594B (en) * 2017-09-01 2019-07-21 潘品睿 Indoor object management system and indoor object management method
WO2019079818A1 (en) * 2017-10-22 2019-04-25 Todd Martin System and method for image recognition registration of an athlete in a sporting event
CN108143191A (en) * 2017-12-25 2018-06-12 广东尚高科技有限公司 A kind of intelligent cosmetic mirror
US10158960B1 (en) * 2018-03-08 2018-12-18 Roku, Inc. Dynamic multi-speaker optimization
TWI672595B (en) * 2018-04-09 2019-09-21 宏碁股份有限公司 Monitering method and electronic device using the same
CN108606453A (en) * 2018-04-19 2018-10-02 郑蒂 A kind of intelligent cosmetic mirror
JP7320746B2 (en) * 2018-09-20 2023-08-04 パナソニックIpマネジメント株式会社 Control unit and unit system
CN112804914A (en) * 2018-09-21 2021-05-14 上海诺基亚贝尔股份有限公司 Mirror
TW202021377A (en) * 2018-11-23 2020-06-01 廣達電腦股份有限公司 Environmental detection system and sound control method using the same
TWI675958B (en) 2018-11-23 2019-11-01 仁寶電腦工業股份有限公司 Space adjustment system and control method thereof
WO2020191755A1 (en) * 2019-03-28 2020-10-01 李修球 Implementation method for smart home and smart device
TWI735250B (en) * 2020-06-04 2021-08-01 薩柏科技有限公司 Expandable wall-mounted air purifier
CN115191801A (en) * 2022-06-24 2022-10-18 深圳拓邦股份有限公司 Intelligent mirror and intelligent mirror system

Family Cites Families (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CH690875A5 (en) * 1996-05-21 2001-02-15 Hts High Technology Systems Ag Home and building automation system.
US7123728B2 (en) * 2001-08-15 2006-10-17 Apple Computer, Inc. Speaker equalization tool
US20030135552A1 (en) * 2002-01-14 2003-07-17 Blackstock Michael A. Method for discovering and discriminating devices on local collaborative networks to facilitate collaboration among users
FR2837945B1 (en) * 2002-03-28 2005-04-08 Celec Conception Electronique RANGE OF PRODUCTS CONFIGURABLE TO THE INSTALLATION, CONFIGURATION TOOL AND METHOD FOR CONFIGURING SUCH PRODUCTS
US20030185400A1 (en) * 2002-03-29 2003-10-02 Hitachi, Ltd. Sound processing unit, sound processing system, audio output unit and display device
US7298871B2 (en) * 2002-06-07 2007-11-20 Koninklijke Philips Electronics N.V. System and method for adapting the ambience of a local environment according to the location and personal preferences of people in the local environment
KR100611993B1 (en) * 2004-11-18 2006-08-11 삼성전자주식회사 Apparatus and method for setting speaker mode automatically in multi-channel speaker system
US8880205B2 (en) * 2004-12-30 2014-11-04 Mondo Systems, Inc. Integrated multimedia signal processing system using centralized processing of signals
JP4829563B2 (en) * 2005-08-03 2011-12-07 キヤノン株式会社 Control method and control apparatus
US8325931B2 (en) * 2008-05-02 2012-12-04 Bose Corporation Detecting a loudspeaker configuration
US8063698B2 (en) * 2008-05-02 2011-11-22 Bose Corporation Bypassing amplification
CN101420371B (en) * 2008-07-03 2010-12-01 江苏华丽网络工程有限公司 Dynamic function supporting method and system for ASIC fusion network device
JP5223595B2 (en) * 2008-10-29 2013-06-26 ヤマハ株式会社 Audio processing circuit and audio processing method
EP2494417B1 (en) * 2009-09-23 2022-04-06 Schneider Electric Buildings, LLC Digital control manager
EP2486736B1 (en) * 2009-10-05 2022-04-13 Harman International Industries, Incorporated Multichannel audio system having audio channel compensation
FR2967003B1 (en) * 2010-11-03 2014-03-14 Focal Jmlab MULTI-CHANNEL ACOUSTIC SPEAKER
US20120148075A1 (en) * 2010-12-08 2012-06-14 Creative Technology Ltd Method for optimizing reproduction of audio signals from an apparatus for audio reproduction
CN202536469U (en) * 2011-12-26 2012-11-21 台州市福斯特科技电子有限公司 Cosmetic mirror
CN103428607A (en) * 2012-05-25 2013-12-04 华为技术有限公司 Audio signal playing system and electronic device
KR101317047B1 (en) * 2012-07-23 2013-10-11 충남대학교산학협력단 Emotion recognition appatus using facial expression and method for controlling thereof
WO2014024442A1 (en) * 2012-08-07 2014-02-13 パナソニック株式会社 Coordination processing execution method and coordination processing execution system
JP5989505B2 (en) * 2012-10-29 2016-09-07 シャープ株式会社 Message management apparatus, message presentation apparatus, message presentation system, message management apparatus, message presentation apparatus control method, control program, and recording medium
CN103941653B (en) * 2013-01-21 2017-10-31 广东美的制冷设备有限公司 The master and slave control method of intelligent appliance and system
CN104102141B (en) * 2013-09-09 2017-01-04 珠海优特电力科技股份有限公司 A kind of SMART COOKWARE and the intelligence collaboration working method of stove and device
CN203761573U (en) * 2013-11-26 2014-08-06 广州天逸电子有限公司 Electronic acoustic splitter
CN104766456A (en) * 2014-01-06 2015-07-08 上海本星电子科技有限公司 Intelligent interactive system with portable intelligent device as control center
CN103824099A (en) * 2014-03-03 2014-05-28 季攀 Method and system for connecting and controlling equipment automatically by intelligent terminal via identification tags
CN104602398B (en) * 2014-11-28 2017-11-03 常州市武进区半导体照明应用技术研究院 Fitting room Lighting Control Assembly and method
US9747573B2 (en) * 2015-03-23 2017-08-29 Avatar Merger Sub II, LLC Emotion recognition for workforce analytics
CN204667136U (en) * 2015-06-01 2015-09-23 中国人民解放军装甲兵工程学院 Multirobot cooperative control system
CN105116859B (en) * 2015-08-21 2018-01-16 杨珊珊 A kind of intelligent domestic system realized using unmanned vehicle and method
TW201721473A (en) * 2015-12-11 2017-06-16 富奇想股份有限公司 Intelligent system

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10558863B2 (en) 2017-07-19 2020-02-11 Pegatron Corporation Video surveillance system and video surveillance method
CN108712311A (en) * 2018-07-06 2018-10-26 李树金 A kind of intelligent domestic system

Also Published As

Publication number Publication date
EP3203414A2 (en) 2017-08-09
US20170171686A1 (en) 2017-06-15
CN107046752A (en) 2017-08-15
TWI623829B (en) 2018-05-11
EP3203414A3 (en) 2017-12-06
CN107085695A (en) 2017-08-22
TW201722174A (en) 2017-06-16
CN106970596A (en) 2017-07-21
TW201721473A (en) 2017-06-16
US9984292B2 (en) 2018-05-29
TW201720345A (en) 2017-06-16
CN106878875A (en) 2017-06-20
JP2017107547A (en) 2017-06-15
TW201721318A (en) 2017-06-16

Similar Documents

Publication Publication Date Title
US20170169296A1 (en) Intelligence system and method thereof
US11721107B2 (en) Systems and methods for locating image data for selected regions of interest
US20220237948A1 (en) Systems and Methods of Detecting and Responding to a Visitor to a Smart Home Environment
US20220247978A1 (en) Systems and Methods of Detecting and Responding to a Visitor to a Smart Home Environment
US20210209348A1 (en) Computer vision system
US10721527B2 (en) Device setting adjustment based on content recognition
KR102354952B1 (en) System and method for output display generation based on ambient conditions
CN105229629B (en) For estimating the method to the user interest of media content, electronic equipment and medium
US20210279475A1 (en) Computer vision systems
CN103760968B (en) Method and device for selecting display contents of digital signage
CN112166350B (en) System and method for ultrasonic sensing in smart devices
CN102346898A (en) Automatic customized advertisement generation system
KR101671760B1 (en) Set-top box, photographing apparatus for providing context-awareness services on the basis of multi-modal information to thereby learn and enhance user interface and user experience and method and computer-readable recording medium using the same
US20180206725A1 (en) Devices and systems for collective impact on mental states of multiple users
CN109754316A (en) Products Show method, Products Show system and storage medium
WO2018208365A1 (en) Methods and systems for presenting image data for detected regions of interest
WO2019209358A1 (en) Systems and methods of power-management on smart devices
KR20160136555A (en) Set-top box for obtaining user information by using multi-modal information, server for managing user information obtainied from set-top box and method and computer-readable recording medium using the same
KR102144554B1 (en) personalized living service apparatus and system for providing personalized living services
KR20160093013A (en) Identification of an appliance user
CN116300491B (en) Control method, device, equipment and medium based on intelligent wearable equipment
Thyagaraju et al. Interactive democratic group preference algorithm for interactive context aware TV
KR20160111777A (en) Set-top box, photographing apparatus for providing context-awareness services on the basis of multi-modal information to thereby learn and enhance user interface and user experience and method and computer-readable recording medium using the same
CN104869131B (en) Multiple electronic device interactive approaches and its system based on user's purpose

Legal Events

Date Code Title Description
AS Assignment

Owner name: SQUAREX INC., TAIWAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HSIEH, JUNG-YA;CHEN, KUEI-YUAN;LIN, MING-TE;REEL/FRAME:040478/0765

Effective date: 20161114

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION