GB2587351A - A system for displaying content information - Google Patents

A system for displaying content information Download PDF

Info

Publication number
GB2587351A
GB2587351A GB1913693.6A GB201913693A GB2587351A GB 2587351 A GB2587351 A GB 2587351A GB 201913693 A GB201913693 A GB 201913693A GB 2587351 A GB2587351 A GB 2587351A
Authority
GB
United Kingdom
Prior art keywords
master device
content information
subsidiary
user
user motion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
GB1913693.6A
Other versions
GB201913693D0 (en
Inventor
Jean Patrice Bernard Thomas
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Priority to GB1913693.6A priority Critical patent/GB2587351A/en
Publication of GB201913693D0 publication Critical patent/GB201913693D0/en
Publication of GB2587351A publication Critical patent/GB2587351A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/44213Monitoring of end-user related data
    • H04N21/44218Detecting physical presence or behaviour of the user, e.g. using sensors to detect if the user is leaving the room or changes his face expression during a TV program
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2200/00Indexing scheme relating to G06F1/04 - G06F1/32
    • G06F2200/16Indexing scheme relating to G06F1/16 - G06F1/18
    • G06F2200/163Indexing scheme relating to constructional details of the computer
    • G06F2200/1637Sensing arrangement for detection of housing movement or orientation, e.g. for controlling scrolling or cursor movement on the display of an handheld computer
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2354/00Aspects of interface with display user
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2356/00Detection of the display position w.r.t. other display screens

Abstract

A system for displaying content information comprising a plurality of subsidiary devices (1a, 1b, 1c, 1d, 1e) and a master device (2). Each subsidiary device (1a, 1b, 1c, 1d, 1e) and the master device (2) comprises a display for displaying a content information item received from a content provision apparatus (3) in response to determining a predetermined user motion, a receiver configured to receive the content information item from the content provision apparatus (3) and a sensor configured to sense user motion. Each subsidiary device (1a, 1b, 1c, 1d, 1e) further comprises a transmitter to transmit a signal to the master device (2) in response to the sensor sensing user motion. The master device (2) further comprises a communication unit to receive the signal transmitted from each subsidiary device (1a, 1b, 1c, 1d, 1e). The master device (2) further comprises a determination unit configured to determine the relative location of the subsidiary devices (1a, 1b, 1c, 1d, 1e) based on the received signals and to determine whether the sensed user motion is the predetermined user motion (4).

Description

A System for Displaying Content Information Technical Backoround [0001] The invention relates to a system for displaying content information and a method for displaying content information Backoround [0002] Currently, existing searches of content information are typically performed using basic linear lists. Additional lists can be provided to allow more advanced searching (e.g. searching by a content information category, such as genre). However, even when such additional lists are provided, the principle by which the lists are searched is unchanged. In other words, existing searching of content information relies on selection of a content information item from a linear list of content information items. Existing searching of content information also has the disadvantages of not allowing grouping of content information items or organization of content information items in multiple dimensions; multiple lists of content information items typically cannot coexist due to the lack of available space. Resultingly, current searches of content information are inefficient and cumbersome. Taking the example of Netflix, the average worldwide Netflix user spends a significant amount of time looking for something to watch.
[0003] The present application describes an invention that enables creation of advanced lists of content information items using tangible interfaces. Subsidiary devices, such as small e-paper displays, represent selectable items or choices of content information. The invention consists of being able to group, organise and exploit the relationship between the selectable content information items, providing a more efficient way of performing advanced searches of content information compared with current methods.
Summary of the Invention
According to a first aspect of the invention, there is provided a system for displaying content information comprising a plurality of subsidiary devices and a master device. Each subsidiary device and the master device comprises a display for displaying a content information item received from a content provision apparatus in response to determining a predetermined user motion, a receiver configured to receive the content information item from the content provision apparatus and a sensor configured to sense user motion. Each subsidiary device further comprises a transmitter to transmit a signal to the master device in response to the sensor sensing user motion. The master device further comprises a communication unit to receive the signal transmitted from each subsidiary device. The master device further comprises a determination unit configured to determine the relative location of the subsidiary devices based on the received signals and to determine whether the sensed user motion is the predetermined user motion.
Each of the plurality of subsidiary devices and the master device may further comprise a user interface configured to receive user input.
The system may power on in response to one of the user interfaces receiving a first user input.
The content information category displayed by each of the plurality of subsidiary devices and by the master device may be updated in response to a user motion being determined as the predetermined user motion.
The content information item displayed by one of the plurality of subsidiary devices or the master device may be updated in response to the corresponding user interface receiving a second user input.
The transmitter of each of the plurality of subsidiary devices and the communication unit of the master device may be configured to transmit a signal to the content provision apparatus in response to a third user input being received by the user interface, indicating selection of the content information item by the user, to cause the content provision apparatus to open a content corresponding to selected content information item.
The predetermined user motion may be a contact or non-contact user motion The user input may be a gesture According to a second aspect of the invention, there is provided a method for displaying content information comprising sensing user motion by a sensor of a plurality of subsidiary devices and a master device, transmitting a signal to the master device by a transmitter of each subsidiary device in response to the sensor sensing user motion, receiving the signal transmitted from each subsidiary device by a communication unit of the master device, determining the relative location of the subsidiary devices based on the received signals and determining whether the sensed user motion is a predetermined user motion by a determination unit of the master device, and receiving a content information item from a content provision apparatus and displaying the received content information item by, respectively, a receiver and a display of each of the plurality of subsidiary devices and the master device in response to the sensed user motion being determined as the predetermined user motion.
Brief Description of the Drawings
[0004] For a better understanding of the invention, and to show how embodiments of the same may be carried into effect, reference will now be made, by way of example, to the accompanying diagrammatic drawings in which: [0005] Figure 1A shows an example embodiment of a system for displaying content information in accordance with the present invention.
[0006] Figure 1B shows Figure 1A following user input to update the content information category, and [0007] Figure 2 shows an example embodiment of a method for displaying content information in accordance with the present invention.
Detailed Description
[0008] Figure 1 shows an example embodiment of a system for displaying content in accordance with the present invention. As shown in Figure 1A, the system comprises a plurality of subsidiary devices (1 a, 1 b, 1 c, ld, le), a master device (2) and an apparatus for providing content information, termed a content provision apparatus, (3).
[0009] The subsidiary devices (1a, lb, lc, ld, le) and the master device (2) are typically laid out on a flat surface. Alternatively, for example, the subsidiary devices (la, lb, lc, ld, le) and the master device (2) may be attached to a vertical surface. Either way, the user can easily view each of the subsidiary devices (la, 1 b, lc, ld, le) and the master device (2) simultaneously. Figure 1A shows the subsidiary devices (la, 1 b, lc, ld, le) and the master device (2) laid out in a substantially circular arrangement. However, the arrangement in which the subsidiary devices (la, lb, lc, ld, le) and the master device (2) can be laid out is not limited to a particular geometry. Figure 1A shows the subsidiary devices (la, lb, lc, ld, le) and the master device (2) spaced apart, but the devices may instead be in contact with each other. The subsidiary devices (1a, lb, lc, ld, le) and the master device may be configured to position themselves into a specific arrangement. The subsidiary devices (la, 1 b, lc, ld, le) and the master device (2) may be considered to be, for example, electronic post notes.
[0010] The subsidiary devices (la, 1 b, lc, Id, le) and the master device (2) may be configured to recognise each other as belonging to the same group of devices that forms the system for displaying content. For example, they may recognise each other as belonging to the same system by being brought into contact with each other, wherein they each include a sensor for sensing a master device (2) or a subsidiary device (la, lb, lc, ld, le) that is in contact. Similarly, the subsidiary devices (1 a, 1 b, lc, ld, le) and the master device (2) may be selectable by a user, for example, in response to detection of a user touch, as being part of the same group.
[0011] The content provision apparatus (3) may be, for example, a television, as shown in Figure 1A. In another example, the content provision apparatus (3) may be a smart fridge, smart fridges typically sensing and tracking the food and drink items being stored in the fridge.
The content provision apparatus (3) can electronically communicate with each of the subsidiary devices (1a, lb, lc, ld, le) and the master device (2).
[0012] Each of the plurality of subsidiary devices (la, 1 b, lc, ld, le) and the master device (2) are configured to receive content information from the content provision apparatus (3), as indicated by the dashed lines in Figure 1A. In the example of the content provision apparatus (3) being a television, the subsidiary devices (1 a, 1 b, lc, ld, le) and the master device (2) may receive content information corresponding to information regarding content shown by an online platform for viewing television programmes and films. For example, the content information may be details about a film which is available to watch. The content information may contain details such as the availability of the film, a synopsis of the film, a rating of the film and details of the film cast. The content information is not restricted to any particular information regarding the content. In the example of the content provision apparatus (3) being a smart fridge, the content information may contain details regarding the amount of a particular food stored in the fridge and when that food item was first placed in the fridge.
[0013] The subsidiary devices (1 a, 1 b, lc, ld, le) and the master device (2) have a display for displaying the content information as a content information item. The content information item may be an icon showing particular content information. The subsidiary devices (1a, lb, lc, ld, le) and the master device (2) may be configured to allow a user to customise the content information shown by the content information item. Continuing with the example of the content provision apparatus (3) being a television, as shown in Figure 1A, and the content information relating to programmes and films available to watch, each of the subsidiary devices (1a, lb, lc, ld, le) and the master device (2) may separately display content information items corresponding to a single content (i.e. a programme or film). However, content information is only displayed by the subsidiary devices (la, lb, lc, ld, le) and the master device (2) when a predetermined motion of the user has been detected and recognised.
[0014] The predetermined user motion (4) may be a gesture. For example, the predetermined user motion (4) may correspond to the user moving their hand or finger in a circular motion or in a motion to trace out some other predetermined shape with their hand or finger. The predetermined user motion (4) may be stored in a storage unit of the master device (2). The subsidiary devices (1a, lb, lc, ld, 1 e) may also comprise a storage unit for storing the predetermined user motion (4).
[0015] Each of the subsidiary devices (1a, 1 b, lc, 1 d, le) and the master device (2) have sensors in order to sense user motion. Examples of suitable sensors include cameras, recognition technology (e.g. face recognition technology), microphones, accelerometers, gyroscopes and pressure sensors. The sensors may use, for example, gesture sensing radar technology in order to sense the motion of a user. The gesture may be a contact gesture or a non-contact gesture (i.e. the predetermined user motion (4) may or may not be a gesture that involves the user touching one or more of each of the subsidiary devices (1a, 1 b, lc, ld, le) and the master device (2) ). Examples of contact gestures include single or multiple touches for a predetermined time period; movement, shaking, rotation of the subsidiary devices (la, 1 b, lc, ld, le) and the master device (2); contact between the subsidiary devices (la, lb, lc, ld, le) and/or the master device (2); and movement of the surface on which the devices are laid. Examples of non-contact gestures include the eye movement of the user, which may be tracked by multiple cameras, and the motion of a body part of a user. Gesture tracking may be
performed using near-field communication (NFC).
[0016] Each subsidiary device (1a, lb, 1c, 1d, le) comprises a transmitter for transmitting a signal to the master device (2) in response to the sensor of a subsidiary device (1a, lb, lc, ld, le) sensing user motion. This signal may contain information about, for example, the motion of the hand of the user over the subsidiary device (la, 1 b, 1 c, ld, le). Such information may include temporal and spatial information. In other words, in response to detection of user motion by the sensor, a signal is sent from the subsidiary device (la, 1 b, lc, ld, le) to the master device (2) containing information regarding the time at which the motion was detected and the direction of the motion.
[0017] A communication unit of the master device (2) receives each of these signals. Based on the information contained in the signals a determination unit can then determine the relative positions of the subsidiary devices (la, lb, lc, ld, le) with respect to the master device (2). As a result, the determination unit can determine whether the sensed user motion corresponds to the predetermined user motion (4) by comparison of the sensed motion with the stored predetermined user motion (4). Without communication between the subsidiary devices (la, 1 b, lc, ld, le) and the master device (2) to determine their relative position, the possible arrangements of the subsidiary devices (1 a, 1 b, lc, 1 d, le) would be restricted to an arrangement pre-programmed into the system, impacting the efficiency of searching by limiting the organisation of the selectable items. If the sensed user motion corresponds to the predetermined user motion (4), a receiver of each subsidiary devices (la, lb, lc, ld, le) and the master device (2) receives the content information from the content provision apparatus (3), and the display of each of the subsidiary devices (la, 1 b, 1 c, ld, le) and the master device (2) displays the content information on the display as a content information item. The process of receiving the content information from the content provision apparatus (3) and displaying the content information on the display of the subsidiary devices (la, lb, lc, id, le) and the master device (2) is termed synchronisation. If the sensed user motion does not correspond to the predetermined user motion (4), the receivers of the subsidiary devices (la, 1 b, lc, id, le) and the master device (2) do not receive a signal from the content provision apparatus (3) and the content information is not displayed.
[0018] In an alternative embodiment, rather than the receiver of each of the subsidiary devices (la, lb, lc, id, le) and the master device (2) receiving the content information from the content provision apparatus (3) as described in the above paragraph, the content information may be received by the master device (2) and transmitted to each of the subsidiary devices (la, lb, lc, ld, le) by a communication unit of the master device (2).
[0019] Each of the subsidiary devices (la, lb, lc, id, le) and the master device (2) may further comprise a user interface configured to receive user input. This user interface may be, for example, a touch pad. The touch pad may be involved in the sensing of the user motion for synchronisation, for example, in the case that the gesture is a contact gesture.
[0020] The system may be configured to power on in response to receiving a particular preset user input indicating that the user wants to launch the system. This user input, herein denoted the first user input, may correspond to, for example, tapping a touch pad corresponding the user interface a set number of times, movement of the fingers of the user across, above or adjacent to the touch pad. Similar to the predetermined user motion (4), the first user input may also be a contact or a non-contact gesture. The first user input may also be stored in the storage unit. Once the system has been powered on the first user input may be used to power off the system.
[0021] Initially, each of the subsidiary devices (la, lb, lc, id, le) and the master device (2) may display on their displays content information items corresponding to the content information shown by the content provision apparatus (3), which may be grouped by content information category. For example, the subsidiary devices (la, lb, lc, id, le) and the master device (2) may each show a single programme or film from a particular genre, as shown in Figure 1A (each of the subsidiary devices (la, 1 b, lc, id, le) and the master device shows a different content information item corresponding to a different genre, these genres being labelled A, B, C, D,... etc.). However, if none of the displayed programme or film genres are of interest to the user, the user may wish to see a different set of genres. The displays of each of the subsidiary devices (la, lb, lc, Id, le) and the master device (2) may be updated to display content information items corresponding to different content information categories. In the same way as described above in relation to the synchronisation process, the displayed content information items may be updated for each of the subsidiary devices (la, 1 b, 1c, ld, le) and the master device (2) in response to the determination unit of the master device (2) identifying a user motion as a predetermined user motion (4). The predetermined user motion (4) may be the same or different to the user motion used in the synchronisation process.
Alternatively, the displayed content information items may be updated for each of the subsidiary devices (la, lb, lc, ld, le) and the master device (2) in response to a user interface receiving a user input similar to that described for the first user input.
[0022] Figure 1B shows the displayed content information updated following the system shown in Figure 1A either determining the predetermined user motion (4) or receiving user input through the user interface. As shown in Figure 1B, the content information category of each subsidiary device (la, lb, 1c, 1d, 1e) and the master device (2) has updated to show a content information item corresponding to a different genre. For instance, the master device (2) shows a content information item corresponding to genre F in Figure 1A, whereas in Figure 1B the master device (2) shows a content information item corresponding to genre L. [0023] Similarly, while the user may be interested in the content information category corresponding to the content information item displayed by a particular subsidiary device (la, 1 b, lc, ld, le) or the master device (2), the user may not be interested in the particular content information item displayed. For example, the user may be interested in the Wm genre shown by a subsidiary device (la, lb, lc, ld, le), but the film shown by the subsidiary device (la, 1 b, lc, ld, le) may not be of interest to the user. Therefore, the user desires to see more content information corresponding to films of the particular genre shown by the subsidiary device (la, lb, lc, ld, le).
[0024] The content information item displayed by one of the plurality of subsidiary devices (la, lb, lc, ld, le) or the master device (2) is updated in response to the corresponding user interface receiving a user input. This user input, herein denoted the second user input (5), is similar to the first user input. An example of the second user input is shown in Figure 1B, in which the user is moving their finger across the user interface of a subsidiary device (1e) to change the content information item to a different content information item in Genre K. [0025] Once the user has found a content information item that they are interested in (e.g. a film they want to watch), the user can cause the content provision apparatus (3) to open the content corresponding to the content information item via use of the user interface. The transmitter of each of the plurality of subsidiary devices (la, lb, lc, 1 d, le) and the communication unit of the master device (2) are configured to transmit a signal to the content provision apparatus (3) in response to a user input (a third user input) being received by the user interface, indicating selection of the content information item by the user. Consequently, the content provision apparatus (3) opens the content corresponding to selected content
B
information item. The third user input, may correspond to, for example, tapping a touch pad corresponding the user interface a set number of times or movement of the fingers of the user across the touch pad or above or adjacent to the touch pad, as described for the first user input and the second user input.
[0026] In an alternative embodiment, rather than each of the subsidiary devices (la, 1 b, lc, id, le) being configured to transmit a signal to the content provision apparatus (3) in response to user input, the signal may be transmitted from the subsidiary device (1a, lb, lc, id, le) in question to the master device (2), the master device then relaying the signal to the content provision apparatus (3) by the communication unit to cause the content provision unit (3) to provide content information to the subsidiary device (1a, lb, 10, ld, le).
[0027] Lighting may be used to indicate that the system is powered on. For example, the subsidiary devices (1a, lb, 1c, id, 1e) and the master device (2) may each comprise an LED that flashes when the system is powered on. Lighting of different colours may be used to distinguish between the subsidiary devices (1 a, 1 b, lc, 1 d, le) and the master device (2).
Similarly, lighting may be used to indicate which among the subsidiary devices (1 a, 1 b, lc, id, le) and the master device (2) belong to the group of devices forming the system.
[0028] The lighting may also be used to bring a particular subsidiary device (la, 1 b, lc, id, le) or the master device (2) to the attention of a user. Alternatively, a subsidiary device (la, 1 b, lc, id, le) or the master device (2) may vibrate to bring the device to the attention of a user. It may be useful to bring the device to the attention of the user to alert them to the availability of content information or that the system has powered on, for example.
[0029] The system may be used be a plurality of users simultaneously.
[0030] The subsidiary devices (la, 1 b, lc, 1 d, le) and the master device (2) maybe chargeable when stacked on top of one another, such that they may be charged simultaneously by means of a charging plates of each of the subsidiary devices (la, 1 b, lc, ld, le) and the master device (2) contacting each other and a charging station.
[0031] The system may include more than one master device (2), and the subsidiary devices (la, lb, lc, Id, le) and the master device (2) may be electronically labelled so that they receive content information according to their electronic label. Each of the subsidiary devices (la, lb, lc, id, le) may be configured to be operable as a master device (2). For example, each of the subsidiary devices (la, lb, lc, id, le) may comprise a storage unit and a communication unit.
[0032] Figure 2 shows a method of display content information according the present invention. As shown in Figure 2, the first step (Si) of the method is the sensing of user motion by the sensor of a plurality of subsidiary devices and a master device. As discussed, this motion may be a gesture. Subsequently, in the second step (S2) a signal is transmitted to the master device by the transmitter of each subsidiary device in response to the sensor sensing user motion (e.g. the gesture). In the third step (83), the signal transmitted from each subsidiary device is received by the communication unit of the master device, the signal, for example, containing temporal and spatial information. As a result of the information provided by the signal set from each subsidiary device to the master device, in the fourth step (54) the relative location of the subsidiary devices based on the received signals can be determined by the master device, and the user motion can be determined. Consequently, the determination unit can determine whether the sensed user motion is a predetermined user motion. In the fifth step (S5), in response to the sensed user motion being determined as the predetermined user motion, a receiver of the each of the plurality of subsidiary devices and the master device receives content information from the content provision apparatus, and the display of each of the plurality of subsidiary devices and the master device displays the received content information.
[0033] Although a few preferred embodiments of the present invention have been shown and described, it will be appreciated by those skilled in the art that various changes and modifications might be made without departing from the scope of the invention, as defined in the appended claims.
[0034] At least some of the example embodiments described herein may be constructed, partially or wholly, using dedicated special-purpose hardware. Terms such as 'component', 'module' or 'unit' used herein may include, but are not limited to, a hardware device, such as circuitry in the form of discrete or integrated components, a Field Programmable Gate Array (FPGA) or Application Specific Integrated Circuit (ASIC), which performs certain tasks or provides the associated functionality. In some embodiments, the described elements may be configured to reside on a tangible, persistent, addressable storage medium and may be configured to execute on one or more processors. These functional elements may in some embodiments include, by way of example, components, such as software components, object-oriented software components, class components and task components, processes, functions, attributes, procedures, subroutines, segments of program code, drivers, firmware, microcode, circuitry, data, databases, data structures, tables, arrays, and variables. Although the example embodiments have been described with reference to the components, modules and units discussed herein, such functional elements may be combined into fewer elements or separated into additional elements. Various combinations of optional features have been described herein, and it will be appreciated that described features may be combined in any suitable combination. In particular, the features of any one example embodiment may be combined with features of any other embodiment, as appropriate, except where such combinations are mutually exclusive. Throughout this specification, the term "comprising" or "comprises" means including the component(s) specified but not to the exclusion of the presence of others.
[0035] Attention is directed to all papers and documents which are filed concurrently with or previous to this specification in connection with this application and which are open to public inspection with this specification, and the contents of all such papers and documents are incorporated herein by reference.
[0036] All of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and/or all of the steps of any method or process so disclosed, may be combined in any combination, except combinations where at least some of such features and/or steps are mutually exclusive.
[0037] Each feature disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise. Thus, unless expressly stated otherwise, each feature disclosed is one example only of a generic series of equivalent or similar features.
[0038] The invention is not restricted to the details of the foregoing embodiment(s). The invention extends to any novel one, or any novel combination, of the features disclosed in this specification (including any accompanying claims, abstract and drawings), or to any novel one, or any novel combination, of the steps of any method or process so disclosed.

Claims (9)

  1. Claims 1. A system for displaying content information comprising a plurality of subsidiary devices and a master device, each subsidiary device and the master device comprising: a display for displaying a content information item received from a content provision apparatus in response to determining a predetermined user motion; a receiver configured to receive the content information item from the content provision apparatus; and a sensor configured to sense user motion, wherein each subsidiary device further comprises a transmitter to transmit a signal to the master device in response to the sensor sensing user motion, wherein the master device further comprises a communication unit to receive the signal transmitted from each subsidiary device, and wherein the master device further comprises a determination unit configured to determine the relative location of the subsidiary devices based on the received signals and to determine whether the sensed user motion is the predetermined user motion.
  2. 2. The system of claim 1, wherein each of the plurality of subsidiary devices and the master device further comprise: a user interface configured to receive user input.
  3. 3. The system of claims 1 or 2, wherein the system powers on in response to one of the user interfaces receiving a first user input.
  4. 4. The system of anyone of claims 1 to 3, wherein the content information category displayed by each of the plurality of subsidiary devices and by the master device is updated in response to a user motion being determined as the predetermined user motion.
  5. 5. The system of any one of claims 2 to 4, wherein the content information item displayed by one of the plurality of subsidiary devices or the master device is updated in response to the corresponding user interface receiving a second user input.
  6. 6. The system of any one of claims 2 to 5, wherein the transmitter of each of the plurality of subsidiary devices and the communication unit of the master device are configured to transmit a signal to the content provision apparatus in response to a third user input being received by the user interface, indicating selection of the content information item by the user, to cause the content provision apparatus to open a content corresponding to selected content information item.
  7. 7. The system of any one of claims 1 to 6, wherein the predetermined user motion is a contact or non-contact user motion
  8. 8. The system of any one of claims 2 to 7, wherein user input is a gesture.
  9. 9. A method for displaying content information comprising: sensing user motion by a sensor of a plurality of subsidiary devices and a master device; transmitting a signal to the master device by a transmitter of each subsidiary device in response to the sensor sensing user motion; receiving the signal transmitted from each subsidiary device by a communication unit of the master device; determining the relative location of the subsidiary devices based on the received signals and determining whether the sensed user motion is a predetermined user motion by a determination unit of the master device; and receiving a content information item from a content provision apparatus and displaying the received content information item by, respectively, a receiver and a display of each of the plurality of subsidiary devices and the master device in response to the sensed user motion being determined as the predetermined user motion.
GB1913693.6A 2019-09-23 2019-09-23 A system for displaying content information Pending GB2587351A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB1913693.6A GB2587351A (en) 2019-09-23 2019-09-23 A system for displaying content information

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1913693.6A GB2587351A (en) 2019-09-23 2019-09-23 A system for displaying content information

Publications (2)

Publication Number Publication Date
GB201913693D0 GB201913693D0 (en) 2019-11-06
GB2587351A true GB2587351A (en) 2021-03-31

Family

ID=68425418

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1913693.6A Pending GB2587351A (en) 2019-09-23 2019-09-23 A system for displaying content information

Country Status (1)

Country Link
GB (1) GB2587351A (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114942740B (en) * 2022-07-12 2022-10-21 江苏润和软件股份有限公司 Transformer substation auxiliary equipment online real-time monitoring method and system based on artificial intelligence

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130227495A1 (en) * 2012-02-24 2013-08-29 Daniel Tobias RYDENHAG Electronic device and method of controlling a display
US20140028921A1 (en) * 2012-07-25 2014-01-30 Samsung Electronics Co., Ltd. Method and mobile terminal for displaying information, method and display device for providing information, and method and mobile terminal for generating control signal
US20190146742A1 (en) * 2017-11-15 2019-05-16 Futurewei Technologies, Inc. Providing enriched e-reading experience in multi-display environments

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130227495A1 (en) * 2012-02-24 2013-08-29 Daniel Tobias RYDENHAG Electronic device and method of controlling a display
US20140028921A1 (en) * 2012-07-25 2014-01-30 Samsung Electronics Co., Ltd. Method and mobile terminal for displaying information, method and display device for providing information, and method and mobile terminal for generating control signal
US20190146742A1 (en) * 2017-11-15 2019-05-16 Futurewei Technologies, Inc. Providing enriched e-reading experience in multi-display environments

Also Published As

Publication number Publication date
GB201913693D0 (en) 2019-11-06

Similar Documents

Publication Publication Date Title
US11169988B2 (en) Radar recognition-aided search
CN102566747B (en) Mobile terminal and method for controlling operation of mobile terminal
US10670409B2 (en) Guiding method and guiding system
EP2728446A2 (en) Method and system for sharing contents
CN103823556A (en) Gaze interaction for information display of gazed items
US8565791B1 (en) Computing device interaction with visual media
CN107111740A (en) For retrieving content item using augmented reality and object recognition and being allowed to the scheme associated with real-world objects
CN104321681A (en) Enhanced information delivery using a transparent display
GB2462171A (en) Displaying enlarged content on a touch screen in response to detecting the approach of an input object
CN101231564A (en) Apparatus and method for improvement of usability of touch screen
CN101943987A (en) Information processing apparatus, information processing method and program
US10592078B2 (en) Method and device for a graphical user interface in a vehicle with a display that adapts to the relative position and operating intention of the user
EP2909755B1 (en) User interface with location mapping
CN103279464A (en) Information processing apparatus, information processing method, program, and recording medium
CN111291200B (en) Multimedia resource display method and device, computer equipment and storage medium
KR102302210B1 (en) Systems and methods for relative representation and disambiguation of spatial objects in an interface
CN105453006B (en) Gesture sensitive display
CN106612369A (en) Positioning information processing method and device
US10599292B2 (en) Display control device and program
CN106462622A (en) Information-processing device, control method, and program
JP2019032791A (en) Display controller, display control method and display control system
CN109983427A (en) Selection coherent reference is estimated based on noise
CA3039460A1 (en) Systems and methods for controlling a display of content in a retail store
CN104240588A (en) Touch globe and globe system
GB2587351A (en) A system for displaying content information