CN108141474B - Electronic device for sharing content with external device and method for sharing content thereof - Google Patents

Electronic device for sharing content with external device and method for sharing content thereof Download PDF

Info

Publication number
CN108141474B
CN108141474B CN201680060049.3A CN201680060049A CN108141474B CN 108141474 B CN108141474 B CN 108141474B CN 201680060049 A CN201680060049 A CN 201680060049A CN 108141474 B CN108141474 B CN 108141474B
Authority
CN
China
Prior art keywords
content
external device
information
display
processor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201680060049.3A
Other languages
Chinese (zh)
Other versions
CN108141474A (en
Inventor
金泰正
金铉埈
朴基哲
宋振宇
禹泓郁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Priority claimed from PCT/KR2016/011600 external-priority patent/WO2017065582A1/en
Publication of CN108141474A publication Critical patent/CN108141474A/en
Application granted granted Critical
Publication of CN108141474B publication Critical patent/CN108141474B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/10Protocols in which an application is distributed across nodes in the network
    • H04L67/1095Replication or mirroring of data, e.g. scheduling or transport for data synchronisation between network nodes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/10Text processing
    • G06F40/12Use of codes for handling textual entities
    • G06F40/14Tree-structured documents
    • G06F40/143Markup, e.g. Standard Generalized Markup Language [SGML] or Document Type Definition [DTD]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/951Indexing; Web crawling techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/958Organisation or management of web site content, e.g. publishing, maintaining pages or automatic linking
    • G06F16/972Access to data in other repository systems, e.g. legacy data or dynamic Web page generation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/165Management of the audio stream, e.g. setting of volume, audio stream path
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/50Network services
    • H04L67/51Discovery or management thereof, e.g. service location protocol [SLP] or web services
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/50Network services
    • H04L67/75Indicating network or usage conditions on the user display
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2370/00Aspects of data communication
    • G09G2370/04Exchange of auxiliary data, i.e. other than image data, between monitor and graphics controller
    • G09G2370/042Exchange of auxiliary data, i.e. other than image data, between monitor and graphics controller for monitor identification
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2370/00Aspects of data communication
    • G09G2370/16Use of wireless transmission of display information
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2380/00Specific applications
    • G09G2380/06Remotely controlled electronic signs other than labels

Abstract

A method and apparatus for sharing content between an electronic device and an external device are provided, the method comprising: a web document containing a plurality of pieces of content is received. Determining at least one piece of content executable in the external device from among the plurality of pieces of content based on the respective types of the plurality of pieces of content and function information of the external device. Information on the at least one piece of content is transmitted to the external device.

Description

Electronic device for sharing content with external device and method for sharing content thereof
Technical Field
The present disclosure relates generally to an electronic device for sharing content and a method of sharing content, and more particularly, to a method of transmitting content from an electronic device to an external device and a method of sharing content.
Background
With the development of wired and wireless communication networks, it has become possible to interconnect display screens to visually output electronic devices of identified data.
Accordingly, various data may be transmitted and received between electronic devices through a wired or wireless communication network. For example, the first electronic device and the second electronic device may share screens, content sharing methods, such as methods such as, for example, mirroring, streaming, and the like.
As an example, according to the mirroring method, a screen of the first electronic device may be compressed and then transmitted to the second electronic device. The second electronic device may decompress and display the screen.
As another example, according to a streaming method, compressed image content in a first electronic device may be transmitted to a second electronic device. The second electronic device may decompress and display the image content.
With improvements in wired and wireless connection methods, such as, for example, cable or wireless fidelity (Wi-Fi), content sharing methods have evolved rapidly. In particular, a content sharing method is being developed to be applied to various electronic devices with screens, such as, for example, a portable computer including a notebook computer, a Personal Computer (PC), a netbook, a tablet computer, a portable terminal including a smart phone and a Personal Digital Assistant (PDA), a television, and the like.
The content sharing method may be used when a user wishes to view content through a larger screen of the second electronic device instead of a small screen of the first electronic device.
In this case, the user may feel inconvenience due to several steps required to be completed for sharing the content.
For example, when there are two second electronic devices capable of sharing content, the user must determine and select one device suitable for sharing content from the two second electronic devices.
Disclosure of Invention
Technical scheme
The present disclosure has been provided to address at least the above problems and/or disadvantages and to provide at least the advantages described below. Accordingly, an aspect of the present disclosure is to provide a content sharing method for providing intuitive utility to a user so that first and second electronic devices can share content.
According to an embodiment of the present disclosure, there is provided a method of sharing content between an electronic device and an external device. A web document containing a plurality of pieces of content is received. Determining at least one piece of content executable in the external device among the plurality of pieces of content based on the respective types of the plurality of pieces of content and function information of the external device. Transmitting information about the at least one piece of content to the external device.
According to another embodiment of the present disclosure, an electronic device for sharing content with an external device is provided. The electronic device includes a communicator configured to communicate with an external device. The electronic device also includes a display configured to display a web document containing a plurality of pieces of content. The electronic device further includes: a processor configured to determine at least one piece of content executable in the external device from among the plurality of pieces of content based on the respective types of the plurality of pieces of content and the function information of the external device, and to transmit information about the at least one piece of content to the external device through the communicator.
In accordance with another embodiment of the present disclosure, an article of manufacture for sharing content between an electronic device and an external device is provided. The article of manufacture includes a non-transitory machine-readable medium containing one or more programs which when executed perform the steps of: a step of receiving a web document including a plurality of contents, determining at least one piece of content executable in an external device from among the plurality of contents based on respective types of the plurality of contents and function information of the external device, and transmitting information on the at least one piece of content to the external device.
Advantageous technical effects
According to the above-described embodiments of the present disclosure, it is possible to improve user applicability by using the method of sharing content.
Drawings
The above and other aspects, features and advantages of the present disclosure will become more apparent from the following detailed description when taken in conjunction with the accompanying drawings in which:
fig. 1 is a diagram illustrating an electronic device sharing content with an external device according to an embodiment of the present disclosure;
fig. 2 is a block diagram illustrating the structure of an electronic device according to an embodiment of the present disclosure;
fig. 3 is a block diagram illustrating a structure of an electronic device according to another embodiment of the present disclosure;
Fig. 4 is a diagram illustrating a configuration of software stored in an electronic device according to an embodiment of the present disclosure;
fig. 5 to 13 are diagrams illustrating an electronic device sharing content according to an embodiment of the present disclosure;
fig. 14 is a diagram illustrating an electronic device sharing content with an external device according to another embodiment of the present disclosure;
FIG. 15 is a diagram illustrating the generation of search information according to one embodiment of the present disclosure;
fig. 16 is a diagram illustrating an electronic device sharing content with an external device according to an embodiment of the present disclosure;
FIG. 17 is a diagram illustrating a network document according to an embodiment of the present disclosure; and
fig. 18 and 19 are flowcharts illustrating sharing of content by an electronic device according to an embodiment of the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in detail below with reference to the accompanying drawings. Although illustrated in different figures, the same or similar components may be referred to by the same or similar reference numerals. Detailed descriptions of well-known constituents or processes in the art will be omitted so as to avoid obscuring the subject matter of the present disclosure.
Hereinafter, terms used in the following description will be briefly described before the embodiments of the present disclosure.
The terms used herein are general expressions used broadly and selected in consideration of the functions of the present disclosure. However, such terms may have different meanings according to the intention of those of ordinary skill in the art, previous examples, or development of new technologies. Accordingly, terms used herein should be defined based on the meanings of the terms and the overall description of the present disclosure.
Terms including an ordinal number (e.g., "first" or "second") may be used to distinguish between elements, but an element is not limited by the ordinal number. The ordinal numbers are used only to distinguish between identical or similar elements.
Unless the text clearly describes, terms in the singular include the plural. Furthermore, terms such as "comprising," "consisting of … …," and "similar" refer to the disclosed features, numbers, steps, operations, elements, parts, or combinations thereof and are not intended to preclude the presence or addition of one or more other features, numbers, steps, operations, elements, parts, or combinations thereof.
Here, the terms "module" and "unit" mean an element that performs at least one function or operation. The module or unit may be implemented in hardware, software or a combination thereof. A plurality of modules, components or parts thereof may be integrated into at least one module or chip and implemented as at least one processor, except for the case where each module or unit needs to be implemented in separate specific hardware.
When a portion is described as being connected to another portion, such connection may be direct connection of the portions or electrical connection through another medium. Moreover, unless explicitly described, the statement that a part includes an element indicates that the part may include other elements.
Here, the term "user input" may include at least one of a touch input, a bending input, a voice input, a button input, an action input, a multi-mode input, but is not limited thereto.
The touch input may include a touch gesture performed by a user relative to a display and a cover of the device. Also, touch inputs may include inputs that are not in contact with the display, but are within a distance of the display (e.g., floating or hovering). The touch input may include, for example, a touch & hand grip gesture, a tap gesture (touch and release), a double tap gesture, a pan gesture, a tap gesture, a touch & drag gesture (touch and move in a particular direction), a pinch gesture, or the like, but is not limited thereto.
Key input refers to user input that controls a device using physical keys on the device.
Motion input refers to a user motion performed with respect to a device for controlling the device. For example, the motion input may include a user input to rotate, tilt, or move the device in a vertical and/or horizontal direction.
Multimodal input refers to input that combines at least two or more input methods. For example, the device may receive touch input and motion input from a user or may receive touch input and voice input.
An application here refers to a set of computer programs configured to perform a specific task. Various types of applications may be provided according to embodiments disclosed herein. For example, the applications may include a game application, a video reproduction (reproduction) application, a map application, a memo application, a schedule application, a phonebook application, a broadcast application, a motion support application, a payment application, a picture folder application, a medical device control application, an application providing a UI for a plurality of medical devices, or the like, but are not limited thereto.
Here, the application identification information may be unique information for distinguishing a specific application from other applications. For example, the application identification information may be an icon, an index item, link information, or the like, but is not limited thereto.
A user interaction element refers to an element that provides visual, auditory, or olfactory feedback according to user input through interaction with a user.
Also, the term "user" may refer to a person using an electronic device or a device using an electronic device (e.g., an electronic device with artificial intelligence).
Fig. 1 is a diagram illustrating an electronic device sharing content with an external device according to an embodiment of the present disclosure.
Referring to fig. 1, the electronic device 10 may be implemented as one of a smartphone, a tablet PC, a mobile phone, a video phone, an e-book reader, a desktop PC, a notebook PC, a netbook computer, a workstation, a server, a PDA, a Portable Multimedia Player (PMP), an MP3 player, an ambulatory medical device, a camera, and a wearable device, for example. According to various embodiments, the wearable device may include at least one of an accessory-type device (e.g., a watch, ring, bracelet, foot ring, necklace, glasses, contact lens, head-mounted device (HMD), or the like), a garment-integrated device (e.g., an electronic garment), a body-worn device (e.g., a skin pad or tattoo), or an implantable biological device (e.g., an implanted circuit).
According to another embodiment, the electronic device 10 may be implemented as a household appliance. For example, the home appliance may include at least one of a TV, a Digital Versatile Disc (DVD) player, a stereo system, a refrigerator, an air conditioner, a vacuum cleaner, an oven, a microwave oven, a washing machine, an air cleaner, a set-top box, a home automation panel, a security panel, a TV box, a game console, an electronic dictionary, an electronic key, a video camera, and an electronic photo frame.
According to another embodiment, the electronic device 10 may be implemented as various medical devices (e.g., various portable medical measurement devices (glucometers, heart rate monitors, sphygmomanometers, thermometers, or the like), Magnetic Resonance Angiography (MRA) units, Magnetic Resonance Imaging (MRI) units, Computed Tomography (CT) units, ultrasound units), navigation devices, Global Navigation Satellite Systems (GNSS), event recorders (EDR), Flight Data Recorders (FDR), in-vehicle infotainment devices, marine electronics (e.g., marine navigation systems, gyrocompass, or the like), avionics, security devices, in-vehicle head units, industrial robots, home robots, Automated Teller Machines (ATMs) in financial facilities, point of sale (POS) in stores, internet of things (IoT) devices (e.g., light bulbs, various sensors, electricity meters, gas meters, etc.) Sprinklers, fire alarms, temperature control systems (thermostats), street lights, toasters, sporting goods, hot water tanks, heaters, boilers, or the like).
According to yet another embodiment, the electronic device 10 may be implemented as, for example, furniture, a building, a portion of a structure, a circuit board, an electronic signature receiving device, a projector, or at least one of a variety of measurement devices (e.g., a water meter, an electric meter, a gas meter, an electric wave meter, or the like).
According to yet another embodiment, the electronic device 10 may be implemented as a combination of one or more of the above-described devices.
According to yet another embodiment, the electronic device 10 may be implemented as a bendable electronic device. The electronic device 10 is not limited to the above-described examples, and may include a new electronic device manufactured according to the development of electronic technology.
Hereinafter, the operation of the electronic device 10 will be described based on an example of a smartphone.
In fig. 1, the external device 20 may be a device capable of reproducing or displaying content. For example, the external device may be implemented as one of a video reproduction device, an image display device, a text display device, or an audio reproduction device. The external device 20 may be implemented as a combination of two or more of the above devices.
The video reproduction device or the image display device may include, for example, at least one of a TV, an electronic photo frame, a smart phone, a tablet PC, a mobile phone, a video phone, a desktop PC, a notebook PC, a netbook computer, a PDA, a PMP, a camera, or a wearable device.
The audio reproduction device may comprise, for example, at least one of a stereo system, MP3 player, speaker, TV, smart phone, tablet PC, mobile phone, video phone, desktop PC, notebook PC, netbook computer, PDA, PMP, or wearable device.
The text display device may include, for example, at least one of an e-book reader, a TV, an electronic photo frame, a smart phone, a tablet PC, a mobile phone, a video phone, a desktop PC, a notebook PC, a netbook computer, a PDA, a PMP, a camera, or a wearable device.
In fig. 1, an electronic device 10 may receive and display an electronic document 101 (e.g., a web document) containing a plurality of contents.
The content may be, for example, audio content, video, text, or images. The content may be a link address, such as a Uniform Resource Locator (URL), indicating a location where the content is stored. That is, the content may be an audio link address, a video link address, a text link address, or an image link address. The content may be a thumbnail of the content. That is, the content may be a video thumbnail, a text thumbnail, or an image thumbnail. By way of example, the content may include two or more types of content as described above. As another example, the content may include a video thumbnail and a video link address.
In fig. 1, the operation of the electronic device 10 will be described based on an example in which the content is a video link address.
The electronic device 10 may acquire function information of an external device capable of communicating with the electronic device 10.
The function information of the external device 20 may be information indicating functions executable in the external device 20. Also, the function information of the external device 20 may be information indicating a function mainly performed by the external device 20. As an example, the function information of the stereo system may indicate an audio reproduction function, and the function information of the TV may indicate a video reproduction function or an image display function. The function information of the e-book reader may indicate a text display function. In the embodiment described herein, the function information of the external device 20 may refer to a profile of the external device 20.
When there are a plurality of external devices 20 communicating with the electronic device 10, the function information about one external device 20 may be information indicating a function of the external device that is relatively higher in performance than other external devices. As an example, when one external device 20 of the plurality of external devices 20 is a stereo system and the other external device 20 is a TV, the function information on the stereo system may indicate an audio reproduction function. The TV can reproduce both audio and video, but the performance of the audio reproduction function of the TV is relatively lower than that of the video reproduction function. Accordingly, the function information of the TV may indicate a video reproduction function.
The electronic device 10 may retrieve the function information about the external device 20 from the memory. In this case, the functional information may be stored in the memory in advance during the manufacturing process of the electronic device 10. Also, the function information may be acquired from a server or from the external device 20 and stored in advance. In response to receiving a user input to reproduce content in the external device 20, the electronic device 10 may acquire function information from a server or the external device 20. In response to the electronic device 10 communicating with the external device 20, the electronic device 10 may acquire the function information from the external device 20.
The electronic device 10 may determine at least one piece of content that can be run in the external device 20 from among the pieces of content based on the types of the pieces of content contained in the electronic document 101 and the function information about the external device 20.
For example, the plurality of pieces of content may include a video link address, an audio link address, and a text link address. In this case, the type of the content corresponding to the video link address may be video, the type of the content corresponding to the audio link address may be audio, and the type of the content corresponding to the text link address may be text.
According to an embodiment, in response to the external device 20 being implemented as a TV and the function information on the TV indicating a video reproduction function, the content determined to be executed in the TV may be a video link address.
According to another embodiment, in response to the external device 20 being implemented as a stereo system and the function information on the stereo system indicating an audio reproduction function, the content determined to be executed in the stereo system may be an audio link address.
According to still another embodiment, the content determined to be executed in the electronic book reader or the tablet PC may be text in response to the external device 20 being implemented as the electronic book reader or the tablet PC and the function information about the electronic book reader or the tablet PC indicating a text display function.
As shown in fig. 1(a), in response to at least one piece of content executable in the external device 20 being determined from among a plurality of pieces of content, the electronic device 10 displays a content list 102 containing at least one piece of content representative information corresponding to the at least one piece of determined content. The content list 102 may be displayed on the screen of the electronic device 10 in the form of a pop-up window, or in the form of a floating pop-up window that is movable by a user touch & drag input. Also, the content list 102 may reduce the size of the web document, and may be displayed in an area formed by reducing the web document. That is, the content list 102 may be displayed in an area different from an area in which the web document is displayed.
The content representative information may be, for example, a thumbnail, a title, content, or content details.
For example, in response to the external device 20 being implemented as a TV and the at least one piece of content determined to be executed in the TV being a video link address, the content representative information corresponding to the content may include a video thumbnail, a portion of a video, a video title, a video summary, a video character, a video producer, or the like. The content representative information corresponding to the content may be acquired from the electronic document 101 or the server.
In response to the content list 102 being displayed, the electronic device 10 senses a user input selecting at least one piece of content representative information 102-1 from the content list 102.
In response to at least one piece of content representative information 102-1 being selected, electronic device 10 transmits content related information corresponding to the selected piece of content representative information 102-1 to external device 20. As an example, in response to the selected content representative information 102-1 being a thumbnail of a video, the electronic device 10 may transmit information containing a video link address of the video to the external device 20.
As shown in fig. 1(b), in response to the content-related information being transmitted to the external device 20, the external device 20 reproduces the content based on the received content. For example, in response to the received content being a video link address, the external device 20 may acquire a video indicated by the video link address by accessing the server and reproduce the acquired video.
Fig. 2 is a block diagram illustrating an electronic device architecture according to an embodiment of the present disclosure.
Referring to fig. 2, the electronic device 10 includes a display 130, a communicator 140, and a processor 190.
The display 130 may display a web document including a plurality of pieces of content. According to an embodiment, display 130 may display a content list including at least one piece of content representative information corresponding to at least one piece of determined content.
The display 130 may display a UI for controlling content transmitted to the external device.
The communicator 140 may communicate with the external device 20 located outside the electronic device 20 according to various types of communication methods. Also, the communicator 140 may transmit information about the contents determined in the processor 190 to the external device 20. In the operation in which the communicator 140 transmits the information on the content to the external device 20, the information on the content may be directly transmitted to the external device 20 or the information on the content may be indirectly transmitted to the external device 20. For example, the communicator 140 may transmit information about the content to the external device 20 through other external devices, Access Points (APs), or base stations.
The processor 190 may determine at least one piece of content executable in the external device 20 from among the pieces of content contained in the web document based on the types of the pieces of content and the function information about the external device 20. Also, the processor 190 may control the communicator 140 to transmit information about the determined content to the external device 20.
According to an embodiment, processor 190 may generate a template file including the at least one determined content, and may transmit the generated template file to external apparatus 20 through communicator 140.
Processor 190 may generate a content list including content representative information corresponding to the at least one piece of content, and may transmit the generated content list to external device 20 through communicator 140.
In response to selecting a piece of content representative information from the content list, processor 190 may transmit information about content corresponding to the selected content representative information to external device 20 through communicator 140.
Processor 190 may receive another network document related to the received network document through communicator 140. In this case, the processor 190 may determine at least one piece of content executable in the external device 20 from among pieces of content contained in other network documents. Also, the processor 190 may transmit information about the at least one determined piece of content to the external device 20 through the communicator 140.
Fig. 3 is a block diagram illustrating a structure of an electronic device according to another embodiment of the present disclosure.
Referring to fig. 3, the electronic device 10 includes at least one of an image acquisition unit 110, an image processor 120, a display 130, a communicator 140, a memory 150, an audio processor 160, an audio output unit 170, a sensor 180, and a processor 190. The structure of fig. 3 is merely an example for explanation, and the structure of the electronic apparatus 10 is not limited thereto. Accordingly, a part of the components of the electronic device 10 may be omitted or modified, or new elements may be added according to the type and purpose of the electronic device 10.
The image acquisition unit 110 may acquire image data from a variety of sources. For example, the image acquisition unit 110 may receive image data from an external server or device located external to the electronic device 10.
Also, the image pickup unit 110 may acquire image data by photographing an external environment. For example, the image acquisition unit 110 may be implemented as a camera for photographing an environment external to the electronic device 10. In this case, the image pickup unit 110 may include a lens through which an image passes and an image sensor sensing the image. The image sensor may be implemented as a Charge Coupled Device (CCD) image sensor or a Complementary Metal Oxide Semiconductor (CMOS) image sensor. The image data acquired by the image acquisition unit 110 may be processed by the image processor 120.
The image processor 120 processes image data acquired by the image acquisition unit 110. The image processor 120 may perform various image processing operations on the image data, such as, for example, decoding, scaling, noise filtering, frame rate conversion, resolution conversion, and so forth.
The display 130 displays a video frame obtained from image data processed by the image processor 120 or various screens generated by the graphic processor 193 in a display area.
Display 130 may be implemented in various sizes. As an example, display 130 may have dimensions of 3 inches, 4 inches, 4.65 inches, 5 inches, 6.5 inches, 8.4 inches, and so on. Display 130 may include a plurality of pixels. In this case, the number of column pixels multiplied by the number of row pixels can be expressed as a resolution. For example, display 130 may have different resolutions, such as 320 × 320, 360 × 480, 720 × 1280, 1280 × 800, 3940 × 2160, and so forth.
The display 130 may be implemented as a flexible display and is fixed on at least one of a front area, a side area, and a rear area of the electronic device 10. A flexible display refers to a display that is bent or rolled over through a bendable substrate as thin as paper without damage. Flexible displays can be manufactured using plastic substrates, and glass substrates are also commonly used. When the flexible display is composed of a plastic substrate, a low temperature manufacturing process may be used instead of the conventional manufacturing process to avoid damaging the substrate. Also, the glass substrate encapsulating the flexible liquid crystal may be replaced with a plastic film so that the flexible display can be flexibly folded or unfolded. Flexible displays are thin and light and are robust to shock. Also, the flexible display may be bent, and may be implemented in various forms.
The display 130 may be implemented as a touch screen in a layer structure combined with the touch sensor 181. The touch screen may provide a function of sensing a position of a touch input, a touch range, a pressure of the touch input, and a display function. The touch screen may also provide the function of sensing actual touches and proximity touches. Also, the touch screen may provide a function of sensing a finger touch of a user and various types of pen touches.
The communicator 140 communicates with various types of external devices according to various communication methods. The communicator 140 may include at least one of a Wi-Fi chip 141, a bluetooth chip 142, a wireless communication chip 143, and a Near Field Communication (NFC) chip 144. The processor 190 may perform communication with an external server or various external devices through the communicator 140.
The Wi-Fi chip 141 and the bluetooth chip 142 may perform communication according to a Wi-Fi method and a bluetooth method, respectively. In the case of the Wi-Fi chip 141 and the bluetooth chip 142, various communication information, such as a subsystem identification, a session key, or the like, may be first transmitted and/or received for communication connection, and then various information may be transmitted and/or received. The wireless communication chip 143 refers to a chip that performs communication according to various communication standards including Institute of Electrical and Electronics Engineers (IEEE), Zigbee, third generation (3G), third generation partnership project (3GPP), Long Term Evolution (LTE), and the like. The NFC chip 144 refers to a chip operating according to the NFC method using a radio frequency identification (RF-ID) band of 13.56MHz among various RF-ID bands including 135kHz, 13.56MHz, 433MHz, 860 to 960MHz, 2.45GHz, and the like.
The memory 150 may store various programs and data required for the operation of the electronic device 10. The memory 150 may be implemented as non-volatile memory, flash memory, a hard disk (HDD), a solid state disk, and so on. Memory 150 is accessible by processor 190 and data in memory 150 may be read, recorded, modified, deleted or updated by processor 190. In the embodiments disclosed herein, the term "memory" may refer to the memory 150, a Read Only Memory (ROM)192 or a Random Access Memory (RAM)191 in the processor 190, or a memory card (e.g., a micro SD card or a memory stick) mounted on the electronic device 10.
Also, the memory 150 may store programs and data for configuring various screens to be displayed on the display area of the display 130.
Next, the configuration of software in the memory 150 will be described with reference to fig. 4. Referring to fig. 4, the memory 150 may store software including an Operating System (OS)310, a kernel 320, middleware 330, and an application 340.
The OS 310 controls and manages the entire operation of the hardware. In other words, the OS 310 is a hierarchy that performs basic functions (such as hardware management, memory functions, security, etc.).
The core 320 serves as a channel for transmitting various signals including touch signals sensed by the sensors 180 to the middleware 330.
Middleware 330 includes various software modules for controlling the operation of electronic device 10. Referring to FIG. 4, the middleware 330 includes an X11 module 330-1, an APP manager 330-2, a connection manager 330-3, a security module 330-4, a system manager 330-5, a multimedia framework 330-6, a main UI framework 330-7, a window manager 330-8, and a sub UI framework 330-9.
The X11 manager 330-1 receives various event signals from different hardware of the electronic device 10. Here, different events may be set, including an event in which a user gesture is sensed, an event in which a system alarm occurs, an event in which a specific program is executed or terminated, and the like.
The APP manager 330-2 manages the execution states of various applications 340 installed in the memory 150. In response to sensing an application execution event in the X11 module 330-1, the APP manager 330-2 calls and executes an application corresponding to the sensed event.
Connection manager 330-3 supports wired and/or wireless network connections. The connection manager 330-3 may include various specific modules such as a DNET module, a universal plug and play (UPnP) module, and the like.
The security module 330-4 supports authentication, licensing, or secure storage for hardware.
The system manager 330-5 monitors the status of each component of the electronic device 10 and provides the monitoring results to the other modules. For example, when a battery is short, an error occurs, or a communication connection is interrupted, the system manager 330-5 may provide the monitoring result to the main UI frame 330-7 or the sub UI frame 330-9 to output an alarm message or an alarm sound.
The multimedia framework 330-6 renders multimedia content stored by the electronic device 10 or provided from an external source. The multimedia framework 330-6 may include a playback module, a camera module, a sound processing module, and the like. Accordingly, the multimedia framework 330-6 can generate and reproduce a screen and a sound by reproducing a variety of multimedia contents.
The main UI frame 330-7 provides various UIs to be displayed in a main display area of the display 130. The sub UI frame 330-9 provides various UIs to be displayed in a sub area of the display 130. The main UI frame 330-7 and the sub UI frame 330-9 may include an image composition module for configuring various UI elements, a coordinate composition module for calculating coordinates to display UI objects, a rendering module for rendering (render) the configured UI objects at the calculated coordinates, and a 2D/3DUI toolkit for providing tools for configuring a two-dimensional (2D) UI or a three-dimensional (3D) UI.
The window manager 330-8 may sense a touch event or other input event using a user's body or pen. In response to an event being sensed, the window manager 330-8 transmits an event signal to the main UI frame 330-7 or the sub UI frame 330-9 to perform an operation corresponding to the event.
Also, the electronic device 10 may further include a variety of program modules including a writing module drawing a line according to a drag trajectory of the user touch & drag input or an angle calculating module calculating a pitch angle, a roll angle, or a yaw angle based on a sensed value of the motion sensor 182.
The application module 340 includes applications 340-1 to 340-n that support various functions. For example, the application modules 340 may include program modules that provide a variety of services, such as a navigation program module, a game module, an e-book module, a calendar module, an alert management module, and so forth. The application may be installed by default or arbitrarily by the user. In response to the UI object being selected, the main Central Processing Unit (CPU)194 may execute an application corresponding to the selected UI object by using the application module 340.
The configuration of fig. 4 is merely an example, and the configuration of software is not limited thereto. Accordingly, some of the above components may be omitted or modified, or new elements may be added according to the type or purpose of the electronic device 10. For example, the memory 150 may additionally store various programs, such as a sensing module for analyzing signals sensed by various sensors, a communication module including a chat program, a text message program, or an email program, a call information aggregation program module, a voice over internet protocol (VoIP) module, a web browsing module, and the like.
Referring again to fig. 3, the audio processor 160 processes audio data of the image content. The audio processor 160 may perform various processing operations on the audio data, such as, for example, decoding, amplification, or noise filtering. The audio data processed by the audio processor 160 may be output to the audio output unit 170.
The audio output unit 170 outputs various audio data processed by the audio processor 160 in various ways, such as decoding, amplification, or noise filtering. Specifically, the audio output unit 170 may be implemented as a speaker, but this is just an example, and the audio output unit 170 may be implemented as an output terminal for outputting audio data.
The sensors 180 sense various user interactions. The sensor 180 may sense at least one of various changes of the electronic device 10, such as a posture change, an illuminance change, an acceleration change, etc., and transmit an electronic signal corresponding to the sensing result to the processor 190. That is, the sensor 180 may sense a state change associated with the electronic device 10, generate a sensing signal corresponding to the sensing result, and transmit the generated sensing signal to the processor 190. In the present embodiment, the sensor 180 may include various sensors. Since the power source supplies power to at least one sensor predetermined by the control of the sensor 180 (or based on user settings) when the electronic device 10 is driven, the sensor 180 may sense a change in the state of the electronic device 10.
The sensor 180 may include a variety of sensors and may include at least one of all types of sensing devices capable of sensing a change in state of the electronic device 10. For example, the sensor 180 may include at least one of a touch sensor, an acceleration sensor, a gyro sensor, an illuminance sensor, a proximity sensor, a pressure sensor, a noise sensor (e.g., a microphone), a video sensor (e.g., a camera module), a pen sensor, a timer, and the like.
The sensor 180 may be divided into a touch sensor 181 and a motion sensor 182 according to purposes, but is not limited thereto. The sensor 180 may be classified for additional purposes that do not represent physical classification. The functions of the touch sensor 181 and the motion sensor 182 may be performed by a combination of at least one sensor. Also, a portion of the functionality or structure of the sensor 180 may be included in the processor 190.
The touch sensor 181 may sense a finger input of a user and output a touch event value corresponding to a sensed touch signal. A touch panel of the touch sensor 181 may be installed under the display 130. The touch sensor 181 may sense a finger input of a user according to a capacitance type method and a pressure resistance type method. The capacitive type method refers to a method of calculating touch coordinates by sensing a micro current excited to a user's body. The piezoresistive method refers to a method of calculating touch coordinates by sensing a current at a touch point generated in response to a contact between an upper electrode plate and a lower electrode plate packaged in a touch panel.
The touch sensor 181 may acquire an output signal according to a user input from the touch sensor 181. The touch sensor 181 may derive user input information including a touch position, touch coordinates, the number of touches, touch intensity, cell ID, touch angle, touch dimension, etc., based on the value of the output signal and determine the type of touch input based on the derived user input information. In this case, the touch sensor 181 may determine the type of touch input using touch pattern data in a touch recognition algorithm control board or memory. In response to determining the type of touch input, touch sensor 181 can send information regarding the type of touch input to processor 190. As described above, the touch sensor 181 may sense the position of the proximity touch (or the position of hovering) input by the user.
In this case, the processor 190 may perform a part of the functions of the touch sensor 181. For example, the touch sensor 181 may send the processor 190 signal values acquired from the touch sensor or user input information derived from the signal values. Processor 190 can determine the type of touch input based on the received signal values, user input information, touch recognition algorithms, touch control panel, or touch pattern data in memory 150. For example, in response to the phone application being executed, the processor 190 may sense that a call button of the phone application is selected from the user input information or the type of touch input, and transmit a call to a calling party through the communicator 140.
The motion sensor 182 may sense a motion (e.g., a rotational motion, a tilting motion, etc.) of the electronic device 10 using at least one of an acceleration sensor, a tilt sensor, a gyro sensor, and a 3-axis magnetic sensor. Also, the motion sensor 182 may send the generated electrical signals to the processor 190. For example, the motion sensor 182 measures an acceleration that combines a motion acceleration and a gravitational acceleration of the electronic device 10, but only measures the gravitational acceleration when the electronic device 10 is not moving.
As an example, assuming that the motion sensor 182 uses an acceleration sensor, the motion sensor 182 may measure the gravitational acceleration with reference to each of the X-axis, Y-axis, and Z-axis of the electronic device 10. In this case, it is assumed that the upward direction of the front surface of the electronic apparatus 10 represents the positive (+) direction of the gravitational acceleration, and the upward direction of the rear surface of the electronic apparatus 10 represents the negative (-) direction of the gravitational acceleration. In response to the rear surface of the electronic device 10 being placed in contact with a horizontal surface, the X-axis and Y-axis components in the gravitational acceleration sensed by the motion sensor 182 are measured as '0 m/sec 2', and only the Z-axis component may be measured as some positive value (e.g., +9.8m/sec 2). Conversely, in response to the front surface of the electronic device 10 being placed in contact with a horizontal surface, the X-axis and Y-axis components in the gravitational acceleration sensed by the motion sensor 182 are measured as '0 m/sec 2', and only the Z-axis component is measured as some negative value (e.g., -9.8m/sec 2). Also, assuming that the electronic device 10 is placed in a tilted manner with respect to the desktop, the component of at least one axis in the gravitational acceleration sensed by the motion sensor 182 may be measured as a value other than '0 m/sec 2'. In this case, the square root of the sum of squares of the three axis components, i.e., the size of the vector sum, may be a certain value (e.g., 9.8m/sec 2). In this case, the motion sensor 182 may sense acceleration in each direction of the X, Y, and Z axes in the coordinate system. Of course, the individual axes and the corresponding gravitational accelerations may vary depending on the position of the sensor.
The sensors 180 may also include a pen sensor (e.g., a pen recognition panel). The pen sensor may use a pen sensor that senses pen input of a user using a stylus (e.g., a stylus pen, a digital pen, etc.) and outputs a pen proximity event value or a pen touch event value. The pen sensor may be implemented according to an EMR method, for example. The pen sensor may sense a touch input or a proximity input according to pen proximity or a change in electromagnetic field intensity of a pen touch. In particular, the pen recognition panel may include an electromagnetic induction coil sensor of a mesh structure and an electric signal processor for sequentially transmitting an alternating current signal of a certain frequency to coils of the electromagnetic induction coil sensor. In response to a pen with an embedded resonant circuit mounted around a pen recognition panel coil, a magnetic field transmitted from the coil generates a current based on mutual electromagnetic induction in the resonant circuit of the pen. Accordingly, an induced magnetic field is generated from the coil of the resonant circuit of the pen based on the current. The pen recognition panel may detect an induced magnetic field from the coil in a signal receiving state, thereby sensing an access position or a touch position of the pen.
The microphone may receive a user voice for controlling the medical device (e.g., an instruction to start, end, or abort a photographing operation) through the electronic device 10 and recognize the user voice through the voice recognition module. Also, the microphone may transmit the recognition result to the processor 190. In this case, the speech recognition module may be located in part of the processor 190 rather than inside the microphone, or may be located external to the electronic device 10.
Processor 190 may use various programs stored in memory 150 to control the overall operation of electronic device 10.
The processor 190 includes a RAM 191, a ROM 192, a graphic processor 193, a main CPU 194, first to nth interfaces 195-1 to 195-n, and a bus 196. The RAM 191, the ROM 192, the graphic processor 193, the main CPU 194, and the first to nth interfaces 195-1 to 195-n may be interconnected by a bus 196.
The RAM 191 stores an OS and application programs. Specifically, in response to the electronic device 10 being booted, the OS may be stored into the RAM 191, and various applications selected by the user may be stored into the RAM 191.
The ROM 192 stores a command set for starting the system. In response to supply of power by a power-on command, the CPU 194 copies the OS in the memory 150 into the RAM192 according to the command stored in the ROM 192 and starts the system by executing the OS. When the startup operation is completed, the CPU 194 copies various application programs in the memory 150 into the RAM 191 and executes the programs copied into the RAM 191 to perform various operations.
The graphic processor 193 generates a screen including various objects such as icons, images, or text by using a calculation unit and a rendering unit. The calculation unit calculates attribute values of the object, such as coordinate values, shape, size, and color, according to the screen layout based on the control command received from the sensor 180. The rendering unit generates screens having various layouts containing the objects based on the attribute values calculated by the calculation unit. The screen generated by the rendering unit may be displayed in a display area of the display 130.
The main CPU 194 accesses the memory 150 and performs a boot operation by using the OS in the memory 150. Also, the main CPU 194 performs various operations using various programs, contents, and data stored in the memory 150.
The first to nth interfaces 195-1 to 195-n connect the above components. One of the interfaces 195-1 to 195-n may be implemented as a network interface that is connected to an external device through a network.
Fig. 5 and 6A to 6C are diagrams of an electronic device sharing content with an external device according to an embodiment of the present disclosure.
Referring to fig. 5 (a), the processor 190 may control the communicator 140 to receive an electronic document 501 (e.g., a web file) including a plurality of pieces of content from a server. Processor 190 then controls display 130 to display electronic document 501 by parsing received electronic document 501. In this case, the electronic document 501 may be an electronic document searched by the user using the keyword.
In viewing the electronic document 501 displayed in the display 130, the user may wish to view the content contained in the electronic document 501 by using the external device 20.
In this case, the sensor 180 senses a user input for displaying the panel 502 of the electronic device 10. The user input may be a user input that touches and drags the screen of the electronic device 10 from a certain side (e.g., the top side) of the electronic device 10 to the center.
In response to the user input, processor 190 controls display 130 to display a panel 502 including a content sharing execution icon 502-1, as shown in fig. 5 (b). The panel may be displayed in a sliding manner in proportion to the number of user touch & drag inputs. The sensor 180 may then sense a user input selecting the content sharing execution icon 502-1.
In response to the user input, processor 190 controls display 130 to display a list 503 of external devices including at least one external device connectable to electronic device 10, as shown in fig. 5 (c). The external device list 503 contains a plurality of pieces of external device identification information 503-1, 503-2, 503-3 corresponding to the at least one external device 20.
For example, the communicator 140 may transmit the power beacon signal to the external device 20 in response to the electronic device 10 communicating with the external device 20 in a bluetooth manner and a bluetooth function of the electronic device 10 being performed. In response to receiving the power beacon signal, external device 20 may send an advertisement signal informing that external device 20 may be connected. Accordingly, a plurality of pieces of external device identification information 503-1, 503-2, 503-3 corresponding to each external device 20 that transmits the advertisement signal may be contained and displayed in the external device list 503. Next, the sensor 180 may sense a user input selecting the external device identification information 503-1 regarding the external device to share the content.
According to an embodiment, in (c) of fig. 5, the user may select the external device identification information 503-1 corresponding to the video reproducing device 20-1 (e.g., TV) capable of reproducing video. In this case, as shown in fig. 6A, processor 190 may determine content executable in video reproduction apparatus 20-1 from among the pieces of content of electronic document 501 based on the type of content and the function information about video reproduction apparatus 20-1. As an example, in response to the function information of the video reproducing apparatus 20-1 indicating a video reproducing function, the processor 190 may determine at least one piece of content of a video type from among the pieces of content of the electronic document 501. Specifically, in response to the electronic document 501 being a web document searched for by the keyword, song name "Tears in Heaven", the processor 190 may determine a music video of the song "Tears in Heaven" or a link address of the music video as content executable in the video reproducing apparatus 20-1.
Also, the processor 190 may determine the content executable in the video reproducing apparatus 20-1 among pieces of content contained in another electronic document related to the electronic document 501. For example, the processor 190 may extract a keyword from the electronic document 501, and determine at least one piece of video-type content as content executable in the video reproduction apparatus 20-1 from among the pieces of content of other electronic documents by using the extracted keyword. Also, the processor 190 may determine the content executable in the video reproducing apparatus 20-1 from among the pieces of content of the other electronic document indicated by the link address contained in the electronic document 501 based on the types of the pieces of content and the function information on the video reproducing apparatus 20-1.
According to an embodiment, in response to determining at least one piece of content executable in video reproduction device 20-1 from among the content of at least one of electronic document 501 and other electronic documents, processor 190 controls communicator 140 to transmit information 601 containing the determined video or link address of the video to video reproduction device 20-1.
In response to processor 190 determining at least one piece of content executable in video reproduction device 20-1, processor 190 controls display 130 to display a content list including at least one piece of content representative information corresponding to the at least one determined piece of content. In this case, processor 190 controls communicator 140 to transmit information 601 containing a video corresponding to the content representative information selected by the user from the content list or a video link address of the video to video reproducing apparatus 20-1.
Further, processor 190 can generate a content list including content representative information corresponding to the at least one determined piece of content. For example, in response to the content being a video or a video link address of the video, the content representative information may include at least one of a thumbnail of the video, a portion of the video, a title of the video, a summary of the video, an I-frame of the video, and details of the video (e.g., subtitles, characters, scenes, etc. of the video). For example, the content representative information may be extracted from the electronic document 501 or other electronic documents related to the electronic document 501.
In response to the content list being generated, processor 190 may control communicator 140 to transmit the generated content list to video reproduction device 20-1.
The video reproduction apparatus 20-1 may receive at least one of the content and the content representative information and reproduce the content based on the received content. For example, in response to the received content being a video, video reproduction apparatus 20-1 may reproduce the video. In response to the received content being the video link address of the video, the video reproducing apparatus 20-1 may acquire the video indicated by the video link address by accessing the server and reproduce the acquired video.
Video reproduction device 20-1 may display the received content representative information via display 130. For example, in response to a video being reproduced, video reproduction apparatus 20-1 may display at least one of a title, a subtitle, a character, a thumbnail, a scene, or a producer of the video through display 130.
According to another embodiment, in (c) of fig. 5, the user may select the external device identification information 503-2 corresponding to the audio reproducing device 20-2 (e.g., a stereo system) capable of reproducing audio as the external device 20. In this case, as shown in fig. 6B, the processor 190 may determine a content executable in the audio reproducing apparatus 20-2 among the plurality of pieces of content of the electronic document 501 based on the types of the plurality of pieces of content and the function information on the audio reproducing apparatus 20-2. As an example, in response to the function information about the audio reproducing apparatus 20-2 indicating an audio reproducing function, the processor 190 may determine at least one piece of content of an audio type from among the pieces of content of the electronic document 501. Specifically, in response to the electronic document 501 being a web document searched for by a keyword, song name "Tears in Heaven", the processor 190 may determine the audio contained in the web document or the audio link address of the audio as content executable in the audio reproducing apparatus 20-2. For example, the audio or audio link address may be background music reproduced when the electronic document 501 is displayed.
The processor 190 may determine the content executable in the audio reproducing apparatus 20-2 among pieces of content contained in other electronic documents related to the electronic document 501. For example, the processor 190 may extract a keyword from the electronic document 501 and determine at least one piece of audio-type content as content executable in the audio reproducing apparatus 20-2 from among the pieces of content of other electronic documents by using the extracted keyword. Specifically, in response to electronic document 501 being a web document searched for by a keyword, song name "teas in Heaven", processor 190 may extract a singer of the keyword "Eric clay", song "teas in Heaven", from the searched web document. Also, the processor 190 may receive another electronic document searched for by the keyword "Eric Clapon" through the communicator 140. Processor 190 may determine additional content that is executable in audio reproduction device 20-2 among the plurality of content of the received electronic document. In particular, processor 190 may determine that the songs "wonderful Tonight" and "Layla" sung by "Eric clay" are content executable in audio reproduction device 20-2.
In response to determining at least one piece of content executable in the audio reproduction device 20-2 from among the content of at least one of the electronic document 501 and the other electronic documents, the processor 190 may control the communicator 140 to transmit information 602 containing the audio contained in the electronic document 501 or an audio link address of the audio to the audio reproduction device 20-2.
In response to processor 190 determining at least one piece of content executable in audio reproduction device 20-2, processor 190 controls display 130 to display a content list including at least one piece of content representative information corresponding to the at least one determined piece of content. In this case, processor 190 may control communicator 140 to transmit information 602, which contains a video corresponding to the content representative information selected by the user from the content list or a video link address of the video, to audio reproducing apparatus 20-2.
Further, processor 190 can generate a content list including content representative information corresponding to the at least one determined piece of content. For example, in response to the content being audio or an audio link address, the content representative information may be, for example, a title of the audio, lyrics, a thumbnail, a song author, or a singer. The content representative information may be extracted from the electronic document 501 or other electronic documents related to the electronic document 501.
In response to the content list being generated, processor 190 may control communicator 140 to transmit the generated content list to audio reproduction device 20-2.
The audio reproducing device 20-2 may receive at least one of the content and the content representative information and reproduce the content based on the received content. For example, in response to the received content being audio, the audio reproduction device 20-2 may directly reproduce the audio. In response to the received content being the audio link address, the audio reproducing apparatus 20-2 may acquire the audio indicated by the audio link address by accessing the server and reproduce the acquired audio.
When audio reproduction device 20-2 includes display 20-21, audio reproduction device 20-2 may display the received content representative information via display 20-21. For example, when reproducing the audio, the audio reproducing apparatus 20-2 may display at least one of a title, lyrics, thumbnail, song author, or singer of the audio through the display 20-21.
According to another embodiment, in (c) of fig. 5, the user may select the external device identification information 503-3 corresponding to the text display device 20-3 (e.g., a tablet PC or an e-book reader) capable of displaying text as the external device 20.
In this case, as shown in fig. 6C, the processor 190 may determine contents executable in the text display device 20-3 among the plurality of contents contained in the electronic document 501 based on the types of the plurality of contents and the function information on the text display device 20-3. As an example, in response to the function information about the text display device 20-3 indicating a text display function, the processor 190 may determine at least one text type of content from among the pieces of content of the electronic document 501. Specifically, in response to the electronic document 501 being a web document searched for by the keyword "Tears in Heaven", the processor 190 may determine text contained in the web document or a text link address of the text as content executable in the text display device 20-3.
Also, the processor 190 may determine the content executable in the text display device 20-3 among pieces of content contained in other electronic documents related to the electronic document 501. For example, the processor 190 may extract a keyword from the electronic document 501 and determine at least one text-type content as a content executable in the text display device 20-3 from among a plurality of pieces of content of the searched electronic document by using the extracted keyword.
As shown in fig. 6C, in response to determining at least one piece of content executable in the text display device 20-3 from among the content of at least one of the electronic document 501 and the other electronic documents, the processor 190 may control the communicator 140 to transmit information 603 containing text or a text link address of the electronic document 501 to the text display device 20-3.
In response to processor 190 determining at least one piece of content executable in text display device 20-3, processor 190 may control display 130 to display a list of content including at least one piece of content representative information corresponding to the at least one determined piece of content. In this case, the processor 190 may control the communicator 140 to transmit information 603, which contains text or a text link address corresponding to the content representative information selected by the user from the content list, to the text display device 20-3.
In response to processor 190 sending information about the content to text display device 20-3 through communicator 140, processor 190 may render and send the text of electronic document 501 so that user readability of the text may be enhanced. For example, the processor 190 may render at least one of a size, letter spacing, color, and font of the text to be suitable for a screen of the text display device 20-3, and transmit the rendered text to the text display device 20-3.
Further, processor 190 can generate a content list including content representative information corresponding to the at least one determined piece of content. For example, in response to the content being text or a text link address of the text, the content representative information may be a title, an author, a source, a creation date, or a thumbnail of the text. The content representative information may be extracted from the electronic document 501 or other electronic documents related to the electronic document 501.
In response to the content list being generated, processor 190 may control communicator 140 to transmit the generated content list to text display device 20-3.
The text display device 20-3 may receive at least one of text and content representative information and display the text based on the received text. For example, in response to the received content being text, the text display device 20-3 may display the text on the screen. In response to the received content being the text link address, the text display device 20-3 may acquire the text indicated by the text link address by accessing the server and display the acquired text on the screen.
The text display device 20-3 may display the received content representative information through a display. For example, in response to displaying text, text display device 20-3 may display at least one of a title, an author, a creation date of the text via a display.
Fig. 7 is a diagram illustrating an electronic device sharing content according to another embodiment of the present disclosure.
Referring to fig. 7 (a), the processor 190 controls the communicator 140 to receive an electronic document 701 (e.g., a web document) including a plurality of pieces of content from a server. Also, processor 190 controls display 130 to display electronic document 701 by parsing received electronic document 701.
The processor 190 may determine at least one piece of content executable in the external device 20 among the plurality of pieces of content based on the types of the plurality of pieces of content of the electronic document 701 and the function information on the external device 20.
In response to determining at least one piece of content executable in the external device 20 among the pieces of content, the processor 190 controls the display 130 to display a content list 702 containing at least one piece of content representative information corresponding to the at least one piece of determined content. Next, the sensor 180 senses a user input selecting content representative information 702-1 from the content list 702.
In response to content representative information 702-1 being selected, processor 190 controls communicator 140 to transmit information 703 regarding content corresponding to the selected content representative information 701-2 to external device 20.
In response to the information 703 about the content being transmitted to the external device 20, the external device 20 reproduces the content based on the received content, as shown in (b) of fig. 7.
The processor 190 of the electronic device 10 controls the display 130 to display the UI 705 for controlling the reproduction of the content in the external device 20.
As an example, in response to the content reproduced in the external device 20 being video, the processor 190 may control the display 130 to display a UI 705 containing video control function items such as pause, start of reproduction, stop of reproduction, speed control, title size control, title position control, and the like of the video.
As another example, in response to the content reproduced in the external device 20 being audio, the processor 190 may control the display 130 to display a UI containing audio control function items such as pause of audio, start of reproduction, stop of reproduction, speed control, repetition interval, equalization control, and the like.
As yet another example, in response to the content reproduced in the external device 20 being text, the processor 190 may control the display 130 to display a UI containing text control function items, such as start display, stop display, automatic scrolling, page turning, size control, line space control, font control, and the like, of the text.
In response to information containing the plurality of pieces of content being transmitted to the external device 20, the processor 190 of the electronic device 10 may control the display 130 to display a UI for converting the corresponding plurality of pieces of content, moving the corresponding plurality of pieces of content, or searching the corresponding plurality of pieces of content.
Fig. 8 is a diagram illustrating an electronic device sharing content according to still another embodiment of the present disclosure.
Referring to fig. 8 (a), the processor 190 controls the communicator 140 to receive an electronic document 801 (e.g., web document) including a plurality of pieces of content from a server. Also, the processor 190 controls the display 130 to display the electronic document 801 by parsing the received electronic document 801.
The processor 190 determines a plurality of pieces of content executable in the external device 20 among the plurality of pieces of content of the electronic document 801 based on the types of the plurality of pieces of content and the function information about the external device 20.
In response to the plurality of pieces of content executable in external device 20 being determined, processor 190 displays content list 802 including a plurality of pieces of content representative information 802-1, 802-2, 802-3, and 802-4 corresponding to the plurality of pieces of determined content. Next, the sensor 180 senses a user input selecting a plurality of pieces of content representative information 802-1, 802-2, and 802-3 from the content list 802. For example, in response to the content list 802 containing a plurality of check boxes corresponding to each of the content representative information 802-1, 802-2, 802-3, and 802-4, the sensor 180 may sense a user input selecting a plurality of pieces of content representative information 802-1, 802-2, and 802-3 through the plurality of check boxes.
In response to the plurality of pieces of content representative information 802-1, 802-2, and 802-3 being selected, processor 190 controls communicator 140 to transmit information 803, which includes a plurality of pieces of content corresponding to the plurality of pieces of content representative information 802-1, 802-2, and 802-3, to external device 20.
In response to the information 803 containing the plurality of pieces of content being transmitted to the external device 20, as shown in fig. 8 (b), the external device 20 may sequentially or randomly reproduce the plurality of pieces of content based on the plurality of pieces of received content.
Fig. 9 is a diagram illustrating an electronic device sharing content according to still another embodiment of the present disclosure.
Referring to fig. 9 (a), processor 190 controls communicator 140 to receive an electronic document 901 (e.g., a web document) containing a plurality of pieces of content from a server. Also, processor 190 controls display 130 to display electronic document 901 by parsing received electronic document 901.
The processor 190 determines a plurality of pieces of content executable in the external device 20 among the plurality of pieces of content of the electronic document 901, based on the types of the plurality of pieces of content and the function information about the external device 20.
In this case, a plurality of pieces of function information about the external device 20 may be provided. For example, in response to the external device being implemented as a TV, the function information about the external device 20 may indicate a video reproduction function, an audio reproduction function, and a text display function.
In response to the plurality of pieces of content executable in the external device 20 being determined, the processor 190 controls the display 130 to display a content list 902 including a plurality of pieces of content representative information 902-1, 902-2, 902-3 and 902-4, 902-5 and 902-6 corresponding to the plurality of determined pieces of content. In this case, at least two or more pieces of the plurality of pieces of content may have different types. As an example, some of the plurality of pieces of content may be a video type, some of the plurality of pieces of content may be an audio type, and some of the plurality of pieces of content may be a text type.
Next, the sensor 180 senses a user input selecting at least one piece of content representative information in the content list 902.
In response to at least one piece of content representative information being selected, processor 190 controls communicator 140 to transmit information about content corresponding to the selected content representative information to external device 20.
In response to the information on the content being transmitted to the external device 20, as shown in (b) of fig. 9, the external device 20 may reproduce the content based on the received content.
In this case, the external device 20 may change the content reproduction screen depending on the type of the received content.
In particular, the external device 20 may provide a variety of screen modes according to the type of content. For example, in response to the type of content being video, the external device 20 operates in a video mode. For example, in response to the external device 20 operating in the video mode, the screen color temperature may be at least one of 6500K and 5500K.
In response to the content type being text, the external device 20 operates in a text mode. For example, in response to the external device 20 operating in the text mode, the screen color temperature may be at least one of 4000K and 5000K.
In response to the content type being audio, the external device 20 operates in an audio mode. Meanwhile, the external device 20 may provide a plurality of sub-modes in one screen mode. For example, in response to the external device operating in a video mode, the video mode may include a standard screen mode, a motion screen mode, a movie screen mode, and the like.
According to an embodiment, in response to the type of the received content being a video, the external device 20 may determine the current screen mode. In response to the current screen mode being the video mode, the external device 20 may maintain the current screen mode. Conversely, in response to the current screen mode not being the video mode, the external device 20 may convert the current screen mode into the video mode. Accordingly, as shown by reference numeral 911 of fig. 9, the external device 20 may adjust at least one of color, definition, and color temperature of a screen to suit reproduction of the video and reproduce the video.
In response to the received content type being text, external device 20 may determine the current screen mode. In response to the current screen mode being the text mode, the external device 20 may maintain the current screen mode. Conversely, in response to the current screen mode not being the text mode, the external device 20 may convert the current screen mode into the text mode. Accordingly, as shown by reference numeral 912 of fig. 9, the external device 20 may adjust at least one of color, definition, and color temperature of the screen to suit the display of the text and display the text. In this case, the external device 20 may render and display text to fit the screen. As an example, the external device 20 may render at least one of a size, a line interval, a color, and a font of text to fit into a screen and display the rendered text on the screen. As another example, in response to the rendered text by the electronic device 10 fitting on the screen and being sent to the external device 20, the external device 20 may display the rendered text on the screen.
According to yet another embodiment, in response to the received content type being audio, external device 20 may determine the current screen mode. In response to the current screen mode being the audio mode, the external device 20 may maintain the current screen mode. Conversely, in response to the current screen mode not being the audio mode, the external device 20 may convert the current screen mode into the audio mode. Accordingly, as shown by reference numeral 913 of fig. 9, the external device 20 may adjust at least one of the color, definition, and color temperature of the screen to be suitable for display of a thumbnail of the audio or text related to the audio (e.g., lyrics, author, or singer of the audio). Also, the external device 20 may render a thumbnail of audio or text related to audio to fit the screen and display the rendered thumbnail or text on the screen. As another example, the external device 20 may display the rendered thumbnail images or text on the screen in response to the thumbnail images or text rendered by the electronic device 10 and transmitted to the external device 20 adapted to the external device 20.
Fig. 10 is a diagram illustrating an electronic device sharing content according to another embodiment of the present disclosure.
Referring to fig. 10 (a), the processor 190 controls the communicator 140 to receive an electronic document 1001 (e.g., a web document) from a server (e.g., a Social Network Service (SNS) server). Next, processor 190 controls display 130 to display electronic document 1001 by parsing the received electronic document 1001. The electronic document 1001 may include content 1002 (e.g., a video registered by a third party or a real-time broadcast video) and specific information 1003 of the content 1002 (e.g., comments about the content, the content of the content, content ratings, a virtual keyboard to enter comments on the content, a chat window about the content, etc.).
In this case, the sensor 180 senses a user input selecting the content 1002.
In response to the user input, the processor 190 determines the external device 20 capable of reproducing the selected content 1002 from among the plurality of external devices connected to the electronic device 10, based on the type of the selected content 1002 and the function information on the external device 20.
In response to determining the external device 20, the processor 190 controls the communicator 140 to transmit information 1005 containing the link address of the selected content 1002 to the external device 20.
In response to the transmission information 1005, as shown in (b) of fig. 10, the external device 20 reproduces the received content 1002 based on the link address thereof. When the content is a real-time broadcast video, the external device 20 may receive the video in real time from the server indicated by the video link address and display the video on the screen.
Also, the external device 20 may receive content from the electronic device 10 in real time and display the video on the screen. As an example, the external device 20 may receive content in real time from the electronic device 10 in a mirroring method or a streaming method and may display the video on a screen.
As shown in the diagram of the electronic device 10 of fig. 10 (b), the processor 190 controls the display 130 to continuously display specific information 1003 about the content 1002 on the screen. As an example, processor 190 may control display 130 to continuously display specific information 1003 other than a video while the video is being reproduced in external apparatus 20. In this case, since the video is not displayed on the screen, more specific information can be displayed on the screen. For example, more comments may be displayed.
As described above, a video is displayed on the large screen of the external device 20, and specific information about the video or a screen receiving user input is separately displayed on the screen of the electronic device 10 held by the user. Therefore, the convenience of the user using both the electronic apparatus 10 and the external apparatus 20 can be improved.
Fig. 11 is a diagram illustrating an electronic device providing contents according to another embodiment of the present disclosure.
Referring to fig. 11 (a), the processor 190 controls the communicator 140 to receive an electronic document 1101 (e.g., a web document) containing a plurality of pieces of content from a server. Also, the processor 190 controls the display 130 to display the electronic document 1101 by parsing the received electronic document 1101.
In this case, the sensor 180 senses a user input for extracting pieces of content from the web document. For example, the user input may be a content display mode selection signal received by the sensor 180 from the remote controller 30 in response to a user request to select the content display mode button 31 on the remote controller 30.
In response to the user input, the processor 190 extracts a plurality of pieces of content contained in the electronic document 1101. Also, the processor 190 determines at least one piece of content executable in the electronic device 10 from among the plurality of pieces of extracted content based on the type of the plurality of pieces of extracted content and the function information on the electronic device 10. As an example, processor 190 can determine at least one piece of content of the video type in response to the electronic device being implemented as a video rendering device.
Next, as shown in fig. 11 (b), processor 190 controls display 130 to display a content list 1102 including a plurality of pieces of content representative information corresponding to at least one piece of the determined content. The content list 1102 may be displayed in the form of a list on one side of the screen of the electronic device 10. Also, the content list 1102 may be displayed on the screen of the electronic device 10 in the form of a pop-up window. Also, the content list may be displayed in the form of a line along at least one axis of the screen of the electronic device 10.
In response to the content list 1102 being displayed, the sensor 180 senses a user input moving to one piece of content representative information 1102-1 among a plurality of pieces of content representative information contained in the content list 1102. The user input may be a one-way button selection signal received by the sensor 180 from the remote controller 30 in response to a user input selecting the button 32 from among the four-way buttons of the remote controller.
In response to the user input, processor 190 moves a cursor to or highlights a location in content list 1102 corresponding to content representative information 1102-1 of the one-way button selection signal. Next, sensor 180 senses a user input selecting the content representative information 1102-1. For example, the user input may be an execution button selection signal received by the sensor 180 from the remote controller 30 in response to the user selecting the execution button 33.
In response to the user input, processor 190 displays an execution screen of the selected content as shown in fig. 10 (c). For example, in response to the selected content being a video, processor 190 may control display 130 to display a video reproduction screen.
Fig. 12 is a diagram illustrating an electronic device 10 providing contents according to another embodiment of the present disclosure.
Referring to fig. 12, the processor 190 controls the communicator 140 to receive an electronic document 1201 (e.g., a web document) including a plurality of pieces of content from a server. Reference numeral 1210 of fig. 12 illustrates source codes of the received electronic document 1201.
In response to a user input to transmit the content to the external device 20, the processor 190 determines a plurality of pieces of content executable in the external device 20 from among the plurality of pieces of content of the electronic document 1201 based on the types of the plurality of pieces of content and the function information about the external device 20. As an example, in response to the function information on the external device 20 indicating the video reproduction function and the image display function, the processor 190 may determine a plurality of pieces of content of the video type and the image type. As another example, as shown by reference numeral 1210 of fig. 12, the processor 190 determines a plurality of image link addresses 1211, 1212, 1213 as contents executable in the external device 20.
According to an embodiment, in response to determining the plurality of pieces of content, processor 190 generates a template file containing the plurality of pieces of content. As an example, the "slide show" class grammar 1221 of the template file contains a plurality of image link addresses 1211, 1212, 1213. The template file may be implemented as a language such as hypertext markup language (HTML), extensible markup language (XML), extensible hypertext markup language (XHTML), and the like. In response to the template file being parsed, content page 1230 is generated. The content page 1230 may include an image indicated by the image link address.
Next, processor 190 controls communicator 140 to transmit template file 1220 containing the content to external device 20.
The external device 20 receives the template file 1220, parses the received template file 1220, and displays the contents page 1230 on the screen. In this case, the external device 20 may display the content page 1230 in the entire area or a partial area of the screen (e.g., a certain side of the screen). Then, in response to a user input selecting a piece of content representative information from the content page 1230, the external device 20 may display an execution screen of content corresponding to the selected content representative information through the display.
Fig. 13 is a diagram illustrating an electronic device providing contents according to another embodiment of the present disclosure.
Referring to fig. 13, the processor 190 controls the communicator 140 to receive an electronic document 1301 (e.g., a web document) including a plurality of pieces of content from a server. Also, the processor 190 controls the display 130 to display the electronic document 1301 by parsing the received electronic document 1301. Reference numeral 1310 of fig. 13 illustrates source codes of the received electronic document 1301.
In response to a user input to transmit the content to the external device 20, the processor 190 determines a plurality of pieces of content executable in the external device 20 from among the plurality of pieces of content of the electronic document 1301 based on the types of the plurality of pieces of content and the function information about the external device 20. As an example, as shown by reference numeral 1310 of fig. 13, processor 190 determines a plurality of image link addresses 1311, 1312, 1313 as contents executable in external device 20.
According to an embodiment, in response to determining the plurality of pieces of content, processor 190 controls communicator 140 to transmit data 1320 containing the plurality of pieces of content to external device 20.
The external device 20 receives data 1320 containing a plurality of pieces of content and generates a template file 1330 containing a plurality of pieces of content by using the received data 1320. For example, the "slide show" class syntax 1331 of the template file 1330 contains a plurality of image link addresses 1311, 1312, 1313. Next, the external device 20 generates a content page 1340 containing an image indicated by the image link address by parsing the template file. Then, the external device 20 displays the generated content page 1340 on the screen. In this case, the external device 20 may display the content page on the entire screen or a partial area of the screen (e.g., a certain side of the screen). In response to a user input selecting a piece of content representative information from the content page 1340, the external device 20 may display an execution screen of content corresponding to the selected content representative information through the display.
Fig. 14 is a diagram of a server sharing content with an external device according to another embodiment of the present disclosure.
Referring to fig. 14, the server 50 transmits information about content contained in an electronic document (e.g., a web document) stored in the server 50 based on the function information from the external device 20 and the received search information.
The server 50 may be a device that stores or provides a network document. The server 50 may be implemented as one or more servers. By way of example, the server 50 may collect information by operating in conjunction with a plurality of servers as a cloud server and provide the collected information in the form of a web document.
The external device 20 may be a device capable of accessing the server 50. In fig. 1, the external device 20 may include the video reproduction device, the audio reproduction device, the text display device, or other types of devices described above. For example, the external device 20 may include at least one of various sensors (e.g., a motion sensor, a window opening/closing sensor, a smoke sensor, a power output sensor, etc.), a gas meter, a sprinkler, a fire alarm, a temperature control system (thermostat), a street lamp, a sporting good, a hot water tank, a heater, a household appliance (e.g., a TV, a refrigerator, an oven, a washing machine, a dryer, etc.), a smart lamp, an electricity meter, a gas meter, a solar system, a sprinkler system, a temperature control system (thermostat), a vehicle, a wearable device, a Closed Circuit Television (CCTV), a writing implement, a keyboard, a mouse, a charger, furniture (e.g., a bed, a mirror, etc.), a door lock, a security system, and the like as an internet of things (IoT) device.
Also, the external device 20 may include at least one of various medical devices (e.g., various portable medical measuring devices (blood glucose meters, heart rate monitors, blood pressure meters, thermometers, or the like), Magnetic Resonance Angiography (MRA) devices, Magnetic Resonance Imaging (MRI) devices, Computed Tomography (CT) devices, ultrasound devices, and the like), navigation devices, Global Navigation Satellite Systems (GNSS), vehicle data recorders (EDR), Flight Data Recorders (FDR), in-vehicle infotainment devices, marine electronic devices (e.g., marine navigation systems, gyrocompass, or the like), avionics, security devices, in-vehicle head units, industrial robots, home robots, unmanned planes, ATMs in banking facilities, POS in stores, and the like.
Next, referring to fig. 14, the external device 20 will be described in detail using an example TV 21, a printer 22, a medical device 23 (e.g., a blood glucose test device), a home appliance 24 (e.g., an oven), and a wireless speaker 25.
In response to receiving a request for executing content contained in a network file, the external device 20 transmits function information about the external device 20 and search information about the network file to the server 50 in step 1401.
The request for executing the content may be an execution request generated according to a user input received via a UI provided by the external device 20 or the electronic device 10. Also, the request for execution of the contents may be an execution request according to a power-on event, an event reaching a predetermined time, or a trigger event occurring at a certain cycle.
The request for executing the content may be an execution request generated when a Quick Response (QR) code or a barcode is scanned or an NFC tag or a Radio Frequency Identification (RFID) tag is read in the external device 20.
As described above, the function information about the external device 20 may be information indicating a function executable in the external device 20 or a function mainly executed by the external device 20. The external device 20 may transmit identification information about the external device 20 instead of function information about the external device 20. The identification information about the external device 20 may include at least one of a model name, a serial number, a type of device, information about a manufacturer, information about a trader, of the external device 20.
The search information on the network file may be a keyword, a picture, voice, barcode information, or tag information required to search the electronic document stored in the server 50.
In step 1403, the server 50 that received the function information and the search information searches the web document based on the received function information and search information.
In response to searching for multiple network documents, server 50 may prioritize the multiple network documents and select the network document with the highest priority. In this case, the priority may be determined by the creation date, the number of viewing times, or the recommendation number of the web document.
In step 1405, the server 50 determines at least one piece of content executable in the external device 20 from among pieces of content contained in the network document.
Next, the server 50 transmits information on the searched content to the external device 20 in step 1407.
The information on the content may be, for example, a template file containing link information on the content, a content list containing content representative information, the content itself, or the processed content.
The external device 20 executes the information on the content. For example, the external device 20 may reproduce or display the content or may be controlled by the content.
According to an embodiment, the external device 20 may be implemented as a TV 21.
In this case, the TV 21 transmits the content display function to the server 50 as the function information of the TV 21.
Also, as shown in (a) of fig. 15, the TV 21 transmits search information on a web document to the server 50. When the user talks to the TV 21 or the remote controller 22-1 communicating with the TV, the search information may be voice information acquired by the TV 21 about the user's uttered voice. In this case, the voice information may be the voice itself or information that the voice is recognized and converted into text.
The search information may be meta information about video reproduced in the TV 21. The meta information of the content may be, for example, at least one of a title, a subtitle, a character, an abstract, a creation date, or a manufacturer of the video.
Also, the search information may be text information input by the user using a remote controller.
The server 50 that has received the function information and the search information searches the web document based on the received function information and search information.
Also, the server 50 determines at least one piece of content executable in the TV 21 from among the contents contained in the searched web document.
Next, the server 50 transmits information about the determined content to the TV 21. In this case, the information on the content may be information processed by the server 50. As an example, in response to the content being text, the information about the content may be a template file configured to enlarge the size of the text to fit into a larger screen of the TV 21. In particular, the font size value of the template file may be increased in response to the template file being generated as a markup file.
The TV 21 executes the received information on the content. For example, the TV 21 may display the enlarged text on the screen.
According to another embodiment, the external device 20 may be implemented as a printer 22.
In this case, the printer 22 transmits the print function to the server 50 as the function information about the printer 22.
Also, the printer 22 transmits search information on the web document to the server 50. The search information may be, for example, a title of a printout printed in the printer 22. Also, as shown in (b) of fig. 15, the search information may be text information input by the user through an input panel of the printer 22.
The server 50 that has received the function information and the search information searches the web document based on the received function information and search information. The searched web document may be, for example, a web document dedicated to a print job.
Also, the server 50 determines at least one piece of content executable in the printer 22 from among the pieces of content contained in the searched web document. As an example, the determined piece of content may be the entire content of the web document. Next, the server 50 transmits information about the determined content to the printer 22. In this case, the information on the content may be information that the web document is rendered into a print data format that can be output at a printer.
The printer 22 executes the received information on the content. For example, the printer 22 outputs the web document in a print data format.
According to another embodiment, the external device 20 may be implemented as a medical device 23 (e.g., a portable blood glucose testing device or a blood pressure testing device).
In this case, the medical device 23 transmits the diagnosis function and the treatment function to the server 50 as the function information about the medical device 23.
Also, the medical device 23 transmits search information on the web document to the server 50. As shown in (c) of fig. 15, the search information may be, for example, information recognized when the user marks an NFC tag or an RFID tag on a package or a coupon provided with the medical device 23 on a reader of the medical device 23.
The server 50 that receives the function information and the search information may search for a web document based on the received function information and search information.
Also, the server 50 determines at least one piece of content executable in the medical device 23 from among the pieces of content contained in the searched web document.
In response to the server 50 transmitting information about the determined content to the medical device 23, the medical device 23 executes the information about the content.
By way of example, the medical device 23 may display the diagnostic method, the therapeutic method, and the use of the device via a display of the medical device 23. Also, the medical device 23 may be operable to perform self-diagnosis or measure biometric information about the user or give therapy to the user based on the information about the content according to predetermined options.
According to another embodiment, the external device 20 may be implemented as a household appliance 24 (e.g., an oven or a refrigerator).
In this case, the home appliance 24 transmits the cooking function or the cooling/warming function as the function information on the home appliance 24 to the server 50.
In this case, the home appliance 24 transmits search information on the network document to the server 50. As shown in (a) - (c) of fig. 15, the search information may be, for example, information obtained by recognizing a user voice, text obtained from the user through an input panel, or information identified by scanning a barcode provided when the home appliance 24 is purchased or reading tag information.
The server 50 that has received the function information and the search information searches the web document based on the received function information and search information.
Also, the server 50 determines at least one piece of content executable in the home appliance 24 from among the pieces of content contained in the searched web document.
In response to the server 50 transmitting the information, the home appliance 24 executes the information on the content. As an example, the home appliance 24 displays the cooking method or use of the device through a display of the home appliance 24. Also, the home appliance 24 performs a self-test or cooks food based on information about the contents according to a predetermined option.
Meanwhile, the embodiment in which the external device 20 acquires and transmits search information and the embodiment in which the external device 20 performs information on content may be applied to other external devices 20 in the same manner.
Fig. 16 is a diagram illustrating a server sharing content with an external device according to an embodiment of the present disclosure.
In fig. 16, in response to receiving a request for executing content based on a web document, the external device 20 transmits function information about the external device 20 to the server 50 in step 1601.
In response to the external device 10 being operated by the user, the network document list 1611 of network documents with the execution history in the electronic device 10 is displayed, at least one network document 1613 is selected from the network document list 1611, and the electronic device 10 transmits information (for example, link information of the network document 1613) about the selected network document 1613 to the server 50 in step 1602.
In step S1603, the server 50, having received the function information and the information on the network document, searches the network document 1613 again based on the information on the network document. In this case, the network document 1613 may be an updated network document.
The server 50 determines at least one piece of content executable in the external device 20 among the pieces of content contained in the searched web document in step 1604.
Next, the server 50 transmits the searched information on the content to the external device 20 in step 1605.
The external device 20 may execute the received information on the content. For example, the external device 20 may reproduce or display the content or may be controlled by the content.
Fig. 17 is a schematic diagram illustrating a web document according to an embodiment of the present disclosure.
According to an embodiment, the network document 1701 is configured to display a plurality of pieces of device identification information 1702, 1703 indicating devices usable by the network document.
As an example, when the network document 1701 contains medical device identification information 1702 as the device identification information, the network document 1701 may be a network document containing descriptive information about the medical device or a network document containing a control command for controlling the medical device. As another example, when the network document 1701 contains TV identification information 1703 as device identification information, the network document may be a network document containing description information on a TV or a network document having a large-sized font suitable for display of a TV. When the network document 1701 contains a printer identification tag as device identification information, the network document 1701 may be a network document containing description information about a printer or a network document in a form suitable for output of a printer.
In response to receiving a request for information on content from the external device 20, the server 50 may first search for a network document containing device identification information on a device of the same type as the external device 30.
Also, the server 50 may determine at least one piece of content executable in the external device 20 among the contents contained in the searched web document and transmit information about the determined content to the external device 20. The server 50 may transmit the searched network document to the external device 20 in response to the searched network document being configured to be suitable for execution by the external device 20.
The external device 20 may execute the received information on the content. For example, the external device 20 may reproduce or display the content or may be controlled by the content.
To display the device identification information 1702, 1703 in the network document 1701, a < device > tag may be added to the markup document structure of the network document 1701. In this case, the type and display position of the device identification information may be determined according to the setting and position of the < device > tag.
According to an embodiment, when a variety of settings of the < device > tag are provided, for example, in response to the < device > tag being set to < device ═ TV, audio, Medical >, a plurality of pieces of device identification information may be displayed in the network document 1701.
The pieces of device identification information may be prioritized. As an example, the pieces of device identification information may be arranged from left to right or from top to bottom in the order of devices having greater utility for the network document 1701.
Also, the < device > tag may contain an attribute value for determining whether to display the device identification information in the web document 1701. For example, when the attribute value is set to < device ═ TV; invisibly >, the device identification information of the TV is not displayed in the network document 1701, but the server 50 uses the device identification information with reference to the device identification information for the TV to search for a network document suitable for execution in the external device 20.
Fig. 18 is a flowchart illustrating a method of sharing content by an electronic device according to an embodiment of the present disclosure.
Referring to fig. 18, the electronic device 10 receives a web document including a plurality of pieces of content at step S1801.
In step S1802, the electronic apparatus 10 determines at least one piece of content executable in the external apparatus 20 among the plurality of pieces of content based on the types of the plurality of pieces of content of the received web document and the function information on the external apparatus 20.
In step S1803, the electronic device 10 transmits information about the at least one piece of content to the external device 20.
The external device 20 receiving the information on the at least one piece of determined content may reproduce or display the content based on the information on the content. In response to the content being the link address, the external device 20 may reproduce or display the video or image indicated by the link address.
Fig. 19 is a flowchart illustrating a method for displaying notification information according to another embodiment of the present disclosure.
Referring to fig. 19, the electronic device 10 receives a web document including a plurality of items of content at step S1901.
In step S1902, the electronic apparatus 10 determines at least one piece of content executable in the external apparatus 20 from among the pieces of content of the received web document based on the types of the pieces of content and the function information on the external apparatus 20.
In step S1903, the electronic device 10 displays a content list containing at least one piece of content representative information corresponding to the at least one determined piece of content.
In step S1904, the electronic device 10 determines whether at least one piece of content representative information is selected from the content list.
In step S1905, when at least one piece of content representative information is selected, the electronic device 10 transmits information on the content corresponding to the selected content representative information to the external device 20.
The external device 20 that receives the information on the content may reproduce or display the content based on the information on the content. In response to the content being the link address, the external device 20 may reproduce or display the video or image indicated by the link address.
For example, an apparatus (e.g., a module or electronic device 10) or a method (e.g., operations) according to the various embodiments described above may be operated on or performed by at least one computer (e.g., processor 190) executing instructions contained in at least one program among programs stored in, for example, a computer-readable storage medium.
In response to execution of the instructions by a computer (e.g., processor 190), the at least one computer may perform functions corresponding to the instructions. In this case, for example, the computer-readable storage medium may be the memory 150.
The program may be stored in a computer readable storage medium such as a hard disk, a floppy disk, a magnetic medium (e.g., tape), an optical medium (e.g., compact disc-ROM (CD-ROM), DVD), a magneto-optical medium (e.g., magneto-optical disk), a hardware device (e.g., ROM, RAM, or flash memory), and so forth. In such a case, the storage medium may be included as part of the components of the electronic device 10. The storage medium may be mounted through a port of the electronic device 10 or may be contained in an external device (e.g., a cloud, server, or other electronic device) located external to the electronic device 10. The program may be stored in a plurality of storage media, in which case at least some of the storage media may be located in a device external to the electronic device 10.
The instructions may include high-level language code that is executable by a computer using an interpreter, as well as machine language generated by a compiler. The hardware device may be configured to execute one or more software modules to perform the operations in the various embodiments described above, and vice versa.
According to the above-described embodiments of the present disclosure, it is possible to improve the usability of the method of using shared content by a user.
As an example, the content transmitted to the external device may be automatically determined, and thus, the steps for sharing the content may be reduced.
As another example, a content list of contents executable in the external device may be automatically provided, and the user can quickly select and reproduce specific contents from the content list. Therefore, user satisfaction can be improved.
While the disclosure has been shown and described with reference to certain embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the disclosure as defined by the appended claims.

Claims (13)

1. A method of sharing content between an electronic device and an external device, the method comprising:
receiving a plurality of web documents containing a plurality of contents;
Identifying a type of content executable in the external device based on the function information of the external device;
identifying at least one content among a plurality of contents included in the plurality of web documents based on the identified type of the content corresponding to the type of the at least one content; and
transmitting information on the identified at least one content to the external device,
wherein the function information includes at least two functions mainly performed by the external device, and
wherein sending the information comprises:
generating a content list containing content representative information corresponding to the identified at least one of the plurality of contents; and
transmitting the content list to the external device,
wherein the method further comprises:
displaying the generated content list on the electronic device; and
displaying device identification information indicating that the obtained at least one content can be executed by the external device on the electronic device,
wherein the identifying the type of content comprises:
identifying a function having higher performance in the external device than in other external devices among the at least two functions based on the electronic device being capable of connecting a plurality of external devices; and
The type of content that can be executed in the external device is identified based on the function having higher performance in the external device among the at least two functions.
2. The method of claim 1, wherein transmitting the information comprises:
generating a template file containing the identified at least one of the plurality of content; and is
And sending the template file to the external equipment.
3. The method of claim 1, comprising:
displaying a content list including at least one content representative information corresponding to the identified at least one of the plurality of contents.
4. The method of claim 3, wherein transmitting the information comprises:
in response to selection of at least one piece of content representative information from the content list, information on at least one piece of content corresponding to the at least one piece of content representative information is transmitted to the external device.
5. The method of claim 1, comprising:
receiving other network documents related to the network document;
identifying at least one other content executable in the external device among a plurality of contents contained in other network documents; and is
Transmitting information about the at least one other content to the external device.
6. The method of claim 1, comprising:
displaying, on a display of the electronic device, a User Interface (UI) for controlling the plurality of contents at the external device.
7. The method of claim 1, wherein the function information on the external device includes at least one of information indicating a function executable in the external device, information indicating a function mainly executed by the external device, and information indicating a function having relatively higher performance in the external device than other external devices.
8. The method of claim 1, wherein the content comprises at least one of video, audio content, text, images, video link addresses, audio link addresses, text link addresses, image link addresses, video thumbnails, text thumbnails, and image thumbnails.
9. The method of claim 1, comprising:
rendering the identified at least one of the plurality of content; and is
Transmitting information about at least one of the rendered plurality of contents to the external device.
10. An electronic device that shares content with an external device, the electronic device comprising:
a communicator including circuitry configured to communicate with an external device;
a display configured to display a plurality of web documents including a plurality of contents; and
a processor configured to:
identifying a type of content executable in the external device based on the function information of the external device,
identifying at least one content among a plurality of contents included in the plurality of web documents based on the identified type of the content corresponding to the type of the at least one content, and
transmitting information on the identified at least one content to the external device through the communicator,
wherein the function information includes at least two functions mainly performed by the external device, and
wherein the processor is further configured to:
generating a content list containing content representative information corresponding to at least one of the identified plurality of contents,
transmitting the content list to the external device through the communicator,
displaying the generated content list on the electronic device,
displaying device identification information indicating that the obtained at least one content can be executed by the external device on the electronic device,
Based on the electronic device being capable of connecting a plurality of external devices, identifying a function having a higher performance in the external device than in the other external devices among the at least two functions, and
the type of content that can be executed in the external device is identified based on the function having higher performance in the external device among the at least two functions.
11. The device of claim 10, wherein the processor is further configured to:
generating a template file containing the identified at least one of the plurality of content, an
And sending the template file to the external equipment through the communicator.
12. The device of claim 10, wherein the display is further configured to display a content list containing at least one content representative information corresponding to the identified at least one of the plurality of content.
13. The device of claim 12, wherein in response to selecting at least one piece of content representative information from the list of content, the processor is further configured to transmit, to the external device through the communicator, information about one piece of content corresponding to the at least one selected piece of content representative information.
CN201680060049.3A 2015-10-16 2016-10-17 Electronic device for sharing content with external device and method for sharing content thereof Active CN108141474B (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
KR20150144700 2015-10-16
KR10-2015-0144700 2015-10-16
KR10-2016-0089046 2016-07-14
KR1020160089046A KR20170045101A (en) 2015-10-16 2016-07-14 Electronic device and Method for sharing content thereof
PCT/KR2016/011600 WO2017065582A1 (en) 2015-10-16 2016-10-17 Electronic device sharing content with an external device and method for sharing content thereof

Publications (2)

Publication Number Publication Date
CN108141474A CN108141474A (en) 2018-06-08
CN108141474B true CN108141474B (en) 2021-11-16

Family

ID=58705235

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201680060049.3A Active CN108141474B (en) 2015-10-16 2016-10-17 Electronic device for sharing content with external device and method for sharing content thereof

Country Status (3)

Country Link
EP (1) EP3323234A1 (en)
KR (1) KR20170045101A (en)
CN (1) CN108141474B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109597951B (en) * 2018-12-05 2021-07-02 广州酷狗计算机科技有限公司 Information sharing method and device, terminal and storage medium
KR102652361B1 (en) 2019-02-08 2024-03-29 삼성전자주식회사 Method for sharing content and electronic device thereof

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103338186A (en) * 2013-06-05 2013-10-02 华为技术有限公司 A content sharing method and an apparatus

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7653001B2 (en) * 2004-04-09 2010-01-26 At&T Mobility Ii Llc Managing differences in user devices when sharing content on mobile devices
CN101227636B (en) * 2007-01-17 2010-10-13 中国移动通信集团公司 Information sharing method
KR101472785B1 (en) * 2008-01-07 2014-12-16 삼성전자주식회사 Method for optimized-sharing multimedia contents and mobile terminal using the same
US8789131B2 (en) * 2010-05-14 2014-07-22 Lg Electronics Inc. Electronic device and method of sharing contents thereof with other devices
US8918645B2 (en) * 2010-09-24 2014-12-23 Amazon Technologies, Inc. Content selection and delivery for random devices
CN102510392B (en) * 2011-10-10 2014-11-05 Tcl集团股份有限公司 Equipment room application sharing method and system, television and mobile terminal
US20130325952A1 (en) * 2012-06-05 2013-12-05 Cellco Partnership D/B/A Verizon Wireless Sharing information
CN103546493B (en) * 2012-07-09 2018-12-28 上海博路信息技术有限公司 A kind of Cross-device communication method
US20150172238A1 (en) * 2013-12-18 2015-06-18 Lutebox Ltd. Sharing content on devices with reduced user actions

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103338186A (en) * 2013-06-05 2013-10-02 华为技术有限公司 A content sharing method and an apparatus

Also Published As

Publication number Publication date
EP3323234A4 (en) 2018-05-23
CN108141474A (en) 2018-06-08
KR20170045101A (en) 2017-04-26
EP3323234A1 (en) 2018-05-23

Similar Documents

Publication Publication Date Title
US10021569B2 (en) Theme applying method and electronic device for performing the same
KR102199786B1 (en) Information Obtaining Method and Apparatus
CN105830422B (en) Foldable electronic and its interface alternation method
EP3586316B1 (en) Method and apparatus for providing augmented reality function in electronic device
KR102309175B1 (en) Scrapped Information Providing Method and Apparatus
KR102178892B1 (en) Method for providing an information on the electronic device and electronic device thereof
KR102240279B1 (en) Content processing method and electronic device thereof
CN115097982B (en) Method for processing content and electronic device thereof
KR102285699B1 (en) User terminal for displaying image and image display method thereof
US10990748B2 (en) Electronic device and operation method for providing cover of note in electronic device
US11361148B2 (en) Electronic device sharing content with an external device and method for sharing content thereof
US10853024B2 (en) Method for providing information mapped between a plurality of inputs and electronic device for supporting the same
US20160062648A1 (en) Electronic device and display method thereof
US10311613B2 (en) Electronic device for processing image and method for controlling thereof
KR20160039746A (en) Information sharing method and electronic device thereof
US20150234799A1 (en) Method of performing text related operation and electronic device supporting same
KR102274944B1 (en) Apparatus and method for identifying an object
US20160104226A1 (en) Method and apparatus for providing content service
KR102202896B1 (en) Method for saving and expressing webpage
CN108141474B (en) Electronic device for sharing content with external device and method for sharing content thereof
EP3314874B1 (en) System and method for providing a web service
US20160048498A1 (en) Method for providing alternative service and electronic device thereof
US20180173701A1 (en) Method for contents tagging and electronic device supporting the same

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant