CN117337572A - Content sharing method and electronic device thereof - Google Patents

Content sharing method and electronic device thereof Download PDF

Info

Publication number
CN117337572A
CN117337572A CN202280034700.5A CN202280034700A CN117337572A CN 117337572 A CN117337572 A CN 117337572A CN 202280034700 A CN202280034700 A CN 202280034700A CN 117337572 A CN117337572 A CN 117337572A
Authority
CN
China
Prior art keywords
electronic device
multimedia content
user
information
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202280034700.5A
Other languages
Chinese (zh)
Inventor
崔善永
郑钟宇
千佳元
姜硕英
俞珍善
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020210077506A external-priority patent/KR20220154574A/en
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Priority claimed from PCT/KR2022/002756 external-priority patent/WO2022239939A1/en
Publication of CN117337572A publication Critical patent/CN117337572A/en
Pending legal-status Critical Current

Links

Landscapes

  • Telephone Function (AREA)

Abstract

An electronic device according to embodiments disclosed herein may include an ultra-wideband (UWB) communication module, a communication module, and a processor operably connected to the UWB communication module and the communication module. The processor may determine a location of the electronic device based on UWB, determine an area in which the electronic device is located, play multimedia content corresponding to the area, obtain interest level information on a plurality of multimedia contents including the multimedia content, select at least one multimedia content from the plurality of multimedia contents based on the interest level information, generate user multimedia content by using the selected at least one multimedia content, and transmit the user multimedia content to the server through the communication module.

Description

Content sharing method and electronic device thereof
Technical Field
Various embodiments disclosed in this document relate to a method for sharing content with another electronic device based on a user's experience obtained by using the electronic device in a space where an ultra-wideband (UWB) environment is established, and an electronic device thereof.
Background
When moving in a space where a UWB environment is established, a user can experience various contents provided in the space by using an electronic device (e.g., a smart phone). For example, when moving in an art gallery where a UWB environment is established, a user can enjoy a display work displayed in the art gallery by using an electronic device. The electronic device may display images or videos related to the display work and may output instructions regarding the display work through a speaker. In another example, when a user moves in a brand store where a UWB environment is established, the user may obtain information about products sold in the brand store by using an electronic device.
In order for a user to experience various contents by using an electronic device in a space where a UWB environment is established, it may be useful to determine a location of the electronic device, and the location of the electronic device may be precisely measured using UWB. UWB anchors (anchors) and UWB communication modules may be used to measure the location of the electronic device by utilizing UWB, and the electronic device may measure the location of the electronic device based on UWB signals exchanged therebetween.
Disclosure of Invention
Technical problem
In some cases, the user may temporarily use (e.g., rent) the electronic device. For example, some spaces (e.g., venues) provide for renting or borrowing electronic devices for use in the space. A user may obtain various experiences by using an electronic device that the user borrows or rents for a predetermined time in a space where a UWB environment is established. Information related to various experiences a user obtains by utilizing a borrowed or rented electronic device may not be provided to the user's personal electronic device and may be disposed of after a single use.
Various embodiments disclosed in this document include a method of generating new content by using information that is not provided to a user's personal terminal and is discarded after a single use and providing the generated content to the user's personal electronic device, and an electronic device thereof.
Solution to the problem
An electronic device according to an embodiment disclosed in the present disclosure may include a UWB communication module, a communication module, and a processor operably connected with the UWB communication module and the communication module. The processor may determine a location of the electronic device based on the UWB signal received through the UWB communication module, may determine an area in which the electronic device is located among the plurality of areas based on a result of comparing the determined location with map information defining the plurality of areas included in a space in which the external device transmitting the UWB signal is installed, may play multimedia contents corresponding to the determined area, may acquire interest information on a plurality of multimedia contents including the multimedia contents played by the electronic device, may select at least one multimedia content from the plurality of multimedia contents based on the interest information, may generate user multimedia contents by using the at least one selected multimedia content, and may transmit the generated user multimedia contents to the server through the communication module.
A method of operating an electronic device according to an embodiment disclosed in the present disclosure may include: determining a location of the electronic device based on UWB signals received by a UWB communication module of the electronic device; determining an area in which the electronic device is located among the plurality of areas based on a result of comparing the determined position with map information defining the plurality of areas included in the space in which the external device transmitting the UWB signal is installed; playing the multimedia content corresponding to the determined area; acquiring interest information on a plurality of multimedia contents including multimedia contents played by an electronic device; selecting at least one multimedia content from a plurality of multimedia contents based on the interest information; generating user multimedia content by using at least one selected multimedia content; and transmitting the generated user multimedia content to a server through a communication module of the electronic device.
A system according to embodiments disclosed in the present disclosure may include an electronic device, a server, and another electronic device. The electronic device of the system may determine a location of the electronic device based on the UWB signal received through the UWB communication module included in the electronic device, may determine an area in which the electronic device is located among the plurality of areas based on a result of comparing the determined location with map information defining the plurality of areas included in a space in which the external device transmitting the UWB signal is installed, may play (may include a first play or a play at another time after the first time) multimedia content corresponding to the determined area, may acquire interest information on the plurality of multimedia contents including the multimedia content played by the electronic device, may select at least one multimedia content from the plurality of multimedia contents based on the interest information, may generate user multimedia content by using the at least one selected multimedia content, and may transmit the generated user multimedia content to the server through the communication module. The server of the system may store user multimedia content received from the electronic device and may transmit link information of the user multimedia content to the electronic device. The electronic device of the system may send the link information received from the server to another electronic device. Another electronic device of the system may download from the server user multimedia content corresponding to the link information received from the electronic device.
Advantageous effects of the invention
A user may have various experiences by utilizing a user-rented electronic device in a particular space where a UWB environment is established, and information related to the various experiences may be transmitted from the borrowed or rented electronic device to the user's personal electronic device.
The user's personal electronic device receives information related to various experiences so that the user can use personalized information related to various experiences and can experience user-customized content or advertisements even when the user is not in a particular space where the UWB environment is established.
Other various effects that can be grasped directly or indirectly through this document can be provided.
Drawings
The above and other advantages and features of the present disclosure will become more apparent by describing embodiments thereof in more detail with reference to the accompanying drawings in which:
FIG. 1 is a block diagram of an electronic device in a network environment, according to an embodiment;
FIG. 2 is a diagram illustrating a space in which a system for measuring a location of an electronic device by using UWB is established according to an embodiment;
FIG. 3 is a block diagram of an electronic device according to an embodiment;
FIG. 4 is a diagram illustrating an indoor positioning system according to an embodiment;
FIG. 5 is a flowchart of operations for sending user multimedia content by an electronic device to a server, according to an embodiment;
FIG. 6a is a diagram illustrating a space including multiple regions according to an embodiment;
FIG. 6b is a diagram illustrating a space including multiple regions according to an embodiment;
FIG. 7 is a diagram illustrating a user interface displayed according to a location of an electronic device, according to an embodiment;
fig. 8 is a view showing a user interface displaying information about played content according to an embodiment;
FIG. 9 is a diagram illustrating a user interface displayed according to acquisition of user input for generating user multimedia content, according to an embodiment;
fig. 10 is a view showing a user interface providing a function of sharing user multimedia content according to an embodiment;
FIG. 11 is a flowchart of operations for determining whether to register a user, according to an embodiment;
FIG. 12 is a flowchart of operations for changing a displayed user interface and played multimedia content according to a change in location of an electronic device, according to an embodiment;
FIG. 13 is a flowchart of operations for sharing user multimedia content, according to an embodiment;
FIG. 14 is a flowchart of operations for sharing user multimedia content, according to an embodiment; and
Fig. 15 is a flowchart of operations for sharing user multimedia content according to an embodiment.
With respect to the explanation of the drawings, the same or similar reference numerals are used for the same or similar components.
Detailed Description
The present invention will now be described more fully hereinafter with reference to the accompanying drawings, in which various embodiments are shown. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art. Like numbers refer to like elements throughout.
It will be understood that when an element is referred to as being "on" another element, it can be directly on the other element or intervening elements may be present therebetween. In contrast, when an element is referred to as being "directly on" another element, there are no intervening elements present.
It will be understood that, although the terms "first," "second," "third," etc. may be used herein to describe various elements, components, regions, layers and/or sections, these elements, components, regions, layers and/or sections should not be limited by these terms. These terms are only used to distinguish one element, component, region, layer or section from another element, component, region, layer or section. Thus, a "first element," "component," "region," "layer" or "section" discussed below could be termed a second element, component, region, layer, or section without departing from the teachings herein.
As used herein, "about" or "approximately" includes the values recited and, taking into account the measurements in question and the errors associated with the particular amount of measurements (i.e., limitations of the measurement system), means within an acceptable deviation of the particular value as determined by one of ordinary skill in the art. For example, "about" may mean within one or more standard deviations, or within ±30%, 20%, 10% or 5% of the stated value.
Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this disclosure belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and the present disclosure and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
Embodiments are described herein with reference to cross-sectional illustrations that are schematic illustrations of idealized embodiments. Thus, variations from the shapes of the illustrations as a result, for example, of manufacturing techniques and/or tolerances, are to be expected. Thus, the embodiments described herein should not be construed as limited to the particular shapes of regions illustrated herein but are to include deviations in shapes that result, for example, from manufacturing. For example, an area shown or described as flat may generally have rough and/or nonlinear features. Furthermore, the acute angles shown may be rounded. Accordingly, the regions illustrated in the figures are schematic in nature and their shapes are not intended to illustrate the precise shape of a region and are not intended to limit the scope of the claims.
Hereinafter, various embodiments will be described with reference to the accompanying drawings. These, however, do not limit the disclosure to the particular embodiments and should be understood to include various modifications, equivalents, and/or alternatives to the embodiments.
Fig. 1 is a block diagram illustrating an electronic device 101 in a network environment 100 according to various embodiments. Referring to fig. 1, an electronic device 101 in a network environment 100 may communicate with an electronic device 102 via a first network 198 (e.g., a short-range wireless communication network) or with at least one of an electronic device 104 or a server 108 via a second network 199 (e.g., a long-range wireless communication network). According to an embodiment, the electronic device 101 may communicate with the electronic device 104 via the server 108. According to an embodiment, the electronic device 101 may include a processor 120, a memory 130, an input module 150, a sound output module 155, a display module 160, an audio module 170, a sensor module 176, an interface 177, a connection terminal 178, a haptic module 179, a camera module 180, a power management module 188, a battery 189, a communication module 190, a subscriber identity module (Subscriber Identification Module, SIM) 196, or an antenna module 197. In some embodiments, at least one of the above-described components (e.g., connection terminal 178) may be omitted from electronic device 101, or one or more other components may be added to electronic device 101. In some embodiments, some of the above components (e.g., sensor module 176, camera module 180, or antenna module 197) may be implemented as a single integrated component (e.g., display module 160).
The processor 120 may run, for example, software (e.g., program 140) to control at least one other component (e.g., a hardware component or a software component) of the electronic device 101 that is connected to the processor 120, and may perform various data processing or calculations. According to one embodiment, as at least part of the data processing or calculation, processor 120 may store commands or data received from another component (e.g., sensor module 176 or communication module 190) into volatile memory 132, process commands or data stored in volatile memory 132, and store the resulting data in non-volatile memory 134. According to an embodiment, the processor 120 may include a main processor 121 (e.g., a central processor (Central Processing Unit, CPU) or an application processor (Application Processor, AP)) or an auxiliary processor 123 (e.g., a graphics processing unit (Graphic Processing Unit, GPU), a neural processing unit (Neural Processing Unit, NPU), an image signal processor (Image Signal Processor, ISP), a sensor hub processor or a communication processor (Communication Processor, CP)) that is operatively independent or combined with the main processor 121. For example, when the electronic device 101 comprises a main processor 121 and an auxiliary processor 123, the auxiliary processor 123 may be adapted to consume less power than the main processor 121 or to be dedicated to a particular function. The auxiliary processor 123 may be implemented separately from the main processor 121 or as part of the main processor 121.
The auxiliary processor 123 (instead of the main processor 121) may control at least some of the functions or states related to at least one of the components of the electronic device 101 (e.g., the display module 160, the sensor module 176, or the communication module 190) when the main processor 121 is in an inactive (e.g., sleep) state, or the auxiliary processor 123 may control at least some of the functions or states related to at least one of the components of the electronic device 101 (e.g., the display module 160, the sensor module 176, or the communication module 190) with the main processor 121 when the main processor 121 is in an active state (e.g., running an application). According to an embodiment, the auxiliary processor 123 (e.g., an image signal processor or a communication processor) may be implemented as part of another component (e.g., the camera module 180 or the communication module 190) functionally related to the auxiliary processor 123. According to an embodiment, the auxiliary processor 123 (e.g., a neural processing unit) may include hardware structures dedicated to artificial intelligence model processing. The artificial intelligence model may be generated by machine learning. Such learning may be performed, for example, by the electronic device 101 where artificial intelligence is performed or via a separate server (e.g., server 108). The learning algorithm may include, but is not limited to, for example, supervised learning, unsupervised learning, semi-supervised learning, or reinforcement learning. The artificial intelligence model may include a plurality of artificial neural network layers. The artificial neural network may be a deep neural network (Deep Neural Network, DNN), a convolutional neural network (Convolutional Neural Network, CNN), a recurrent neural network (Recurrent Neural Network, RNN), a boltzmann machine limited (Restricted Boltzmann Machine, RBM), a deep belief network (Deep Belief Network, DBN), a bi-directional recurrent deep neural network (Bidirectional Recurrent Deep Neural Network, BRDNN), or a deep Q network, or a combination of two or more thereof, but is not limited thereto. Additionally or alternatively, the artificial intelligence model may include software structures in addition to hardware structures.
The memory 130 may store various data used by at least one component of the electronic device 101 (e.g., the processor 120 or the sensor module 176). The various data may include, for example, software (e.g., program 140) and input data or output data for commands associated therewith. Memory 130 may include volatile memory 132 or nonvolatile memory 134.
The program 140 may be stored as software in the memory 130, and the program 140 may include, for example, an Operating System (OS) 142, middleware 144, or an application 146.
The input module 150 may receive commands or data from outside the electronic device 101 (e.g., a user) to be used by other components of the electronic device 101 (e.g., the processor 120). The input module 150 may include, for example, a microphone, a mouse, a keyboard, keys (e.g., buttons) or a digital pen (e.g., a stylus).
The sound output module 155 may output a sound signal to the outside of the electronic device 101. The sound output module 155 may include, for example, a speaker or a receiver. Speakers may be used for general purposes such as playing multimedia or playing a record. The receiver may be used to receive an incoming call. Depending on the embodiment, the receiver may be implemented separate from the speaker or as part of the speaker.
The display module 160 may visually provide information to the outside (e.g., user) of the electronic device 101. The display module 160 may include, for example, a display, a holographic device, or a projector, and a control circuit for controlling a corresponding one of the display, the holographic device, and the projector. According to an embodiment, the display module 160 may comprise a touch sensor adapted to detect a touch or a pressure sensor adapted to measure the strength of the force caused by a touch.
The audio module 170 may convert sound into electrical signals and vice versa. According to an embodiment, the audio module 170 may obtain sound via the input module 150, or output sound via the sound output module 155 or headphones of an external electronic device (e.g., the electronic device 102) that is directly (e.g., wired) or wirelessly connected to the electronic device 101.
The sensor module 176 may detect an operational state (e.g., power or temperature) of the electronic device 101 or an environmental state (e.g., a state of a user) external to the electronic device 101 and then generate an electrical signal or data value corresponding to the detected state. According to an embodiment, the sensor module 176 may include, for example, a gesture sensor, a gyroscope sensor, an atmospheric pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an Infrared (IR) sensor, a biometric sensor, a temperature sensor, a humidity sensor, or an illuminance sensor.
Interface 177 can support one or more specific protocols that will be used to connect electronic device 101 with an external electronic device (e.g., electronic device 102) directly (e.g., wired) or wirelessly. According to an embodiment, the interface 177 may include, for example, a high-definition multimedia interface (High Definition Multimedia Interface, HDMI), a universal serial bus (Universal Serial Bus, USB) interface, a Secure Digital (SD) card interface, or an audio interface.
The connection terminal 178 may include a connector via which the electronic device 101 may be physically connected with an external electronic device (e.g., the electronic device 102). According to an embodiment, the connection terminal 178 may include, for example, an HDMI connector, a USB connector, an SD card connector, or an audio connector (e.g., a headphone connector).
The haptic module 179 may convert the electrical signal into a mechanical stimulus (e.g., vibration or motion) or an electrical stimulus that may be recognized by the user via his sense of touch or kinesthetic sense. According to an embodiment, the haptic module 179 may include, for example, a motor, a piezoelectric element, or an electrostimulator.
The camera module 180 may capture still images or moving images. According to an embodiment, the camera module 180 may include one or more lenses, an image sensor, an image signal processor, or a flash.
The power management module 188 may manage power supply to the electronic device 101. According to an embodiment, the power management module 188 may be implemented as at least part of, for example, a power management integrated circuit (Power Management Integrated Circuit, PMIC).
Battery 189 may power at least one component of electronic device 101. According to an embodiment, battery 189 may include, for example, a non-rechargeable primary battery, a rechargeable secondary battery, or a fuel cell.
The communication module 190 may support establishing a direct (e.g., wired) communication channel or a wireless communication channel between the electronic device 101 and an external electronic device (e.g., the electronic device 102, the electronic device 104, or the server 108) and performing communication via the established communication channel. The communication module 190 may include one or more communication processors capable of operating independently of the processor 120 (e.g., an Application Processor (AP)) and supporting direct (e.g., wired) or wireless communication. According to an embodiment, the communication module 190 may include a wireless communication module 192 (e.g., a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (Global Navigation Satellite System, GNSS) communication module) or a wired communication module 194 (e.g., a local area network (Local Area Network, LAN) communication module or a Power line communication (Power-Line Communication, PLC) module). A respective one of these communication modules may communicate with external electronic devices via a first network 198 (e.g., a short-range communication network such as bluetooth, wireless fidelity (Wireless Fidelity, wi-Fi) direct or infrared data association (Infrared Data Association, irDA)) or a second network 199 (e.g., a long-range communication network such as a conventional cellular network, a 5G network, a next-generation communication network, the internet, or a computer network (e.g., a LAN or wide-area network (Wide Area Network, WAN)). These various types of communication modules may be implemented as a single component (e.g., a single chip), or may be implemented as multiple components (e.g., multiple chips) separate from each other. The wireless communication module 192 may identify and authenticate the electronic device 101 in a communication network, such as the first network 198 or the second network 199, using user information (e.g., international mobile subscriber identity (International Mobile Subscriber Identity, IMSI)) stored in the user identification module 196.
The wireless communication module 192 may support a 5G network following a 4G network as well as next generation communication technologies (e.g., new Radio (NR) access technologies). NR access technologies may support enhanced mobile broadband (Enhanced Mobile Broadband, eMBB), large-scale machine type communication (Massive Machine Type Communication, mctc), or Ultra-Reliable Low-latency communication (URLLC). The wireless communication module 192 may support a high frequency band (e.g., millimeter-wave band) to implement, for example, high data transmission rates. The wireless communication module 192 may support various techniques for ensuring performance over a high frequency band, such as, for example, beamforming, massive multiple-input multiple-output (massive MIMO), full-dimensional MIMO (FD-MIMO), array antennas, analog beamforming, or massive antennas. The wireless communication module 192 may support various requirements specified in the electronic device 101, an external electronic device (e.g., electronic device 104), or a network system (e.g., second network 199). According to an embodiment, the wireless communication module 192 may support a peak data rate (e.g., 20Gbps or greater) for implementing an eMBB, a lost coverage (e.g., 164dB or less) for implementing an emtc, or a U-plane delay (e.g., a round trip of 0.5ms or less, or 1ms or less for each of the Downlink (DL) and Uplink (UL)) for implementing a URLLC.
The antenna module 197 may transmit or receive signals or power to or from the outside of the electronic device 101 (e.g., an external electronic device). According to an embodiment, the antenna module 197 may include an antenna including a radiating element composed of a conductive material or conductive pattern formed in or on a substrate, such as a Printed Circuit Board (PCB). According to an embodiment, the antenna module 197 may include a plurality of antennas (e.g., array antennas). In this case, at least one antenna suitable for a communication scheme used in a communication network, such as the first network 198 or the second network 199, may be selected from the plurality of antennas, for example, by the communication module 190 (e.g., the wireless communication module 192). Signals or power may then be transmitted or received between the communication module 190 and the external electronic device via the selected at least one antenna. According to an embodiment, further components (e.g., radio frequency integrated circuits (Radio Frequency Integrated Circuit, RFICs)) other than radiating elements may additionally be formed as part of the antenna module 197.
According to various embodiments, antenna module 197 may form a millimeter wave antenna module. According to an embodiment, a millimeter wave antenna module may include a printed circuit board, a Radio Frequency Integrated Circuit (RFIC) disposed on a first surface (e.g., a bottom surface) of the printed circuit board or adjacent to the first surface and capable of supporting a specified high frequency band (e.g., a millimeter wave band), and a plurality of antennas (e.g., array antennas) disposed on a second surface (e.g., a top surface or a side surface) of the printed circuit board or adjacent to the second surface and capable of transmitting or receiving signals of the specified high frequency band.
At least some of the above components may be connected to each other and communicatively communicate signals (e.g., commands or data) between them via an inter-peripheral communication scheme (e.g., bus, general Purpose Input Output (GPIO), serial peripheral interface (Serial Peripheral Interface, SPI), or mobile industrial processor interface (Mobile Industrial Processor Interface, MIPI)).
According to an embodiment, commands or data may be sent or received between the electronic device 101 and the external electronic device 104 via the server 108 connected to the second network 199. Each of the electronic device 102 or the electronic device 104 may be the same type of device as the electronic device 101 or a different type of device from the electronic device 101. According to an embodiment, all or some of the operations to be performed at the electronic device 101 may be performed at one or more of the external electronic device 102, the external electronic device 104, or the server 108. For example, if the electronic device 101 should automatically perform a function or service or should perform a function or service in response to a request from a user or another device, the electronic device 101 may request the one or more external electronic devices to perform at least part of the function or service instead of or in addition to the function or service, or the electronic device 101 may request the one or more external electronic devices to perform at least part of the function or service. The one or more external electronic devices that received the request may perform the requested at least part of the function or service or perform another function or another service related to the request and transmit the result of the performing to the electronic device 101. The electronic device 101 may provide the result as at least a partial reply to the request with or without further processing of the result. For this purpose, for example, cloud computing technology, distributed computing technology, mobile edge computing (Mobile Edge Computing, MEC) technology, or client-server computing technology may be used. The electronic device 101 may provide ultra-low latency services using, for example, distributed computing or mobile edge computing. In another embodiment, the external electronic device 104 may include an internet of things (Internet of Things, ioT) device. Server 108 may be an intelligent server using machine learning and/or neural networks. According to an embodiment, the external electronic device 104 or the server 108 may be included in the second network 199. The electronic device 101 may be applied to smart services (e.g., smart home, smart city, smart car, or healthcare) based on 5G communication technology or IoT-related technology.
The electronic device according to various embodiments may be one of various types of electronic devices. The electronic device may include, for example, a portable communication device (e.g., a smart phone), a computer device, a portable multimedia device, a portable medical device, a camera, a wearable device, or a household appliance. According to the embodiments of the present disclosure, the electronic devices are not limited to those described above.
It should be understood that the various embodiments of the disclosure and the terminology used therein are not intended to limit the technical features set forth herein to the particular embodiments, but rather include various modifications, equivalents or alternatives to the respective embodiments. For the description of the drawings, like reference numerals may be used to refer to like or related elements. It will be understood that a noun in the singular, corresponding to a term, may include one or more things, unless the context clearly indicates otherwise. As used herein, each of the phrases such as "a or B", "at least one of a and B", "at least one of a or B", "A, B or C", "at least one of A, B and C", and "at least one of A, B or C" may include any or all possible combinations of items listed with a respective one of the plurality of phrases. As used herein, terms such as "1 st" and "2 nd" or "first" and "second" may be used to simply distinguish a corresponding component from another component and not to otherwise limit the components (e.g., importance or order). It will be understood that if the terms "operatively" or "communicatively" are used or the terms "operatively" or "communicatively" are not used, then if an element (e.g., a first element) is referred to as being "coupled to," "connected to," or "connected to" another element (e.g., a second element), it is meant that the element can be directly (e.g., wired) connected to, wirelessly connected to, or connected to the other element via a third element.
As used in connection with various embodiments of the present disclosure, the term "module" may include an element implemented in hardware, software, or firmware, and may be used interchangeably with other terms (e.g., "logic," "logic block," "portion," or "circuitry"). A module may be a single integrated component adapted to perform one or more functions or a minimal unit or portion of the single integrated component. For example, according to an embodiment, a module may be implemented in the form of an Application-specific integrated circuit (ASIC).
The various embodiments set forth herein may be implemented as software (e.g., program 140) comprising one or more instructions stored in a storage medium (e.g., internal memory 136 or external memory 138) that may be read by a machine (e.g., electronic device 101). For example, under control of a processor, a processor (e.g., processor 120) of the machine (e.g., electronic device 101) may invoke and execute at least one instruction of the one or more instructions stored in the storage medium with or without the use of one or more other components. This enables the machine to operate to perform at least one function in accordance with the at least one instruction invoked. The one or more instructions may include code generated by a compiler or code capable of being executed by an interpreter. The machine-readable storage medium may be provided in the form of a non-transitory storage medium. Wherein the term "non-transitory" merely means that the storage medium is a tangible device and does not include a signal (e.g., electromagnetic waves), but the term does not distinguish between data being semi-permanently stored in the storage medium and data being temporarily stored in the storage medium.
According to embodiments, methods according to various embodiments of the present disclosure may be included and provided in a computer program product. The computer program product may be used as a product for conducting transactions between sellers and buyers. The computer program product may be distributed in the form of a machine-readable storage medium (e.g., compact disk read Only Memory (CD-ROM)), or may be distributed via an application Store (e.g., a Play Store) TM ) The computer program product may be published (e.g., downloaded or uploaded) online, or may be distributed (e.g., downloaded or uploaded) directly between two user devices (e.g., smartphones). At least a portion of the computer program product may be temporarily generated if published online, or at least a portion of the computer program product may be at least temporarily stored on a machine-readable storage medium (such as the store of a manufacturer's server, an application store's server, or a forwarding serverReservoir).
According to various embodiments, each of the above-described components (e.g., a module or a program) may include a single entity or a plurality of entities, and some of the plurality of entities may be separately provided in different components. According to various embodiments, one or more of the above components may be omitted, or one or more other components may be added. Alternatively or additionally, multiple components (e.g., modules or programs) may be integrated into a single component. In this case, according to various embodiments, the integrated component may still perform the one or more functions of each of the plurality of components in the same or similar manner as the corresponding one of the plurality of components performed the one or more functions prior to integration. According to various embodiments, operations performed by a module, a program, or another component may be performed sequentially, in parallel, repeatedly, or in a heuristic manner, or one or more of the operations may be performed in a different order or omitted, or one or more other operations may be added.
Fig. 2 shows a space 210 in which a system for measuring the position of an electronic device 201 by using UWB is established according to an embodiment.
Referring to fig. 2, a user (e.g., user a or user B) may move an electronic device 201 in a space 210. When moving in the space 210 in which a system for measuring the position of the electronic device 201 (or the user of the electronic device 201) by using UWB is established, the user of the electronic device 201 (e.g., user a or user B) may experience content corresponding to objects (e.g., the first object 211-1, the second object 212-1, the third object 213-1) located in the space 210. For example, when moving in an exhibition hall in which the position of the electronic device 201 (or the user of the electronic device 201) can be measured by using UWB, the user can experience content (e.g., multimedia content) corresponding to the exhibition work exhibited in the exhibition hall. In the present disclosure, it is assumed that the user moves in the space 210 while carrying the electronic device 201, and thus the position of the electronic device 201 and the position of the user can be considered to be substantially the same as each other.
According to an embodiment, an indoor positioning system for measuring a position of an electronic device 201 by using UWB may include a plurality of UWB anchors (not shown), a UWB communication module (e.g., UWB communication module 320 of fig. 3), and a server (not shown) (e.g., server 401 of fig. 4) providing an indoor positioning solution. The indoor positioning system may not be limited to the above-described components. The electronic device 201 comprising a UWB communication module may be understood as an object to be located by using UWB.
Depending on the embodiment, multiple UWB anchors may be installed in the space 210 as shown in fig. 2, or may be otherwise suitable. The plurality of UWB anchors may exchange UWB signals with a UWB communication module of the electronic device 201 (e.g., UWB communication module 320 of fig. 3). The plurality of UWB anchors may provide information related to the location of the electronic device 201 to a server (e.g., server 401 of fig. 4) that provides an indoor positioning solution based on the exchanged UWB signals. The information related to the location of the electronic device 201 may include coordinate information corresponding to the location of the electronic device 201. The plurality of UWB anchors may be wired or wirelessly connected to the server and may provide information to the server related to the location of the electronic device 201, either wired or wirelessly.
According to an embodiment, the server may determine the location of the electronic device 201 and/or the area in which the electronic device 201 is located by comparing information related to the location of the electronic device 201 received from a plurality of UWB anchors with map information about the space 210. The server may send information to the electronic device 201 regarding the determined location of the electronic device 201 and/or the area in which the electronic device 201 is located.
According to an embodiment, based on information received from the server regarding the location of the electronic device 201 and/or the region in which the electronic device 201 is located, the electronic device 201 may play (which may include a first play or a play at another time after the first time) multimedia content corresponding to the region in which the electronic device 201 is located among the plurality of regions included in the space 210. For example, a user a holding the electronic device 201 may move within the space 210 and based on information received from the server regarding the location of the electronic device 201 and/or the region in which the electronic device 201 is located, the electronic device 201 may play (which may include a first play or a play at another time after the first time) first multimedia content corresponding to the first region 211 in which the electronic device 201 (or the user a) is located. The first multimedia content may be understood as content including information about the object 211-1 included in the first area 211. In another example, a user B holding the electronic device 201 may move within the space 210 and based on information received from the server regarding the location of the electronic device 201 and/or information regarding the region in which the electronic device 201 is located, the electronic device 201 may play (which may include first play or play at another time after the first) second multimedia content corresponding to the second region 212 in which the electronic device 201 (or user B) is located. The second multimedia content may be understood as content including information about the object 212-1 included in the second area 212. In another example, a user C holding the electronic device 201 may move within the space 210 and based on information received from the server regarding the location of the electronic device 201 and/or information regarding the region in which the electronic device 201 is located, the electronic device 201 may play (which may include first play or play at another time after the first time) third multimedia content corresponding to a third region 213 in which the electronic device 201 (or the user C) is located. The third multimedia content may be understood as content including information about the object 213-1 included in the third area 213. The multimedia content (e.g., first multimedia content, second multimedia content, third multimedia content) may include voice, text, images, augmented reality (augmented reality, AR), and/or video content, among others.
According to an embodiment, the electronic device 201 may obtain user input regarding an object (e.g., the first object 211-1, the second object 212-1, the third object 213-1) or multimedia content corresponding to the object. For example, the electronic apparatus 201 may acquire a user input of the user C to photograph the object 213-1 included in the third area 213. User input regarding an object or multimedia content corresponding to the object may be understood as interactive information for evaluating the user's interest level in the object. The interaction information may include information on whether photographing is performed, whether the user has an experience in AR, whether there is an operation and input of the user, time, and the number of times.
According to an embodiment, the electronic device 201 may be referred to as at least one mobile terminal device including the UWB communication module 320 among a smart phone, a smart tablet, a smart notebook, a smart tag, or a smart watch (smart gear). The electronic device 201 is not limited to the above example, and may include a communication device that can exchange UWB signals with a plurality of UWB anchors by using the UWB communication module 320, and may perform communication with a server.
Fig. 3 is a block diagram of an electronic device 201 according to an embodiment.
Referring to fig. 3, the electronic device 201 may include a processor 310, a UWB communication module 320, a communication module 330, a display 340, and a speaker 350. The components included in the electronic device 201 may not be limited to the components shown in fig. 3 (e.g., the processor 310, UWB communication module 320, communication module 330, display 340, and speaker 350). The components of the electronic device 201 shown in fig. 3 may be replaced with other components or additional components may be added to the electronic device 201. For example, a partial description of the electronic device 101 of fig. 1 may be applied to the electronic device 201 of fig. 3. In another example, the electronic device 201 may also include memory, at least one sensor (e.g., a proximity sensor, a geomagnetic sensor, an illuminance sensor, an IR sensor, a gyro sensor), and/or a camera.
According to an embodiment, the processor 310 may execute instructions stored in the memory to control the operation of the components of the electronic device 201 (e.g., the UWB communication module 320, the communication module 330, the display 340, and the speaker 350). The processor 310 may be electrically and/or operatively connected to the UWB communication module 320, the communication module 330, the display 340, and the speaker 350. The processor 310 may run software to control at least one other component (e.g., UWB communication module 320, communication module 330, display 340, and speaker 350) connected to the processor 310. The processor 310 may acquire commands from components included in the electronic device 201, and may interpret the acquired commands, and may process and/or calculate various data according to the interpreted commands.
According to an embodiment, UWB communication module 320 may support performing communication between electronic device 201 and an external device (e.g., server 401) by transmitting or receiving UWB signals. The UWB communication module 320 may support performing communication with the electronic device 201 by using an ultra wideband frequency of substantially 500MHz or higher. The electronic device 201 may exchange ultra-wideband signals (e.g., UWB signals) with UWB anchors installed in the space 210 through the UWB communication module 320.
According to an embodiment, the communication module 330 may support performing communication between the electronic device 201 (e.g., a smart phone) and an external device (e.g., the server 401 or another electronic device 230) by using wired communication or wireless communication (e.g., bluetooth (BT), bluetooth low energy (Bluetooth low energy, BLE), wi-Fi). For example, the electronic device 201 may send multimedia content to the server 401 through the communication module 330. In another example, the electronic device 201 may receive linking information about multimedia content from the server 401 through the communication module 330. In another example, the electronic device 201 may send link information about the multimedia content to another electronic device 230 through the communication module 330.
According to an embodiment, the display 340 may visually display (provide or output) content. For example, the display 340 may visually display multimedia content (e.g., content in the form of text, images, and/or video). In another example, the display 340 may display a User Interface (UI) related to link information about the multimedia content. The UI related to the link information may include QR code or URL information.
According to an embodiment, the display 340 may comprise a deformable display that has flexibility to become flexible, foldable, bendable, rollable, or stretchable. For example, the display 340 may include a flexible display, a foldable display, or a stretchable display.
According to an embodiment, the display 340 may include at least one sensor. For example, the display 340 may include a touch sensor or a pressure sensor. The electronic device 201 may obtain user input (e.g., touch input) by using a touch sensor via the display 340.
According to an embodiment, the speaker 350 may output audio or sound corresponding to audio data. For example, the electronic device 201 may output audio or sound corresponding to audio data of the multimedia content through the speaker 350.
Fig. 4 illustrates an indoor positioning system 400 according to an embodiment.
Referring to fig. 4, an indoor positioning system 400 may include an electronic device 201, a server 401, and a UWB anchor 403. The components of the indoor positioning system 400 are not limited to the above examples.
According to an embodiment, the electronic device 201 may exchange UWB signals with UWB anchors 403 through UWB communication module 320. UWB refers to a short range wireless communication technology that uses a wideband frequency of substantially 500MHz or higher, and can measure distance by using pulses of about 2 nanoseconds in length.
According to an embodiment, UWB anchor 403 may determine a distance between UWB anchor 403 and electronic device 201 based on the UWB signal exchanged with electronic device 201. UWB anchor 403 may send information about the determined distance to server 401.
According to an embodiment, the server 401 may provide an indoor positioning solution by using UWB. The server 401 may receive information about the distance from the UWB anchor 403. The server 410 may receive information from the plurality of UWB anchors regarding respective distances between the electronic device 201 and the plurality of UWB anchors, although this is not shown in fig. 4. The server 401 may determine an area (e.g., the first area 211, the second area 212, the third area 213 of fig. 2) in the space 210 in which the electronic device 201 is located based on a result of comparing the information about the distance with the map information provided by the map solution 411. The map information provided by the map solution 411 may include map information defining a plurality of areas included in the space 210.
According to an embodiment, the server 401 may provide the electronic device 201 with multimedia content corresponding to the determined area in which the electronic device 201 is located. For example, when the electronic device 201 is located in a first area (e.g., the first area 211 of fig. 2), the server 401 may transmit the first multimedia content corresponding to the first area 211 to the electronic device 201. In another example, when the electronic device 201 is located in a second region (e.g., the second region 212 of fig. 2) or a third region (e.g., the third region 213 of fig. 2), the server 401 may send the second multimedia content or the third multimedia content, respectively, to the electronic device 201.
According to one embodiment, the electronic device 201 may play (which may include playing for the first time or at another time after the first time) the multimedia content received from the server 401. For example, the electronic device 201 may display visual information (e.g., text, images, video) about the multimedia content through the display 340 and may output audible information (e.g., guide voice) about the multimedia content through the speaker 350. For example, the electronic device 201 may display information about an exhibition work (e.g., an image of the work, an explanation about the work) exhibited in an exhibition hall through the display 340, and may output information about the exhibition work (e.g., guide voice) in an audio form through the speaker 350.
According to an embodiment, the ADMIN management 412 may provide a function of controlling settings related to multimedia content provided from the server 401 to the electronic device 201 or map information about the space 210. For example, the ADMIN management 412 may provide a function that enables an administrator to determine the type of multimedia content corresponding to the area in which the electronic device 201 is located, and may provide a function that enables an administrator to determine whether to automatically or manually change the multimedia content to another multimedia content according to the changed location of the electronic device 201.
Fig. 5 shows a flowchart of operations for transmitting user multimedia content by the electronic device 201 to the server 401, according to an embodiment.
A series of operations explained below may be performed simultaneously by the electronic device 201, or may be performed in a different order, or some operations may be omitted or added.
In operation 501, the electronic device 201 may determine a location of the electronic device 201 based on UWB signals exchanged with UWB anchors (e.g., UWB anchor 403 of fig. 4, which is an example of one of the plurality of UWB anchors of fig. 2) through the UWB communication module 320. When the electronic device 201 is located in the space 210 where the UWB anchor 403 is installed, the electronic device 201 may exchange UWB signals with the UWB anchor 403 through the UWB communication module 320. The UWB anchor 403 may determine a distance between the electronic device 201 and the UWB anchor 403 based on the UWB signal received from the electronic device 201. The UWB anchor 403 may send information about the determined distance between the electronic device 201 and the UWB anchor 403 to the server 401. The server 401 may receive information about respective distances between the electronic device 201 and the plurality of UWB anchors from the plurality of UWB anchors including UWB anchor 403. The server 401 may determine the location of the electronic device 201 by using information about the distance. The server 401 may determine the location of the electronic device 201 in real time and may be updated. The server 401 may determine the area in the space 210 in which the electronic device 201 is located based on the determined location of the electronic device 201 and the map information provided by the map solution 411. The server 401 may send information about the location of the electronic device 201 or the area in the space 210 where the electronic device 201 is located to the electronic device 201.
According to an embodiment, the electronic device 201 may determine the location of the electronic device 201 by using at least one of the UWB communication module 320, at least one sensor, or a camera. The at least one sensor may include a proximity sensor, a geomagnetic sensor, an illuminance sensor, or an IR sensor.
According to an embodiment, the electronic device 201 may receive information about a distance between the electronic device 201 and the UWB anchor 403 and map information about the space 210 from the server 401. The electronic device 201 may determine the location of the electronic device 201 based on the received information about the distance. The electronic device 201 may determine an area in which the electronic device 201 is located based on a result of comparing the location with the map information. The electronic device 201 may determine the location of the electronic device 201 and the area in which the electronic device 201 is located in real time.
In operation 503, the electronic device 201 may play (which may include playing for the first time or at another time after the first time) multimedia content corresponding to the region in which the electronic device 201 is located. For example, when the electronic device 201 is located in a first region (e.g., the first region 211 of fig. 2), the electronic device 201 may play the first multimedia content corresponding to the first region.
In operation 505, the electronic device 201 may acquire interest information corresponding to an element for determining an interest level of content played back to the electronic device 201. The interest information may include user location information, content play information, and interaction information according to table 1 presented below. For example, the interest information may include first information regarding a time that the electronic device 201 stays in a specific area, second (play) information related to the play of the content, and/or third information related to the user's interaction with the content.
TABLE 1
According to an embodiment, the location information of the user may be understood to be substantially the same as the location information of the electronic device 201. According to an embodiment, the electronic device 201 may determine the user's interest level in the multimedia content played by the electronic device 201 based on the interest information. For example, the electronic device 201 may measure the time the user stays in the specific area by measuring the time the user enters the specific area and the time the user leaves the specific area. The electronic device 201 may determine that the longer the user stays, the higher the user's interest in the multimedia content corresponding to the particular region. According to an embodiment, the electronic device 201 may determine that the user has a higher interest in the played multimedia content than the non-played multimedia content. According to an embodiment, the electronic device 201 may determine multimedia content having a long play time or a high play percentage as multimedia content of high user interest. According to an embodiment, the electronic device 201 may determine the multimedia content on which the interaction is performed as multimedia content of high user interest. The electronic device 201 may determine the multimedia content on which the interaction is performed for a long time (e.g., relative to another interaction) or the manipulation is performed a plurality of times as the multimedia content of high user interest.
According to an embodiment, based on a real-time or near real-time determined location of the electronic device 201, the electronic device 201 may determine a first time at which the electronic device 201 enters a particular region (e.g., the first region 211, the second region 212, the third region 213) included in the space 210 and a second time at which the electronic device exits therefrom. The electronic device 201 may determine a time (referred to as a "dwell time") for which the electronic device 201 is staying in a particular area based on a difference between the first time and the second time. The electronic device 201 may determine a user's interest level in an object included in a specific area or multimedia content corresponding to the object based on the determined stay time. The user's interest level in the object located in the specific area and the multimedia content corresponding to the object may be proportional to the time the electronic device 201 stays in the specific area. For example, when the electronic device 201 remains in the first region 211 for a longer time than the electronic device 201 remains in the second region 212, the electronic device 201 may determine that the user's interest in the object included in the first region 211 is higher than the user's interest in the object included in the second region 212.
According to an embodiment, the play information may include information on whether the multimedia content corresponding to the plurality of areas included in the space 210 is played, a play time, and information on a ratio of an actual play time (i.e., how long the play was watched) to a total play time (i.e., a total duration of the played multimedia content).
According to an embodiment, the electronic device 201 may determine the user's interest level in the object included in the specific area or the content corresponding to the object based on the play information. The electronic device 201 may determine that the user's interest in the played content is higher than the user's interest in the non-played content. Similarly, the electronic device 201 may determine that the user's interest in content played multiple times is higher than the user's interest in content not played or played once. The electronic device 201 may determine that the user's interest in content having a long play time is higher than the user's interest in content having a short play time. The electronic device 201 may determine that the ratio of the actual playing time of the content to the total playing time is proportional to the user's interest level in the content. For example, the electronic device 201 may determine that the user's interest in content whose ratio of the actual playing time to the total playing time of the content is high is higher than that of content whose user contrast ratio is small.
According to an embodiment, the user's interactions may refer to various actions of the user with respect to the content or the object corresponding to the content. For example, the interaction may include searching multimedia content regarding the object (e.g., the first object 211-1, the second object 212-1, the third object 213-1 of fig. 2), capturing the object by a camera (e.g., the camera module 180), or experiencing Augmented Reality (AR) content related to the object. The information related to the user interaction may include information about whether the user performs the interaction, the execution time, and/or the number of executions.
In operation 507, the electronic device 201 may generate user multimedia content based on the interest information. The electronic device 201 may generate the user multimedia content based on the interest information in response to obtaining user input on a display (e.g., display 861 of fig. 8) for generating the user multimedia content.
Referring to table 2, a method of generating user multimedia content by the electronic device 201 when there is one space 210 or two or more spaces 2 will be described.
TABLE 2
According to an embodiment, when the number of multimedia contents played by the electronic device 201 is less than a designated number (e.g., 5), the electronic device 201 may omit the generation of the user multimedia contents regardless of the number of spaces 210, and may display a notification for recommending the play of the multimedia contents. According to an embodiment, when the number of multimedia contents played by the electronic device 201 is greater than or equal to a specified number and the number of multimedia contents played by substantially 100% (or another suitable threshold play amount, such as substantially 80%) is less than the specified number, the electronic device 201 may calculate the play time of the played multimedia contents in percentage, may select the specified number of multimedia contents, and may generate the user multimedia contents by using the selected multimedia contents, regardless of the number of spaces 210. The electronic device 201 may calculate the play time of the multimedia content in percentages by dividing the actual play time of the multimedia content by the total play time. According to an embodiment, when the number of multimedia contents played by the electronic device 201 is greater than or equal to a specified number and one space 210 exists, the electronic device 201 may randomly select the specified number of multimedia contents from among the multimedia contents played by the electronic device 201 and may generate the user multimedia contents by using the selected multimedia contents. According to an embodiment, when the number of multimedia contents played by the electronic device 201 is greater than or equal to a specified number and there are two or more spaces 210, the electronic device 201 may select at least one multimedia content in each space. The number of multimedia contents selected in each space may be equal to a designated number.
According to an embodiment, the electronic device 201 may select at least one content from a plurality of contents including multimedia content played by the electronic device 201 based on the interest information. For example, the electronic device 201 may select as many content of high user interest as the number (e.g., 5) set by the user from the played content based on the interest information. The electronic device 201 (or another suitable device, such as the server 108, the electronic device 104, etc.) may generate user multimedia content by using at least one selected content. User multimedia content may be understood as multimedia content. For example, there may be pieces of information corresponding to the content played by the electronic device 201, and the user multimedia content may be understood as an image or video content continuously displaying images. The information may include an image, title, subtitle, production date, location, time or date of playing the content of the object corresponding to the played content. According to an embodiment, the electronic device 201 may determine whether to generate the user multimedia content according to the amount of content played by the electronic device 201. For example, when the number of played contents is smaller than the set number (e.g., 5), the electronic device 201 may omit the generation of the user multimedia contents. When the number of played contents is smaller than the set number, the electronic device 201 may display a user interface (e.g., user interface 901a of fig. 9) that displays guide information indicating that the user multimedia contents are not generated through the display 340. In another example, the electronic device 201 may generate the user multimedia content when the number of content played is greater than a set number (e.g., 5). When the number of content played is greater than the set number, the electronic device 201 (or another suitable device, such as the server 108, the electronic device 104, etc.) may generate user multimedia content that continuously displays images (e.g., images displayed on the user interfaces 902, 903, 904 of fig. 9) corresponding to the played content via the display 340. The electronic device 201 may generate a data file by performing multimedia generation and capturing or recording the generated user multimedia content.
According to an embodiment, when the number of played contents is greater than the set number (e.g., 5), the electronic device 201 may determine a method of selecting at least one content selected to generate the user multimedia content based on the number of contents actually played for a total play time (e.g., substantially 100% played contents). For example, when the number of contents played is greater than the set number (e.g., 5), and the number of contents actually played for the total play time is smaller than the set number (e.g., 5), the electronic device 201 may select the set number of contents (e.g., 5) having a high play ratio obtained by dividing the play time of the played contents by the total play time. The electronic device 201 (or another suitable device, such as the server 108, the electronic device 104, etc.) may generate user multimedia content by using a selected set amount of content.
According to the embodiment, when the number of contents played is greater than the set number and the number of contents actually played for the total play time is greater than the set number, the electronic device 201 may randomly select the set number of contents from the contents actually played for the total play time.
According to the embodiment, when the number of contents played is greater than the set number and the number of contents actually played for the total play time is greater than the set number, the electronic device 201 may select the set number of contents having a high interest level based on the interest information. The electronic device 201 (or another suitable device, such as the server 108, the electronic device 104, etc.) may generate user multimedia content by using the selected content.
According to the embodiment, when a place where indoor positioning can be performed by using the UWB method includes a plurality of spaces, for example, when there are a plurality of spaces 210 shown in fig. 2, and when the number of contents actually played for the total play time is greater than a set number, the electronic device 201 may select at least one content corresponding to each of the plurality of spaces based on the interest information. For example, when the venue includes a first space and a second space, the electronic device 201 may select at least one content corresponding to the first space and may select at least one content corresponding to the second space in order to generate the user multimedia content. The sum of the numbers of contents respectively selected for the plurality of spaces may be the same as the set number. For example, when the set number is 5, the electronic device 201 may select three contents corresponding to the first space, and may select two contents corresponding to the second space. The electronic device 201 (or another suitable device, such as the server 108, the electronic device 104, etc.) may generate user multimedia content by using the selected content.
According to an embodiment, when a place where indoor positioning can be performed by the UWB method includes a plurality of spaces and the number of contents played by the electronic device 201 is smaller than a set number (e.g., 5), the electronic device 201 may omit generation of the user multimedia contents.
According to the embodiment, when a place where indoor positioning can be performed by using the UWB method includes a plurality of spaces, and when the number of contents played is greater than a set number (e.g., 5) and the number of contents played for a total play time is less than the set number, the electronic device 201 can select the content of the set number (e.g., 5) having a high play ratio obtained by dividing the play time of the played contents by the total play time. The electronic device 201 (or another suitable device, such as the server 108, the electronic device 104, etc.) may generate user multimedia content by using the selected content.
In operation 509, the electronic device 201 may transmit the generated user multimedia content to the server 401. For example, the electronic device 201 may generate a data file by performing multimedia generation and capturing or recording with respect to the generated user multimedia content. The electronic device 201 may send the generated data file to the server 401.
Fig. 6a illustrates a space (e.g., space 210) including multiple regions according to an embodiment. Fig. 6b illustrates a space (e.g., space 210) including multiple regions according to an embodiment.
The view 620 of fig. 6a shows the areas (e.g., the first target area 610, the second target area 611, the third target area 612, the fourth target area 613) divided by a distance from the center of the target 601, and the view 621 of fig. 6b shows the third area arranged in the second target area 611 in various ways.
Referring to view 620 of fig. 6a, a target 601 may exist in a space (e.g., space 210 of fig. 2) providing a system capable of measuring a position of an electronic device 201 by using UWB, and may be referred to as an object (e.g., first object 211-1, second object 212-1, third object 213-1 of fig. 2) having corresponding multimedia content. For example, target 601 may exist in an exhibition hall that provides a system capable of measuring the location of electronic device 201 using UWB, and may be understood as an exhibition work with corresponding multimedia content. In another example, the target 601 may be present in a brand store that provides a system capable of measuring the location of the electronic device 201 using UWB, and may be understood as a brand product (e.g., clothing, shoes, or bags) with corresponding multimedia content.
According to an embodiment, the plurality of regions may be determined according to a distance from the target 601, and the distance may be determined according to a setting of an administrator.
According to an embodiment, an administrator of space 210 may determine an area a first distance from target 601 as fourth target area 613 through ADMIN management 412 of server 401. For example, the administrator may set the fourth target area 613 such that when the electronic device 201 is located in the fourth target area 613, the electronic device 201 plays the content corresponding to the target 601 or displays a screen to display detailed explanation information about the target 601. When the electronic device 201 is located in the fourth target area 613, the electronic device 201 may play content corresponding to the target 601, or may display a screen (e.g., the fourth user interface 704a and/or the fifth user interface 705a of fig. 7) to display detailed explanation information about the target 601.
According to an embodiment, the administrator may determine an area from the first distance to the second distance to the target 601 as the third target area 612. For example, the administrator may set the third target area 612 such that when the electronic device 201 is located in the third target area 612, the electronic device 201 displays a screen to display an area of the space 210 (e.g., the first area 211, the second area 212, the third area 213 of fig. 2). When the electronic device 201 is located in the third target area 612, the electronic device 201 may display a screen (e.g., the third user interface 703a of fig. 7) to display an area of the space 210.
According to an embodiment, the administrator may determine the area from the second distance to the third distance to the target 601 as the second target area 611. For example, the administrator may set the second target area 611 such that when the electronic device 201 is located in the second target area 611, the electronic device 201 displays a screen to display common information of the space 210. When the electronic device 201 is located in the second target area 611, the electronic device 201 may display a screen (e.g., the second user interface 702a of fig. 7) to display common information of the space 210.
According to an embodiment, the administrator may determine an area longer than the third distance from the target 601 as the first target area 610. For example, the administrator may set the first target area 610 such that the electronic device 201 displays a home screen with respect to the space 210 when the electronic device 201 is located in the first target area 610. When the electronic device 201 is located in the first target area 610, the electronic device 201 may display a home screen (e.g., the first user interface 701a of fig. 7) of the space 210.
According to an embodiment, the second distance is longer than the first distance, and the third distance may be longer than the second distance.
According to an embodiment, the electronic device 201 may provide different user interfaces through the display 340 depending on the target area in which the electronic device 201 is located. This will be described in detail with reference to fig. 7.
According to an embodiment, when the electronic device 201 is located in the fourth target area 613, the electronic device 201 may play the multimedia content corresponding to the target 601.
Referring to the view 621 of fig. 6B, the second target region 611 may include two third target regions 612A, 612B that do not overlap, or may include two third target regions 612-1, 612-2 that overlap each other. The arrangement relation between the target areas is not limited to the shape shown in fig. 6b, and the target areas may be arranged in various shapes.
Fig. 7 shows a user interface displayed according to the location of the electronic device 201 according to an embodiment.
Hereinafter, user interfaces respectively displayed in response to the target areas when the electronic device 201 is located in the target areas shown in view 620 of fig. 6a will be described.
The view 701 of fig. 7 shows a first user interface 701a displayed when the electronic device 201 is located in the first target area 610 of fig. 6 a. The first user interface 701a may be understood as a basic screen of the application. An application may be understood as an application for providing information about the location of the electronic device 201 and the content corresponding to the location.
The view 702 of fig. 7 shows a second user interface 702a displayed when the electronic device 201 is located in the second target area 611 of fig. 6a or 6 b. According to the change in the position of the electronic device 201, the electronic device 201 may change the first user interface 701a to the second user interface 702a, and may display the second user interface. For example, when the position of the electronic device 201 moves from the first target area 610 to the second target area 611 and a specified time (e.g., N seconds) elapses, the electronic device 201 may change the first user interface 701a to the second user interface 702a and may display the second user interface.
According to an embodiment, the second user interface 702a may include a first portion 711 displaying information about the location of the electronic device 201, a second portion 713 displaying a map to display the current location 721 of the electronic device 201, a third portion 715 displaying an image of an object located near the electronic device 201 and corresponding to reproducible content, and a fourth portion 717 displaying text and played voice corresponding to a guide suitable for each step and each context.
According to an embodiment, the first portion 711 may display information about the location of the electronic device 201 in the form of text. For example, the first portion 711 may display information indicating that the electronic device 201 is at floor 4 of the art museum 1.
According to an embodiment, the second portion 713 may display a map to display the current location 721 of the electronic device 201.
The view 703 of fig. 7 shows a third user interface 703a displayed when the electronic device 201 is located in the third target area 612 of fig. 6 a. In accordance with the change in the position of the electronic device 201, the electronic device 201 may change the second user interface 207 to the third user interface 703a and display the third user interface. For example, when the position of the electronic device 201 moves from the second target area 611 to the third target area 612 and a specified time (e.g., N seconds) elapses, the electronic device 201 may change the second user interface 702a to the third user interface 703a and may display the third user interface.
According to an embodiment, the third user interface 703a may include a first portion 711, a second portion 713, a third portion 715, and a fourth portion 717. The area of the second portion 713 included in the third user interface 703a may be smaller than the area of the second portion 713 included in the second user interface 702 a. The area of the third portion 715 included in the third user interface 703a may be greater than the area of the third portion 715 included in the second user interface 702 a. The third portion 715 included in the third user interface 703a may display an image of an object located near the electronic device 201 and corresponding to the reproducible content. The electronic device 201 may display an image 741 corresponding to the closest object 731 among objects located near the electronic device 201 on the third portion 715 of the third user interface 703 a. The electronic device 201 may display the image 741 corresponding to the objects 731 closest to the electronic device 201 as relatively larger than the images corresponding to the objects.
The view 704 of fig. 7 shows a fourth user interface 704a displayed when the electronic device 201 is located in the fourth target area 613 of fig. 6 a. In accordance with the change in the position of the electronic device 201, the electronic device 201 may change the third user interface 703a to the fourth user interface 704a and display the fourth user interface. For example, when the position of the electronic device 201 moves from the third target area 612 to the fourth target area 613 and a specified time (e.g., N seconds) elapses, or in response to a user input to select an image corresponding to an object located near the electronic device 201, the electronic device 201 may change the third user interface 703a to the fourth user interface 704a and may display the fourth user interface.
According to an embodiment, the fourth user interface 704a may include a first portion 711, a third portion 715, and a fourth portion 717. The fourth user interface 704a may include a second portion 713, although this is not shown, and the area of the second portion 713 included in the fourth user interface 704a may be smaller than the area of the second portion 713 included in the third user interface 703 a. The third portion 715 included in the fourth user interface 704a may include an image 741 corresponding to the object 731 that is closest to the electronic device 201 and information 742 (e.g., title and subtitle) related to the object 731. The fourth user interface 704a may include a play bar 719, the play bar 719 indicating a current play point in a total play time of video data of the multimedia content corresponding to the object 731. The fourth user interface 704a may include a display 720, the display 720 providing a screen displaying detailed explanation information related to the object 731. In response to acquiring a drag input that moves upward on the screen from the display 720, the electronic device 201 may display a fifth user interface 705a that includes detailed explanation information related to the object 731.
According to an embodiment, the fifth user interface 705a may include content (e.g., text, image, video) displaying detailed explanation information related to the object 731, and may include a play bar 719, the play bar 719 indicating a current play point in a total play time of voice data corresponding to the content. The electronic device 201 may impart a visual effect to text corresponding to a play point displayed on the play bar 719 among the displayed text, and may display the text.
Fig. 8 illustrates a user interface displaying information about play content according to an embodiment.
The view 801 of fig. 8 shows a fifth user interface 801a corresponding to the third user interface 703a of fig. 7. When the electronic device 201 is located in the third target area 612 of fig. 6a, the electronic device 201 may display a fifth user interface 801a. The fifth user interface 801a may include a first portion 811 displaying information related to the location of the electronic device 201, a second portion 813 displaying a map to display the current location 821 of the electronic device 201 and nearby objects 831, a third portion 815 displaying an image of objects located nearby the electronic device 201 and corresponding to reproducible content, and a fourth portion 817 displaying text corresponding to played speech.
According to an embodiment, the second portion 813 of the fifth user interface 801a may display a map to display the current location 821 of the electronic device 201 and the object 831 located near the electronic device 201.
According to an embodiment, the third portion 815 of the fifth user interface 801a may display images corresponding to objects located in the vicinity of the electronic device 201, and the images 841 corresponding to objects closest to the electronic device 201 may be displayed relatively larger than the images corresponding to those objects.
View 802 of fig. 8 shows a sixth user interface 802a corresponding to fourth user interface 704a of fig. 7. When the electronic device 201 is located in the fourth target area 613 of fig. 6a, the electronic device 201 may display the sixth user interface 802a. The sixth user interface 802a may include a first portion 811, a third portion 815, and a fourth portion 817. The sixth user interface 802a may include a second portion 813, although this is not shown, and the area of the second portion 813 included in the sixth user interface 802a may be smaller than the area of the second portion 813 included in the fifth user interface 801 a.
According to an embodiment, the third portion 815 of the sixth user interface 802a may display an image 841 associated with the object 831 that is closest to the electronic device 201.
According to an embodiment, the fourth portion 817 of the sixth user interface 802a may display text 820 corresponding to the currently played voice, and may display a play bar 819 indicating the current play point of the total play time.
According to an embodiment, the electronic device 201 may display a seventh user interface 803a in response to a user input to identify history information of content played by the electronic device 201. The seventh user interface 803a may display information about the content played by the electronic device 201. For example, the seventh user interface 803a may display an identification image 853 (e.g., work image) and identifier text 855 (e.g., work title, artist name) corresponding to the played content. The seventh user interface 803a may display an identification image and an identification text with respect to the currently played content 851.
According to an embodiment, the seventh user interface 803a may display a display 861 for generating user multimedia content. The electronic device 201 may generate the user multimedia content in response to user input on the display 861 for generating the user multimedia content.
Fig. 9 illustrates a user interface displayed according to acquisition of user input for generating user multimedia content according to an embodiment.
View 901 of fig. 9 shows a user interface 901a displayed by the electronic device 201 when a condition for generating user multimedia content is not satisfied in response to acquiring a user input for generating user multimedia content. For example, when the number of contents played by the electronic device 201 is smaller than the set number (e.g., 5), the electronic device 201 may determine that the condition for generating the user multimedia content is not satisfied, and may display the user interface 901a.
According to an embodiment, views 902 through 904 of fig. 9 may show images corresponding to content played by the electronic device 201. For example, the first image 902 may correspond to first content, the second image 903 may correspond to second content, and the third image 904 may correspond to third content. The image may include information about an object corresponding to the played content. For example, the image may include an image of the display work viewed by the user, a title, a subtitle associated with the display work, a location where the display work is located, a date the user visits the display hall, and a brand logo of the gallery.
According to an embodiment, the user multimedia content may be multimedia content generated by using at least one content selected from played contents based on interest information. For example, the user multimedia content may be video content that sequentially and/or consecutively displays the first image 902, the second image 903, and the third image 904.
Fig. 10 illustrates a user interface providing functionality for sharing user multimedia content according to an embodiment.
View 1001 of fig. 10 shows a user interface 1001a displayed in response to generating user multimedia content. The electronic device 201 may generate the user multimedia content in response to obtaining user input for generating the user multimedia content and may display the user interface 1001a. The user interface 1001a may include a display 1011 for sharing user multimedia content. The electronic device 201 may obtain user input to select a display 1011 for sharing the user's multimedia content.
View 1002 of fig. 10 shows a user interface 1002a displayed in response to selecting display 1011 for sharing user multimedia content. The electronic device 201 may display a user interface 1012, the user interface 1012 including an image 1013 (e.g., QR code, URL information) corresponding to the generated link information of the user multimedia content. In response to obtaining a user input to select a display 1011 for sharing user multimedia content, the electronic device 201 may display the user interface 1012 while gradually moving the user interface 1012 including an image 1013 (e.g., QR code, URL information) corresponding to the link information from the lower end to the upper end of the user interface 1002a.
Fig. 11 shows a flowchart of an operation for determining whether to register a user according to an embodiment.
The electronic device 201 may perform a series of operations described below simultaneously or in a different order, and some operations may be omitted or added.
In operation 1101, the electronic device 201 may obtain user input to select an application. The electronic device 201 may run the application in response to obtaining the user input. An application may be understood as an application that displays location information of the electronic device 201 and provides a function of playing multimedia content corresponding to an object located near the electronic device 201. For example, the application may display location information of the electronic device 201 moving in an art museum and may provide functionality to play multimedia content corresponding to an exhibition work located in the vicinity of the electronic device 201.
In operation 1103, the electronic device 201 may determine whether to register the user of the application. The electronic device 201 may display a screen for selecting whether to register a user of the application in response to running the application. The electronic device 201 may obtain user input to perform user registration. In response to obtaining user input to perform user registration, the electronic device 201 may perform user registration of the application. According to an embodiment, when performing user registration, the electronic device 201 may perform operation 1105. The electronic device 201 may obtain user input to reject performing user registration. The electronic device 201 may omit user registration of the application in response to obtaining user input to refuse to perform user registration. According to an embodiment, when the user registration is not performed, the electronic device 201 may perform operation 1107.
When the user of the application is registered, the electronic device 201 may acquire user identification information in operation 1105. For example, the electronic device 201 may obtain user identification information through user input. The user identification information may include a user nickname, email, telephone number, and SNS address.
In operation 1107, when the user who executes the application is not registered, the electronic device 201 may omit the operation of acquiring the user identification information.
In operation 1109, the electronic device 201 may determine whether there is the same user identification information as the acquired user identification information in the database of the application. According to an embodiment, the electronic device 201 may perform operation 1111 when there is the same user identification information as the acquired user identification information in the database of the application. According to an embodiment, when there is no user identification information identical to the acquired user identification information, the electronic device 201 may perform operation 1113.
In operation 1111, the electronic device 201 may acquire data corresponding to the user identification information stored in the database from the database of the application. For example, the electronic device 201 may acquire interest information of the user corresponding to the user identification information from the database of the application.
In operation 1113, the electronic device 201 may begin storing new data related to the user of the application.
Fig. 12 shows a flowchart of operations for changing a displayed user interface and played multimedia content according to a change in location of the electronic device 201 according to an embodiment.
The electronic device 201 may perform a series of operations explained below simultaneously or in a different order, and some operations may be omitted or added.
In operation 1201, the electronic device 201 may play multimedia content based on the location of the electronic device 201. Operation 1201 of fig. 12 may correspond to operation 503 of fig. 5.
In operation 1203, the electronic device 201 may determine whether the area in which the electronic device 201 is located changes. For example, the electronic device 201 may determine whether the region in which the electronic device 201 is located changes from a first region (e.g., the first region 211 of fig. 2) to a second region (e.g., the second region 212 of fig. 2). According to an embodiment, the electronic device 201 may perform operation 1201 when the area in which the electronic device 201 is located is unchanged. According to an embodiment, when the area in which the electronic device 201 is located changes, the electronic device 201 may perform operations 1205 and 1207.
In operation 1205, in response to the change in the area in which the electronic device 201 is located, the electronic device 201 may display a user interface corresponding to the changed area. For example, when the region in which the electronic device 201 is located is changed from a first region (e.g., the first region 211 of fig. 2) to a second region (e.g., the second region 212 of fig. 2), the electronic device 201 may change the user interface displaying information about the first multimedia content corresponding to the first region to the user interface displaying information about the second multimedia content corresponding to the second region. The information related to the multimedia content may include, for example, an image 741 and detailed explanation information 742 displayed on the fourth user interface 704a of fig. 7.
In operation 1207, the electronic device 201 may play multimedia content corresponding to the changed region in response to the change of the region in which the electronic device 201 is located. For example, when the region in which the electronic device 201 is located is changed from a first region (e.g., the first region 211 of fig. 2) to a second region (e.g., the second region 212 of fig. 2), the electronic device 201 may change the first multimedia content corresponding to the first region to the second multimedia content corresponding to the second region, and may play the second multimedia content.
According to the embodiment, when the area in which the electronic device 201 is located changes from the first area 211 to the second area 212 while the electronic device 201 plays the first multimedia content, the electronic device 201 may complete the playing of the first multimedia content and may play the second multimedia content.
According to an embodiment, when the region in which the electronic device 201 is located changes from the first region 211 to the second region 212 while the electronic device 201 plays the first multimedia content, the electronic device 201 may display a notification that the second multimedia content is reproducible on a specific region of the displayed user interface.
According to the embodiment, when the region in which the electronic device 201 is located is changed from the first region 211 to the second region 212 while the electronic device 201 plays the first multimedia content, the electronic device 201 may change the first multimedia content to the second multimedia content when the region is changed, and may play the second multimedia content.
Fig. 13 shows a flowchart of an operation of sharing user multimedia content according to an embodiment.
The electronic device 201 may perform a series of operations explained below simultaneously or in a different order, and some operations may be omitted or added.
In operation 1301, the electronic apparatus 201 may receive link information regarding user multimedia content from the server 401 through the communication module 330. The link information may include URL information or QR code information.
In operation 1303, the electronic device 201 may display an image related to the received link information, or may transmit the link information to another electronic device (e.g., a smart phone). For example, when a user who executes an application registers, the electronic device 201 may transmit link information to another electronic device corresponding to user identification information. In another example, when a user who executes an application is not registered, the electronic device 201 may display an image (e.g., QR code) related to the link information received from the server 401.
Fig. 14 shows a flowchart of an operation of sharing user multimedia content according to an embodiment.
In operation 1401, the electronic device 201 may play (which may include playing for the first time or at another time after the first time) the multimedia content based on the location of the electronic device 201. The electronic device 201 may play multimedia content corresponding to an area in which the electronic device 201 is located. Operation 1401 of fig. 14 may correspond to operation 503 of fig. 5.
In operation 1403, the electronic device 201 may acquire interest information about content including the played multimedia content. The electronic device 201 may determine an interest level in the content based on the interest information. Operation 1403 of fig. 14 may correspond to operation 505 of fig. 5.
In operation 1405, the electronic device 201 may generate user multimedia content based on the interest information. The electronic device 201 may generate the user multimedia content based on the interest information in response to obtaining user input on a display (e.g., display 861 of fig. 8) for generating the user multimedia content. The electronic device 201 may generate a data file by performing multimedia capturing or recording with respect to the generated content. Operation 1405 of fig. 14 may correspond to operation 507 of fig. 5.
In operation 1407, the electronic device 201 may transmit the user multimedia content or the data file to the server 401 through the communication module 330. Operation 1407 of fig. 14 may correspond to operation 509 of fig. 5. According to an embodiment, the operation of the electronic device 201 transmitting the user multimedia content or the data file to the server 401 through the communication module 330 may be performed while the operation 1415 is performed.
In operation 1409, the server 401 may receive user multimedia content or a generated data file from the electronic device 201. The server 401 may store the user multimedia content or the generated data file during a specified period of time (e.g., N days). When the period of storing the user multimedia content or the data file exceeds a specified period, the server 401 may delete the user multimedia content or the generated data.
In operation 1411, the server 401 may transmit link information corresponding to the user multimedia content or the data file to the electronic device 201. The link information may include URL information or QR code information.
In operation 1413, the electronic device 201 may receive the link information from the server 401 through the communication module 330. The electronic device 201 may display an image (e.g., a QR code) related to the link information. For example, when a user who executes an application is not registered, the electronic device 201 may display an image related to the link information.
In operation 1415, the electronic device 201 may transmit the link information to another electronic device 230 through the communication module 330. For example, when a user who executes an application is not registered, the other electronic device 230 may acquire link information by scanning an image related to the link information displayed by the electronic device 201 with a camera of the other electronic device 230. In another example, when a user executing an application registers, the electronic device 201 may send the link information to another electronic device 230 corresponding to the user identification information.
According to an embodiment, the electronic device 201 may send the user multimedia content or data file to another electronic device 230 through the communication module 330.
In operation 1417, another electronic device 230 may access the server 401 based on the link information.
According to an embodiment, during a period in which the server 401 stores user multimedia content, another electronic device 230 may access the server 401 based on the link information. When another electronic device 230 accesses the server 401 based on the link information while the server 401 stores the user multimedia content, the other electronic device 230 may play (which may include a first play or a play at another time after the first time) the user multimedia content.
According to an embodiment, after a period in which the server 401 stores the user multimedia content, another electronic device 230 may access the server 401 based on the link information. When another electronic device 230 accesses the server 401 after a period in which the server 401 stores the user multimedia content, the other electronic device 230 may receive the server multimedia content from the server 401. The server multimedia content may be multimedia content including information provided by the server 401. For example, the server multimedia content may include advertisements related to the multimedia content or information about the space 210 in which the object corresponding to the multimedia content is located (e.g., next exhibition planning information).
In operation 1419, when the other electronic device 230 accesses the server 401 during a period in which the server 401 stores the user multimedia content, the other electronic device 230 may download the user multimedia content stored in the server 410 from the server 401 based on the received link information.
In operation 1421, another electronic device 230 may play (which may include playing for the first time or at another time after the first time) the user multimedia content. Another electronic device 230 may play (which may include playing for the first time or at another time after the first time) the downloaded user multimedia content.
According to an embodiment, when another electronic device 230 accesses the server 401 based on the link information during a period in which the server 401 stores the user multimedia content, the other electronic device 230 may play a stream of the user multimedia content.
According to an embodiment, when another electronic device 230 accesses the server 401 based on the link information after a period in which the server 401 stores the user multimedia content, the other electronic device 230 may play the server multimedia content provided from the server 401.
Fig. 15 shows a flowchart of an operation of sharing user multimedia content according to an embodiment.
In operation 1501, the electronic device 201 may play (may include playing for the first time or another time after the first time) multimedia content based on the location of the electronic device 201. The electronic device 201 may play multimedia content corresponding to an area in which the electronic device 201 is located. For example, in a case where the electronic device 201 is located in a specific area (e.g., the first area 211 of fig. 2) among a plurality of areas included in the space 210, the electronic device 201 may play multimedia content corresponding to the specific area. Operation 1501 of fig. 15 may correspond to operation 1401 of fig. 14 and operation 503 of fig. 5.
In operation 1503, the electronic device 201 may acquire interest information about the played multimedia content. The electronic device 201 may obtain interest information regarding each of a plurality of played multimedia content. For example, when the electronic device 201 plays the first multimedia content, the electronic device 201 may obtain first interest information about the first multimedia content. In another example, when the electronic device 201 plays (which may include playing for the first time or at another time after the first time) the second multimedia content, the electronic device 201 may obtain second interest information regarding the second multimedia content. The electronic device 201 may determine the user's interest level in each of the played multimedia contents based on the acquired interest information (e.g., the first interest information or the second interest information). Operation 1503 of fig. 15 may correspond to operation 1403 of fig. 14 and operation 505 of fig. 5.
In operation 1505, the electronic device 201 may transmit the acquired interest information to the server 401. For example, the electronic device 201 may send first interest information about the first multimedia content and/or second interest information about the second multimedia content to the server 401.
According to an embodiment, when the electronic device 201 acquires a plurality of pieces of interest information, the electronic device 201 may transmit all or some of the plurality of pieces of interest information to the server 401. For example, when the electronic device 201 acquires 10 pieces of interest information, the electronic device 201 may transmit the 10 pieces of interest information to the server 401. In another example, when the electronic device 201 acquires 10 pieces of interest information, the electronic device 201 may transmit the first 5 pieces of interest information determined to have a high interest level in the played multimedia content among the 10 pieces of interest information to the server 401.
According to an embodiment, when the electronic device 201 acquires a plurality of pieces of interest information, the electronic device 201 may transmit all or some of the plurality of pieces of interest information to the server 401 based on an input of a user. When the electronic device 201 acquires pieces of interest information, the electronic device 201 may acquire user input to specify the number of pieces of interest information to be transmitted to the server 401. The electronic device 201 may transmit a specified number of pieces of interest information among the pieces of interest information to the server 401 based on the acquired user input. For example, when the electronic device 201 acquires 10 pieces of interest information, the electronic device 201 may acquire a user input to specify the number of pieces (e.g., 5 pieces) of interest information to be transmitted to the server 401. The electronic device 201 may transmit a specified number (e.g., 5) of pieces of interest information among the pieces of interest information to the server 401.
In operation 1507, the electronic device 201 may send a request for generating the user multimedia content to the server 401. The electronic device 201 may obtain user input to send a request to generate user multimedia content. In response to obtaining the user input, the electronic device 201 may send a request to the server 401 for generating the user multimedia content. Operation 1505 of fig. 15 and operation 1507 of fig. 15 may change order or may be performed substantially simultaneously. For example, in response to the electronic device 201 sending a request to the server 401 for generating user multimedia content, the electronic device 201 may send interest information to the server 401. In another example, in response to the electronic device 201 obtaining user input for generating user multimedia content, the electronic device 201 may send a request for generating user multimedia content and interest information to the server 401 substantially simultaneously.
According to an embodiment, the electronic device 201 may automatically or manually send a request for generating user multimedia content and multimedia content for generating user multimedia content and/or content related to the multimedia content (e.g., photo content, video content, or AR content) to the server 401. For example, the electronic device 201 may automatically or manually send multimedia content played by the electronic device 201 and content generated through interactions related to the played multimedia content to the server 401, while sending a request to the server 401 for generating user multimedia content. The content generated through the interaction may include a photo, a video, or AR content related to an object created by photographing the object corresponding to the multimedia content.
In operation 1509, the server 401 may generate user multimedia content based on the interest information received from the electronic device 201. The server 401 may generate user multimedia content based on a plurality of pieces of interest information (e.g., first interest information, second interest information) received from the electronic device 201. Operation 1509 of fig. 15 may differ from and substantially correspond to operation 1405 of fig. 14 and operation 507 of fig. 5 only in operational entities.
In operation 1511, the server 401 may store the generated user multimedia content. The server 401 may store the generated user multimedia content during a specified period of time (e.g., N days). When the period of storing the user multimedia content exceeds a specified period, the server 401 may delete the user multimedia content. In operation 1513, the server 401 may transmit link information corresponding to the user multimedia content to the electronic device 201. The link information may include URL information or QR code information.
In operation 1515, the electronic device 201 may receive the link information from the server 401 through the communication module 330. A UI (e.g., QR code) related to the link information received from the server 401 may be displayed. For example, when a user who executes an application is not registered, the electronic device 201 may display a UI related to the link information through the display 340.
In operation 1517, the electronic device 201 may transmit the link information to another electronic device 230 through the communication module 330. For example, when the electronic device 201 does not perform user registration of an application, another electronic device may acquire link information by scanning a UI (e.g., QR code) related to the link information displayed by the electronic device 201 through a camera of the other electronic device 230. In another example, when the electronic device 201 performs user registration of an application, the electronic device 201 may transmit link information to another electronic device 230 corresponding to user identification information for user registration of the application.
In operation 1519, the other electronic device 230 may access the server 410 based on the link information received from the electronic device 201. In another example, another electronic device 230 may receive user input to select the received link information. In response to obtaining user input to select the received link information, another electronic device 230 may access the server 401.
According to an embodiment, when another electronic device 230 accesses the server 401 during a period in which the server 401 stores the user multimedia content, the other electronic device 230 may play the user multimedia content. For example, when another electronic device 230 accesses the server 401 while the server 401 stores the user multimedia content, the other electronic device 230 may play a stream of the user multimedia content, or may download the user multimedia content from the server 401 and may play the user multimedia content.
According to an embodiment, when another electronic device 230 accesses the server 401 after a period in which the server 401 stores the user multimedia content, the other electronic device 230 may receive the server multimedia content from the server 401. The server multimedia content may be multimedia content separately provided by the server 401. For example, the server multimedia content may include advertisements related to the multimedia content or information about the space 210 in which the object corresponding to the multimedia content is located (e.g., next exhibition planning information).
In operation 1521, the other electronic device 230 may download the user multimedia content corresponding to the link information from the server 401 accessed by the other electronic device 230 based on the link information. For example, when the other electronic device 230 accesses the server 401 during a period in which the server 401 stores the user multimedia content, the other electronic device 230 may download the user multimedia content corresponding to the link information from the server 401.
In operation 1523, another electronic device 230 may play the user multimedia content. According to an embodiment, another electronic device 230 may play a stream of user multimedia content. For example, when the other electronic device 230 accesses the server 401 based on the link information during a period in which the server 401 stores the user multimedia content, the other electronic device 230 may play a stream of the user multimedia content corresponding to the link information.
According to an embodiment, another electronic device 230 may play the user multimedia content downloaded from the server 401. For example, when another electronic device 230 accesses the server 401 based on the link information after a period in which the server 401 stores the user multimedia content, the other electronic device 230 may download the user multimedia content from the server 401. Another electronic device 230 may play the downloaded user multimedia content.
According to an embodiment, the electronic device 201 may include a UWB communication module 320, a communication module 330, and a processor 310 operably connected to the UWB communication module 320 and the communication module 330. The processor 310 may determine the location of the electronic device 201 based on UWB signals received through the UWB communication module 320. The processor 310 may determine an area in which the electronic device 201 is located among the plurality of areas based on a result of comparing the location with map information defining the plurality of areas included in the space in which the external device transmitting the UWB signal is installed. The processor 310 may play (which may include a first play or a play at another time after the first time) the multimedia content corresponding to the region. The processor 310 may obtain interest information regarding a plurality of multimedia content including multimedia content played by the electronic device 201. The processor 310 may select at least one multimedia content from the plurality of multimedia contents based on the interest information. The processor 310 may generate the user multimedia content by using at least one selected multimedia content. The processor 310 may transmit the user multimedia content to the server 401 through the communication module 330.
According to an embodiment, the interest information may include first information about a time when the electronic device 201 stays in an area in which the electronic device is located, second information about the playing of the plurality of multimedia contents, and/or third information about user interactions.
According to an embodiment, in response to the number of multimedia content played by the electronic device 201 satisfying the specified number, the processor 310 may determine a value by dividing the play time of the played multimedia content by the total play time. The processor 310 may select at least one multimedia content from the played plurality of multimedia contents based on the magnitude of the value. The processor 310 may generate the user multimedia content by using at least one selected multimedia content.
According to an embodiment, when the electronic device 201 is located in the first space, the processor 310 may acquire interest information about a plurality of first multimedia contents played by the electronic device. The processor 310 may select a third multimedia content from the plurality of first multimedia contents based on the interest information. When the electronic device 201 is located in the second space, the processor 310 may acquire interest information about a plurality of second multimedia contents played by the electronic device 201. The processor 310 may select fourth multimedia content from the plurality of second multimedia content based on the interest information. The processor 310 may generate the user multimedia content by using the selected third multimedia content and fourth multimedia content.
According to an embodiment, the electronic device 201 may include a display 340 and a speaker 350. The processor 310 may display visual information corresponding to the multimedia content through the display 340. The processor 310 may output audio data corresponding to the multimedia content through the speaker 350.
According to an embodiment, while playing the multimedia content, the processor 310 may determine whether the area in which the electronic device is located is changed based on the UWB signal. In response to determining that the region in which the electronic device 201 is located has changed, a notification is displayed via the display 340 indicating that the played multimedia content has changed to multimedia content corresponding to the changed region.
According to an embodiment, while playing the multimedia content, the processor 310 may determine whether the area in which the electronic device 201 is located is changed based on the UWB signal. In response to determining that the region in which the electronic device 201 is located has changed, the processor 310 may change the played multimedia content to multimedia content corresponding to the changed region and may play the multimedia content.
According to an embodiment, the processor 310 may receive link information corresponding to user multimedia content from the server 401. The link information may include at least one of URL information or QR code information.
According to an embodiment, the processor 310 may transmit link information corresponding to the user multimedia content received from the server 401 through the communication module 330 to another electronic device 230.
According to an embodiment, the processor 310 may display a user interface corresponding to the region through the display 340.
According to an embodiment, in response to determining that the area in which the electronic device 201 is located is a first area, the processor 310 may display a first user interface 701a corresponding to the first area through the display 340. In response to determining that the region in which the electronic device 201 is located is a second region, the processor 310 may display a second user interface 702a corresponding to the second region through the display 340.
According to an embodiment, the user interface may include a first portion to display information related to the location of the electronic device 201, a second portion to display an image of an object located near the electronic device, and a third portion to display text corresponding to the played voice.
According to an embodiment, the processor 310 may determine whether the location of the electronic device is changed based on the UWB signal. In response to determining that the location of the electronic device 201 has changed, the processor 310 may change the size of the first portion and the second portion.
According to an embodiment, when the electronic device 201 is located further than a specified distance, the specified distance being a distance from an object located in the vicinity of the electronic device 201, the processor 310 may display a third user interface comprising a second portion having the first size and a third portion having the second size.
According to an embodiment, when the electronic device 201 is within a specified distance from an object located in proximity to the electronic device 201, the processor 310 may display a fourth user interface including a second portion having a third size smaller than the first size and a third portion having a fourth size larger than the second size.
According to an embodiment, the processor 310 may determine whether the location of the electronic device 201 is within a specified distance from an object located in the vicinity of the electronic device 201 based on the UWB signal. When the location of the electronic device 201 is within a specified distance, the processor 310 may display a notification indicating that multimedia content corresponding to the object is played.
According to an embodiment, a method of operation of the electronic device 201 may include: determining a location of the electronic device 201 based on UWB signals received by the UWB communication module 320 of the electronic device 201; determining an area in which the electronic device is located among the plurality of areas based on a result of comparing the determined position with map information defining the plurality of areas included in the space in which the external device transmitting the UWB signal is installed; playing the multimedia content corresponding to the region; acquiring interest information on a plurality of multimedia contents including multimedia contents played by the electronic device 201; selecting at least one multimedia content from a plurality of multimedia contents based on the interest information; generating user multimedia content by using at least one selected multimedia content; and transmitting the user multimedia content to the server 401 through the communication module 330 of the electronic device 201.
According to an embodiment, a method of operation of the electronic device 201 may include: determining whether an area in which the electronic device 201 is located is changed based on the UWB signal while the multimedia content is played; and in response to determining that the region in which the electronic device 201 is located is changed, changing the played multimedia content to multimedia content corresponding to the changed region, and playing the multimedia content.
According to an embodiment, a method of operation of the electronic device 201 may include: in response to determining that the area in which the electronic device 201 is located is a first area, displaying, by the display 340, a first user interface corresponding to the first area; and in response to determining that the region in which the electronic device 201 is located is a second region, displaying, by the display 340, a second user interface corresponding to the second region.
According to an embodiment, the positioning system 400 may include the electronic device 201, the server 410, and an external device (e.g., UWB anchor 403 of fig. 4). The electronic device 201 may determine a location of the electronic device 201 based on UWB signals received through a UWB communication module 320 included in the electronic device 201; an area in which the electronic device is located may be determined among the plurality of areas based on a result of comparing the location with map information defining the plurality of areas included in the space in which the external device transmitting the UWB signal is installed, multimedia content corresponding to the area may be played, interest information regarding a plurality of multimedia contents including the multimedia content played by the electronic device 201 may be acquired, at least one multimedia content may be selected from the plurality of multimedia contents based on the interest information, user multimedia content may be generated by using the at least one selected multimedia content, and the generated user multimedia content may be transmitted to the server 401 through the communication module 330. The server 401 may store the user multimedia content received from the electronic device 201, may transmit link information of the user multimedia content to the electronic device 201, and may transmit the link information received from the server 401 to the other electronic device 230, and the other electronic device 230 may download the user multimedia content corresponding to the link information received from the electronic device 201 from the server 401.
The effects achieved by the present disclosure are not limited to those mentioned above, and other effects not mentioned above can be clearly understood by those skilled in the art based on the description provided below.
The methods based on the claims or embodiments disclosed in this disclosure may be implemented in hardware, software, or a combination of both.
When implemented in software, a computer-readable storage medium may be provided for storing one or more programs (software modules). One or more programs stored in the computer-readable storage medium are configured to be executed by one or more processors in the electronic device. The one or more programs include instructions for allowing the electronic device to perform a method based on the claims or embodiments disclosed in the present disclosure.
Programs (software modules or software) may be stored in random access memory, non-volatile memory including flash memory, read Only Memory (ROM), electrically erasable programmable read only memory (electrically erasable programmable read only memory, EEPROM), magnetic disk storage devices, compact disk ROM (CD-ROM), digital versatile disks (digital versatiledisc, DVD) or other forms of optical storage devices, as well as magnetic cassettes. Alternatively, the program may be stored in a memory configured in conjunction with all or some of these storage media. Further, the memory to be configured may be plural in number.
Further, the program may be stored in an attachable storage device that can access an electronic device through a communication network such as the internet, an intranet, a local area network (local area network, LAN), a wide area network (WLAN), or a storage area network (storage area network, SAN), or a communication network configured by combining these networks. The storage device may access devices that perform embodiments of the present disclosure via an external port. Further, additional storage devices on the communication network may access devices that perform embodiments of the present disclosure.
In the above-described embodiments of the present disclosure, elements included in the present disclosure are expressed in singular or plural form according to the embodiments. However, for convenience of explanation, singular or plural forms are appropriately selected according to suggested cases, and the present disclosure is not limited to a single element or a plurality of elements. Elements expressed in plural may be arranged in the singular or elements expressed in the singular may be arranged in plural.
Although specific embodiments have been described in the detailed explanation of the present disclosure, various changes in form and detail may be made without departing from the scope of the disclosure. Accordingly, the scope of the present disclosure is defined not by the above-described embodiments but by the appended claims and equivalents thereof.

Claims (15)

1. An electronic device, comprising:
an ultra wideband UWB communication module;
a communication module; and
a processor operatively connected to the UWB communication module and the communication module,
wherein the processor is configured to:
determining a location of the electronic device based on UWB signals received by the UWB communication module;
determining an area in which the electronic device is located among a plurality of areas based on a result of comparing the location with map information defining the plurality of areas included in a space in which the external device transmitting the UWB signal is installed;
playing the multimedia content corresponding to the region;
acquiring interest information on a plurality of multimedia contents including multimedia contents played by the electronic device;
selecting at least one multimedia content from the plurality of multimedia contents based on the interest information;
generating user multimedia content by using at least one selected multimedia content; and
and transmitting the user multimedia content to a server through the communication module.
2. The electronic device of claim 1, wherein the interest information comprises: first information about a time when the electronic device stays in an area in which the electronic device is located, second information about playback of the plurality of multimedia contents, and third information related to user interaction.
3. The electronic device of claim 1, wherein the processor is configured to:
determining a value by dividing a play time of the multimedia content by a total play time in response to the number of multimedia contents played by the electronic device satisfying a specified number;
selecting at least one multimedia content from the plurality of multimedia contents based on the magnitude of the value; and
the user multimedia content is generated by using at least one selected multimedia content.
4. The electronic device of claim 1, wherein the processor is configured to:
acquiring interest information about a plurality of first multimedia contents played by the electronic device when the electronic device is located in a first space;
selecting a third multimedia content from the plurality of first multimedia contents based on the interest information;
acquiring interest information about a plurality of second multimedia contents played by the electronic device when the electronic device is located in a second space;
selecting fourth multimedia content from the plurality of second multimedia content based on the interest information; and
the user multimedia content is generated by using the selected third multimedia content and fourth multimedia content.
5. The electronic device of claim 1, comprising a display and a speaker,
wherein the processor is configured to:
displaying visual information corresponding to the multimedia content through the display; and
audio data corresponding to the multimedia content is output through the speaker.
6. The electronic device of claim 1, wherein the processor is configured to:
determining whether an area in which the electronic device is located changes based on the UWB signal when the multimedia content is played; and
in response to determining that the region in which the electronic device is located has changed, a notification is displayed via the display, the notification indicating that the replayed multimedia content has changed to multimedia content corresponding to the changed region.
7. The electronic device of claim 1, wherein the processor is configured to:
determining whether an area in which the electronic device is located changes based on the UWB signal when the multimedia content is played; and
in response to determining that the region in which the electronic device is located has changed, the multimedia content is changed to multimedia content corresponding to the changed region, and the multimedia content is played.
8. The electronic device of claim 1, wherein the processor is configured to receive link information corresponding to the user multimedia content from the server, and
wherein the link information includes at least one of URL information or QR code information.
9. The electronic device of claim 1, wherein the processor is configured to send the link information corresponding to the user multimedia content received from the server through the communication module to another electronic device.
10. The electronic device of claim 1, wherein the processor is configured to display, via a display, a user interface corresponding to the region.
11. The electronic device of claim 1, wherein the processor is configured to:
responsive to determining that the region in which the electronic device is located is a first region, displaying, by a display, a first user interface corresponding to the first region; and
in response to determining that the region in which the electronic device is located is a second region, a second user interface corresponding to the second region is displayed through the display.
12. The electronic device of claim 10, wherein the user interface comprises: a first portion to display information related to the location of the electronic device, a second portion to display an image of an object located in proximity to the electronic device, and a third portion to display text corresponding to the played voice.
13. The electronic device of claim 12, wherein the processor is configured to:
determining whether the location of the electronic device changes based on the UWB signal; and
in response to determining a change in location of the electronic device, a size of the first portion and the second portion is changed.
14. The electronic device of claim 12, wherein the processor is configured to display a third user interface including the second portion having a first size and the third portion having a second size when the electronic device is located farther than a specified distance from an object located in proximity to the electronic device.
15. The electronic device of claim 14, wherein the processor is configured to display a fourth user interface including the second portion having a third size smaller than the first size and the third portion having a fourth size larger than the second size when the electronic device is within the specified distance.
CN202280034700.5A 2021-05-13 2022-02-25 Content sharing method and electronic device thereof Pending CN117337572A (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR10-2021-0062199 2021-05-13
KR10-2021-0077506 2021-06-15
KR1020210077506A KR20220154574A (en) 2021-05-13 2021-06-15 Method and electronic device for sharing content
PCT/KR2022/002756 WO2022239939A1 (en) 2021-05-13 2022-02-25 Content sharing method and electronic device therefor

Publications (1)

Publication Number Publication Date
CN117337572A true CN117337572A (en) 2024-01-02

Family

ID=89283503

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202280034700.5A Pending CN117337572A (en) 2021-05-13 2022-02-25 Content sharing method and electronic device thereof

Country Status (1)

Country Link
CN (1) CN117337572A (en)

Similar Documents

Publication Publication Date Title
CN108415705B (en) Webpage generation method and device, storage medium and equipment
US10931880B2 (en) Electronic device and method for providing information thereof
CN111026992B (en) Multimedia resource preview method, device, terminal, server and storage medium
CN108536295B (en) Object control method and device in virtual scene and computer equipment
US20230179856A1 (en) Electronic device including plurality of cameras and method of operating the same
US20210263639A1 (en) Electronic device sharing at least one object and method for controlling the same
CN112131473B (en) Information recommendation method, device, equipment and storage medium
US20230315156A1 (en) Electronic device for moving and displaying at least one object according to expansion of flexible display, and method for controlling same
US20230260280A1 (en) Electronic device for displaying ar object and method thereof
US20230360342A1 (en) Method for providing content creation function and electronic device supporting same
US20220345638A1 (en) Electronic device and method for controlling screen thereof
US20230030320A1 (en) Electronic device displaying user interface and method for operating the same
CN117337572A (en) Content sharing method and electronic device thereof
EP4287627A1 (en) Content sharing method and electronic device therefor
US20220383598A1 (en) Method and apparatus for displaying augmented reality object
KR20220154574A (en) Method and electronic device for sharing content
US20230164515A1 (en) Electronic device and method for recommending user action based on location
US20230244351A1 (en) Audio playback and screen display method and device therefor
US20240020084A1 (en) Screen sharing method and electronic device therefor
US11694441B2 (en) Electronic device correcting meta information of image and operating method thereof
US20230144615A1 (en) Electronic device comprising flexible display
US20220261435A1 (en) Method and electronic device for providing personalized media content
US11153018B2 (en) Electronic device and method for controlling electronic device
EP4365721A1 (en) Electronic device including flexible display and method for operating same
CN113409235B (en) Vanishing point estimation method and apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination