WO2021101006A1 - Dispositif électronique pour fournir un contenu sur la base de l'emplacement d'une image réfléchissante d'un objet externe, et procédé de fonctionnement de dispositif électronique - Google Patents

Dispositif électronique pour fournir un contenu sur la base de l'emplacement d'une image réfléchissante d'un objet externe, et procédé de fonctionnement de dispositif électronique Download PDF

Info

Publication number
WO2021101006A1
WO2021101006A1 PCT/KR2020/008628 KR2020008628W WO2021101006A1 WO 2021101006 A1 WO2021101006 A1 WO 2021101006A1 KR 2020008628 W KR2020008628 W KR 2020008628W WO 2021101006 A1 WO2021101006 A1 WO 2021101006A1
Authority
WO
WIPO (PCT)
Prior art keywords
electronic device
content
user
display
external object
Prior art date
Application number
PCT/KR2020/008628
Other languages
English (en)
Korean (ko)
Inventor
이미영
박찬웅
배주윤
강동구
김현진
유임경
Original Assignee
삼성전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전자 주식회사 filed Critical 삼성전자 주식회사
Publication of WO2021101006A1 publication Critical patent/WO2021101006A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B71/0622Visual, audio or audio-visual systems for entertaining, instructing or motivating the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/80Special sensors, transducers or devices therefor
    • A63B2220/806Video cameras

Definitions

  • Various embodiments of the present invention relate to an electronic device and a method of operating the electronic device, and in particular, an electronic device that provides content based on a position of a reflection image of an external object reflected by a display of the electronic device, and an electronic device. It relates to how it works.
  • a mirror display which is a type of display, displays a screen created based on data stored in the memory of the electronic device and a reflection image of an external object displayed by reflecting an external object in front of the display by the display. Can be displayed together.
  • AR augmented reality
  • a reflection image of the user and information related to the user may be implemented as augmented reality.
  • an electronic device including a mirror display requires coordinates on the display on which the reflected image is displayed.
  • the electronic device cannot recognize the reflected image, and furthermore, it cannot check the coordinates on the display on which the reflected image is displayed.
  • An electronic device includes a display; camera; And a processor, wherein the processor controls the display to display the first content, and while the first content is displayed, the position of the visual object included in the first content and the position of the visual object included in the first content are reflected and displayed by the display.
  • the location information of the reflection image is determined based on the location information of the visual object, and the location where the second content is to be displayed is determined by the reflection. It may be set to determine based on the location information of the image.
  • a method of operating an electronic device may include displaying first content; Detecting movement of an external object related to a visual object included in the first content while the first content is displayed; In response to confirming that the location of the visual object and the location of the reflective image of the external object reflected and displayed by the display match, the location information of the reflected image is based on the location information of the visual object An operation to be obtained; And determining a location where the second content is to be displayed based on location information of the reflected image.
  • a first content including at least one visual object is displayed, and a position and a reflection image of the visual object are displayed.
  • the position of the reflection image may be accurately obtained based on the position of the visual object.
  • An electronic device and a method of operating the electronic device may obtain a position of a reflective image, and thus a position of the second content related to the reflective image may be determined based on the position of the reflective image. Accordingly, a service that provides a reflection image and second content in the form of an augmented reality can be provided.
  • An electronic device and a method of operating the electronic device may consider a distance between an external object and a display, and thus an accurate position of a reflected image may be obtained.
  • An electronic device and a method of operating the electronic device are to check the characteristics of the external object and provide second content suitable for the characteristics of the external object in the process of obtaining the position of the reflection image of the external object.
  • I can. Accordingly, a content providing service suitable for a user can be implemented.
  • An electronic device and a method of operating the electronic device may check the characteristics of the user's body and provide exercise content suitable for the characteristics of the user's body in the process of obtaining the location of the user's reflection image. . Accordingly, a service for providing exercise content appropriate to the user's body can be implemented.
  • FIG. 1 is a block diagram of an electronic device according to various embodiments of the present disclosure.
  • FIG. 2 is a block diagram of a program according to various embodiments.
  • 3A and 3B are diagrams illustrating an electronic device according to various embodiments of the present disclosure.
  • FIG. 4 is a block diagram of an electronic device according to various embodiments of the present disclosure.
  • 5A, 5B, 5C, 5D, 5E, 5F, 5G, 5H, and 5I illustrate a second content while displaying a first content in an electronic device according to various embodiments of the present disclosure.
  • 6A and 6B are diagrams illustrating an embodiment of displaying a first content and a visual object in an electronic device according to various embodiments of the present disclosure.
  • FIGS. 7A, 7B, 7C, and 7D are diagrams illustrating an embodiment of additionally obtaining body information of a user in an electronic device according to various embodiments of the present disclosure.
  • 8A, 8B, and 8C are diagrams illustrating second content to be displayed based on user's body information in an electronic device according to various embodiments of the present disclosure.
  • 9A, 9B, 9C, and 9D are diagrams illustrating second content to be displayed based on user's body information in an electronic device according to various embodiments of the present disclosure.
  • 10A, 10B, 10C, and 10D are diagrams illustrating second content to be displayed based on a result of identifying another external object in an electronic device according to various embodiments of the present disclosure.
  • 11A, 11B, 11C, and 11D are diagrams illustrating second content to be displayed according to a distance between the electronic device and an external object in an electronic device according to various embodiments of the present disclosure.
  • FIG. 12 is an operation flowchart illustrating a method of operating an electronic device according to various embodiments of the present disclosure.
  • FIG. 1 is a block diagram of an electronic device 101 in a network environment 100 according to various embodiments.
  • the electronic device 101 communicates with the electronic device 102 through a first network 198 (for example, a short-range wireless communication network), or a second network 199 It is possible to communicate with the electronic device 104 or the server 108 through (eg, a long-distance wireless communication network).
  • the electronic device 101 may communicate with the electronic device 104 through the server 108.
  • the electronic device 101 includes a processor 120, a memory 130, an input device 150, an audio output device 155, a display device 160, an audio module 170, and a sensor module ( 176, interface 177, haptic module 179, camera module 180, power management module 188, battery 189, communication module 190, subscriber identification module 196, or antenna module 197 ) Can be included.
  • a sensor module 176, interface 177, haptic module 179, camera module 180, power management module 188, battery 189, communication module 190, subscriber identification module 196, or antenna module 197
  • at least one of these components may be omitted or one or more other components may be added to the electronic device 101.
  • some of these components may be implemented as one integrated circuit.
  • the sensor module 176 eg, a fingerprint sensor, an iris sensor, or an illuminance sensor
  • the display device 160 eg, a display.
  • the processor 120 for example, executes software (eg, a program 140) to implement at least one other component (eg, a hardware or software component) of the electronic device 101 connected to the processor 120. It can be controlled and can perform various data processing or operations. According to an embodiment, as at least a part of data processing or operation, the processor 120 may transfer commands or data received from other components (eg, the sensor module 176 or the communication module 190) to the volatile memory 132. It is loaded into, processes commands or data stored in the volatile memory 132, and the result data may be stored in the nonvolatile memory 134.
  • software eg, a program 140
  • the processor 120 may transfer commands or data received from other components (eg, the sensor module 176 or the communication module 190) to the volatile memory 132. It is loaded into, processes commands or data stored in the volatile memory 132, and the result data may be stored in the nonvolatile memory 134.
  • the processor 120 includes a main processor 121 (eg, a central processing unit or an application processor), and a secondary processor 123 (eg, a graphic processing unit, an image signal processor) that can be operated independently or together with the main processor 121 (eg, a central processing unit or an application processor). , A sensor hub processor, or a communication processor). Additionally or alternatively, the coprocessor 123 may be set to use lower power than the main processor 121 or to be specialized for a designated function. The secondary processor 123 may be implemented separately from the main processor 121 or as a part thereof.
  • main processor 121 eg, a central processing unit or an application processor
  • a secondary processor 123 eg, a graphic processing unit, an image signal processor
  • the coprocessor 123 may be set to use lower power than the main processor 121 or to be specialized for a designated function.
  • the secondary processor 123 may be implemented separately from the main processor 121 or as a part thereof.
  • the co-processor 123 is, for example, in place of the main processor 121 while the main processor 121 is in an inactive (eg, sleep) state, or the main processor 121 is active (eg, executing an application). ) While in the state, together with the main processor 121, at least one of the components of the electronic device 101 (for example, the display device 160, the sensor module 176, or the communication module 190) It is possible to control at least some of the functions or states associated with it.
  • the coprocessor 123 eg, an image signal processor or a communication processor
  • may be implemented as a part of other functionally related components eg, the camera module 180 or the communication module 190). have.
  • the memory 130 may store various data used by at least one component of the electronic device 101 (eg, the processor 120 or the sensor module 176 ).
  • the data may include, for example, software (eg, the program 140) and input data or output data for commands related thereto.
  • the memory 130 may include a volatile memory 132 or a nonvolatile memory 134.
  • the program 140 may be stored as software in the memory 130, and may include, for example, an operating system 142, middleware 144, or an application 146.
  • the input device 150 may receive a command or data to be used for a component of the electronic device 101 (eg, the processor 120) from outside (eg, a user) of the electronic device 101.
  • the input device 150 may include, for example, a microphone, a mouse, a keyboard, or a digital pen (eg, a stylus pen).
  • the sound output device 155 may output an sound signal to the outside of the electronic device 101.
  • the sound output device 155 may include, for example, a speaker or a receiver.
  • the speaker can be used for general purposes such as multimedia playback or recording playback, and the receiver can be used to receive incoming calls.
  • the receiver may be implemented separately from the speaker or as part of the speaker.
  • the display device 160 may visually provide information to the outside of the electronic device 101 (eg, a user).
  • the display device 160 may include, for example, a display, a hologram device, or a projector and a control circuit for controlling the device.
  • the display device 160 may include a touch circuitry set to sense a touch, or a sensor circuit (eg, a pressure sensor) set to measure the strength of a force generated by the touch. have.
  • the audio module 170 may convert sound into an electrical signal, or conversely, may convert an electrical signal into sound. According to an embodiment, the audio module 170 acquires sound through the input device 150, the sound output device 155, or an external electronic device (eg: Sound can be output through the electronic device 102) (for example, a speaker or headphones).
  • an external electronic device eg: Sound can be output through the electronic device 102
  • Sound can be output through the electronic device 102
  • the sensor module 176 detects an operating state (eg, power or temperature) of the electronic device 101, or an external environmental state (eg, a user state), and generates an electrical signal or data value corresponding to the detected state. can do.
  • the sensor module 176 is, for example, a gesture sensor, a gyro sensor, an atmospheric pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an infrared (IR) sensor, a biometric sensor, It may include a temperature sensor, a humidity sensor, or an illuminance sensor.
  • the interface 177 may support one or more specified protocols that may be used for the electronic device 101 to connect directly or wirelessly with an external electronic device (eg, the electronic device 102 ).
  • the interface 177 may include, for example, a high definition multimedia interface (HDMI), a universal serial bus (USB) interface, an SD card interface, or an audio interface.
  • connection terminal 178 may include a connector through which the electronic device 101 can be physically connected to an external electronic device (eg, the electronic device 102).
  • the connection terminal 178 may include, for example, an HDMI connector, a USB connector, an SD card connector, or an audio connector (eg, a headphone connector).
  • the haptic module 179 may convert an electrical signal into a mechanical stimulus (eg, vibration or movement) or an electrical stimulus that a user can perceive through tactile or motor sensations.
  • the haptic module 179 may include, for example, a motor, a piezoelectric element, or an electrical stimulation device.
  • the camera module 180 may capture a still image and a video.
  • the camera module 180 may include one or more lenses, image sensors, image signal processors, or flashes.
  • the power management module 188 may manage power supplied to the electronic device 101.
  • the power management module 388 may be implemented as at least a part of, for example, a power management integrated circuit (PMIC).
  • PMIC power management integrated circuit
  • the battery 189 may supply power to at least one component of the electronic device 101.
  • the battery 189 may include, for example, a non-rechargeable primary cell, a rechargeable secondary cell, or a fuel cell.
  • the communication module 190 includes a direct (eg, wired) communication channel or a wireless communication channel between the electronic device 101 and an external electronic device (eg, the electronic device 102, the electronic device 104, or the server 108). It is possible to support establishment and communication through the established communication channel.
  • the communication module 190 operates independently of the processor 120 (eg, an application processor) and may include one or more communication processors supporting direct (eg, wired) communication or wireless communication.
  • the communication module 190 is a wireless communication module 192 (eg, a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module) or a wired communication module 194 (eg : A local area network (LAN) communication module, or a power line communication module) may be included.
  • a corresponding communication module is a first network 198 (for example, a short-range communication network such as Bluetooth, WiFi direct or IrDA (infrared data association)) or a second network 199 (for example, a cellular network, the Internet, or It can communicate with external electronic devices through a computer network (for example, a telecommunication network such as a LAN or WAN).
  • the wireless communication module 192 uses subscriber information stored in the subscriber identification module 196 (eg, International Mobile Subscriber Identifier (IMSI)) in a communication network such as the first network 198 or the second network 199.
  • IMSI International Mobile Subscriber Identifier
  • the electronic device 101 can be checked and authenticated.
  • the antenna module 197 may transmit a signal or power to the outside (eg, an external electronic device) or receive from the outside.
  • the antenna module may include one antenna including a conductor formed on a substrate (eg, a PCB) or a radiator formed of a conductive pattern.
  • the antenna module 197 may include a plurality of antennas. In this case, at least one antenna suitable for a communication method used in a communication network such as the first network 198 or the second network 199 is, for example, provided by the communication module 190 from the plurality of antennas. Can be chosen.
  • the signal or power may be transmitted or received between the communication module 190 and an external electronic device through the at least one selected antenna.
  • other components eg, RFIC
  • other than the radiator may be additionally formed as part of the antenna module 197.
  • At least some of the components are connected to each other through a communication method (e.g., a bus, general purpose input and output (GPIO), serial peripheral interface (SPI), or mobile industry processor interface (MIPI))) between peripheral devices and a signal ( E.g. commands or data) can be exchanged with each other.
  • a communication method e.g., a bus, general purpose input and output (GPIO), serial peripheral interface (SPI), or mobile industry processor interface (MIPI)
  • GPIO general purpose input and output
  • SPI serial peripheral interface
  • MIPI mobile industry processor interface
  • commands or data may be transmitted or received between the electronic device 101 and the external electronic device 104 through the server 108 connected to the second network 199.
  • Each of the electronic devices 102 and 104 may be a device of the same or different type as the electronic device 101.
  • all or part of the operations executed by the electronic device 101 may be executed by one or more of the external electronic devices 102, 104, or 108.
  • the electronic device 101 needs to perform a function or service automatically or in response to a request from a user or another device, the electronic device 101
  • One or more external electronic devices receiving the request may execute at least a part of the requested function or service, or an additional function or service related to the request, and transmit a result of the execution to the electronic device 101.
  • the electronic device 101 may process the result as it is or additionally and provide it as at least part of a response to the request.
  • cloud computing, distributed computing, or client-server computing technology Can be used.
  • the program 140 includes an operating system 142 for controlling one or more resources of the electronic device 101, middleware 144, or an application 146 executable in the operating system 142.
  • the operating system 142 may include, for example, AndroidTM, iOSTM, WindowsTM, SymbianTM, TizenTM, or BadaTM.
  • At least some of the programs 140 are, for example, preloaded on the electronic device 101 at the time of manufacture, or when used by a user, an external electronic device (eg, electronic device 102 or 104), or a server ( 108)) can be downloaded or updated.
  • the operating system 142 may control management (eg, allocation or retrieval) of one or more system resources (eg, process, memory, or power) of the electronic device 101.
  • Operating system 142 additionally or alternatively, other hardware devices of the electronic device 101, for example, the input device 150, the sound output device 155, the display device 160, the audio module 170 , Sensor module 176, interface 177, haptic module 179, camera module 180, power management module 188, battery 189, communication module 190, subscriber identification module 196, or One or more driver programs for driving the antenna module 197 may be included.
  • the middleware 144 may provide various functions to the application 146 so that a function or information provided from one or more resources of the electronic device 101 can be used by the application 146.
  • the middleware 144 is, for example, an application manager 201, a window manager 203, a multimedia manager 205, a resource manager 207, a power manager 209, a database manager 211, and a package manager 213. ), a connectivity manager 215, a notification manager 217, a location manager 219, a graphic manager 221, a security manager 223, a call manager 225, or a voice recognition manager 227.
  • I can.
  • the application manager 201 may manage the life cycle of the application 146, for example.
  • the window manager 203 may manage one or more GUI resources used on a screen, for example.
  • the multimedia manager 205 for example, identifies one or more formats required for playback of media files, and performs encoding or decoding of a corresponding media file among the media files by using a codec suitable for the selected corresponding format. You can do it.
  • the resource manager 207 may manage the source code of the application 146 or a memory space of the memory 130, for example.
  • the power manager 209 manages the capacity, temperature, or power of the battery 189, for example, and may determine or provide related information necessary for the operation of the electronic device 101 by using the corresponding information. . According to an embodiment, the power manager 209 may interwork with a basic input/output system (BIOS) (not shown) of the electronic device 101.
  • BIOS basic input/output system
  • the database manager 211 may create, search, or change a database to be used by the application 146, for example.
  • the package manager 213 may manage installation or update of an application distributed in the form of, for example, a package file.
  • the connectivity manager 215 may manage, for example, a wireless connection or a direct connection between the electronic device 101 and an external electronic device.
  • the notification manager 217 may provide a function for notifying the user of the occurrence of a designated event (eg, incoming call, message, or alarm), for example.
  • the location manager 219 may manage location information of the electronic device 101, for example.
  • the graphic manager 221 may manage, for example, one or more graphic effects to be provided to a user or a user interface related thereto.
  • the security manager 223 may provide, for example, system security or user authentication.
  • the telephony manager 225 may manage, for example, a voice call function or a video call function provided by the electronic device 101.
  • the voice recognition manager 227 transmits, for example, a user's voice data to the server 108, and a command corresponding to a function to be performed in the electronic device 101 based at least in part on the voice data, Alternatively, text data converted based at least in part on the voice data may be received from the server 108.
  • the middleware 244 may dynamically delete some of the existing components or add new components.
  • at least a part of the middleware 144 may be included as a part of the operating system 142 or implemented as separate software different from the operating system 142.
  • the application 146 is, for example, a home 251, a dialer 253, an SMS/MMS 255, an instant message (IM) 257, a browser 259, a camera 261, and an alarm 263. , Contacts (265), voice recognition (267), email (269), calendar (271), media player (273), album (275), watch (277), health (279) (e.g. Biometric information measurement), or environmental information 281 (eg, air pressure, humidity, or temperature information measurement) application may be included. According to an embodiment, the application 146 may further include an information exchange application (not shown) capable of supporting information exchange between the electronic device 101 and an external electronic device.
  • an information exchange application (not shown) capable of supporting information exchange between the electronic device 101 and an external electronic device.
  • the information exchange application may include, for example, a notification relay application configured to deliver specified information (eg, a call, a message, or an alarm) to an external electronic device, or a device management application configured to manage an external electronic device.
  • the notification relay application for example, transmits notification information corresponding to a specified event (eg, mail reception) generated by another application (eg, email application 269) of the electronic device 101 to an external electronic device. I can. Additionally or alternatively, the notification relay application may receive notification information from an external electronic device and provide it to the user of the electronic device 101.
  • the device management application includes, for example, an external electronic device that communicates with the electronic device 101 or some components thereof (for example, the display device 160 or the camera module 180). -Off) or a function (eg, brightness, resolution, or focus of the display device 160 or the camera module 180) may be controlled.
  • the device management application may additionally or alternatively support installation, deletion, or update of an application operating in an external electronic device.
  • 3A and 3B are diagrams illustrating an electronic device according to various embodiments of the present disclosure.
  • an electronic device 300 may include a display 310 and a camera 320.
  • the display 310 may display various screens based on control of a processor (eg, the processor 120 of FIG. 1 ).
  • a processor eg, the processor 120 of FIG. 1
  • an external object for example, a user of the electronic device 300, various objects provided by a user of the electronic device 300
  • an external object reflected by the display 310 and displayed A reflective image of may be displayed on the display 310.
  • the display 310 may display a screen corresponding to the data transmitted by the processor 120 while a reflection image of an external object is displayed.
  • the user of the electronic device 300 may check the screen and the reflection image displayed on the display 310 together.
  • the display 310 may be implemented as a mirror display in which a configuration for increasing reflectance of light traveling toward the display 310 is added in order to clearly display a reflection image of an external object.
  • the mirror display may increase reflectance of light traveling toward the display 310 by attaching a mirror film to a polarizing plate among constituent elements of the display 310.
  • the camera 320 may capture various external objects existing in front of the display 310.
  • the electronic device 300 may determine the location of the external object based on the captured image and provide various contents based on the location of the external object.
  • the electronic device 300 may display content using augmented reality (AR) on the display 310.
  • the electronic device 300 may provide content using augmented reality based on a location of a reflection image of an external object.
  • the electronic device 300 fails to obtain the location information of the reflection image of the external object, the electronic device 300 may not consider the location of the reflection image of the external object in determining the location to display the content.
  • the electronic device 300 may display content at an inappropriate location (eg, a location other than the same location as the reflection image of an external object).
  • 3B is a diagram illustrating an embodiment in which an electronic device displays a reflection image and content according to various embodiments of the present disclosure.
  • a reflection image 330 of an external object that is displayed by reflecting an external object existing in front of the display 310 by the display 310 may be displayed on the display 310.
  • the display 310 may display the content 340 provided by the electronic device 300 on the display 310 while the reflection image 330 of the external object is displayed.
  • a position at which the reflection image 330 is displayed and a position at which the content 340 is displayed may be different from each other.
  • a phenomenon in which the location where the reflection image 330 is displayed and the location where the content 340 is displayed differ from each other is based on the location information of the external object 350 obtained by the electronic device 300 using the camera 320. This may mean that the position determined to display the 340 is different from the position at which the reflected image 330 recognized by the actual user is displayed.
  • the phenomenon in which the location where the reflection image 330 is displayed and the location where the content 340 is displayed differ from each other is a service that must provide the content 340 to the location where the reflection image 330 is displayed (e.g., augmented reality-based service, Fitness coach service) can be difficult to provide.
  • a service that must provide the content 340 to the location where the reflection image 330 is displayed e.g., augmented reality-based service, Fitness coach service
  • FIG. 4 is a block diagram of an electronic device according to various embodiments of the present disclosure.
  • an electronic device 400 includes a display (eg, the display 310 of FIG. 3) 410, and a processor (for example, it may include the processor 120) 420 of FIG. 1 and at least one camera (eg, the camera 320 of FIG. 3) 430.
  • the display 410 may display various screens based on the control of the processor 420.
  • the display 410 is It can be implemented as a mirror display with a configuration that increases the reflectivity of the display.
  • the camera 430 may measure a distance between the camera 430 and an external object positioned in front of the display 410.
  • the camera 430 may be implemented as a depth camera or a vision sensor capable of measuring a distance between an external object and the camera 430.
  • the camera 430 implemented as a depth camera or a vision sensor may capture a depth image including an external object.
  • the depth image may be an image including distance information between the object and the camera 430.
  • the processor 420 may check the distance between the external object and the camera 430 based on the depth image captured by the camera 430.
  • the processor 420 may be operatively connected to the display 410 or the camera 420 to control the operation of the display 410 or the camera 430.
  • the processor 420 may calibrate or adjust the location of the content displayed on the display 410. Specifically, the processor 420 determines the location of the reflected image of the external object displayed by reflecting the external object in front of the display 410 by the display 410 and the location of the content displayed by the control of the processor 420. You can calibrate the location of the content to be displayed to match.
  • a specific embodiment of adjusting the position of the displayed content will be described.
  • the processor 420 may control the display 410 to display the first content.
  • the processor 420 may display the first content in response to confirming that an external object (eg, a user) having a specific shape (eg, a person's appearance) exists on the front surface of the display 410.
  • an external object eg, a user
  • a specific shape eg, a person's appearance
  • the first content may be content provided by the processor 420 to obtain information on an external object.
  • the first content may be content provided to obtain various pieces of information for calibrating the location of the content to be displayed.
  • Various pieces of information for calibrating the location of the content to be displayed by the processor 420 are variables required to determine the location (or coordinates on the display) of the area in which the content is to be displayed. ) May include location information on the top.
  • the first content may be content provided to obtain information on characteristics of an external object (eg, a person) (eg, a height of a user, a radius of an arm, and a location of a waist as a characteristic of a body).
  • the first content may be content related to a warm-up exercise (eg, stretching, shaking an arm, or bending a knee) of a user of the electronic device 400.
  • a warm-up exercise eg, stretching, shaking an arm, or bending a knee
  • the user may correspond to an external object.
  • the user may perform the warm-up exercise while being provided with content related to the user's warm-up exercise.
  • the processor 420 may determine the location of the reflection image of the external object based on a user input received while the user performs the warm-up exercise.
  • the second content is obtained by calibrating a location where the first content is displayed based on the location of a reflection image of an external object acquired while the electronic device 400 displays the first content. It may mean various contents displayed on a location.
  • the second content may be content for a user's exercise provided to the user.
  • the processor 420 may determine a location to display the first content based on location information of an external object acquired using the camera 430.
  • the processor 420 may control the display 410 to display the first content on the determined location.
  • the processor 420 obtains a position of a reflection image of at least one feature point (eg, a part of a person's body including a wrist, a shoulder, and a head) of an external object (eg, a person).
  • at least one visual object may be displayed in an area corresponding to a feature point among areas in which the first content is to be displayed.
  • the visual object may be included in the first content.
  • the position at which the visual object is displayed may be a position that exists within a preset distance from the area corresponding to the feature point.
  • the processor 420 may acquire location information on the display 410 of a reflection image of an external object in various ways.
  • the processor 420 displays a visual object in a specific area on the display 410, and receives a user input indicating that the position of the reflection image of the external object matches the position of the visual object. can do.
  • User input may be received in various forms.
  • the user input may be an input implemented with a voice indicating that the location of the reflection image of the external object and the location of the visual object match.
  • the user input may be an input implemented in the form of a gesture indicating that the position of the reflection image of the external object and the position of the visual object match.
  • the processor 420 may determine the location of the reflection image of the external object as the location of the visual object.
  • the processor 420 may determine (or calibrate) a location in which the second content is to be displayed based on location information of a visual object and location information of an external object corresponding to the visual object. . According to an embodiment, the processor 420 calibrates the position where the first content is displayed based on the difference between the position of the visual object and the position of the external object corresponding to the visual object, and displays the calibrated coordinates on the second content. You can decide where you want to be.
  • the processor 420 may acquire characteristics of an external object while displaying the first content.
  • the characteristics of the external object may include the user's height, the user's arm activity radius, and the user's knee position.
  • the processor 420 may provide third content, which is additional content, for increasing the accuracy of the characteristic of the partial region of the external object in response to confirming that the accuracy of the characteristic of the partial region of the external object is less than or equal to a preset value.
  • the third content may be content for acquiring characteristics of a partial area of an external object.
  • the processor 420 displays the third content including the warm-up exercise for the shoulder (410). ) Can be displayed. While displaying the third content, the processor 420 may photograph a user performing a warm-up exercise on the shoulder using the camera 430 and check characteristics of the shoulder based on the captured image.
  • the processor 420 may determine a location in which the second content is to be displayed based on a distance between the display 410 and an external object.
  • the processor 420 determines the distance between the camera 430 and the external object based on the depth image collected by the camera 430, and the display 410 and the external object based on the distance between the camera 430 and the external object. You can determine the distance.
  • the distance between the camera 430 and the external object may be the same as or similar to the distance between the display 410 and the external object.
  • the processor 420 may move the display position of the second content to the lower portion of the display 410.
  • the processor 420 may move the display position of the second content to the upper portion of the display 410.
  • the processor 420 may acquire characteristic information of an external object based on an image captured using the camera 430 while displaying the first content.
  • the characteristic information of the external object is characteristic information of the user's body, and may include the user's gender, the user's age, the user's height, a waist position, a knee position, and an arm radius.
  • the processor 420 may select or determine the second content to be displayed based on the collected characteristic information of the external object.
  • a plurality of second contents may be stored on a memory (for example, the memory 130 of FIG. 1) of the electronic device 400.
  • the second content stored in the memory may be received from an external electronic device (eg, the server 108 of FIG. 1 or the electronic device 104 of FIG. 1 ), or may be stored in advance at the time of manufacture of the electronic device 400.
  • Each of the stored second contents may include metadata for selection.
  • Meta data includes the characteristics of the guide voice included in the second content (eg, soft voice), the properties of the background music included in the second content, and the characteristics of the movement included in the second content (eg, exercise difficulty, exercise speed, exercise).
  • the processor 420 may check the user's body information, and select or determine the second content to be displayed based on the user's body information and metadata. For example, the processor 420 checks the user's body information (eg, male, 20s), and includes second content corresponding to the body information (eg, exercise using a relatively heavy exercise device (dumbell)). Content) can be selected. For another example, the processor 420 checks the user's body information (eg, female, 50's), and exercise using second content corresponding to the body information (eg, a relatively light exercise device (foam-roller)). Content) can be selected.
  • the processor 420 checks the user's body information (eg, male, 20s), and includes second content corresponding to the body information (eg, exercise using a relatively heavy exercise device (dumbell)). Content) can be selected.
  • the processor 420 checks the user's body information (eg, female, 50's), and exercise using second content corresponding to the body information (eg, a relatively light exercise device (
  • the processor 420 includes user's body information acquired using the camera 430 while displaying the first content and other electronic devices 104 and 108 connected to the electronic device 400.
  • the second content may be selected using the body information of the user received from.
  • the user's body information received from the other electronic devices 104 and 108 is the user's body information that is difficult for the electronic device 400 to obtain through image analysis (eg, the user's disease history, the user's weight). , The user's blood pressure or the user's blood glucose level).
  • the processor 420 may additionally consider the user's body information that is difficult to obtain using the camera 430 and may select or determine the second content, thereby providing more appropriate content to the user.
  • the processor 420 determines whether another external object exists and the type of the external object based on a result of analyzing an image acquired using the camera 430 while displaying the first content. You can check.
  • Other external objects may include exercise equipment (eg, foam rollers, dumbbells) provided by the user.
  • the processor 420 may select or determine the second content based on the identification result of the external object. For example, in response to confirming that the user has the dumbbell, the processor 420 may select or determine the second content including exercise content using the dumbbell.
  • the processor 420 may change the order of the second content to be displayed on the display 410 in response to confirming that another external object exists. For example, in response to confirming that the user has a dumbbell, the processor 420 may provide second content including exercise content using a dumbbell earlier than other second content.
  • the processor 420 may determine the second content based on a distance between the external object and the display 410. For example, in response to confirming that the distance between the user and the display 410 exceeds (or exceeds) a preset value, the processor 420 may select or determine the second content related to the full-body exercise. This is because when the distance between the user and the display 410 is greater than or equal to a preset value, the user's whole body may be reflected on the display 410. For another example, in response to confirming that the distance between the user and the display 410 is less than (or less than) a preset value, the processor 420 may select or determine the second content related to a part of the user's body. have. When the distance between the user and the display 410 is less than or equal to a preset value, a specific body part of the user may be reflected on the display 410, so that exercise content related to a part of the body may be appropriate.
  • the processor 420 may select at least one or more second contents to be displayed based on the user's body information, and display a list including the selected second contents.
  • the processor 420 may display the second content based on a user input for selecting at least one or more of the second content included in the list.
  • the electronic device 400 acquires and acquires body information of the user and information of other external objects provided by the user while determining the location where the second content is to be displayed according to the method described above.
  • the second content including appropriate exercise may be provided to the user by using one piece of information, so that the usability of the electronic device 400 may be increased.
  • 5A, 5B, 5C, 5D, 5E, 5F, 5G, 5H, and 5I illustrate a second content while displaying a first content in an electronic device according to various embodiments of the present disclosure.
  • an electronic device uses a camera (eg, the camera 430 of FIG. 4) to display a display (eg, the display 410 of FIG. 4 ). It is possible to check whether the external object 511 exists.
  • the electronic device 400 may activate the camera 430 at every preset period and check whether the external object 511 exists based on the image collected by the camera 430.
  • the display 410 in order to clearly display the reflection image 513 of the external object 511, the display 410 has an additional configuration that increases the reflectance of light traveling toward the display 410. It can be implemented as a mirror display.
  • the electronic device 400 displays the first content. Can be displayed.
  • the first content may be content provided by the electronic device 400 to obtain information on an external object.
  • the first content may be content provided to obtain various pieces of information for calibrating the position of the second content to be displayed.
  • Various pieces of information for calibrating the location of the second content to be displayed are variables required to determine the location (or coordinates on the display) of the area where the content is to be displayed. It may include location information on the display 410.
  • the first content may be content related to a warm-up exercise (eg, stretching, shaking an arm, or bending a knee) of a user of the electronic device 400.
  • the first content is video or audio content to induce user interaction (e.g., I will wear an item to help with stretching, please align my wrist to the location of the item displayed on the screen) and at least one feature point of the user (e.g.: It may include at least one or more visual objects 515 and 517 for acquiring the positions of the reflection images of the left wrist and the right wrist.
  • the display 410 may display at least one or more visual objects 515 and 517 while the user's reflection image 513 is displayed.
  • the electronic device 400 may determine a location to display the first content based on location information of an external object acquired using the camera 430.
  • the electronic device 400 may control the display 410 to display the first content on the determined location.
  • the electronic device 400 may detect a user's interaction (eg, an interaction for fitting a wrist to an area where a visual object is displayed) 519 existing in front of the display 410.
  • the user may perform the warm-up exercise while being provided with content related to the user's warm-up exercise.
  • the electronic device 400 may determine a location of a reflection image of an external object based on a user input collected while the user performs the warm-up exercise.
  • the electronic device 400 displays the visual objects 515 and 517 on the display 410, and the position of the reflection image 519 of the external object and the visual objects 515 and 517 It is possible to receive a user input indicating that the location of is matched.
  • User input may be received in various forms.
  • the user input may be an input implemented with a voice indicating that the location of the reflection image of the external object and the location of the visual object match.
  • the user input may be an input implemented in the form of a gesture indicating that the position of the reflection image of the external object and the position of the visual object are identical (eg, a gesture of holding and holding a hand).
  • the electronic device 400 may determine the location of the reflection image of the external object as the location of the visual object.
  • 5D and 5E are diagrams illustrating an embodiment in which the electronic device 400 determines a location where a second content is to be displayed by calibrating a location where a first content is displayed according to various embodiments of the present disclosure.
  • 5D shows the estimated position 525 on the display 410 of a reflection image of an external object acquired by the electronic device 400 using the camera 430 and the positions of the visual objects 521 and 523.
  • the electronic device 400 displays an actual position 527 on the display 410 of a reflection image of a partial area (eg, wrist) of an external object based on the position of the visual objects 521 and 523. ) Can be determined.
  • the electronic device 400 checks the position difference between the estimated position 525 and the visual objects 521 and 523 (or the difference between the estimated position 525 and the actual position 527 of the reflected image), and the difference value Based on the calibration, the position at which the second content is displayed may be determined.
  • 5D and 5E in response to confirming that the actual location 527 is higher than the estimated location 525, the electronic device 400 calibrates the location where the first content is displayed. Then, the calibrated coordinates may be determined as a position in which the second content is to be displayed.
  • 5F to 5I illustrate an embodiment of acquiring characteristics (eg, body characteristics) of an external object (eg, a user) while the electronic device 400 displays first content according to various embodiments of the present disclosure. It is a drawing.
  • the electronic device 400 while displaying the first content, determines a location in which the second content is to be displayed, as well as characteristics of an external object (eg, a user). Characteristics) can be acquired.
  • the electronic device 400 may check characteristics of an external object based on an image acquired using a camera (eg, the camera 430 of FIG. 4 ).
  • the electronic device 400 includes parts of the user's face and body (e.g., shoulders, wrists, etc.) based on an image acquired using a camera 430 implemented using an RGB camera method or a depth camera method.
  • the electronic device 400 is an external electronic device connected to the electronic device 400 (for example, a server storing user information (for example, the server 108 of FIG. 1) or an external electronic device (for example: A characteristic of an external object may be received from the electronic device 102 of FIG. 1.
  • a server storing user information (for example, the server 108 of FIG. 1) or an external electronic device (for example: A characteristic of an external object may be received from the electronic device 102 of FIG. 1.
  • the electronic device 400 may determine or select the second content to be displayed based on the acquired characteristics of the external object.
  • 5F is a diagram illustrating an example in which the electronic device 400 measures a user's arm radius.
  • the electronic device 400 may output voice (eg, shake the arm up and down as wide as possible) or video content that induces a gesture for measuring the radius of the user's arm included in the first content. have.
  • the electronic device 400 may detect a user input 532 moving a left arm or a user input 533 moving a right arm, and check the radius of the user's arm based on the user inputs 532 and 533.
  • the electronic device 400 may determine the second content to be displayed based on the radius of the user's arm.
  • the electronic device 400 may display an operation of moving the visual objects 534 and 535 to correspond to the movement of the arm on the display 410.
  • the display 410 may display the visual objects 534 and 535 while the user's reflection image 531 is displayed.
  • 5G is a diagram illustrating an example in which the electronic device 400 measures a user's height.
  • the electronic device 400 includes a voice (for example, stop approaching when the line 537 is positioned at the end of the head) or video that induces a gesture for measuring the user's height included in the first content. Content can be output.
  • the electronic device 400 may detect a motion approaching the user and measure a distance between the user and the display 410.
  • the electronic device 400 may measure the height of the user based on the distance between the user and the display and the height of the line corresponding to the coordinates of the line displayed on the display 410.
  • the electronic device 400 may display an operation of moving the visual objects 534 and 535 to correspond to the movement of the arm on the display 410.
  • the display 410 may display the visual objects 534 and 535 while the user's reflection image 531 is displayed.
  • 5H is a diagram illustrating an example in which the electronic device 400 measures the position of a user's waist.
  • the electronic device 400 performs an operation of placing an item 539 on the knee and holding it for several seconds with a voice that induces a gesture for measuring the position of the user's waist included in the first content.
  • video content can be output.
  • the display 410 may display the item 539 while the user's reflection image 531 is displayed.
  • the item 539 displayed on the display is a still image or an image that can be moved virtually output by the electronic device 400, and the electronic device 400 determines the position of the user's waist based on the position of the item 539. Can be measured.
  • the electronic device 400 may determine the location of the user's waist based on the location information of the item 539.
  • 5H is an embodiment of measuring the position of the user's waist, but positions of various body parts of the user may be measured through the method described in FIG. 5H.
  • 5I is a diagram illustrating an example in which the electronic device 400 measures a moving radius of a user.
  • the electronic device 400 displays a voice (eg, a specific exercise tool (eg, ball or dumbbell)) on the display 410 for inducing a gesture for measuring a moving radius of a user included in the first content. Place it in a designated location) or output video content.
  • a reflection image 531 of a user and a reflection image 541 of a specific exercise tool may be displayed on the display 410.
  • the electronic device 400 may detect a user input for moving a specific exercise tool to illuminate a specific exercise tool on a designated area on the display 410 and measure a moving radius of the user.
  • the electronic device 400 may determine the second content to be displayed based on the user's moving radius.
  • 6A and 6B are diagrams illustrating an embodiment of displaying a first content and a visual object in an electronic device according to various embodiments of the present disclosure.
  • an electronic device eg, the electronic device 400 of FIG. 4 (or a processor (eg, the processor 420 of FIG. 4 )) is a display corresponding to a specific part of an external object (eg, the electronic device 400 of FIG. 4 ).
  • At least one visual object may be displayed on the display 410 of Fig. 4.
  • the location information of the visual object includes at least one area (eg, a body) of an external object (eg, a person).
  • the electronic device 400 responds to confirming that the position of the reflection image of the part of the external object matches the position on the display 420 of the visual object.
  • the position of the reflection image of a part of the external object corresponds to the position of the visual object.
  • a user input eg, voice input, gesture input
  • 6A is a diagram illustrating an embodiment of displaying a visual object to check a position of a reflection image of a part of a user's body (eg, head, shoulders, knees, or feet).
  • a part of a user's body eg, head, shoulders, knees, or feet.
  • the electronic device 400 may check a location of a reflection image 610 of a part of a user's body using a visual object 621.
  • the visual object 621 is a movable image, and in FIG. 6A, it can be moved in a vertical direction.
  • the electronic device 400 may output first content including audio content (eg, move your hand when a picture arrives on your head, shoulders, knees, or feet) and a visual object.
  • the electronic device 400 may be in a state in which a camera (eg, the camera 430 of FIG. 4) is activated to receive a user's gesture.
  • the electronic device 400 may detect a user gesture (eg, a user gesture when the visual object 621 reaches the head) using the camera 430.
  • the electronic device 400 may determine the location of the reflection image of the user's head based on the location (or coordinates) of the visual object 421 corresponding to the time at which the user gesture is received.
  • the electronic device 400 may detect a user gesture (eg, a user gesture when the visual object 621 reaches a shoulder) using the camera 430.
  • the electronic device 400 may determine the location of the reflection image of the user's shoulder based on the location (or coordinates) of the visual object 421 corresponding to the time at which the user gesture is received.
  • the electronic device 400 may detect a user gesture (eg, a user gesture when the visual object 621 reaches a knee) using the camera 430.
  • the electronic device 400 may determine the location of the reflection image of the user's knee based on the location (or coordinates) of the visual object 421 corresponding to the time at which the user gesture is received.
  • the electronic device 400 is based on a result of analyzing an image acquired using the camera 430 while displaying the first content. ), and the type of other external objects (eg, dumbbells, balls) can be checked.
  • the electronic device 400 may change the characteristics of the visual object based on the identification result of the external object.
  • the characteristics of the visual object may include the size and shape of the visual object.
  • the electronic device 400 displays first content including a visual object 630 for measuring a user's body characteristics (eg, a user's height, a user's moving radius). can do.
  • a visual object 630 and a reflection image 631 of a user may be displayed on the display 410.
  • the electronic device 400 identifies an external object (for example, a ball) other than the user based on the result of analyzing the image acquired using the camera 430, and identifies the visual object 630 )'S characteristics can be changed.
  • an external object for example, a ball
  • the electronic device 400 in response to identifying other external objects, has a visual object 640 whose characteristics (eg, size, shape) of the visual object 630 that has been displayed have been changed. Can be displayed.
  • a visual object 640, a reflection image 633 of a user, and a reflection image 635 of another external object may be displayed on the display 410.
  • FIGS. 7A, 7B, 7C, and 7D are diagrams illustrating an embodiment of additionally obtaining body information of a user in an electronic device according to various embodiments of the present disclosure.
  • an electronic device acquires characteristics of an external object (eg, the external object 511 of FIG. 5A) while displaying the first content.
  • an external object eg, the external object 511 of FIG. 5A
  • the characteristics of the external object 511 may include the user's height, the user's arm activity radius, and the user's knee position. The embodiments described below assume that the external object 511 is the user's body.
  • the electronic device 400 may determine that the accuracy of a characteristic of a part of the user 511 is equal to or less than a preset value.
  • FIG. 7A is a diagram showing a result of the electronic device 400 checking the characteristics of the external object 511, and the electronic device 400 is a partial area of the user's body (eg, the user's leg, wrist, waist, stomach) Although the characteristics of (711) were checked, other parts of the user's body (e.g., the user's forearm, armpit) 715 or the characteristics of the user's arm movement radius 713 were not confirmed, or the movement of the arm It can be seen that the accuracy of the confirmed characteristic of the radius 713 or other partial region 715 is less than or equal to a preset value. Although the screen shown in FIG. 7A is shown to be displayed on the display 410, the screen including the result of checking the characteristics of the user's body may not be displayed on the display 410.
  • the screen shown in FIG. 7A is shown to be displayed on the display 410, the screen including the result of checking the characteristics of the user's body may not be displayed on the display 410.
  • the electronic device 400 in response to confirming that the accuracy of the characteristics of the portions 713 and 715 of the user 511 is less than or equal to a preset value, the electronic device 400 Third content, which is additional content for increasing accuracy, may be provided.
  • the third content may be content for acquiring characteristics of a partial area of an external object.
  • the electronic device 400 may display a screen 721 for providing third content on the display 410.
  • the electronic device 400 may display third content including a warm-up exercise for an arm on the display 410.
  • the electronic device 400 may output voice (eg, shake your arm up or down) or video content that induces a gesture for measuring a user's arm.
  • the electronic device 400 may detect the user input 731 moving the left arm or the user input 733 moving the right arm, and check the radius of the user's arm based on the user inputs 731 and 733. While checking the radius of the arm, the electronic device 400 may display an operation of moving the visual objects 735 and 737 to correspond to the movement of the arm on the display 410.
  • the visual objects 735 and 737 may be displayed on the display 410 together with the user's reflection image 739.
  • the electronic device 400 displays on the display 410 an indicator 741 indicating the end of the warm-up exercise in response to the completion of measuring the characteristics of the arm portion or the moving radius of the arm. I can.
  • 8A, 8B, and 8C are diagrams illustrating second content to be displayed based on user's body information in an electronic device according to various embodiments of the present disclosure.
  • a plurality of second contents may be stored on a memory (eg, the memory 130 of FIG. 1) of an electronic device (eg, the electronic device 400 of FIG. 4 ).
  • the second content stored in the memory may be received from an external electronic device (eg, the server 108 of FIG. 1 or the electronic device 104 of FIG. 1 ), or may be stored in advance at the time of manufacture of the electronic device 400.
  • Each of the stored second contents may include metadata for selection.
  • Meta data includes the characteristics of the guide voice included in the second content (eg, soft voice), the properties of the background music included in the second content, and the characteristics of the movement included in the second content (eg, exercise difficulty, exercise speed, exercise).
  • the electronic device 400 may check the user's body information, and select or determine the second content to be displayed based on the user's body information and metadata. For example, in response to confirming that the user's height is less than (or less than) a preset value (eg, 130cm) based on the user's body information, the electronic device 400 provides second content that can be used by children. can do. For another example, in response to confirming that the user's age is equal to or greater than a preset value (eg, 65 years old) based on the user's body information, the electronic device 400 may provide second content that can be used by the elderly. .
  • a preset value eg, 130cm
  • the electronic device 400 checks the user's body information (eg, having strong muscle strength), and second content corresponding to the body information (eg, exercise using a relatively heavy exercise device (dumbell)). Content) can be selected.
  • the electronic device 400 checks the user's body information (eg, having weak muscle strength), and provides second content corresponding to the body information (eg, a relatively light exercise device (foam-roller)). Contents including the exercise used) can be selected.
  • the electronic device 400 may display the selected second content on the display 410.
  • FIG. 8A is a diagram illustrating an embodiment in which second content is displayed on the display 410, and the second content may refer to content that guides an exercise performed by a user with an exercise tool (eg, a ball).
  • the second content may include an area 811 displaying data related to the user's exercise state, an area 813 displaying an image related to the user's exercise, and a visual guide 817 related to the user's exercise.
  • the visual guide 817 may be implemented in the form of an augmented reality, and the visual guide 817 may be overlaid and displayed on the reflection image 815 of the user.
  • the electronic device 400 may change the layout of elements constituting the second content based on the user's body information.
  • the electronic device 400 includes an area 811 displaying data related to a user's exercise state and an area 813 displaying an image related to the user’s exercise. You can change the layout of the components by changing their position or size.
  • 8B is a diagram illustrating an embodiment in which second content is displayed on the display 410.
  • the electronic device 400 may display a guide 817 related to movement of an exercise tool (eg, a ball) on the display 410.
  • the display 410 may display a reflection image 815 of a user, a reflection image 819 of an exercise tool, and a guide 817.
  • the electronic device 400 tracks the movement of an exercise tool using the camera 430, and characteristics of the movement of the exercise tool (for example, the movement of the exercise tool corresponds to the guide 817).
  • a score related to the user's exercise may be calculated based on the degree of matching, the type of exercise, the movement distance of the exercise tool, the movement speed of the exercise tool, or the type of exercise tool.
  • the exercise-related score may be measured each time a user performs one of a plurality of movements included in the exercise.
  • the calculated score may be displayed on the area 811 displaying data related to the user's exercise state.
  • FIG. 8C is a diagram illustrating an embodiment in which another second content shown in FIG. 8B is displayed on the display 410.
  • the second content includes an area 811 displaying data related to a user's exercise state, an area 813 displaying an image related to a user's exercise, and a visual guide related to the user's exercise ( 821).
  • the visual guide 821 related to the user's exercise visually displays the expected path of the movement of the user or exercise tool that occurs while the user performs the exercise (eg, dotted line, solid line, plane, or Animation type) can mean a guide.
  • the electronic device 400 may adjust the position or size of the visual guide 821 based on the user's body information acquired while displaying the first content.
  • the visual guide 821 shown in FIG. 8C shows a moving path of the user that occurs while the user performs stretching using an arm.
  • the electronic device 400 may set the length of the visual guide 821 based on the moving radius of the user acquired while displaying the first content.
  • 9A, 9B, 9C, and 9D are diagrams illustrating second content to be displayed based on user's body information in an electronic device according to various embodiments of the present disclosure.
  • the second content includes an area displaying data related to the user's exercise state (eg, 811 in FIG. 8A), and an area displaying an image related to the user’s exercise (eg, 813 in FIG. 8A). ), a visual guide related to the user's exercise (eg, 817 of FIG. 8A ).
  • the electronic device 400 displays body information of a user acquired using the camera 430 while displaying the first content and/or other electronic device 104 connected to the electronic device 400. , Based on the user's body information received from (108), an exercise goal and a visual guide related to the exercise may be generated.
  • 9A is a diagram illustrating data related to an exercise state of a user included in second content (eg, 811 in FIG. 8A ).
  • the electronic device 400 includes body information of a user acquired using the camera 430 and/or a user received from other electronic devices 104 and 108 connected to the electronic device 400. You can set an exercise goal based on your body information. By setting an exercise target in consideration of the user's body information, different exercise targets may be set for each user.
  • the exercise target is a target score related to the user's exercise according to at least one or more of the degree to which the movement of the exercise tool coincides with the guide 817, the type of exercise, the movement distance of the exercise tool, the movement speed of the exercise tool, and the type of exercise tool, It may include at least one or more of a target calorie related to calories consumed by the user through exercise, a target exercise time related to a time when the user exercised, and a target heart rate related to the user's heart rate measured when the user performs an exercise. .
  • the electronic device 400 may display information related to an exercise goal and an exercise state on an area 811 displaying data related to an exercise state.
  • an area 811 displaying data related to an exercise state includes a score area 901 displaying a user's exercise score and a target score, and a calorie area displaying calories actually consumed and target calories by the user ( 903), an exercise time region 905 displaying an exercise time and a target exercise time actually exercised by the user, and a heart rate region 907 displaying the user's heart rate and target heart rate may be included.
  • 9B and 9C are diagrams illustrating a visual guide related to exercise included in the second content.
  • the electronic device 400 includes body information of a user acquired using the camera 430 and/or a user received from other electronic devices 104 and 108 connected to the electronic device 400. It is possible to create a visual guide based on the body information.
  • the visual guide may include an appropriate posture and movement range in consideration of the user's body information for each exercise operation.
  • the visual guide may include at least one exercise reference point generated based on the user's body information.
  • the exercise reference point may mean an indicator that displays the position or posture of the user's body part corresponding to the reference point.
  • the visual guide may be implemented as a line connecting at least one or more exercise reference points.
  • the electronic device 400 may display a visual guide 911 provided to a specific user on the display 410.
  • the visual guide 911 and the user's reflection image 912 may be displayed together on the display 410.
  • the electronic device 400 may display a visual guide 913 provided to another user on the display 410.
  • the display 410 may display the visual guide 913 while the user's reflection image 914 is displayed.
  • the electronic device 400 may generate and display different visual guides for each user. For example, when a specific user has relatively good athletic ability compared to other users, the electronic device 400 provides a visual guide provided to a specific user with a relatively high standard (eg: Movement posture, movement speed, movement range).
  • a relatively high standard eg: Movement posture, movement speed, movement range
  • the visual guide 912 illustrated in FIG. 9B has a relatively higher standard than the visual guide 913 illustrated in FIG. 9C.
  • 9D is a diagram illustrating guide information displayed on the display 410 by the electronic device 400 according to various embodiments of the present disclosure.
  • the electronic device 400 includes an exercise guide including at least one exercise reference point 924, 925, 926, 927, 928, 929, 930 that is generated based on body information to the user. 930) may be generated based on the user's body information.
  • the exercise reference points 924, 925, 926, 927, 928, 929, and 930 may refer to an indicator that displays the position or posture of the user's body part corresponding to the exercise reference point.
  • the visual guide 930 may be implemented as a line connecting at least one or more exercise reference points 924, 925, 926, 927, 928, 929, and 930.
  • the electronic device 400 includes a user's body among exercise reference points 924, 925, 926, 927, 928, 929, and 930 based on an image collected using the camera 430.
  • the user's exercise score may be determined based on the number of exercise reference points that match the position of.
  • the electronic device 400 may adjust the number of exercise reference points according to the difficulty of the exercise. For example, even in the same motion, the electronic device 400 may increase the number of exercise reference points as the difficulty of the exercise increases.
  • the electronic device 400 may display an image 921 related to a user's exercise together with the exercise guide 930.
  • the display 410 can display the exercise guide 930 in the area where the user's reflection image is displayed while the user's reflection image is displayed, so that the user can exercise while simultaneously checking the exercise guide 930 and the reflection image. You can do it.
  • 10A, 10B, 10C, and 10D are diagrams illustrating second content to be displayed based on a result of identifying another external object in an electronic device according to various embodiments of the present disclosure.
  • an electronic device may display other external devices based on a result of analyzing an image acquired using the camera 430 while displaying the first content. You can check whether an object exists and the type of external object. Other external objects may include exercise equipment (eg, foam rollers, dumbbells) provided by the user.
  • the electronic device 400 may select or determine the second content based on the identification result of the external object.
  • the electronic device 400 may select second content that can use a specific external object and display the selected second content on a display (eg, the display 410 of FIG. 4 ).
  • the electronic device 400 confirms that the ball exists, and selects and displays second content that can use the ball, based on the result of analyzing the image acquired using the camera 430. I can.
  • the electronic device 400 includes a movement point 1005 of a ball related to an exercise using a ball and an exercise reference point 1007 indicating a movement point of a body related to an exercise using the ball.
  • a visual guide may be displayed on the display 410.
  • the display 410 may display a visual guide while displaying a reflection image 1001 of a user performing an exercise and a reflection image 1003 of a ball.
  • the electronic device 400 calculates an exercise score based on the number of exercise reference points that match the user's body among at least one exercise reference point 1007 and the degree to which the ball is close to the movement point 1005 of the ball, and is related to the exercise score.
  • Feedback can be provided to the user through a variety of means (eg, voice).
  • the electronic device 400 confirms the existence of a dumbbell based on a result of analyzing an image acquired using the camera 430, and selects and displays a second content that can use the dumbbell. can do.
  • the electronic device 400 includes a movement point 1015 of a dumbbell related to an exercise using a dumbbell and an exercise reference point 1017 indicating a movement point of a body related to an exercise using the dumbbell.
  • a visual guide may be displayed on the display 410.
  • the display 410 may display a visual guide while displaying a reflection image 1011 of a user performing an exercise and a reflection image 1013 of a dumbbell.
  • the electronic device 400 calculates an exercise score based on the number of exercise reference points coincident with the user's body among at least one exercise reference point 1017 and the degree of proximity of the dumbbell to the movement point 1015 of the dumbbell, and calculates the exercise score and Relevant feedback can be provided to the user through various means (eg, voice).
  • an electronic device may display other external devices based on a result of analyzing an image acquired using the camera 430 while displaying the first content. It can be confirmed that the object does not exist.
  • the electronic device 400 may select second content related to exercise (eg, bare hand exercise) without using a specific exercise device, and display the selected second content on the display 410.
  • 10C and 10D are diagrams illustrating an embodiment in which the electronic device 400 displays second content related to exercise without using a specific exercise device.
  • the electronic device 400 indicates an exercise reference point 1025 indicating a movement point of the body related to an exercise (eg, squat), and a movement direction of the body generated based on the exercise reference point 1025.
  • a visual guide including the indicator 1023 may be displayed on the display 410.
  • the display 410 may display a visual guide while a reflection image 1021 of a user performing an exercise is displayed.
  • the electronic device 400 calculates an exercise score based on the number of exercise reference points that match the user's body among at least one exercise reference point 1025, and provides feedback related to exercise score or exercise posture (eg, thigh and Keep your knees level) can be provided to the user through a variety of means (e.g. voice).
  • the electronic device 400 checks the user's posture and displays information for comparing the user's posture and a visual guide based on a result of analyzing an image acquired using the camera 430 ( 410).
  • the electronic device 400 indicates an exercise reference point 1035 indicating a movement point of the body related to an exercise (eg, down leg), and a movement direction of the body generated based on the exercise reference point 1035
  • a visual guide including the indicator 1033 to be displayed may be displayed on the display 410.
  • the display 410 may display a visual guide while a reflection image 1031 of a user performing an exercise is displayed.
  • the electronic device 400 calculates an exercise score based on the number of exercise reference points that match the user's body among at least one exercise reference point 1035, and provides feedback related to exercise score or exercise posture (for example, the sole of the foot).
  • Attach it to the floor and keep your knee high and back completely open can be provided to the user through a variety of means (e.g., voice).
  • the electronic device 400 checks the user's posture and displays information for comparing the user's posture and a visual guide based on a result of analyzing an image acquired using the camera 430 ( 410).
  • 11A, 11B, 11C, and 11D are diagrams illustrating second content to be displayed according to a distance between the electronic device and an external object in an electronic device according to various embodiments of the present disclosure.
  • an electronic device may display second content based on a distance between a display (eg, the display 410 of FIG. 4) and an external object. You can determine the location.
  • the electronic device 400 determines the distance between the camera 430 and the external object based on the depth image collected by the camera (eg, the camera 430 of FIG. 4 ), and determines the distance between the camera 430 and the external object. Based on this, a distance between the display 410 and an external object may be determined.
  • the distance between the camera 430 and the external object may be the same as or similar to the distance between the display 410 and the external object.
  • the electronic device 400 may set a different display method of the second content to be displayed based on a distance between the display 410 and an external object.
  • the electronic device 400 decreases the amount of displayed information (eg, exercise reference point, visual guide), and displays the size of the information. Can increase Conversely, the electronic device 400 may increase the amount of displayed information as the distance between the display 410 and the external object decreases.
  • the amount of displayed information eg, exercise reference point, visual guide
  • 11A and 11B are diagrams illustrating a visual guide displayed differently according to a distance between the display 410 and an external object.
  • the visual guide of FIG. 11A shows the visual guide 1110 displayed when the distance between the display 410 and the external object is larger than that of FIG. 11B, and the visual guide of FIG. 11B is between the display 410 and the external object.
  • a visual guide 1120 displayed when the distance of is smaller than that of FIG. 11A is shown.
  • the electronic device 400 may minimize the amount of displayed information while displaying displayed information in a large size.
  • the amount of information displayed e.g., exercise reference points 1111, 1113, 1115
  • FIG. 11B e.g., exercise reference points 1121, 1122, 1123, 1124). , 1125
  • the electronic device 400 may increase the amount of displayed information. Increasing the amount of displayed information may provide more detailed information to the user, and furthermore, an accurate exercise posture may be provided to the user. As the distance between the display 410 and the external object decreases, the electronic device 400 may move the display position of the second content to the upper portion of the display 410. As the distance between the external object and the display 410 decreases, the electronic device 400 may not provide the second content on the specific location 1126. Only a reflection image of an external object may be displayed on a specific location 1126.
  • the electronic device 400 may determine the second content to be displayed based on the distance between the external object and the display 410. For example, in response to confirming that the distance between the user and the display 410 exceeds (or exceeds) a preset value, the electronic device 400 may select or determine the second content related to the full-body exercise. This is because when the distance between the user and the display 410 is greater than or equal to a preset value, the user's whole body may be reflected on the display 410. Referring to FIG. 11C, in response to confirming that the distance between the user and the display 410 is greater than or equal to a preset value, the electronic device 400 displays second content including a visual guide 1130 related to a full-body exercise ( 410).
  • the electronic device 400 selects or determines the second content related to a part of the user's body. I can.
  • the electronic device 400 displays a visual guide 1140 related to upper body movement or a visual guide 1150 related to lower body movement.
  • the second content including) may be displayed on the display 410.
  • An electronic device includes a display; camera; And a processor, wherein the processor controls the display to display the first content, and while the first content is displayed, the position of the visual object included in the first content and the position of the visual object included in the first content are reflected and displayed by the display.
  • the location information of the reflection image is determined based on the location information of the visual object, and the location where the second content is to be displayed is determined by the reflection. It may be set to determine based on the location information of the image.
  • the processor may be configured to adjust the position of the second content so that the position of the reflection image and the position of the second content coincide with each other.
  • the processor is configured to obtain information on at least a part of the external object in response to confirming that the accuracy of information on at least a part of the external object is less than or equal to a preset value. It may be set to display third content including a visual object on the display, and to obtain information on at least part of the external object using the camera in response to the user's interaction with the visual object. .
  • the processor obtains distance information between the external object and the display using the camera, and sets the location of the content to be displayed based on the distance information. Can be.
  • the processor determines a location to be displayed of the first content based on location information of the external object acquired using the camera. Can be set to
  • the external object includes a user of the electronic device
  • the processor includes an image obtained by using the camera while the first content is displayed. It may be set to obtain the user's body information based on the analysis result of and to determine the second content to be displayed based on the body information.
  • the processor analyzes the image acquired using the camera to identify another external object, and an identification result of the other external object It may be set to determine the second content based on.
  • the processor may be configured to obtain distance information between the external object and the display using the camera and determine the second content based on the distance information. have.
  • the processor in response to determining that the distance between the external object and the display is equal to or greater than a preset value, the processor displays second content related to the entire body of the user, and the In response to confirming that the distance between the external object and the display is equal to or greater than a preset value, it may be set to display second content related to a part of the user's body.
  • the first content may include content related to a user's warm-up exercise
  • the second content may be content for the user's exercise performed after the warm-up exercise.
  • FIG. 12 is a flowchart illustrating a method 1200 of operating an electronic device according to various embodiments of the present disclosure.
  • the electronic device may display the first content on the display (eg, the display 410 of FIG. 4 ).
  • the first content may be content provided by the electronic device 400 to obtain information on an external object.
  • the first content may be content provided to obtain various pieces of information for calibrating the location of the content to be displayed.
  • Various pieces of information for determining the location of the second content to be displayed by the electronic device 400 are variables required to determine the location (or coordinates on the display) of the area in which the content is to be displayed. It may include location information on the display 410.
  • the first content is displayed in an area corresponding to the feature point.
  • At least one or more visual objects may be displayed.
  • the visual object may be included in the first content.
  • the position at which the visual object is displayed may be a position that exists within a preset distance from the area corresponding to the feature point.
  • the first content may be content related to a warm-up exercise (eg, stretching, shaking an arm, or bending a knee) of a user of the electronic device 400.
  • a warm-up exercise eg, stretching, shaking an arm, or bending a knee
  • the user may correspond to an external object.
  • the user may perform the warm-up exercise while being provided with content related to the user's warm-up exercise.
  • the electronic device 400 may check whether the position of the visual object and the position of the reflection image of the external object match.
  • the electronic device 400 displays a visual object in a specific area on the display 410, and receives a user input indicating that the position of the reflection image of the external object matches the position of the visual object. You can receive it.
  • User input may be received in various forms.
  • the user input may be an input implemented with a voice indicating that the location of the reflection image of the external object and the location of the visual object match.
  • the user input may be an input implemented in the form of a gesture indicating that the position of the reflection image of the external object and the position of the visual object match.
  • the electronic device 400 may determine that the location of the visual object and the location of the reflection image of the external object match.
  • the electronic device 400 may obtain location information of the reflection image in response to confirming that the location of the visual object matches the reflection image of the external object.
  • the electronic device 400 may determine the position of the reflection image as the position of the visual object.
  • the electronic device 400 may determine the display position of the second content based on the position information of the reflected image.
  • the electronic device 400 may calibrate the position where the first content is displayed to match the position of the reflection image, and determine the calibrated coordinates as the position where the second content is to be displayed.
  • a method of operating an electronic device may include displaying first content; Detecting movement of an external object related to a visual object included in the first content while the first content is displayed; In response to confirming that the location of the visual object and the location of the reflective image of the external object reflected and displayed by the display match, the location information of the reflected image is based on the location information of the visual object An operation to be obtained; And determining a location where the second content is to be displayed based on location information of the reflected image.
  • the operation of determining a position in which the second content is to be displayed includes the position of the second content so that the position of the reflection image and the position of the second content coincide with each other. It may include an operation of adjusting.
  • a visual method for acquiring information on at least part of the external object Displaying third content including an object on the mirror display; And acquiring information on at least part of the external object using the camera in response to the user's interaction with the visual object.
  • a method of operating an electronic device may include obtaining distance information between the external object and the display using the camera; And adjusting the location of the second content based on the distance information.
  • the method of determining the position to be displayed of the first content is based on the location information of the external object acquired using the camera. It may further include an operation.
  • the external object includes a user of the electronic device, and the method of operating the electronic device is obtained using the camera while the first content is displayed. Acquiring body information of the user based on the analysis result of the image including the user; And determining the second content to be displayed based on the body information.
  • a method of operating an electronic device may include identifying another external object by analyzing the image acquired using the camera while the first content is displayed; And determining the second content based on the identification result of the other external object.
  • the determining of the second content may include obtaining distance information between the external object and the display using the camera; And determining the second content based on the distance information.
  • determining the second content in response to confirming that a distance between the external object and the display is equal to or greater than a preset value, the entire body of the user Determining to display related second content; Alternatively, in response to confirming that the distance between the external object and the display is equal to or greater than a preset value, determining to display second content related to a part of the user's body may be included.
  • the first content includes content related to a user's warm-up exercise
  • the second content is a content for the user's exercise performed after the warm-up exercise.
  • An electronic device may be a device of various types.
  • the electronic device may include, for example, a portable communication device (eg, a smart phone), a computer device, a portable multimedia device, a portable medical device, a camera, a wearable device, or a home appliance.
  • a portable communication device eg, a smart phone
  • a computer device e.g., a smart phone
  • a portable multimedia device e.g., a portable medical device
  • a camera e.g., a camera
  • a wearable device e.g., a smart bracelet
  • the electronic device according to the embodiment of the present document is not limited to the above-described devices.
  • first, second, or first or second may be used simply to distinguish the component from other Order) is not limited.
  • Some (eg, first) component is referred to as “coupled” or “connected” to another (eg, second) component, with or without the terms “functionally” or “communicatively”.
  • module used in this document may include a unit implemented in hardware, software, or firmware, and may be used interchangeably with terms such as logic, logic blocks, parts, or circuits.
  • the module may be an integrally configured component or a minimum unit of the component or a part thereof that performs one or more functions.
  • the module may be implemented in the form of an application-specific integrated circuit (ASIC).
  • ASIC application-specific integrated circuit
  • Various embodiments of the present document include one or more instructions stored in a storage medium (eg, internal memory 136 or external memory 138) that can be read by a machine (eg, electronic device 101). It may be implemented as software (for example, the program 140) including them.
  • the processor eg, the processor 120 of the device (eg, the electronic device 101) may call and execute at least one command among one or more commands stored from a storage medium. This enables the device to be operated to perform at least one function according to the at least one command invoked.
  • the one or more instructions may include code generated by a compiler or code executable by an interpreter.
  • a storage medium that can be read by a device may be provided in the form of a non-transitory storage medium.
  • non-transitory only means that the storage medium is a tangible device and does not contain a signal (e.g., electromagnetic waves), and this term refers to the case where data is semi-permanently stored in the storage medium. It does not distinguish between temporary storage cases.
  • a signal e.g., electromagnetic waves
  • a method according to various embodiments disclosed in the present document may be provided by being included in a computer program product.
  • Computer program products can be traded between sellers and buyers as commodities.
  • Computer program products are distributed in the form of a device-readable storage medium (e.g., compact disc read only memory (CD-ROM)), or through an application store (e.g., Play StoreTM) or two user devices (e.g., compact disc read only memory (CD-ROM)). It can be distributed (e.g., downloaded or uploaded) directly between, e.g. smartphones).
  • a device-readable storage medium e.g., compact disc read only memory (CD-ROM)
  • an application store e.g., Play StoreTM
  • two user devices e.g., compact disc read only memory (CD-ROM)
  • It can be distributed (e.g., downloaded or uploaded) directly between, e.g. smartphones).
  • At least a part of the computer program product may be temporarily stored or temporarily generated in a storage medium that can be read by a device such as a server of a manufacturer, a server of an application store, or a memory of a relay server.
  • each component (eg, module or program) of the above-described components may include a singular number or a plurality of entities.
  • one or more components or operations among the above-described corresponding components may be omitted, or one or more other components or operations may be added.
  • a plurality of components eg, a module or program
  • the integrated component may perform one or more functions of each component of the plurality of components in the same or similar to that performed by the corresponding component among the plurality of components prior to the integration. .
  • operations performed by a module, program, or other component may be sequentially, parallel, repeatedly, or heuristically executed, or one or more of the operations may be executed in a different order or omitted. Or one or more other actions may be added.

Landscapes

  • Business, Economics & Management (AREA)
  • Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Tourism & Hospitality (AREA)
  • Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Human Resources & Organizations (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Human Computer Interaction (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Dans un dispositif électronique et un procédé de fonctionnement du dispositif électronique selon divers modes de réalisation de l'invention, le dispositif électronique peut comprendre : une unité d'affichage ; une caméra ; et un processeur, le processeur étant configuré pour : commander l'unité d'affichage pour afficher un premier contenu ; tandis que le premier contenu est en cours d'affichage, en réponse à l'identification que l'emplacement d'un objet visuel inclus dans le premier contenu coïncide avec l'emplacement d'une image réfléchissante d'un objet externe, réfléchie et affichée par l'unité d'affichage, déterminer des informations d'emplacement de l'image réfléchissante sur la base d'informations d'emplacement de l'objet visuel ; et déterminer un emplacement auquel un second contenu doit être affiché, sur la base des informations d'emplacement de l'image réfléchissante. Divers autres modes de réalisation sont possibles.
PCT/KR2020/008628 2019-11-19 2020-07-02 Dispositif électronique pour fournir un contenu sur la base de l'emplacement d'une image réfléchissante d'un objet externe, et procédé de fonctionnement de dispositif électronique WO2021101006A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2019-0148827 2019-11-19
KR1020190148827A KR20210061062A (ko) 2019-11-19 2019-11-19 외부 객체의 반사 이미지의 위치에 기반하여 컨텐츠를 제공하는 전자 장치 및 전자 장치의 동작 방법

Publications (1)

Publication Number Publication Date
WO2021101006A1 true WO2021101006A1 (fr) 2021-05-27

Family

ID=75980685

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2020/008628 WO2021101006A1 (fr) 2019-11-19 2020-07-02 Dispositif électronique pour fournir un contenu sur la base de l'emplacement d'une image réfléchissante d'un objet externe, et procédé de fonctionnement de dispositif électronique

Country Status (2)

Country Link
KR (1) KR20210061062A (fr)
WO (1) WO2021101006A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7201850B1 (ja) 2022-01-24 2023-01-10 三菱ケミカルグループ株式会社 情報処理装置、方法、およびプログラム

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102510048B1 (ko) * 2022-07-25 2023-03-15 주식회사 컴플렉시온 운동 동작에 따른 증강현실 데이터를 출력하는 전자 장치의 제어 방법
KR102528108B1 (ko) * 2022-11-02 2023-05-02 홍미영 미러 미디어를 이용한 디지털 컨텐츠 출력 시스템

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20180054293A (ko) * 2016-11-15 2018-05-24 박종찬 스마트 트레이닝 장치
KR102000763B1 (ko) * 2018-02-07 2019-07-16 주식회사 인프라웨어테크놀러지 퍼스널 트레이닝 서비스를 제공하는 스마트 미러 장치
KR20190101827A (ko) * 2018-02-23 2019-09-02 삼성전자주식회사 디스플레이를 통해 표시된 제 1 콘텐트에 대해 제 2 콘텐트를 외부 객체의 움직임에 따라 제공하기 위한 전자 장치 및 그의 동작 방법
KR20190113265A (ko) * 2018-03-28 2019-10-08 주식회사 스탠스 헬스케어를 위한 증강현실 디스플레이 장치 및 이를 이용한 헬스케어 시스템
KR20190117099A (ko) * 2018-04-06 2019-10-16 엘지전자 주식회사 트레이닝 시스템

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20180054293A (ko) * 2016-11-15 2018-05-24 박종찬 스마트 트레이닝 장치
KR102000763B1 (ko) * 2018-02-07 2019-07-16 주식회사 인프라웨어테크놀러지 퍼스널 트레이닝 서비스를 제공하는 스마트 미러 장치
KR20190101827A (ko) * 2018-02-23 2019-09-02 삼성전자주식회사 디스플레이를 통해 표시된 제 1 콘텐트에 대해 제 2 콘텐트를 외부 객체의 움직임에 따라 제공하기 위한 전자 장치 및 그의 동작 방법
KR20190113265A (ko) * 2018-03-28 2019-10-08 주식회사 스탠스 헬스케어를 위한 증강현실 디스플레이 장치 및 이를 이용한 헬스케어 시스템
KR20190117099A (ko) * 2018-04-06 2019-10-16 엘지전자 주식회사 트레이닝 시스템

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7201850B1 (ja) 2022-01-24 2023-01-10 三菱ケミカルグループ株式会社 情報処理装置、方法、およびプログラム
WO2023139944A1 (fr) * 2022-01-24 2023-07-27 三菱ケミカルグループ株式会社 Dispositif, procédé et programme de traitement d'informations
JP2023107347A (ja) * 2022-01-24 2023-08-03 三菱ケミカルグループ株式会社 情報処理装置、方法、およびプログラム

Also Published As

Publication number Publication date
KR20210061062A (ko) 2021-05-27

Similar Documents

Publication Publication Date Title
WO2020171582A1 (fr) Procédé de détermination d'une image de face de montre, et dispositif électronique associé
WO2020085789A1 (fr) Dispositif électronique pliable pour commander une interface utilisateur et son procédé de fonctionnement
WO2021101006A1 (fr) Dispositif électronique pour fournir un contenu sur la base de l'emplacement d'une image réfléchissante d'un objet externe, et procédé de fonctionnement de dispositif électronique
WO2020246727A1 (fr) Dispositif électronique pliable et procédé d'affichage d'informations dans un dispositif électronique pliable
WO2020130691A1 (fr) Dispositif électronique et procédé pour fournir des informations sur celui-ci
WO2021162435A1 (fr) Dispositif électronique et procédé d'activation de capteur d'empreinte digitale
WO2018216868A1 (fr) Dispositif électronique et procédé d'entrée de dispositif d'entrée
WO2020022780A1 (fr) Procédé et appareil permettant d'établir une connexion de dispositif
WO2021025272A1 (fr) Dispositif électronique pliable pour détecter un angle de pliage, et son procédé de fonctionnement
WO2018208093A1 (fr) Procédé de fourniture de rétroaction haptique et dispositif électronique destiné à sa mise en œuvre
WO2020050636A1 (fr) Procédé et appareil de reconnaissance de gestes basée sur l'intention de l'utilisateur
EP3808097A1 (fr) Procédé et appareil permettant d'établir une connexion de dispositif
WO2020171563A1 (fr) Dispositif électronique et procédé destiné à commander le fonctionnement d'un afficheur dans ce dispositif
WO2020032720A1 (fr) Dispositif électronique comprenant un bouton et procédé pour opération dans le dispositif électronique
WO2020218848A1 (fr) Dispositif électronique et procédé de réalisation d'une fonction d'authentification biométrique et d'une fonction d'agent intelligent à l'aide d'une entrée d'utilisateur dans un dispositif électronique
WO2021080360A1 (fr) Dispositif électronique et son procédé de commande de fonctionnement du dispositif d'affichage
WO2020101435A1 (fr) Procédé de détection de choc externe et dispositif électronique associé
WO2021080171A1 (fr) Procédé et dispositif permettant la détection d'un port à l'aide d'un capteur inertiel
WO2019135548A1 (fr) Procédé de compensation de la valeur de pression d'un capteur de force et dispositif électronique l'utilisant
WO2020171342A1 (fr) Dispositif électronique permettant de fournir un service d'intelligence artificielle visualisé sur la base d'informations concernant un objet externe, et procédé de fonctionnement pour dispositif électronique
WO2019039729A1 (fr) Procédé de modification de la taille du contenu affiché sur un dispositif d'affichage, et dispositif électronique associé
WO2020130301A1 (fr) Dispositif électronique permettant de suivre l'activité d'un utilisateur et son procédé de fonctionnement
WO2022158692A1 (fr) Dispositif électronique permettant d'identifier une force tactile et son procédé de fonctionnement
WO2021133123A1 (fr) Dispositif électronique comprenant un écran flexible et son procédé de fonctionnement
WO2019103350A1 (fr) Appareil et procédé de configuration adaptative d'interface utilisateur

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20890889

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20890889

Country of ref document: EP

Kind code of ref document: A1