WO2022177105A1 - Dispositif électronique à affichage transparent et procédé de fonctionnement dudit dispositif - Google Patents

Dispositif électronique à affichage transparent et procédé de fonctionnement dudit dispositif Download PDF

Info

Publication number
WO2022177105A1
WO2022177105A1 PCT/KR2021/017477 KR2021017477W WO2022177105A1 WO 2022177105 A1 WO2022177105 A1 WO 2022177105A1 KR 2021017477 W KR2021017477 W KR 2021017477W WO 2022177105 A1 WO2022177105 A1 WO 2022177105A1
Authority
WO
WIPO (PCT)
Prior art keywords
transparent display
electronic device
effect
input
user
Prior art date
Application number
PCT/KR2021/017477
Other languages
English (en)
Korean (ko)
Inventor
정순천
권지혜
왕태호
정희영
김재형
서용
이관희
이지혜
Original Assignee
삼성전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전자 주식회사 filed Critical 삼성전자 주식회사
Publication of WO2022177105A1 publication Critical patent/WO2022177105A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/002Specific input/output arrangements not covered by G06F3/01 - G06F3/16
    • G06F3/005Input arrangements through a video camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/147Digital output to display device ; Cooperation and interconnection of the display device with other functional units using display panels
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation

Definitions

  • Various embodiments of the present disclosure relate to an electronic device including a transparent display, and more particularly, to a method of discriminating and displaying objects capable of receiving input from both sides of the transparent display and using them.
  • each component included in an electronic device is also diversifying.
  • types of cameras included in electronic devices are diversified, and the number of cameras is gradually increasing.
  • types of displays included in electronic devices are diversifying.
  • a flexible display, a rollable display, and a transparent display may be included in an electronic device, and a technology using such a display is being developed.
  • the display since an electronic device may display an interface with a user, the display may be diversified as the types of displays are diversified.
  • the electronic device may receive a user input from both sides of the transparent display. If an object capable of receiving a user's input from one side of the transparent display and an object capable of receiving a user's input from the other side of the transparent display are identically displayed, the user can input an input to the object through which side of the transparent display. It can be confusing as to what to do.
  • a malfunction may also occur due to the user's hand holding the transparent display and/or the user's erroneous input.
  • An electronic device includes a transparent display and a processor, wherein the processor determines a front surface and a rear surface of the transparent display in consideration of a user's gaze, and among objects to be displayed on the transparent display Checking whether there is an object to receive an input using the front and rear surfaces of the transparent display, applying a first effect to the object confirmed to receive an input using the front surface of the transparent display and displaying it on the transparent display, , and a second effect may be applied to an object confirmed to receive an input using the rear surface of the transparent display and displayed on the transparent display, and the first effect and the second effect may be different from each other.
  • An operation method of an electronic device includes an operation of determining the front and rear surfaces of a transparent display in consideration of a user's gaze, and using the front and rear surfaces of the transparent display among objects to be displayed on the transparent display.
  • An operation of confirming whether there is an object to receive an input an operation of applying a first effect to the object confirmed to receive an input using the front surface of the transparent display and displaying it on the transparent display, and the rear surface of the transparent display and displaying a second effect on the transparent display by applying a second effect to the object confirmed to receive the input using the method, wherein the first effect may be different from the second effect.
  • the electronic device may distinguish and display the object so that the user can intuitively determine whether the object displayed on the transparent display is displayed on the front side or the back side.
  • the effect applied to the object displayed on the transparent display is switched between the effects so that the user can easily input the object.
  • the electronic device may distinguish the front and rear surfaces of the transparent display without a user setting.
  • FIG. 1 is a block diagram of an electronic device in a network environment, according to various embodiments of the present disclosure
  • FIG. 2 is a block diagram of a display device according to various embodiments of the present disclosure.
  • 3 and 4 show an example of an electronic device including a transparent display.
  • FIG. 5 is a diagram illustrating an object that can be touched only on one surface of a transparent display, according to various embodiments of the present disclosure
  • FIG. 6 is a diagram illustrating an example in which a touchable object is disposed on both the front and rear surfaces of the transparent display, according to various embodiments of the present disclosure
  • FIG. 7 is a diagram illustrating an example in which a touchable object is disposed on the transparent display to be overlapped on the front and rear surfaces of the transparent display, according to various embodiments of the present disclosure
  • FIGS. 8A and 8B are diagrams illustrating an example of determining the front and rear surfaces of a large transparent display, according to various embodiments.
  • FIG. 9 is a diagram illustrating an example of using a pop-up window in an electronic device including a transparent display, according to various embodiments of the present disclosure.
  • FIG. 10 is a diagram illustrating an example in which a game controller is implemented using an electronic device including a transparent display, according to various embodiments of the present disclosure
  • FIG. 11 is a diagram illustrating an example of using a camera application in an electronic device including a transparent display, according to various embodiments of the present disclosure
  • 12A and 12B are diagrams illustrating an example of using a health application in an electronic device including a transparent display, according to various embodiments.
  • FIG. 13 is a flowchart of an electronic device according to various embodiments of the present disclosure.
  • FIG. 1 is a block diagram of an electronic device 101 in a network environment 100, according to various embodiments.
  • an electronic device 101 communicates with an electronic device 102 through a first network 198 (eg, a short-range wireless communication network) or a second network 199 . It may communicate with at least one of the electronic device 104 and the server 108 through (eg, a long-distance wireless communication network). According to an embodiment, the electronic device 101 may communicate with the electronic device 104 through the server 108 .
  • a first network 198 eg, a short-range wireless communication network
  • a second network 199 e.g., a second network 199
  • the electronic device 101 may communicate with the electronic device 104 through the server 108 .
  • the electronic device 101 includes a processor 120 , a memory 130 , an input module 150 , a sound output module 155 , a display module 160 , an audio module 170 , and a sensor module ( 176), interface 177, connection terminal 178, haptic module 179, camera module 180, power management module 188, battery 189, communication module 190, subscriber identification module 196 , or an antenna module 197 .
  • at least one of these components eg, the connection terminal 178
  • some of these components are integrated into one component (eg, display module 160 ). can be
  • the processor 120 for example, executes software (eg, a program 140) to execute at least one other component (eg, a hardware or software component) of the electronic device 101 connected to the processor 120. It can control and perform various data processing or operations. According to one embodiment, as at least part of data processing or operation, the processor 120 converts commands or data received from other components (eg, the sensor module 176 or the communication module 190 ) to the volatile memory 132 . may be stored in , process commands or data stored in the volatile memory 132 , and store the result data in the non-volatile memory 134 .
  • software eg, a program 140
  • the processor 120 converts commands or data received from other components (eg, the sensor module 176 or the communication module 190 ) to the volatile memory 132 .
  • the volatile memory 132 may be stored in , process commands or data stored in the volatile memory 132 , and store the result data in the non-volatile memory 134 .
  • the processor 120 is a main processor 121 (eg, a central processing unit or an application processor) or a secondary processor 123 (eg, a graphic processing unit, a neural network processing unit) a neural processing unit (NPU), an image signal processor, a sensor hub processor, or a communication processor).
  • a main processor 121 eg, a central processing unit or an application processor
  • a secondary processor 123 eg, a graphic processing unit, a neural network processing unit
  • NPU neural processing unit
  • an image signal processor e.g., a sensor hub processor, or a communication processor.
  • the secondary processor 123 may, for example, act on behalf of the main processor 121 while the main processor 121 is in an inactive (eg, sleep) state, or when the main processor 121 is active (eg, executing an application). ), together with the main processor 121, at least one of the components of the electronic device 101 (eg, the display module 160, the sensor module 176, or the communication module 190) It is possible to control at least some of the related functions or states.
  • the coprocessor 123 eg, an image signal processor or a communication processor
  • may be implemented as part of another functionally related component eg, the camera module 180 or the communication module 190. have.
  • the auxiliary processor 123 may include a hardware structure specialized for processing an artificial intelligence model.
  • Artificial intelligence models can be created through machine learning. Such learning may be performed, for example, in the electronic device 101 itself on which the artificial intelligence model is performed, or may be performed through a separate server (eg, the server 108).
  • the learning algorithm may include, for example, supervised learning, unsupervised learning, semi-supervised learning, or reinforcement learning, but in the above example not limited
  • the artificial intelligence model may include a plurality of artificial neural network layers.
  • Artificial neural networks include deep neural networks (DNNs), convolutional neural networks (CNNs), recurrent neural networks (RNNs), restricted boltzmann machines (RBMs), deep belief networks (DBNs), bidirectional recurrent deep neural networks (BRDNNs), It may be one of deep Q-networks or a combination of two or more of the above, but is not limited to the above example.
  • the artificial intelligence model may include, in addition to, or alternatively, a software structure in addition to the hardware structure.
  • the memory 130 may store various data used by at least one component (eg, the processor 120 or the sensor module 176 ) of the electronic device 101 .
  • the data may include, for example, input data or output data for software (eg, the program 140 ) and instructions related thereto.
  • the memory 130 may include a volatile memory 132 or a non-volatile memory 134 .
  • the program 140 may be stored as software in the memory 130 , and may include, for example, an operating system 142 , middleware 144 , or an application 146 .
  • the input module 150 may receive a command or data to be used by a component (eg, the processor 120 ) of the electronic device 101 from the outside (eg, a user) of the electronic device 101 .
  • the input module 150 may include, for example, a microphone, a mouse, a keyboard, a key (eg, a button), or a digital pen (eg, a stylus pen).
  • the sound output module 155 may output a sound signal to the outside of the electronic device 101 .
  • the sound output module 155 may include, for example, a speaker or a receiver.
  • the speaker can be used for general purposes such as multimedia playback or recording playback.
  • the receiver can be used to receive incoming calls. According to one embodiment, the receiver may be implemented separately from or as part of the speaker.
  • the display module 160 may visually provide information to the outside (eg, a user) of the electronic device 101 .
  • the display module 160 may include, for example, a control circuit for controlling a display, a hologram device, or a projector and a corresponding device.
  • the display module 160 may include a touch sensor configured to sense a touch or a pressure sensor configured to measure the intensity of a force generated by the touch.
  • the audio module 170 may convert a sound into an electric signal or, conversely, convert an electric signal into a sound. According to an embodiment, the audio module 170 acquires a sound through the input module 150 or an external electronic device (eg, a sound output module 155 ) directly or wirelessly connected to the electronic device 101 .
  • the electronic device 102) eg, a speaker or headphones
  • the sensor module 176 detects an operating state (eg, power or temperature) of the electronic device 101 or an external environmental state (eg, a user state), and generates an electrical signal or data value corresponding to the sensed state. can do.
  • the sensor module 176 may include, for example, a gesture sensor, a gyro sensor, a barometric pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an IR (infrared) sensor, a biometric sensor, It may include a temperature sensor, a humidity sensor, or an illuminance sensor.
  • the interface 177 may support one or more specified protocols that may be used by the electronic device 101 to directly or wirelessly connect with an external electronic device (eg, the electronic device 102 ).
  • the interface 177 may include, for example, a high definition multimedia interface (HDMI), a universal serial bus (USB) interface, an SD card interface, or an audio interface.
  • the connection terminal 178 may include a connector through which the electronic device 101 can be physically connected to an external electronic device (eg, the electronic device 102 ).
  • the connection terminal 178 may include, for example, an HDMI connector, a USB connector, an SD card connector, or an audio connector (eg, a headphone connector).
  • the haptic module 179 may convert an electrical signal into a mechanical stimulus (eg, vibration or movement) or an electrical stimulus that the user can perceive through tactile or kinesthetic sense.
  • the haptic module 179 may include, for example, a motor, a piezoelectric element, or an electrical stimulation device.
  • the camera module 180 may capture still images and moving images. According to an embodiment, the camera module 180 may include one or more lenses, image sensors, image signal processors, or flashes.
  • the power management module 188 may manage power supplied to the electronic device 101 .
  • the power management module 188 may be implemented as, for example, at least a part of a power management integrated circuit (PMIC).
  • PMIC power management integrated circuit
  • the battery 189 may supply power to at least one component of the electronic device 101 .
  • battery 189 may include, for example, a non-rechargeable primary cell, a rechargeable secondary cell, or a fuel cell.
  • the communication module 190 is a direct (eg, wired) communication channel or a wireless communication channel between the electronic device 101 and an external electronic device (eg, the electronic device 102, the electronic device 104, or the server 108). It can support establishment and communication performance through the established communication channel.
  • the communication module 190 may include one or more communication processors that operate independently of the processor 120 (eg, an application processor) and support direct (eg, wired) communication or wireless communication.
  • the communication module 190 is a wireless communication module 192 (eg, a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module) or a wired communication module 194 (eg, : It may include a local area network (LAN) communication module, or a power line communication module).
  • a wireless communication module 192 eg, a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module
  • GNSS global navigation satellite system
  • wired communication module 194 eg, : It may include a local area network (LAN) communication module, or a power line communication module.
  • a corresponding communication module among these communication modules is a first network 198 (eg, a short-range communication network such as Bluetooth, wireless fidelity (WiFi) direct, or infrared data association (IrDA)) or a second network 199 (eg, legacy It may communicate with the external electronic device 104 through a cellular network, a 5G network, a next-generation communication network, the Internet, or a computer network (eg, a telecommunication network such as a LAN or a WAN).
  • a first network 198 eg, a short-range communication network such as Bluetooth, wireless fidelity (WiFi) direct, or infrared data association (IrDA)
  • a second network 199 eg, legacy It may communicate with the external electronic device 104 through a cellular network, a 5G network, a next-generation communication network, the Internet, or a computer network (eg, a telecommunication network such as a LAN or a WAN).
  • a telecommunication network
  • the wireless communication module 192 uses subscriber information (eg, International Mobile Subscriber Identifier (IMSI)) stored in the subscriber identification module 196 within a communication network such as the first network 198 or the second network 199 .
  • subscriber information eg, International Mobile Subscriber Identifier (IMSI)
  • IMSI International Mobile Subscriber Identifier
  • the electronic device 101 may be identified or authenticated.
  • the wireless communication module 192 may support a 5G network after a 4G network and a next-generation communication technology, for example, a new radio access technology (NR).
  • NR access technology includes high-speed transmission of high-capacity data (eMBB (enhanced mobile broadband)), minimization of terminal power and access to multiple terminals (mMTC (massive machine type communications)), or high reliability and low latency (URLLC (ultra-reliable and low-latency) -latency communications)).
  • eMBB enhanced mobile broadband
  • mMTC massive machine type communications
  • URLLC ultra-reliable and low-latency
  • the wireless communication module 192 may support a high frequency band (eg, mmWave band) to achieve a high data rate, for example.
  • a high frequency band eg, mmWave band
  • the wireless communication module 192 uses various techniques for securing performance in a high-frequency band, for example, beamforming, massive multiple-input and multiple-output (MIMO), all-dimensional multiplexing. It may support technologies such as full dimensional MIMO (FD-MIMO), an array antenna, analog beam-forming, or a large scale antenna.
  • the wireless communication module 192 may support various requirements defined in the electronic device 101 , an external electronic device (eg, the electronic device 104 ), or a network system (eg, the second network 199 ).
  • the wireless communication module 192 may include a peak data rate (eg, 20 Gbps or more) for realizing eMBB, loss coverage (eg, 164 dB or less) for realizing mMTC, or U-plane latency for realizing URLLC ( Example: Downlink (DL) and uplink (UL) each 0.5 ms or less, or round trip 1 ms or less) can be supported.
  • a peak data rate eg, 20 Gbps or more
  • loss coverage eg, 164 dB or less
  • U-plane latency for realizing URLLC
  • the antenna module 197 may transmit or receive a signal or power to the outside (eg, an external electronic device).
  • the antenna module 197 may include an antenna including a conductor formed on a substrate (eg, a PCB) or a radiator formed of a conductive pattern.
  • the antenna module 197 may include a plurality of antennas (eg, an array antenna). In this case, at least one antenna suitable for a communication method used in a communication network such as the first network 198 or the second network 199 is connected from the plurality of antennas by, for example, the communication module 190 . can be selected. A signal or power may be transmitted or received between the communication module 190 and an external electronic device through the selected at least one antenna.
  • other components eg, a radio frequency integrated circuit (RFIC)
  • RFIC radio frequency integrated circuit
  • the antenna module 197 may form a mmWave antenna module.
  • the mmWave antenna module comprises a printed circuit board, an RFIC disposed on or adjacent to a first side (eg, underside) of the printed circuit board and capable of supporting a designated high frequency band (eg, mmWave band); and a plurality of antennas (eg, an array antenna) disposed on or adjacent to a second side (eg, top or side) of the printed circuit board and capable of transmitting or receiving signals of the designated high frequency band. can do.
  • peripheral devices eg, a bus, general purpose input and output (GPIO), serial peripheral interface (SPI), or mobile industry processor interface (MIPI)
  • GPIO general purpose input and output
  • SPI serial peripheral interface
  • MIPI mobile industry processor interface
  • the command or data may be transmitted or received between the electronic device 101 and the external electronic device 104 through the server 108 connected to the second network 199 .
  • Each of the external electronic devices 102 or 104 may be the same as or different from the electronic device 101 .
  • all or part of the operations performed by the electronic device 101 may be executed by one or more external electronic devices 102 , 104 , or 108 .
  • the electronic device 101 may perform the function or service itself instead of executing the function or service itself.
  • one or more external electronic devices may be requested to perform at least a part of the function or the service.
  • One or more external electronic devices that have received the request may execute at least a part of the requested function or service, or an additional function or service related to the request, and transmit a result of the execution to the electronic device 101 .
  • the electronic device 101 may process the result as it is or additionally and provide it as at least a part of a response to the request.
  • cloud computing, distributed computing, mobile edge computing (MEC), or client-server computing technology may be used.
  • the electronic device 101 may provide an ultra-low latency service using, for example, distributed computing or mobile edge computing.
  • the external electronic device 104 may include an Internet of things (IoT) device.
  • the server 108 may be an intelligent server using machine learning and/or neural networks.
  • the external electronic device 104 or the server 108 may be included in the second network 199 .
  • the electronic device 101 may be applied to an intelligent service (eg, smart home, smart city, smart car, or health care) based on 5G communication technology and IoT-related technology.
  • the display device 160 may include a display 210 and a display driver IC (DDI) 230 for controlling the display 210 .
  • the display 210 may be a transparent display.
  • the DDI 230 may include an interface module 231 , a memory 233 (eg, a buffer memory), an image processing module 235 , or a mapping module 237 .
  • the DDI 230 receives, for example, image data or image information including an image control signal corresponding to a command for controlling the image data from another component of the electronic device 101 through the interface module 231 . can do.
  • the image information is the processor 120 (eg, the main processor 121 (eg, an application processor) or the auxiliary processor 123 (eg, an application processor) operated independently of the function of the main processor 121
  • the DDI 230 may communicate with the touch circuit 250 or the sensor module 176 through the interface module 231.
  • the DDI 230 may be At least a portion of the received image information may be stored in the memory 233, for example, in units of frames, for example, the image processing module 235 may store at least a portion of the image data, Pre-processing or post-processing (eg, resolution, brightness, or size adjustment) may be performed based at least on the characteristics of the display 210.
  • Pre-processing or post-processing eg, resolution, brightness, or size adjustment
  • the mapping module 237 may perform pre-processing or post-processing through the image processing module 135.
  • a voltage value or a current value corresponding to the image data may be generated.
  • the generation of the voltage value or the current value may include, for example, a property of the pixels of the display 210 (eg, an arrangement of pixels). RGB stripe or pentile structure), or the size of each sub-pixel)
  • At least some pixels of the display 210 are, for example, based at least in part on the voltage value or the current value.
  • visual information eg, text, image, or icon
  • corresponding to the image data may be displayed through the display 210 .
  • the display device 160 may further include a touch circuit 250 .
  • the touch circuit 250 may include a touch sensor 251 and a touch sensor IC 253 for controlling the touch sensor 251 .
  • the touch sensor 251 may be disposed on the front side, the back side, and both sides.
  • the touch sensor IC 253 may control the touch sensor 251 to sense a touch input or a hovering input for a specific position of the display 210 , for example.
  • the touch sensor IC 253 may detect a touch input or a hovering input by measuring a change in a signal (eg, voltage, light amount, resistance, or electric charge amount) for a specific position of the display 210 .
  • a signal eg, voltage, light amount, resistance, or electric charge amount
  • the touch sensor IC 253 may provide information (eg, location, area, pressure, or time) regarding the sensed touch input or hovering input to the processor 120 .
  • information eg, location, area, pressure, or time
  • at least a part of the touch circuit 250 is disposed as a part of the display driver IC 230 , the display 210 , or outside the display device 160 . may be included as a part of another component (eg, the coprocessor 123).
  • the display device 160 may further include at least one sensor (eg, a fingerprint sensor, an iris sensor, a pressure sensor, or an illuminance sensor) of the sensor module 176 , or a control circuit therefor.
  • the at least one sensor or a control circuit therefor may be embedded in a part of the display device 160 (eg, the display 210 or the DDI 230 ) or a part of the touch circuit 250 .
  • the sensor module 176 embedded in the display device 160 includes a biometric sensor (eg, a fingerprint sensor)
  • the biometric sensor provides biometric information related to a touch input through a partial area of the display 210 . (eg, fingerprint image) can be acquired.
  • the pressure sensor may acquire pressure information related to a touch input through a part or the entire area of the display 210 .
  • the touch sensor 251 or the sensor module 176 may be disposed between pixels of the pixel layer of the display 210 , or above or below the pixel layer.
  • 3 and 4 show an example of an electronic device including a transparent display.
  • an electronic device may include a cover, and the cover may include a transparent display 320 .
  • the electronic device 101 may include a main display 310 in addition to the transparent display 320 .
  • the main display 310 may be an opaque display that is not transparent.
  • 3A is a front view of the electronic device in which the cover includes the transparent display 320
  • FIG. 3B is a side view of the electronic device in which the cover includes the transparent display 320 .
  • the cover and the electronic device 101 may be coupled to each other by a connection part 330 .
  • connection part 330 may be, for example, a pogo-pin that is a cylinder-shaped spring pin.
  • the cover may be rotated 360 degrees by the connection part 330 .
  • objects displayed on one surface of the transparent display 320 may be transmitted through the other surface to be displayed.
  • the main display 310 includes a touch sensor 360 on one surface to receive an input for an object from one surface of the main display 310 , and the transparent display 320 has both sides. Including the touch sensors 340 and 350, an input for an object may be received from both sides of the transparent display 320 .
  • the electronic device 101 may deactivate the touch sensor 340 or 350 of one surface or a part of the transparent display 320 when a touch input is not required on one surface or a part of the transparent display 320 . have.
  • the display 400 of the electronic device may include a transparent display 420 and an opaque display 410 that is not transparent.
  • 4A is a front view of a display 400 including a transparent display 420 and an opaque display 410 that is not transparent
  • FIG. 4B is a transparent display 420 and not transparent.
  • a side view of the display 400 including the non-display 410 is shown.
  • the upper end of the display of the electronic device may be configured as a transparent display 420
  • the lower end of the display may be configured as an opaque display 410 that is not transparent.
  • an object displayed on one surface of the transparent display 420 may be displayed as being transmitted through the other surface.
  • the non-transparent display 410 may include a touch sensor 450 on one surface, and the transparent display 420 may include touch sensors 430 and 440 on both sides. .
  • the non-transparent display 410 may receive a user's input from one side including the touch sensor 450
  • the transparent display 420 may receive a user's input from both sides.
  • the electronic device 101 may deactivate the touch sensor 430 or 440 of one surface or a part of the transparent display 420 .
  • the electronic device 101 may disable the touch sensor of an unnecessary part to prevent a function from being performed for a erroneous operation (eg, a gripped hand or a touch on the other side of the transparent display).
  • an electronic device eg, the electronic device 101 of FIG. 1
  • a transparent display e.g., the electronic device 101 of FIG. 1
  • various embodiments of the present disclosure may not be limited thereto.
  • all displays included in the electronic device 101 are transparent displays, various embodiments of the present disclosure may be applied.
  • Various embodiments described in the present disclosure may be applied to everything from a smart phone held in a user's hand to an electronic device including a large display such as a signage installed in a public or commercial space.
  • the electronic device 101 may display an object on a transparent display.
  • An object displayed on the transparent display may receive an input from one side of the transparent display or an input from both sides of the transparent display.
  • the electronic device 101 may process inputs received from both sides of the transparent display as the same input.
  • the electronic device 101 may process an input received from both sides of the transparent display as another input.
  • the electronic device 101 may process to receive an input only from one surface of the transparent display.
  • FIG. 5 is a diagram illustrating an object that can be touched only on one surface of a transparent display, according to various embodiments of the present disclosure
  • the electronic device may distinguish the front and rear surfaces of the transparent display in consideration of the user's gaze (or the user's position with respect to the transparent display).
  • the electronic device 101 may determine (or distinguish) the front and rear surfaces of the transparent display by using a camera or a sensor module included in the electronic device.
  • the electronic device 101 may determine the front and rear surfaces of the transparent display by determining the number and/or area of the fingers gripped using the touch sensor of the transparent display.
  • the electronic device 101 may determine a direction in which the electronic device 101 is placed using a gyro sensor to determine the front and rear surfaces of the transparent display.
  • the electronic device 101 may determine a focused user's position using IR (infrared ray) sensors mounted on both sides of the transparent display, and determine the front and rear surfaces of the transparent display based on the user's position.
  • IR infrared ray
  • 5A and 5B illustrate examples in which the electronic device 101 distinguishes the front and rear surfaces of the transparent display using the user's gaze.
  • the object displayed on the transparent display may be an object that can receive an input from the front side.
  • FIG. 5A may show a state in which the gaze of the user 510 is positioned in front of the transparent display 520 .
  • An input to the object 530 may be received from the front of the transparent display 520 , so that the electronic device 101 may apply the first effect to the object 530 .
  • the first effect may be, for example, an outer shadow.
  • FIG. 5( b ) may show a state in which the user's 540 gaze is positioned on the back of the transparent display 550 .
  • An input for the object 560 may be received from the front side of the transparent display 550 , so that the electronic device 101 may display the second object on the object 560 when the user's 540 gaze is located on the rear side of the transparent display 550 .
  • effects can be applied.
  • the second effect may be, for example, an inner shadow.
  • the first effect and the second effect may be opposite to each other, such as an outer shadow and an inner shadow. According to another embodiment, the first effect and the second effect may be different from each other, or the degree of the same effect may be different.
  • the effect that can be applied to the object is at least one of shadow, opacity, saturation, blur, color, scale, and motion.
  • the first effect and the second effect may have different sizes.
  • the first effect may have high saturation and no blur, and the second effect may include low saturation and blur.
  • FIG. 6 is a diagram illustrating an example in which a touchable object is disposed on both the front and rear surfaces of the transparent display, according to various embodiments of the present disclosure
  • both the first object that can be touched on one side (eg, the front side) of the transparent display and the second object that can be touched on the other side (eg, the back side) of the transparent display can be displayed on the transparent display. have. If the first object and the second object are displayed identically, it may be difficult for the user to determine which side of the transparent display can touch each object.
  • one side of the transparent display may be fixed to the front side and the other side of the transparent display may be fixed to the rear side for convenience of explanation.
  • a first object 620-1 and a second object 630-1 may be displayed on the front surface 610-1 of the transparent display.
  • the first object 620-1 may be a touchable object from the front surface of the transparent display
  • the second object 630-1 may be a touchable object from the rear surface of the transparent display.
  • the electronic device eg, the electronic device 101 of FIG. 1
  • the electronic device includes a first object 620 - 1 that can be touched from the front of the transparent display and a second object 630 - 1 that can be touched from the back of the transparent display. You can apply different effects to each object to distinguish them. For example, an outer shadow effect may be applied to the first object 620-1, and an inner shadow effect may be applied to the second object 630-1.
  • the transparent display can be turned upside down. If the transparent display is turned upside down, the back of the transparent display may be up. When the transparent display is turned over, the user's gaze may be positioned on the back of the transparent display.
  • 6 (b) shows the rear surface 610 - 2 of the inverted transparent display. Referring to FIG. 6B , a first object 620-2 and a second object 630-2 may be displayed on the rear surface 610-2 of the transparent display. Because the transparent display is flipped over, it may be difficult for the user to determine which side of the transparent display can touch each object. Since the transparent display is turned over, the electronic device 101 may apply an effect applied to each object opposite to that of FIG. 6A . For example, the electronic device 101 may apply an inner shadow effect to the first object 620 - 2 and may apply an outer shadow effect to the second object 630 - 2 .
  • FIG. 7 is a diagram illustrating an example in which a touchable object is disposed on the transparent display to be overlapped on the front and rear surfaces of the transparent display, according to various embodiments of the present disclosure
  • a first object that can be touched on one side (eg, front) of the transparent display and a second object that can be touched on the other side (eg, back) of the transparent display may be overlapped on the transparent display.
  • one side of the transparent display may be fixed to the front side, and the other side of the transparent display may be fixed to the rear side.
  • the electronic device (eg, the electronic device 101 of FIG. 1 ) responds to a second object 730 - 1 that can be touched on the front surface 710 - 1 of the transparent display.
  • An effect that appears to be placed in front of the object 720 - 1 may be applied.
  • the electronic device 101 applies transparency to the first object 730 - 1 that can be touched from the front, for example, so that the user can know that there is a second object 720 - 1 that can be touched also on the back. can make it
  • the transparent display can be turned upside down. If the transparent display is turned upside down, the back of the transparent display may be up. 7 (b) shows the rear surface 710 - 2 of the inverted transparent display in detail. The first object 730 - 2 and the second object 720 - 2 may overlap and be displayed on the rear surface 710 - 2 of the transparent display. Because the transparent display is flipped over, it may be difficult for the user to determine which side of the transparent display can touch each object. The electronic device 101 may apply an effect applied to each object differently because the transparent display is turned upside down. Referring to FIG.
  • the electronic device 101 is disposed in front of the first object 730 - 2 on the second object 720 - 2 that can be touched on the rear surface 710 - 2 of the transparent display. You can apply effects that appear to have been removed (eg transparency). For example, the electronic device 101 applies transparency to the touchable second object 720-2 from the rear so that the user can know that there is a first touchable object 730-2 from the front. can make it
  • FIGS. 8A and 8B are diagrams illustrating an example of determining the front and rear surfaces of a large transparent display, according to various embodiments.
  • a small size electronic device such as a smart phone allows a user to easily turn the electronic device over.
  • the monitor or signage is large in size, so the user may not be able to easily flip the monitor or signage, but the display of the monitor or signage may also be a transparent display, and the user's touch input may be received from both sides of the display can
  • the electronic device including the large transparent display may not need to recognize the user.
  • the electronic device 101 does not need to recognize the user, but the electronic device 101 separates and displays both sides of the large transparent display according to the touch input possible side in order to distinguish and display the touch input possible object. It can be distinguished by the front and back of the large transparent display.
  • the electronic device 101 including a large transparent display may include at least one of a camera, an IR sensor, and a sensor module.
  • the electronic device 101 may recognize the surface detected by the user as the front surface of the large transparent display using at least one of a camera, an IR sensor, and a sensor module.
  • the electronic device 101 including the large transparent display may recognize the front and rear surfaces of the large transparent display in another method.
  • the electronic device 101 including a large transparent display may recognize a recently used surface or a currently used surface as the front surface of the large transparent display.
  • the electronic device 101 including the large transparent display may display the large transparent display.
  • the other side can be recognized as the front side. If the side on which the input was received is different from the side on which the input was previously received, the electronic device 101 may exchange the front and back surfaces of the large transparent display to recognize it.
  • FIG. 8A may show a user 820 - 1 gazing at one surface 810 - 1 of a large transparent display.
  • the electronic device 101 inputs a keypad 830-1 through which an input can be received from one surface 810-1 of the large transparent display among objects displayed on the large transparent display, from the other surface 810-2 of the large transparent display.
  • different effects may be applied to the keypad 830-1 and the emoji 840 - 1 .
  • the electronic device 101 may apply, for example, an outer shadow effect to the keypad 830-1 and an inner shadow effect to the emoji 840 - 1 .
  • FIG. 8B may show the user 820 - 2 gazing at the other side 810 - 2 of the large transparent display.
  • the electronic device 101 displays an emoji 840-2, which can receive an input from the other surface 810-2 of the large transparent display, among the objects displayed on the large transparent display, on one surface 810-1 of the large transparent display.
  • Different effects may be applied to the emoji 840 - 2 and the keypad 830 - 2 in order to distinguish them from the keypad 830 - 2 from which an input can be received.
  • the electronic device 101 may apply an inner shadow effect to the keypad 830 - 2 and an outer shadow effect to the emoji 840 - 2 as opposed to FIG. 8A .
  • FIG. 9 is a diagram illustrating an example of using a pop-up window in an electronic device including a transparent display, according to various embodiments of the present disclosure.
  • an electronic device 910 including a transparent display may provide a specific application as a pop-up screen.
  • the user may select 'View as a pop-up screen' 930 by executing the phone application 920 - 1 on the main screen.
  • the phone application 920-2 may be displayed as a pop-up screen, and a main screen may be displayed behind the pop-up screen of the phone application 920-2. Since the transparent display may allow touch input from the rear instead of the front, the electronic device 910 may also display the icon 940 included in the main screen overlapping the pop-up screen of the phone application 920 - 2 together. The electronic device 910 may receive an input to the icon 940 overlapped by the pop-up screen of the phone application 920 - 2 from the rear surface of the transparent display.
  • the electronic device 910 displays the pop-up screen of the phone application 920-2 and the pop-up screen of the phone application 920-2 to indicate that the icon 940 overlapping by the pop-up screen of the phone application 920-2 exists. Different effects can be applied to icons overlapped by the screen. The electronic device 910 may apply different effects to the icon 950 that is not overlapped by the pop-up screen and the icon 940 that is overlapped by the pop-up screen.
  • FIG. 10 is a diagram illustrating an example in which a game controller is implemented using an electronic device including a transparent display, according to various embodiments of the present disclosure
  • FIG. 10(a) shows the game controller 1000
  • FIG. 10(b) shows the electronic device 1030 including a transparent display performing the function of the game controller (eg, in FIG. 1 ).
  • the electronic device 101 is shown.
  • the game controller 1000 may include four direction keys 1020 and two buttons 1010 for controlling a mode of the game controller.
  • the four direction keys 1020 may be easily controlled using the thumbs of both hands, and the two buttons 1010 may be easily controlled using the index fingers of both hands. have.
  • the electronic device 1030 may perform a function of the game controller 1000 .
  • a game controller-related user interface may be displayed on the electronic device 1030 .
  • the electronic device 1030 may be an electronic device including a transparent display.
  • a direction key 1050 corresponding to four direction keys 1020 of the game controller 1000 and a button key 1040 corresponding to two buttons 1010 of the game controller 1000 are provided. can be displayed.
  • an input to the direction key 1050 may be received from the front surface of the transparent display, and an input to the button key 1040 may be received from the rear surface of the transparent display.
  • the electronic device 1030 may apply an outer shadow effect to the direction key 1050 for receiving an input from the front of the transparent display and display it, and a button key 1040 for receiving an input from the rear of the transparent display. can be displayed by applying an inner shadow effect.
  • the electronic device 1030 applies a height effect by a difference in color to the direction key 1050 for receiving an input from the front side of the transparent display and the button key 1040 for receiving an input from the rear side of the transparent display. can be displayed.
  • the area in which a touch input can be received from the front side of the transparent display may be inactivated from the rear side, and the area where the touch input can be received from the back side of the transparent display may be inactivated from the front side.
  • the electronic device 1030 may determine the front and rear surfaces of the transparent display by using a camera or a sensor module included in the electronic device 1030 .
  • the electronic device 1030 may determine the front and rear surfaces of the transparent display by determining the number and/or area of gripped fingers using a touch sensor of the transparent display.
  • the electronic device 1030 may determine a direction in which the electronic device 1030 is placed using a gyro sensor to determine the front and rear surfaces of the transparent display.
  • the front and rear surfaces of the transparent display may be determined using an object recognized by the camera.
  • the electronic device 1030 may identify a focused position of the user using infrared (IR) sensors mounted on both sides of the transparent display.
  • the electronic device 1030 may determine the front and rear surfaces of the electronic device 1030 based on the user's location.
  • IR infrared
  • FIG. 11 is a diagram illustrating an example of using a camera application in an electronic device including a transparent display, according to various embodiments of the present disclosure
  • the user may take a picture or a video using a camera application installed in the electronic device (eg, the electronic device 101 of FIG. 1 ).
  • a user may select a camera to switch a shooting mode, or turn over the electronic device to switch a shooting mode.
  • 11 may be an example in which a user selects a camera mounted on one side (eg, a rear side) of an electronic device using a camera application and turns the electronic device over to change a shooting mode.
  • 11A illustrates one side 1110 - 1 (eg, front) of the transparent display, and the user enters the normal mode using the camera 1130 mounted on the other side (eg, rear) of the electronic device. can take pictures.
  • the user may touch the button 1120-1 disposed on one surface of the transparent display for photographing.
  • the mode of the camera application may be switched from the normal mode to the selfie mode.
  • 11B shows the other surface 1110 - 2 of the transparent display, and the user may touch the button 1120 - 2 disposed on one surface of the transparent display to take a picture.
  • the user may not be able to distinguish whether the button for photographing is disposed on one side of the transparent display or the other side of the transparent display.
  • the electronic device may apply an effect to the buttons so that the user can intuitively determine whether the buttons 1120-1 and 1120-2 for photographing are disposed on one side of the transparent display or the other side of the transparent display. have.
  • the electronic device may apply an outer shadow effect to a button 1120-1 that can be touched on one surface of the transparent display as shown in FIG. ), an inner shadow effect can be applied to the button 1120 - 2 .
  • FIG. 11 is a diagram illustrating an example of using a camera application in an electronic device including a transparent display, according to various embodiments of the present disclosure
  • the user may take a picture or a video using a camera application installed in the electronic device (eg, the electronic device 101 of FIG. 1 ).
  • a user may select a camera to switch a shooting mode, or turn over the electronic device to switch a shooting mode.
  • 11 may be an example in which a user selects a camera mounted on one side (eg, a rear side) of an electronic device using a camera application and turns the electronic device over to change a shooting mode.
  • 11A illustrates one side 1110 - 1 (eg, front) of the transparent display, and the user enters the normal mode using the camera 1130 mounted on the other side (eg, rear) of the electronic device. can take pictures.
  • the user may touch the button 1120-1 disposed on one surface of the transparent display for photographing.
  • the mode of the camera application may be switched from the normal mode to the selfie mode.
  • 11B shows the other surface 1110 - 2 of the transparent display, and the user may touch the button 1120 - 2 disposed on one surface of the transparent display to take a picture.
  • the user may not be able to distinguish whether the button for photographing is disposed on one side of the transparent display or the other side of the transparent display.
  • the electronic device may apply an effect to the buttons so that the user can intuitively determine whether the buttons 1120-1 and 1120-2 for photographing are disposed on one side of the transparent display or the other side of the transparent display. have.
  • the electronic device may apply an outer shadow effect to a button 1120-1 that can be touched on one surface of the transparent display as shown in FIG. ), an inner shadow effect can be applied to the button 1120 - 2 .
  • FIG. 13 is a flowchart of an electronic device according to various embodiments of the present disclosure.
  • the electronic device may include a transparent display.
  • the electronic device 101 may receive a user's input using both sides of the transparent display.
  • the electronic device 101 may perform a different operation according to a side on which a user input is received among both sides of the transparent display.
  • the electronic device 101 may inactivate the user's input from being received for a partial region of the transparent display that does not require the user's input.
  • the electronic device 101 may determine the front and rear surfaces of the transparent display in consideration of the user's gaze.
  • a camera may be attached to one surface of the electronic device 101 , and the electronic device 101 may determine the front and rear surfaces of the transparent display using the camera. For example, when the camera captures the user's face or a part of the face (eg, eyes, nose, or mouth), the electronic device 101 determines that the front of the transparent display is the front of the transparent display, and the user's face or a part of the face (eg, eyes) , nose, and mouth) are not photographed, it can be determined as the back of the transparent display.
  • the electronic device 101 may determine the front and rear surfaces of the transparent display in consideration of the user's gaze again. For example, when the transparent display is turned over, the electronic device 101 may not recognize the gaze of the user recognized through the camera. The electronic device 101 may determine the front and rear surfaces of the transparent display according to whether the user's gaze is recognized.
  • the electronic device 101 may check whether there is an object to receive an input using the front and rear surfaces of the transparent display among the objects to be displayed on the transparent display.
  • some applications may not have an object to receive a user's input using the back surface of the transparent display. According to another embodiment, some other applications may not have an object to receive a user's input using the front surface of the transparent display.
  • the electronic device 101 may apply the first effect to an object identified as receiving an input using the front surface of the transparent display and display it on the transparent display.
  • the electronic device 101 may apply the second effect to an object identified as receiving an input using the rear surface of the transparent display and display it on the transparent display.
  • the first effect and the second effect may be opposite to each other. According to another embodiment, the first effect and the second effect may be different from each other, or the degree of the same effect may be different.
  • the effect that can be applied to the object is at least one of shadow, opacity, saturation, blur, color, scale, and motion.
  • the first effect and the second effect may be applied with different sizes.
  • the first effect may have high saturation and no blur, and the second effect may include low saturation and blur.
  • the electronic device 101 may exchange the first effect and the second effect applied to the object.
  • An electronic device includes a transparent display and a processor, wherein the processor determines a front surface and a rear surface of the transparent display in consideration of a user's gaze, and among objects to be displayed on the transparent display Checking whether there is an object to receive an input using the front and rear surfaces of the transparent display, applying a first effect to the object confirmed to receive an input using the front surface of the transparent display and displaying it on the transparent display, , and a second effect may be applied to an object confirmed to receive an input using the rear surface of the transparent display and displayed on the transparent display, and the first effect and the second effect may be different from each other.
  • the electronic device may further include a camera attached to one surface of the electronic device, and the processor may determine the front and rear surfaces of the transparent display using the camera.
  • the processor of the electronic device further determines the front and rear surfaces of the transparent display in consideration of the user's gaze, and when it is determined that the front and rear surfaces of the transparent display are changed, the first effect and The second effect may be displayed interchangeably.
  • the first effect and the second effect may include shadow, opacity, saturation, blur, color, and scale. , may be effects expressed differently for at least one of motion.
  • the processor of the electronic device may receive both user inputs from the front and/or rear surfaces of the transparent display.
  • the processor of the electronic device may execute a function based on whether the user input is the front side or the rear side of the transparent display.
  • the electronic device may further include a sensor module, and the processor may further determine the front and rear surfaces of the transparent display using the sensor module.
  • the electronic device may further include cameras attached to both surfaces of the electronic device, and the processor may determine the front and rear surfaces of the transparent display using the cameras.
  • the processor of the electronic device includes at least some of an object confirmed to receive an input using the front surface of the transparent display and an object confirmed to receive an input using the rear surface of the transparent display. In case of overlapping, all of the objects may be displayed using transparency.
  • the processor of the electronic device determines that there is an object to receive an input using only one of the front and rear surfaces of the transparent display among the objects to be displayed on the transparent display, the object to receive the input It is possible to disable the touch input on one side without a .
  • An operation method of an electronic device includes an operation of determining the front and rear surfaces of a transparent display in consideration of a user's gaze, and using the front and rear surfaces of the transparent display among objects to be displayed on the transparent display.
  • An operation of confirming whether there is an object to receive an input an operation of applying a first effect to the object confirmed to receive an input using the front surface of the transparent display and displaying it on the transparent display, and the rear surface of the transparent display and displaying a second effect on the transparent display by applying a second effect to the object confirmed to receive the input using the method, wherein the first effect may be different from the second effect.
  • the operation of determining the front and rear surfaces of the transparent display in consideration of the user's gaze in the method of operating an electronic device includes using a camera attached to one surface of the transparent display to determine the front surface of the transparent display. It may be an operation to determine the back and the back.
  • the first The method may further include an operation of displaying the effect and the second effect interchangeably.
  • the first effect and the second effect may include shadow, opacity, saturation, blur, color, and size. It may be an effect expressed differently for at least one of (scale) and motion (motion).
  • the method of operating an electronic device may further include receiving all user inputs from the front surface and/or rear surface of the transparent display.
  • the method of operating an electronic device may further include performing a different function based on whether the user's input is a front surface or a rear surface of the transparent display.
  • the operation of determining the front surface and the rear surface of the transparent display in consideration of the user's gaze is an operation of further determining the front surface and the rear surface of the transparent display using a sensor module can be
  • the operation of determining the front and rear surfaces of the transparent display in consideration of the user's gaze is performed between the front and rear surfaces of the transparent display using cameras attached to both sides of the electronic device. It may be an operation to determine the back side.
  • an object confirmed to receive an input using the front surface of the transparent display and an object confirmed to receive an input using the rear surface of the transparent display are at least partially
  • the method may further include displaying all of the objects using transparency.
  • the object to receive the input may further include an operation of deactivating the touch input of one side without the .
  • the electronic device may have various types of devices.
  • the electronic device may include, for example, a portable communication device (eg, a smart phone), a computer device, a portable multimedia device, a portable medical device, a camera, a wearable device, or a home appliance device.
  • a portable communication device eg, a smart phone
  • a computer device e.g., a smart phone
  • a portable multimedia device e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a wearable device e.g., a smart bracelet
  • a home appliance device e.g., a home appliance
  • first, second, or first or second may simply be used to distinguish an element from other elements in question, and may refer elements to other aspects (e.g., importance or order) is not limited. It is said that one (eg, first) component is “coupled” or “connected” to another (eg, second) component, with or without the terms “functionally” or “communicatively”. When referenced, it means that one component can be connected to the other component directly (eg by wire), wirelessly, or through a third component.
  • module used in various embodiments of this document may include a unit implemented in hardware, software, or firmware, and is interchangeable with terms such as, for example, logic, logic block, component, or circuit.
  • a module may be an integrally formed part or a minimum unit or a part of the part that performs one or more functions.
  • the module may be implemented in the form of an application-specific integrated circuit (ASIC).
  • ASIC application-specific integrated circuit
  • Various embodiments of the present document include one or more instructions stored in a storage medium (eg, internal memory 136 or external memory 138) readable by a machine (eg, electronic device 101).
  • a storage medium eg, internal memory 136 or external memory 138
  • the processor eg, the processor 120
  • the device eg, the electronic device 101
  • the one or more instructions may include code generated by a compiler or code executable by an interpreter.
  • the device-readable storage medium may be provided in the form of a non-transitory storage medium.
  • 'non-transitory' only means that the storage medium is a tangible device and does not contain a signal (eg, electromagnetic wave), and this term is used in cases where data is semi-permanently stored in the storage medium and It does not distinguish between temporary storage cases.
  • a signal eg, electromagnetic wave
  • the method according to various embodiments disclosed in this document may be provided in a computer program product (computer program product).
  • Computer program products may be traded between sellers and buyers as commodities.
  • the computer program product is distributed in the form of a machine-readable storage medium (eg compact disc read only memory (CD-ROM)), or via an application store (eg Play Store TM ) or on two user devices ( It can be distributed (eg downloaded or uploaded) directly, online between smartphones (eg: smartphones).
  • a portion of the computer program product may be temporarily stored or temporarily created in a machine-readable storage medium such as a memory of a server of a manufacturer, a server of an application store, or a relay server.
  • each component eg, a module or a program of the above-described components may include a singular or a plurality of entities, and some of the plurality of entities may be separately disposed in other components. have.
  • one or more components or operations among the above-described corresponding components may be omitted, or one or more other components or operations may be added.
  • a plurality of components eg, a module or a program
  • the integrated component may perform one or more functions of each component of the plurality of components identically or similarly to those performed by the corresponding component among the plurality of components prior to the integration. .
  • operations performed by a module, program, or other component are executed sequentially, in parallel, repeatedly, or heuristically, or one or more of the operations are executed in a different order, or omitted. , or one or more other operations may be added.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Un dispositif électronique selon divers modes de réalisation de la présente divulgation comprend un affichage transparent et un processeur. Le processeur : détermine les surfaces avant et arrière de l'affichage transparent en tenant compte du regard de l'utilisateur ; confirme s'il existe un objet pour recevoir une entrée parmi des objets à afficher sur l'affichage transparent en utilisant les surfaces avant et arrière de l'affichage transparent ; applique un premier effet à un objet confirmé pour recevoir une entrée en utilisant la surface avant de l'affichage transparent et l'affiche sur l'affichage transparent ; et applique un second effet à un objet confirmé pour recevoir une entrée en utilisant la surface arrière de l'affichage transparent et l'affiche sur l'affichage transparent, le premier effet et le second effet pouvant être différents l'un de l'autre.
PCT/KR2021/017477 2021-02-19 2021-11-25 Dispositif électronique à affichage transparent et procédé de fonctionnement dudit dispositif WO2022177105A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020210022564A KR20220118751A (ko) 2021-02-19 2021-02-19 투명 디스플레이를 포함하는 전자 장치 및 이의 동작 방법
KR10-2021-0022564 2021-02-19

Publications (1)

Publication Number Publication Date
WO2022177105A1 true WO2022177105A1 (fr) 2022-08-25

Family

ID=82930772

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2021/017477 WO2022177105A1 (fr) 2021-02-19 2021-11-25 Dispositif électronique à affichage transparent et procédé de fonctionnement dudit dispositif

Country Status (2)

Country Link
KR (1) KR20220118751A (fr)
WO (1) WO2022177105A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20130113997A (ko) * 2012-04-07 2013-10-16 삼성전자주식회사 투명 디스플레이를 포함하는 디바이스에서 오브젝트 제어 방법 및 그 디바이스와 기록 매체
US20140347267A1 (en) * 2012-03-28 2014-11-27 Sony Corporation Display apparatus and display control method
KR20150009204A (ko) * 2013-07-16 2015-01-26 엘지전자 주식회사 휴대 단말기 및 그 제어 방법
KR101750896B1 (ko) * 2010-10-12 2017-07-03 엘지전자 주식회사 이동 단말기 및 그 제어방법
KR102105460B1 (ko) * 2013-06-14 2020-06-01 엘지전자 주식회사 이동 단말기 및 그것의 제어방법

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101750896B1 (ko) * 2010-10-12 2017-07-03 엘지전자 주식회사 이동 단말기 및 그 제어방법
US20140347267A1 (en) * 2012-03-28 2014-11-27 Sony Corporation Display apparatus and display control method
KR20130113997A (ko) * 2012-04-07 2013-10-16 삼성전자주식회사 투명 디스플레이를 포함하는 디바이스에서 오브젝트 제어 방법 및 그 디바이스와 기록 매체
KR102105460B1 (ko) * 2013-06-14 2020-06-01 엘지전자 주식회사 이동 단말기 및 그것의 제어방법
KR20150009204A (ko) * 2013-07-16 2015-01-26 엘지전자 주식회사 휴대 단말기 및 그 제어 방법

Also Published As

Publication number Publication date
KR20220118751A (ko) 2022-08-26

Similar Documents

Publication Publication Date Title
WO2021075786A1 (fr) Dispositif électronique et procédé de traitement d'une fenêtre surgissante utilisant une multi-fenêtre de celui-ci
WO2020060218A1 (fr) Dispositif électronique d'amélioration du phénomène de reconnaissance visuelle dans une zone partielle d'affichage
WO2022060041A1 (fr) Dispositif électronique pliable permettant la génération de contenu et son procédé de fonctionnement
WO2022114416A1 (fr) Dispositif électronique pour fournir une multifenêtre en utilisant un écran extensible
WO2022124734A1 (fr) Dispositif électronique comprenant un afficheur souple, son procédé de fonctionnement et support de stockage
WO2022030804A1 (fr) Dispositif électronique pliable pour commander la rotation d'un écran, et son procédé de fonctionnement
WO2022177299A1 (fr) Procédé de commande de fonction d'appel et dispositif électronique le prenant en charge
WO2022030921A1 (fr) Dispositif électronique, et procédé de commande de son écran
WO2022119311A1 (fr) Dispositif électronique comprenant un écran flexible, et procédé de fonctionnement associé
WO2022103021A1 (fr) Dispositif électronique à affichage flexible et procédé de commande dudit dispositif
WO2022080883A1 (fr) Dispositif électronique et procédé de fonctionnement de dispositif électronique
WO2022177105A1 (fr) Dispositif électronique à affichage transparent et procédé de fonctionnement dudit dispositif
WO2022014836A1 (fr) Procédé et appareil d'affichage d'objets virtuels dans différentes luminosités
WO2024101704A1 (fr) Dispositif pouvant être porté et procédé d'identification d'entrée tactile et support de stockage lisible par ordinateur non transitoire
WO2024019311A1 (fr) Dispositif électronique et procédé de traitement de contact d'objet externe sur un écran d'affichage
WO2022065844A1 (fr) Procédé d'affichage d'image de prévisualisation et appareil électronique le prenant en charge
WO2022108402A1 (fr) Procédé de fonctionnement d'écran souple, et dispositif électronique
WO2023287057A1 (fr) Dispositif électronique permettant de rapidement mettre à jour un écran lorsqu'une entrée est reçue en provenance d'un dispositif périphérique
WO2023003156A1 (fr) Procédé et dispositif électronique pour réaliser une fonction d'authentification d'utilisateur à l'aide d'udc
WO2023063752A1 (fr) Dispositifs électroniques permettant de déplacer un emplacement d'un objet visuel situé dans une zone pliée, et leurs procédés de commande
WO2021251655A1 (fr) Dispositif électronique et procédé de synchronisation basé sur un signal de commande d'affichage dans un dispositif électronique
WO2022181949A1 (fr) Dispositif électronique pour fournir un environnement de ra/rv et son procédé de fonctionnement
WO2022050627A1 (fr) Dispositif électronique comprenant un affichage souple et procédé de fonctionnement de celui-ci
WO2022119412A1 (fr) Dispositif électronique comprenant un afficheur flexible et une caméra
WO2022186495A1 (fr) Dispositif électronique comprenant une pluralité d'objectifs et procédé de commande dudit dispositif

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21926903

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21926903

Country of ref document: EP

Kind code of ref document: A1