EP4635772A1 - Procédé destiné à être exécuté par un équipement utilisateur, appareil et programme informatique - Google Patents
Procédé destiné à être exécuté par un équipement utilisateur, appareil et programme informatiqueInfo
- Publication number
- EP4635772A1 EP4635772A1 EP24170423.8A EP24170423A EP4635772A1 EP 4635772 A1 EP4635772 A1 EP 4635772A1 EP 24170423 A EP24170423 A EP 24170423A EP 4635772 A1 EP4635772 A1 EP 4635772A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- user
- user equipment
- display device
- data
- vehicle
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/10—Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/149—Instrument input by detecting viewing direction not otherwise provided for
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
Definitions
- Embodiments relate to the field of customization of functionalities. Embodiments relate to a method to be executed by user equipment, an apparatus and a computer program.
- Customization of functions may be important to improve a user experience. For example, passengers seated in the back of a vehicle may want to access different functionalities of the vehicle than passengers seated in the front. Passengers seated in the back may want to adjust a window blind, for example. Thus, there may be a need to improve an adjustment of a function.
- a functionality of a software running on user equipment can be adjusted based on a position of a user of the user equipment relative to a display device.
- the position of the user relative to the display device may allow to adjust the software, such that a functionality related to the position of the user can be activated and/or a functionality non-related to the position of the user can be deactivated.
- Examples provide a method to be executed by user equipment.
- the method comprises capturing image data indicating an image of a structure displayed on a display device and determining, based on the image data, perspective data indicating a perspective of the structure. Further, the method comprises determining, based on the perspective data, position data indicating a position of a user of the user equipment relative to the display device and adjusting, based on the position data, a functionality of a software running on the user equipment. Adjusting the functionality of the software running on the user equipment may allow to customize the functions provided by the software. In this way, the user can be given a choice of functions that the user can control. The customization of the functions, i.e., the adjustment of the functionality of the software is based on the position of the user relative to the display device. Using the structure displayed on a display device may allow to determine the position of the user relative to the display device in a facilitated way.
- the position data may indicate a predefined position of the user of the user equipment relative to the display device. That is, the functionality of the software may be linked to a predefined position of the user relative to the display device. For example, users at different positions relative to the display device may have different functions they can use. Using the predefined position the determination of the functions that the user can control can be improved.
- the predefined position may be a seat occupied by the user of the user equipment within a vehicle and the adjustment of the functionality of the software may be based on the seat occupied by the user within the vehicle.
- the user can be given a choice of functions that the user can control for the current position in the vehicle.
- different seats of the vehicle may offer different functionalities, e.g., climate control for the back seats instead of climate control for the front seats.
- the adjustment of the functionality of the software can allow the customization of a function of the vehicle offered to the user.
- the method may further comprise obtaining vehicle data indicating a vehicle in which the user equipment is located and determining the position data based on the vehicle data.
- vehicle data may allow to determine the seat occupied by the user with an improved reliability.
- the vehicle data may indicate a seat configuration and display configuration of the vehicle.
- obtaining the vehicle data may allow to determine a possible seat occupied by the user based on the seat configuration and display configuration of the vehicle. In this way, the determination of the position of the user relative to the display device can be improved.
- the vehicle data is obtained by applying image processing on the image data. Applying image processing may allow to determine the vehicle data in a facilitated way.
- information about the seat configuration and/or display configuration may be encrypted in the structure displayed on the display device. That is, the vehicle data can be obtained by post processing the captured image data.
- the method may further comprise obtaining setting data of an optical imaging sensor of the user equipment used to capture the image data.
- the setting data indicates zoom information of the optical imaging sensor.
- the position data may be determined based on the setting data.
- the setting data may allow to determine a distance between the user equipment and the display device. Based on the distance the position of the user can be determined with improved reliability. For example, the distance can be used to distinguish whether the user occupies a front seat or back seat or a seat in a first row in the back or in a seat in a second row in the back.
- the software running on the user equipment is for adjusting a setting of a device of the vehicle. That is, the software can be used by the user to adjust a setting of the device of the vehicle.
- the software e.g., a graphical user interface of the software, can be used to present the user a choice of functions that the user can control.
- the method may further comprise displaying the software with the adjusted functionality, receiving a user input and generating, based on the user input, control data indicating an adjustment of a setting of a device of a vehicle. Further, the method may comprise transmitting the control data for adjusting the setting of the device of the vehicle. In this way, the user can use the user equipment to adjust a function of the device of the vehicle based on the adjusted functionality of the software. For example, a specific function may be activated for the user based on the position of the user relative to the display device.
- Examples relate to an apparatus, comprising interface circuitry configured to communicate with at least one of a communication device, user equipment or backend and processing circuitry configured to perform a method as described above.
- Examples further relate to a computer program having a program code for performing the method described above, when the computer program is executed on a computer, a processor, or a programmable hardware component.
- the term “or” refers to a non-exclusive or, unless otherwise indicated (e.g., “or else” or “or in the alternative”).
- words used to describe a relationship between elements should be broadly construed to include a direct relationship or the presence of intervening elements unless otherwise indicated. For example, when an element is referred to as being “connected” or “coupled” to another element, the element may be directly connected or coupled to the other element or intervening elements may be present. In contrast, when an element is referred to as being “directly connected” or “directly coupled” to another element, there are no intervening elements present. Similarly, words such as “between”, “adjacent”, and the like should be interpreted in a like fashion.
- Fig. 1 shows an example of a method 100 to be executed by user equipment.
- the method 100 comprises capturing 110 image data indicating an image of a structure displayed on a display device.
- the image data may be captured using an optical imaging sensor of the user equipment.
- a front camera or back camera may be used to capture the image data.
- the structure may be any kind of displayed information suitable to (simply) encode information into a format that can be easily scanned and read by devices with scanning capabilities.
- the structure may be a QR code, a barcode or a snap tag.
- the method 100 comprises determining 120 perspective data indicating a perspective of the structure.
- the perspective data is determined based on the image data.
- the user of the user equipment may be seated on the right side of the display device. This means that the structure captured with the user equipment can be deformed due to the perspective of the user equipment (see Figs. 2 and 3 ). That is, a size of the structure may depend on the position of user equipment relative to the display device.
- the user equipment can determine the perspective data based on the size or the outer dimension of the structure.
- the user equipment can determine the perspective data by comparing with a reference structure.
- the user equipment may determine the perspective data by determining a deviation between the reference structure and the captured structure.
- the method 100 comprises determining 130 position data indicating a position of a user of the user equipment relative to the display device.
- the position data is determined based on the perspective data.
- Determining 130 the position data may comprise determining a distance and/or an angle dependent position of the user relative to the display device. That is, the user equipment may determine a distance and/or a viewing angle of the user equipment to the display device.
- the user equipment may determine 130 the position data by determining a predefined position occupied by the user of the user equipment. For example, the user equipment may compare the outer dimension of the captured structure with reference structures to determine the predefined position. In this case, the user equipment may retrieve different reference structures from a storage device for comparison with the captured structure. In this way, the position data can be determined with less computational effort.
- the method 100 comprises adjusting 140 a functionality of a software running on the user equipment.
- the functionality of the software is adjusted based on the position data.
- the software running on the user equipment can be customized to a current position of the user.
- the software may provide different functions depending on the position of the user relative to the display device.
- the adjustment of the software may allow to provide a user choice of functions that the user can control. That is, the adjustment of the software may be used to customize functions of devices in the surrounding of the user. In this way, a user experience can be improved.
- the method 100 may allow to provide a user an adjusted software for specific user needs.
- the method 100 may comprise image processing on the captured image data while the user is scanning the structure displayed on the display device. That is, the image data may be captured during scanning of the structure using the user equipment.
- the user of the user equipment may scan the structure displayed on the display device for pairing the user equipment with another electronic device, e.g., part of the display device.
- the structure may be a QR code.
- the user can scan the QR code using the user equipment.
- the image of the QR code may be processed by image processing when the user is scanning a QR code not only to identify the QR code itself, but also to identify the orientation of the QR code with respect to the user equipment.
- a specific location e.g., a specific seat location inside of the vehicle, can be assigned to the user equipment.
- the assigned specific location can be used to adjust the features offered within an application, e.g., an application of a vehicle manufacturer, installed on the user equipment accordingly.
- the position data may indicate a predefined position of the user of the user equipment relative to the display device. That is, the user equipment may determine the position data by determining the predefined position that the user most likely occupies.
- Multiple predefined positions may be related to the display device the structure is displayed on.
- one display device may be related to back seats of a vehicle. The back seats may comprise a left seat, a middle seat and a right seat.
- the position data may indicate one of these three predefined positions.
- Using the predefined position may allow to increase a reliability of the determination of the position of the user and/or to reduce a computational load to determine the position data.
- the user equipment can be used in conjunction with any kind of display device such as a television or a monitor of a computer. That is, the method can be applied to the display device of the vehicle, a television and/or a monitor of a computer, for example.
- the predefined position may be a seat occupied by the user of the user equipment within a vehicle and the adjustment of the functionality of the software may be based on the seat occupied by the user within the vehicle.
- the back seats of the vehicle may offer different functionalities to control different functions or devices of the vehicle.
- Window seats of the back seats may provide control of a window blind, for example.
- the functionality of the software can be adjusted such that the window blind can only be controlled from a passenger occupying a window seat. In this way, a user of the vehicle can be given a choice of functions that the user can control for the current occupied seat in the vehicle.
- the method 100 may further comprise obtaining vehicle data indicating a vehicle in which the user equipment is located and determining the position data based on the vehicle data.
- the vehicle data may indicate a vehicle type or specific vehicle identifier, for example.
- the vehicle data may be obtained by receiving the vehicle data, e.g., from an electronic control unit of the vehicle.
- the vehicle data may be obtained by determining based on the captured image data, e.g., by applying image processing.
- the vehicle data can be used to determine the position data.
- the vehicle data may indicate all seats of the vehicle with respect to the display devices of the vehicle.
- the vehicle data may indicate the vehicle type and the user equipment can retrieve information about the seats of the vehicle type from a storage device. In this way, information about the predefined positions can be determined in a facilitated way.
- the vehicle data is obtained by applying image processing on the image data.
- the structure such like a QR code, may comprise information about the vehicle in which the display device is located.
- the vehicle data can be determined based on the captured image data.
- obtaining the image data can be facilitated.
- the method 100 may further comprise obtaining setting data of an optical imaging sensor of the user equipment used to capture the image data.
- the setting data indicates zoom information of the optical imaging sensor.
- the position data may be determined based on the setting data.
- the setting data may allow to determine a distance between the user equipment and the display device.
- the display device may be associated with multiple seats at different distances from the display device.
- the user equipment can distinguish the position of the user not only from the size or the outer dimension of the structure displayed on the display device, but also from the setting data, e.g., a respective zoom meta-information, of the optical imaging sensor of the user equipment.
- the size of the structure shown in the captured image may be correlated with the zoom of the optical imaging sensor on the application used to capture the image.
- a distance of the user equipment from the display device can be determined. Based on the distance the position of the user can be determined with an improved reliability.
- the software running on the user equipment is for adjusting a setting of a device of the vehicle. That is, the software can be used by the user to adjust a setting of the device of the vehicle.
- the software e.g., a graphical user interface of the software
- the software can be used to present the user a choice of functions that the user can control. Adjusting the functionality of the software may comprise activating and/or deactivating certain icons in the software.
- the graphical user interface of the software can be adjusted to provide the user an overview of functions that the user can control.
- the method 100 may further comprise displaying the software with the adjusted functionality, receiving a user input and generating, based on the user input, control data indicating an adjustment of a setting of a device of a vehicle. Further, the method 100 may comprise transmitting the control data for adjusting the setting of the device of the vehicle. In this way, the user can use the user equipment to adjust a function of the device of the vehicle based on the adjusted functionality of the software. For example, a specific function may be activated for the user based on the position of the user relative to the display device.
- the image of the structure displayed on the display device may indicate a position of the display device. That is, the image of the structure may be assigned to a specific seat of the vehicle. For example, the image of the structure may be assigned to a passenger seat in the front.
- the method 100 may comprise determining a functionality of the display device based on the position of the display device.
- the display device can only be used by a user seated on the passenger seat or multiple user of the vehicle excluding the driver. That is, a usage of the display device by another unintended user, e.g., a driver, can be blocked.
- the driver may scan the structure displayed on the display device from his perspective. In this case, the perspective data would indicate a perspective of the driver. Therefore, the method 100 may adjust 140 the functionality of a software running on the user equipment by blocking an interaction with the display device based on the position of a user of the user equipment relative to the display device. In this way, a use of the display device by the driver can be prevented.
- the access to the display device could also be blocked for other user in the car, for example, passengers seated in the back of the vehicle. In this way, the display device can be only accessed by an intended user. That is, the functionality of the display device can be determined based on the position of the display device and/or the position of a user of the user equipment relative to the display device.
- user equipment may be a device that is capable of communicating wirelessly.
- the user equipment may be a mobile user equipment, e.g., user equipment that is suitable for being carried around by a user.
- the user equipment may be a user terminal or user equipment within the meaning of the respective communication standards being used for mobile communication.
- the user equipment may be a mobile phone, such as a smartphone, or another type of mobile communication device, such as a smartwatch, a laptop computer, a tablet computer, or autonomous augmented-reality glasses.
- Fig. 1 may comprise one or more optional additional features corresponding to one or more aspects mentioned in connection with the proposed concept or one or more examples described below (e.g., Fig. 2 (comprising Figs. 2a and 2b ) - 4).
- Figs. 2a and 2b show an exemplary inside of a vehicle comprising a display device 240 for the back seats.
- Fig. 2a shows user equipment 230 located on the left side.
- the user of the user equipment 230 is scanning the structure 250.
- a structure 250 is a QR code 250.
- the user occupies the left back seat.
- the QR code 252a is deformed due to the perspective of the user equipment to the display device. That is, the outer dimension of the QR code is not rectangular.
- Fig. 2b shows user equipment 230 located on the right side.
- the user of the user equipment 230 is scanning the QR code 250.
- the QR code 252b is deformed in a different way than the QR code 252a. That is, the perspective of the user equipment 230 to the display device 250 can be determined based on the size and/or the outer dimension of the QR code 252a, 252b.
- Fig. 2 discloses just an example where the QR code 250 is illustrated on the back seats display device 240.
- the concept is equally valid for any kind of display device such as a front display device.
- Fig. 2 may comprise one or more optional additional features corresponding to one or more aspects mentioned in connection with the proposed concept or one or more examples described above (e.g., Fig. 1 ) and/or below (e.g., Fig. 3 (comprising Figs. 3a and 3b ) - 4).
- Figs. 3a and 3b show different examples of QR codes displayed on a display device of user equipment 330.
- Figs. 3a shows an example in which the QR code is not deformed. That is, the outer dimension of the QR code may have a rectangular shape.
- the guiding lines 340a which can be used to highlight the perspective of the user equipment are parallel. This is the case when the user equipment 330 is directly in front of the center of the display device and not displaced to the side.
- FIGs. 3b shows an example in which the QR code is deformed, e.g., the user equipment 330 is displayed to the right side of the display device.
- the guiding lines 340b are not parallel. Based on the angle between the guiding lines 340 the perspective of user equipment 330 to the display device can be determined.
- Fig. 3 may comprise one or more optional additional features corresponding to one or more aspects mentioned in connection with the proposed concept or one or more examples described above (e.g., Fig. 1-2 ) and/or below (e.g., Fig. 4 ).
- Fig. 4 shows a block diagram of an example of an apparatus 30, e.g., of user equipment.
- the apparatus 30 comprises interface circuitry 32 configured to communicate another electronic device and processing circuitry 34 configured to perform a method as described above, e.g., the method to be executed by the user equipment as described with reference to Fig. 1 .
- the apparatus 30 may be part of user equipment.
- the respective interface circuitry 32 is coupled to the respective processing circuitry 34 at the apparatus 30.
- the processing circuitry 34 may be implemented using one or more processing units, one or more processing devices, any means for processing, such as a processor, a computer or a programmable hardware component being operable with accordingly adapted software. Similar, the described functions of the processing circuitry 34 may as well be implemented in software, which is then executed on one or more programmable hardware components. Such hardware components may comprise a general-purpose processor, a Digital Signal Processor (DSP), a micro-controller, etc.
- DSP Digital Signal Processor
- the processing circuitry 34 is capable of controlling the interface circuitry 32, so that any data transfer that occurs over the interface circuitry 32 and/or any interaction in which the interface circuitry 32 may be involved may be controlled by the processing circuitry 34.
- the apparatus 30 may comprise a memory and at least one processing circuitry 34 operably coupled to the memory and configured to perform the method described above.
- the interface circuitry 32 may correspond to any means for obtaining, receiving, transmitting or providing analog or digital signals or information, e.g. any connector, contact, pin, register, input port, output port, conductor, lane, etc. which allows providing or obtaining a signal or information.
- the interface circuitry 32 may be wireless or wireline and it may be configured to communicate, e.g., transmit or receive signals, information with further internal or external components.
- the apparatus 30 may be a computer, processor, control unit, (field) programmable logic array ((F)PLA), (field) programmable gate array ((F)PGA), graphics processor unit (GPU), application-specific integrated circuit (ASICs), integrated circuits (IC) or system-on-a-chip (SoCs) system.
- FPLA field programmable logic array
- F field programmable gate array
- GPU graphics processor unit
- ASICs application-specific integrated circuit
- IC integrated circuits
- SoCs system-on-a-chip
- Fig. 4 may comprise one or more optional additional features corresponding to one or more aspects mentioned in connection with the proposed concept or one or more examples described above (e.g., Fig. 1 - 3 ).
- Examples may further be or relate to a (computer) program including a program code to execute one or more of the above methods when the program is executed on a computer, processor or other programmable hardware component.
- steps, operations or processes of different ones of the methods described above may also be executed by programmed computers, processors or other programmable hardware components.
- Examples may also cover program storage devices, such as digital data storage media, which are machine-, processor- or computer-readable and encode and/or contain machine-executable, processor-executable or computer-executable programs and instructions.
- Program storage devices may include or be digital storage devices, magnetic storage media such as magnetic disks and magnetic tapes, hard disk drives, or optically readable digital data storage media, for example.
- Other examples may also include computers, processors, control units, (field) programmable logic arrays ((F)PLAs), (field) programmable gate arrays ((F)PGAs), graphics processor units (GPU), application-specific integrated circuits (ASICs), integrated circuits (ICs) or system-on-a-chip (SoCs) systems programmed to execute the steps of the methods described above.
- FPLAs field programmable logic arrays
- F field) programmable gate arrays
- GPU graphics processor units
- ASICs application-specific integrated circuits
- ICs integrated circuits
- SoCs system-on-a-chip
- aspects described in relation to a device or system should also be understood as a description of the corresponding method.
- a block, device or functional aspect of the device or system may correspond to a feature, such as a method step, of the corresponding method.
- aspects described in relation to a method shall also be understood as a description of a corresponding block, a corresponding element, a property or a functional feature of a corresponding device or a corresponding system.
- a block, device or functional aspect of the device or system may correspond to a feature, such as a method step, of the corresponding method. Accordingly, aspects described in relation to a method shall also be understood as a description of a corresponding block, a corresponding element, a property or a functional feature of a corresponding device or a corresponding system.
Landscapes
- Engineering & Computer Science (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- User Interface Of Digital Computer (AREA)
- Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| EP24170423.8A EP4635772A1 (fr) | 2024-04-16 | 2024-04-16 | Procédé destiné à être exécuté par un équipement utilisateur, appareil et programme informatique |
| PCT/EP2025/052042 WO2025218939A1 (fr) | 2024-04-16 | 2025-01-28 | Procédé à exécuter par un équipement utilisateur, appareil et programme informatique |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| EP24170423.8A EP4635772A1 (fr) | 2024-04-16 | 2024-04-16 | Procédé destiné à être exécuté par un équipement utilisateur, appareil et programme informatique |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| EP4635772A1 true EP4635772A1 (fr) | 2025-10-22 |
Family
ID=90735381
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| EP24170423.8A Pending EP4635772A1 (fr) | 2024-04-16 | 2024-04-16 | Procédé destiné à être exécuté par un équipement utilisateur, appareil et programme informatique |
Country Status (2)
| Country | Link |
|---|---|
| EP (1) | EP4635772A1 (fr) |
| WO (1) | WO2025218939A1 (fr) |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO1996022660A1 (fr) * | 1995-01-20 | 1996-07-25 | Reveo, Inc. | Systeme intelligent et procede associe de creation et presentation d'images stereo multiplexees dans des environnements de realite virtuelle |
| WO2012132201A1 (fr) * | 2011-03-31 | 2012-10-04 | Sony Corporation | Appareil de traitement d'informations, appareil d'affichage d'image et procédé de traitement d'informations |
| US9269177B2 (en) * | 2011-06-10 | 2016-02-23 | Lg Electronics Inc. | Method for processing image and apparatus for processing image |
-
2024
- 2024-04-16 EP EP24170423.8A patent/EP4635772A1/fr active Pending
-
2025
- 2025-01-28 WO PCT/EP2025/052042 patent/WO2025218939A1/fr active Pending
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO1996022660A1 (fr) * | 1995-01-20 | 1996-07-25 | Reveo, Inc. | Systeme intelligent et procede associe de creation et presentation d'images stereo multiplexees dans des environnements de realite virtuelle |
| WO2012132201A1 (fr) * | 2011-03-31 | 2012-10-04 | Sony Corporation | Appareil de traitement d'informations, appareil d'affichage d'image et procédé de traitement d'informations |
| US9269177B2 (en) * | 2011-06-10 | 2016-02-23 | Lg Electronics Inc. | Method for processing image and apparatus for processing image |
Also Published As
| Publication number | Publication date |
|---|---|
| WO2025218939A1 (fr) | 2025-10-23 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US20210225019A1 (en) | Electronic device and method for applying image effect to images obtained using image sensor | |
| EP3968625B1 (fr) | Appareil de photographie numérique et son procédé de fonctionnement | |
| CN111741274B (zh) | 一种支持画面局部放大和漫游的超高清视频监看方法 | |
| US20200014864A1 (en) | Electronic device including camera module and method for controlling electronic device | |
| EP3422287B1 (fr) | Appareil de traitement d'informations, procédé de traitement d'informations et programme | |
| US20150296145A1 (en) | Method of displaying images and electronic device adapted thereto | |
| US9942483B2 (en) | Information processing device and method using display for auxiliary light | |
| US20120281022A1 (en) | Electronic apparatus and image display method | |
| US11610285B2 (en) | Display method and device | |
| EP4635772A1 (fr) | Procédé destiné à être exécuté par un équipement utilisateur, appareil et programme informatique | |
| US8467572B2 (en) | Method and apparatus for detecting object using perspective plane | |
| US10440283B2 (en) | Electronic device and method for controlling the same | |
| US8786710B2 (en) | Test system and method for testing motherboard of camera | |
| US11709645B2 (en) | Wearable terminal device, control method, and system | |
| CN106155316A (zh) | 控制方法、控制装置及电子装置 | |
| US9747235B2 (en) | Information processing method and electronic device | |
| US11979529B2 (en) | Information processing apparatus, information processing method, and non-transitory recording medium for reading aloud content for visually impaired users | |
| CN109196860A (zh) | 一种多视角图像的控制方法及相关装置 | |
| EP3261057A1 (fr) | Visiocasque et procédé de commande de transmission | |
| US11144273B2 (en) | Image display apparatus having multiple operation modes and control method thereof | |
| KR20180024151A (ko) | 클러스터 해상도 변경 장치 | |
| CN117057995B (zh) | 图像处理方法、装置、芯片、电子设备及存储介质 | |
| US9665338B2 (en) | Display apparatus, video system, display method and projector | |
| US20240098356A1 (en) | Image capturing apparatus, analysis method, and storage medium | |
| US11733843B2 (en) | Information processing apparatus, information processing method, and storage medium |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION HAS BEEN PUBLISHED |
|
| AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC ME MK MT NL NO PL PT RO RS SE SI SK SM TR |