WO2024078686A1 - Agencement de module logiciel informatique, agencement de circuits, agencement et procédé pour une perception humaine améliorée dans des systèmes xr - Google Patents

Agencement de module logiciel informatique, agencement de circuits, agencement et procédé pour une perception humaine améliorée dans des systèmes xr Download PDF

Info

Publication number
WO2024078686A1
WO2024078686A1 PCT/EP2022/078090 EP2022078090W WO2024078686A1 WO 2024078686 A1 WO2024078686 A1 WO 2024078686A1 EP 2022078090 W EP2022078090 W EP 2022078090W WO 2024078686 A1 WO2024078686 A1 WO 2024078686A1
Authority
WO
WIPO (PCT)
Prior art keywords
view
format
viewing arrangement
graphical representation
image data
Prior art date
Application number
PCT/EP2022/078090
Other languages
English (en)
Inventor
Tobias WIDMARK
Andreas Kristensson
Original Assignee
Telefonaktiebolaget Lm Ericsson (Publ)
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Telefonaktiebolaget Lm Ericsson (Publ) filed Critical Telefonaktiebolaget Lm Ericsson (Publ)
Priority to PCT/EP2022/078090 priority Critical patent/WO2024078686A1/fr
Publication of WO2024078686A1 publication Critical patent/WO2024078686A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements

Definitions

  • the present invention relates to an arrangement, an arrangement comprising computer software modules, an arrangement comprising circuits, and a method for providing an improved manner of assisting human perception, or rather human vision, in extended reality systems.
  • Augmented Reality technology
  • AR Augmented Reality
  • it comes to visual information specifically it becomes a trade-off between the amount of information to provide versus what the user is able to focus on.
  • most AR devices available provide a static viewport bounded by the display; information is often locked to a specific location either on the display or in the virtual world representation, with little regard for where the user is directly looking.
  • the human field of vision is divided into at least two main fields of vision, namely the main field of vision and the peripheral field of vision, also referred to as peripheral vision.
  • the inventors have realized that as human vision is not the same in the main field of vision as in the peripheral field of vision, a user will not solely have difficulties perceiving and processing any visual information provided therein but may actually have difficulties seeing the visual information.
  • the inventors are therefore proposing a solution that is not static in its display of information but adapts the information to be displayed based on the location relative the user's gaze thereby ensuring or at least increasing the chances that the visual information is actually seen by the user.
  • An object of the present teachings is therefore to overcome or at least reduce or mitigate the problems of the prior art, by providing a manner of tracking the gaze of a user and to display objects in the main field of view as the natural objects or through a basic virtual object and to display objects of interest in the peripheral field of view as a virtual object with an adapted representation.
  • an AR viewing arrangement comprises a circuit for receiving image data and a circuit for processing the image, wherein the circuit for receiving image data is configured to receive an image of a real-world view, and wherein the circuit for processing the image is configured to determine a main field- of-view (MFOV) and a peripheral field-of-view (PFOV) in the image data; detect an object in the image data; determine if the object is in the peripheral field-of-view (PFOV), and if so provide a graphical representation for the object in a second format, and if not provide a graphical representation for the object in a first format, wherein the first format is different from the second format.
  • MFOV main field- of-view
  • PFOV peripheral field-of-view
  • the solution may be implemented as a software solution, a hardware solution or a mix of software and hardware components.
  • the second format is in black and white, and wherein the first format is in color.
  • the first format is the original image of the object.
  • the circuit for receiving image data is configured to receive image data corresponding to a rear-view of the AR viewing arrangement, and wherein the circuit for receiving image data is further configured to detect an eye (E) in the image data corresponding to the rear-view and to determine a gaze (G) based on the detected eye (E), and to determine the main field-of-view (MFOV) and the peripheral field-of-view (PFOV) in the image data based on the gaze (G).
  • E eye
  • G a gaze
  • PFOV peripheral field-of-view
  • the circuit for image processing is further configured to detect an event for the detected object and provide the graphical representation to indicate the event. This enables the arrangement to increase the chances of the user seeing the event, such as a change.
  • the circuit for image processing is further configured to detect an event for the detected object in the peripheral field-of-vision and provide a further graphical representation for the event.
  • the circuit for image processing is further configured to detect that the detected object has moved, and in response thereto provide the graphical representation in the other format.
  • the circuit for image processing is further configured to detect that the gaze (G) has moved and in response thereto determine updated main field of view and peripheral field of view determine if any detected object in the peripheral field of vision falls within the updated main field of vision, and in response thereto provide the graphical representation in the first format; and determine if any detected object in the main field of vision falls within the updated peripheral field of vision, and in response thereto provide the graphical representation in the second format.
  • the graphical representation for the detected object in the peripheral field-of-view comprises a color-scale ranging from full color to black and white, wherein the amount of color is based on a distance from a center (G) of the field-of-view.
  • the graphical representation for the detected object in the peripheral field-of-view comprises an intensity, wherein the amount of intensity is based on a distance from a center (G) of the field-of-view, wherein the intensity grows with the distance.
  • the graphical representation for the detected object in the peripheral field-of-view comprises a changing component.
  • the AR viewing arrangement further comprises a circuit for displaying image data configured to display the image of the real-world and to display the graphical representation of over the detected object.
  • the graphical representation for the detected object in the peripheral field-of-view comprises a graphical object arranged to cover the detected object as displayed.
  • the graphical representation for the detected object in the peripheral field-of-view comprises a graphical object arranged to frame the detected object as displayed.
  • the graphical representation for the detected object in the peripheral field-of-view comprises an identifier for the detected object, wherein the detected object is a searched-for object.
  • the AR viewing arrangement is a head-worn device.
  • the AR viewing arrangement is a head-up display device.
  • the AR viewing arrangement is a tablet computer device.
  • the AR viewing arrangement is a smartphone.
  • the AR viewing arrangement further comprises a front-facing image sensor and a rear-facing image sensor, such as a front-facing camera and a rear-facing camera.
  • the AR viewing arrangement is comprised in a camera or other image sensor device.
  • the AR viewing arrangement is a display possibly to be used with another device or in another device.
  • the AR viewing arrangement is arranged to be used in image retrieval, industrial use, robotic vision and/or video surveillance.
  • the proposed solution manages to achieve the provision of visual information in a non-intrusive, lightweight way, leveraging the use of AR and Object Recognition / Object Detection to provide a greater degree of control over the notifications and scanning the environment for relevant objects.
  • the proposed solution provides a new way to be aware of one's surroundings thanks to: Providing notifications with limited distraction in the visual field
  • the AR viewing arrangement is thus enabled to provide indicators for certain objects and events in the peripheral vision, relying on motion and certain coloring of indicators to signify importance / type of event.
  • the AR viewing arrangement according to the teachings herein is thus also enabled to provide visual AR indicators that respond dynamically to real-time updates in the environment, to ensure that the user is being alerted by relevant and current events.
  • the AR viewing arrangement is thus also enabled to adapt its peripheral vision indicators to the user's gaze, to ensure that the indicators do not impede on the main field of vision and cause unnecessary distractions or visual impairments.
  • a method for use in an AR viewing arrangement comprising receiving image data and determining a main field-of-view (MFOV) and a peripheral field-of-view (PFOV) in the image data.
  • the method further comprises detecting an object in the image data, and determining if the object is in the peripheral field-of-view (PFOV), and if so providing a graphical representation for the object in black and white, being an example of a second format.
  • PFOV peripheral field-of-view
  • the method further comprises detecting a change in gaze and adapting or updating the peripheral and main field of visions accordingly as well as any graphical representations if needed.
  • a computer-readable medium carrying computer instructions that when loaded into and executed by a controller of an AR viewing arrangement enables the AR viewing arrangement to implement a method according to herein.
  • a software component arrangement comprising: a software component for receiving image data and a software component for determining a main field-of-view (MFOV) and a peripheral field-of-view (PFOV) in the image data.
  • the software component arrangement further comprises a software component for detecting an object in the image data, and a software component for determining if the object is in the peripheral field-of-view (PFOV), and a software component for providing a graphical representation for the object in a second format if so.
  • PFOV peripheral field-of-view
  • a software component for providing a graphical representation for the object in a first format wherein the first format is different from the second format, if the object is in the main field of view.
  • the software component arrangement further comprises a software component for detecting a change in gaze and adapting or updating the peripheral and main field of visions accordingly as well as any graphical representations if needed.
  • the software component arrangement further comprises software component(s) for performing any of the functionalities discussed herein.
  • an arrangement comprising circuitry, wherein the arrangement comprising circuitry comprises: circuitry for receiving image data and circuitry for determining a main field-of-view (MFOV) and a peripheral field-of-view (PFOV) in the image data.
  • the arrangement further comprises circuitry for detecting an object in the image data, and circuitry for determining if the object is in the peripheral field-of-view (PFOV), and circuitry for providing a graphical representation for the object in a second format if so.
  • the arrangement further comprises circuitry for detecting a change in gaze and adapting or updating the peripheral and main field of visions accordingly as well as any graphical representations if needed.
  • Figure 1A shows a schematic view of an AR viewing arrangement according to an embodiment of the present invention.
  • Figure IB shows a schematic view of an AR viewing arrangement according to an embodiment of the present invention.
  • Figure 1C shows a schematic view of an AR viewing arrangement according to an embodiment of the present invention.
  • Figures 2A,2B, 2C, 2D and 2E each shows a schematic view of an AR viewing arrangement according to one embodiment of the teachings herein.
  • FIG. 3 shows a flowchart of a general method according to an embodiment of the present invention.
  • Figure 4 shows a component view for a software component arrangement according to an embodiment of the teachings herein.
  • Figure 5 shows a component view for an arrangement comprising circuits according to an embodiment of the teachings herein.
  • Figure 6 shows a schematic view of a computer-readable medium carrying computer instructions that when loaded into and executed by a controller of an arrangement enables the arrangement to implement an embodiment of the present invention.
  • the teachings herein describe a notification system for an AR device that uses gaze tracking to determine the part of the device's display is in the user's peripheral vision, and then notifies the user of events through graphics that move in the user's periphery. These events are based on real world information gathered through Object Detection (and/or Object Recognition) that would otherwise be difficult to perceive in the user's peripheral vision. As some or all processing may be performed in a separate device than the actual display device, the device will hereafter be referred to as an arrangement which may comprise one or more devices or be connected to one or more devices.
  • Augmented Reality is a term for technology that involves combining the real world with virtual environments, enhancing, or changing a user's perception of the real world.
  • Virtual objects are represented by using displays, speakers, haptics, or other mediums, intersecting with the actual physical real-world environment.
  • An example would be a pair of glasses that has an integrated heads-up display that can show for example a virtual model of a building in the place of where another building is already standing.
  • OD Object Detection
  • OR Object Recognition
  • Peripheral vision is the part of human vision that occurs outside a person's point of fixation, i.e., what a person is capable of seeing that they are not looking directly at outside the main field of vision. While the exact definition differs depending on usage, it is commonly referred to as the vision available 30 degrees outside an eye's focus point, the main field of vision being the 30 degrees from the eye's focus point, bounded by the edges of said eye's vision (100-110 degrees on each side horizontally, 65 degrees vertically).
  • a defining feature of peripheral vision is that visual acuity and color perception is greatly reduced when moving out from the center. While color perception is reduced, peripheral vision is capable of motion detection and in low-light conditions it excels over central vision at detecting light.
  • FIG. 1A shows a schematic view of an AR viewing arrangement 100 according to an embodiment of the present invention.
  • the AR viewing arrangement comprises a controller 101, a memory 102, an image data receiving device 103, such as for example a camera or image sensor, an image streaming device (such as a communication interface) or an image data reading device arranged to read image data from the memory 102.
  • the controller 101 is configured to receive at least one image data file, corresponding to at least an image, from the image data receiving device 103, and to perform object detection (to be understood as being an alternative to or including any of image recognition or image classification or segmentation) on the image data.
  • the image data receiving device 103 may be comprised in the AR viewing arrangement 100 by being housed in a same housing as the AR viewing arrangement, or by being connected to it, by a wired connection or wirelessly.
  • the AR viewing arrangement 100 may comprise a single device or may be distributed across several devices and apparatuses.
  • the controller 101 is also configured to control the overall operation of the AR viewing arrangement 100.
  • the controller 101 is a graphics controller.
  • the controller 101 is a general-purpose controller.
  • the controller 101 is a combination of a graphics controller and a general-purpose controller.
  • the controller comprises several circuits for performing various tasks, processing or sub-processing as discussed herein.
  • the controller comprises circuit 101A for receiving image data and a circuit 101B for processing the image.
  • a controller would understand there are many alternatives for how to implement a controller, such as using Field -Programmable Gate Arrays circuits, AISIC, GPU, etc. in addition or as an alternative. For the purpose of this application, all such possibilities and alternatives will be referred to simply as the controller 101.
  • the memory 102 is configured to store graphics data and computer-readable instructions that when loaded into the controller 101 indicates how the AR viewing arrangement 100 is to be controlled.
  • the memory 102 may comprise several memory units or devices, but they will be perceived as being part of the same overall memory 102. There may be one memory unit for a display arrangement storing graphics data, one memory unit for image capturing device storing settings, one memory for the communications interface (see below) for storing settings, and so on. As a skilled person would understand there are many possibilities of how to select where data should be stored and a general memory 102 for the AR viewing arrangement 100 is therefore seen to comprise any and all such memory units for the purpose of this application.
  • non-volatile memory circuits such as EEPROM memory circuits
  • volatile memory circuits such as RAM memory circuits.
  • all such alternatives will be referred to simply as the memory 102.
  • the teachings herein find use in AR viewing arrangements in many areas of computer vision, including object detection in mixed or augmented reality systems, image retrieval, industrial use, robotic vision and video surveillance where a basic AR viewing arrangement 100 such as in figure 1A may be utilized.
  • the AR viewing arrangement 100 is a digital camera or other image sensor device (or comprised in such device).
  • the AR viewing arrangement 100 is connected to a digital camera or other image sensor device.
  • Figure IB shows a schematic view of an AR viewing arrangement being a viewing device 100 according to an embodiment of the present invention.
  • the viewing device 100 is a smartphone or a tablet computer.
  • the viewing device further comprises a display arrangement 110, which may be a touch display, and the image data receiving device 103 may be a series of cameras of the smartphone or tablet computer.
  • the controller 101 is configured to receive an image from the camera (or other image receiving device) 103, detect objects in the image and display the image on the display arrangement 110 along with virtual content indicating or being associated with the detected object(s).
  • the display 110 is seen to comprise a display circuit configured to display graphics as received from and possibly processed by the controller 101.
  • one camera 103A is arranged on a backside (opposite side of the display 110, as is indicated by the dotted contour of the camera 103A) of the AR viewing arrangement 100 for enabling real life objects behind the AR viewing arrangement 100 to be captured and shown to a user (not shown in figure IB) on the display 110 along with any displayed virtual content.
  • one camera 103B is arranged on a frontside (same side as the display 110) of the AR viewing arrangement 100 for tracking the user's gaze.
  • the displayed virtual content may be information and/or graphics indicating and/or giving information on detected objects.
  • An AR viewing device such as in figure IB may be worn in a headset whereby it becomes a see-through device as discussed in relation to figure 1C below.
  • Figure 1C shows a schematic view of an AR viewing arrangement being or being part of an optical see-through (OST) viewing device 100 according to an embodiment of the present invention.
  • the viewing device 100 is a see-through device, where a user looks in through one end, and sees the real-life objects in the line of sight at the other end of the viewing device 100.
  • the viewing device 100 is in some embodiments a virtual reality device.
  • the viewing device 100 is a head-mounted viewing device 100 to be worn by a user (not shown explicitly in figure 1C) for looking through the viewing device 100.
  • the viewing device 100 is arranged as glasses, or other eye wear including goggles, to be worn by a user.
  • the viewing device 100 is in some embodiments arranged to be hand-held, whereby a user can hold up the viewing device 100 to look through it.
  • the viewing device 100 is in some embodiments arranged to be mounted on for example a tripod, whereby a user can mount the viewing device 100 in a convenient arrangement for looking through it.
  • the viewing device 100 may be mounted on a dashboard of a car or other vehicle.
  • the viewing device comprises a display arrangement 110 for presenting virtual content to a viewer and an image data receiving device 103 for receiving image data.
  • the image data receiving device 103 may be remote and comprised in the AR viewing arrangement through a connection to the AR viewing arrangement 100.
  • the image data receiving device is one or more cameras 103 of which at least one is arranged to perceive a forward field of view for capturing image data of the real world looked at by the user, and at least one for capturing aa rear field of view for capturing an image of the user's eye E to enable gaze tracking of the user's gaze G.
  • the AR viewing arrangement 100 may further comprise a communication interface (not shown explicitly but taken to be part of the controller 101).
  • the communication interface may be wired and/or wireless.
  • the communication interface may comprise several interfaces.
  • the communication interface comprises a USB (Universal Serial Bus) interface. In some embodiments the communication interface comprises a HDMI (High Definition Multimedia Interface) interface. In some embodiments the communication interface comprises a Display Port interface. In some embodiments the communication interface comprises an Ethernet interface. In some embodiments the communication interface comprises a MIPI (Mobile Industry Processor Interface) interface. In some embodiments the communication interface comprises an analog interface, a CAN (Controller Area Network) bus interface, an I2C (Inter-Integrated Circuit) interface, or other interface.
  • USB Universal Serial Bus
  • HDMI High Definition Multimedia Interface
  • the communication interface comprises a Display Port interface.
  • the communication interface comprises an Ethernet interface.
  • the communication interface comprises a MIPI (Mobile Industry Processor Interface) interface.
  • the communication interface comprises an analog interface, a CAN (Controller Area Network) bus interface, an I2C (Inter-Integrated Circuit) interface, or other interface.
  • the communication interface comprises a radio frequency (RF) communications interface.
  • the communication interface comprises a BluetoothTM interface, a WiFiTM interface, a ZigBeeTM interface, a RFIDTM (Radio Frequency I Dentifier) interface, Wireless Display (WiDi) interface, Miracast interface, and/or other RF interface commonly used for short range RF communication.
  • the communication interface comprises a cellular communications interface such as a fifth generation (5G) cellular communication interface, an LTE (Long Term Evolution) interface, a GSM (Global Systeme Mobile) interface and/or other interface commonly used for cellular communication.
  • the communications interface is configured to communicate using the UPnP (Universal Plug n Play) protocol.
  • the communications interface is configured to communicate using the DLNA (Digital Living Network Appliance) protocol.
  • the communications interface is configured to enable communication through more than one of the example technologies given above.
  • a wired interface such as MIPI could be used for establishing an interface between the display arrangement, the controller and the user interface
  • a wireless interface for example WiFiTM could be used to enable communication between the AR viewing arrangement 100 and an external host device (not shown).
  • the communications interface may be configured to enable the AR viewing arrangement 100 to communicate with other devices, such as other AR viewing arrangements 100 and/or smartphones, Internet tablets, computer tablets or other computers, media devices, such as television sets, gaming consoles, video viewer or projectors (not shown), or image capturing devices for receiving the image data streams.
  • devices such as other AR viewing arrangements 100 and/or smartphones, Internet tablets, computer tablets or other computers, media devices, such as television sets, gaming consoles, video viewer or projectors (not shown), or image capturing devices for receiving the image data streams.
  • a user interface 104 may be comprised in the AR viewing arrangement 100 (only shown in figure IB). Additionally or alternatively, (at least a part of) the user interface 104 may be comprised remotely in the AR viewing arrangement 100 through the communication interface, the user interface then (at least a part of it) not being a physical means in the AR viewing arrangement 100, but implemented by receiving user input through a remote device (not shown) through the communication interface.
  • a remote device is a game controller, a mobile phone handset, a tablet computer or a computer.
  • the teachings herein aim to provide visual notifications in a way that allows the user's central vision to remain clear while the AR device is in use as well as enabling a user to see (visually perceive as opposed to cognitively perceive) object in the peripheral vision. Since the notifications follows the user's periphery dynamically there is no risk for the user to find their vision blocked when they change direction they look in. When the user looks directly at the object they are being notified of, the notification can seamlessly disappear from the device display. As human color perception deteriorates the further out along the periphery one goes, using motion such as blinking or slight movement can be a good visual indicator. As the eye is also more sensitive to light and light changes, using a starker contrast in the peripheral vision is also a good visual indicator. In some embodiments, the starker contrast is achieved through using black and white, which ensures that a desired contrast is visually perceived by a user regardless o that user's ability to perceive colors in the peripheral field of vision.
  • Figure 2A is a schematic view of an AR viewing arrangement as in any of figures 1A, IB or 1C.
  • a user is looking at a scene where there are two objects 220A, 220B that are detected by the controller of the AR viewing arrangement 100.
  • the objects 220 are traffic lights, but may be any type of object as would be understood.
  • one object 220A is in the main field of view MFOV (centered along a center of gaze, reference G), whereas the other object 220B is in a peripheral field of view PFOVA.
  • the peripheral vision surrounds the main field of vision and will thus be illustrated as two peripheral field of visions, one on each side of the main field of view.
  • an implementation may not have such a partition into a first and a second field of view, but they may be regarded as one field of view and no difference will be made between any such peripheral field of views in the description herein, unless specifically indicated.
  • the AR viewing arrangement 100 is thus configured to determine the main field of view and the peripheral filed of view based on the center of the gaze.
  • the gaze G is detected through the rear field of vision as received or captured by the image receiving device 103 or as received by the image receiving circuit.
  • the gaze may be detected in may different and alternative manners as would be understood by a skilled person, based on image analysis of a rear field (towards the user) of view image capture.
  • the AR viewing arrangement 100 is thus, in some embodiments, further configured to receive image data corresponding to a rear-view of the AR viewing arrangement 100 (through the circuit for receiving image data 101A), and detect an eye E in the image data corresponding to the rearview and to determine a gaze G based on the detected eye E, and to determine the main field- of-view MFOV and the peripheral field-of-view PFOV in the image data based on the gaze G (through the circuit for receiving image data 101B).
  • the gaze may be a default gaze as in the center of the AR viewing arrangement 100.
  • the peripheral field of view is determined as the view outside the main field of view, and is in some embodiments the field of view outside a view angle (20, 25, 30 or 35 degrees) from the center of gaze G.
  • the view angle is set by a user through user selection.
  • the view angle is set by the controller through noting successful visual perception by a user. Visual perception can be determined to be successful if a corresponding action is taken or not, wherein if an action is taken for a same event in one viewing angle, but not in another, the peripheral field of vision can be adapted to ensure successful interaction.
  • the first object 220A the object in the main field of view
  • the visual representation 220RA on the display 110 having the same coloring scheme as the real-life object 220A. That the object is displayed "as is" (which is to be understood as without any visual manipulation such as being overlaid with a virtual representation) is one example of a basic or first format. Other examples include being overlaid with a graphical representation, such as a virtual object, which has not been modified. More examples will be discussed below.
  • a second object 220B is also shown in figure 2A, the object in the peripheral field of view PFOVA, which is also detected, determined to be in the peripheral field of vision and then displayed in an amended or adapted format on the display 110 of the AR viewing arrangement 100.
  • This is illustrated in figure 2A by the visual representation 220RB on the display 110 not having the same coloring scheme as the second real-life object 220B. That the object is displayed in an adapted or second format, which is to be understood as being displayed with visual manipulation such as being overlaid with a virtual representation, is one example of an adapted or second format.
  • a graphical representation such as a virtual object, where colors are exchanged for a grey scale, where colors are exchanged for black or white, where the virtual representation includes a changing (such as blinking or changing color including grey scale) component, and/or where the virtual representation includes a moving (such as shaking) component. More examples will be discussed below.
  • the first format is simply to display the object (or its virtual object) in colors and the second format is to display the virtual object in black and white.
  • the first format is simply to display the object (or its virtual object) as the original image of the object. 8.
  • the graphical representation 220RB for the detected object 220B in the peripheral field-of-view comprises a color-scale ranging from full color to black and white, wherein the amount of color is based on a distance from a center (G) of the field-of-view.
  • the graphical representation 220RB for the detected object 220B in the peripheral field-of-view comprises an intensity, wherein the amount of intensity is based on a distance from a center (G) of the field-of-view, wherein the intensity grows with the distance.
  • the second format may thus include an intensity which in some embodiments is a brightness level, in some embodiments a contrast level, in some embodiments a special color scale where a more striking color is used further from the center (for example the further away from the center the more black will the representation be), in some embodiments a size, in some embodiments a boldness, in some embodiments a speed of blinking, in some embodiments aa speed of moving or any combination of these examples.
  • the graphical representation 220RB for the detected object 220B in the peripheral field-of-view comprises a changing component.
  • the second format may thus include a moving or blinking component as discussed in the above.
  • the AR viewing arrangement 100 is configured to display the image of the real-world and to display the graphical representation of 220RB over the detected object 220B.
  • the second format may thus include a virtual object over or around the detected object to highlight the detected object.
  • the graphical representation 220R for the detected object 220B in the peripheral field-of-view comprises a graphical object arranged to cover the detected object 220 as displayed (i.e. to cover the image of the object).
  • the second format may thus include a virtual object that covers the object to alter its appearance.
  • the graphical representation 220RB for the detected object 220B in the peripheral field-of-view comprises a graphical object arranged to frame the detected object 220 as displayed.
  • the second format may thus include a framing component.
  • any display of an object on a display 110 is providing a graphical representation and does not necessarily mean that a virtual object or other object is provided.
  • the AR viewing arrangement 100 is configured to receive a search query from a user, and in response thereto determine if any, some or all of the detected objects correspond to the search query and if so, provide an identifier for that object.
  • the AR viewing arrangement 100 is configured to do this for all detected objects matching the search query, but in some embodiments the AR viewing arrangement 100 is configured to do this only for the detected objects matching the search query in the peripheral field of view. This may be for all search queries or specific to one search query.
  • the identifier may be provided as a further graphical representation 210.
  • the identifier may be provided as indicating the search query, for example through text.
  • the AR viewing arrangement 100 is thus, in some embodiments, further configured to provide the graphical representation 220RB for the detected object 220B in the peripheral field- of-view comprising an identifier for the detected object, wherein the detected object is a searched-for object.
  • the AR viewing arrangement 100 is thus configured to receive an image of a real-world view (through the circuit for receiving image data 101A), and determine a main field-of-view (MFOV) and a peripheral field-of-view (PFOV) in the image data; detect an object 220A, 220B in the image data; determine if the object 220A, 220B is in the peripheral field-of-view (PFOV), and if so provide a graphical representation 220RB for the object 220B in a second format, and if not provide a graphical representation 220RA for the object 220A in a first format (through the circuit for processing the image 101B), wherein the first format is different from the second format.
  • MFOV main field-of-view
  • PFOV peripheral field-of-view
  • Figure 2B shows a schematic view of an AR viewing arrangement 100 as in any of figures 1A, IB, 1C and/or 2A wherein a change is detected in any, some or all real-life objects 220.
  • the change may be any type of change, and in some embodiments, the AR viewing arrangement 100 is further configured to classify the change as an event for the object 220.
  • the AR viewing arrangement 100 is configured to provide a notification of the event (or change).
  • the event or change is that the traffic light changes to indicate STOP - i.e. red light.
  • the AR viewing arrangement 100 is then, in some embodiments configured to provide the graphical representation 220R to indicate the change or event.
  • the event for the first object 220A in the main field of view is indicated in the first format, which may be to show the object "as is” or by providing a graphical representation 220RA that highlights the event.
  • the graphical representation 220RA that highlights the event for the first object 220A is provided in the first format.
  • the graphical representation 220RA that highlights the event will be provided in an alternative first format as compared to before the event.
  • the event for the second object 220B in the peripheral field of view is indicated in the second format, which may be to provide an altered graphical representation 220RB that highlights the event or to provide a further graphical representation 210 that highlights the event.
  • the graphical representation 220RB or the further graphical representation 210 that highlights the event for the second object 220B is provided in the second format.
  • the graphical representation 220RB that highlights the event will be provided in an alternative second format as compared to before the event.
  • the graphical representation 220RB of the second object 220B is provided with the red light lamp indicated in a starker contrast, or alternatively or additionally, by a further graphical representation 210 which is also in a starker contrast.
  • an overlaid further representation 210 is provided in order to increase the chances of visual perception, especially for embodiments where the second format includes a blinking or otherwise changing component, wherein the underlying graphical representation 212RB will still be intermittently visible.
  • the (overlaid) further graphical representation is, in some embodiments, only partially overlaying the graphical representation 220RB.
  • the further graphical representation 210 overlays the graphical representation completely or almost completely.
  • the AR viewing arrangement 100 is thus, in some embodiments, further configured to detect an event for the detected object 220A, 220B and provide the graphical representation 220RA, 2220RB to indicate the event (through the circuit for image processing 101B). And, in some embodiments further configured to detect an event for the detected object 220B in the peripheral field-of-vision and provide a further graphical representation 210 for the event (through the circuit for image processing 101B).
  • Figure 2C shows a schematic view of an AR viewing arrangement 100 as in any of figures 1A, IB, 1C, 2A and/or 2B wherein a movement M is detected in any, some or all real-life objects 220 which results in the object moving from the peripheral field of view to the main field of view (or vice-versa).
  • the AR viewing arrangement 100 is configured to provide the object for which the movement is detected accordingly.
  • the second object 220B moves from the peripheral field of view to the main field of view and the graphical representation 220RB is provided in the first format for the second object 220B, as opposed to in the second format as before the movement.
  • the graphical representation will be provided in the second format instead of the first format.
  • Figures 2D and 2E each shows a schematic view of an AR viewing arrangement 100 as in any of figures 1A, IB, 1C, 2A, 2B and/or 2C wherein a movement of the eye, and therefore a change in the gaze G is detected.
  • the user is looking straight ahead, whereas in figure 2E the user has moved to the right and the gaze is now steered to the left.
  • the AR viewing arrangement 100 is configured to detect such a change in gaze and adapt or update the field of views accordingly.
  • the AR viewing arrangement 100 is thus configured to follow a user's gaze and to update the display accordingly.
  • the gaze has changed so that the first object 220A previously in the main field of view MFOV ends up in the peripheral field of view PFOVB and the second object 220B previously in the peripheral field of view PFOVA ends up in the main field of view MFOV.
  • the AR viewing arrangement 100 Is configured to update the graphical representations accordingly, whereby the graphical representation 220RA for the first object 220A is updated to the second format and the graphical representation 220RB for the second object 220B is updated to the first format.
  • the AR viewing arrangement 100 is thus, in some embodiments, configured to detect that the gaze G has moved and in response thereto determine updated main field of view and peripheral field of view; determine if any detected object 220B in the peripheral field of vision falls within the updated main field of vision, and in response thereto provide the graphical representation 220RB in the first format; and determine if any detected object 220A in the main field of vision falls within the updated peripheral field of vision, and in response thereto provide the graphical representation 220RB in the second format.
  • Figure 3 shows a flowchart of a general method according to an embodiment of the teachings herein.
  • the method utilizes an AR viewing arrangement 100 as taught herein.
  • the method comprises receiving 310 image data and determining 320 a main field-of-view (MFOV) and a peripheral field-of-view (PFOV) in the image data.
  • the method further comprises detecting 330 an object 220A, 220B in the image data, and determining 340 if the object 220A, 220B is in the peripheral field-of-view (PFOV), and if so providing 350 a graphical representation 220RB for the object 220B in black and white, being an example of a second format.
  • MFOV main field-of-view
  • PFOV peripheral field-of-view
  • the peripheral and main field of visions are determined based on a detected gaze, and, in some embodiments, the method further comprises detecting 360 a change in gaze and adapting or updating the peripheral and main field of visions accordingly as well as any graphical representations if needed.
  • Figure 4 shows a schematic view of a computer-readable medium 120 carrying computer instructions 121 that when loaded into and executed by a controller of an AR viewing arrangement 100 enables the AR viewing arrangement 100 to implement the present invention.
  • the computer-readable medium 120 may be tangible such as a hard drive or a flash memory, for example a USB memory stick or a cloud server.
  • the computer- readable medium 120 may be intangible such as a signal carrying the computer instructions enabling the computer instructions to be downloaded through a network connection, such as an internet connection.
  • a computer-readable medium 120 is shown as being a computer disc 120 carrying computer-readable computer instructions 121, being inserted in a computer disc reader 122.
  • the computer disc reader 122 may be part of a cloud server 123 - or other server - or the computer disc reader may be connected to a cloud server 123 - or other server.
  • the cloud server 123 may be part of the internet or at least connected to the internet.
  • the cloud server 123 may alternatively be connected through a proprietary or dedicated connection.
  • the computer instructions are stored at a remote server 123 and be downloaded to the memory 102 of the AR viewing arrangement 100 for being executed by the controller 101.
  • the computer disc reader 122 may also or alternatively be connected to (or possibly inserted into) an AR viewing arrangement 100 for transferring the computer-readable computer instructions 121 to a controller of the AR viewing arrangement (presumably via a memory of the AR viewing arrangement 100).
  • Figure 4 shows both the situation when an AR viewing arrangement 100 receives the computer-readable computer instructions 121 via a wireless server connection (non-tangible) and the situation when another AR viewing arrangement 100 receives the computer-readable computer instructions 121 through a wired interface (tangible). This enables for computer- readable computer instructions 121 being downloaded into an AR viewing arrangement 100 thereby enabling the AR viewing arrangement 100 to operate according to and implement the invention as disclosed herein.
  • Figure 5 shows a component view for a software component (or module) arrangement 500 according to an embodiment of the teachings herein.
  • the software component arrangement 500 is adapted to be used in an AR viewing arrangement 100 as taught herein.
  • the software component arrangement 500 comprises a software component for receiving 510 image data and a software component for determining 520 a main field-of-view (MFOV) and a peripheral field-of-view (PFOV) in the image data.
  • the software component arrangement 500 further comprises a software component for detecting 530 an object 220A, 220B in the image data, and a software component for determining 540 if the object 220A, 220B is in the peripheral field-of-view (PFOV), and a software component for providing 550 a graphical representation 220RB for the object 220B in black and white (being an example of a second format) if so.
  • MFOV main field-of-view
  • PFOV peripheral field-of-view
  • the peripheral and main field of visions are determined based on a detected gaze, and, in some embodiments, the software component arrangement 500 further comprises a software component for detecting 560 a change in gaze and adapting or updating the peripheral and main field of visions accordingly as well as any graphical representations if needed.
  • the software component arrangement 500 further comprises software component(s) for performing any of the functionalities discussed herein.
  • Figure 6 shows a component view for an arrangement comprising circuitry 600 according to an embodiment of the teachings herein.
  • the arrangement comprising circuitry 600 is adapted to be used in an AR viewing arrangement 100 as taught herein.
  • the arrangement comprising circuitry 600 of figure 6 comprises circuitry for receiving 610 image data and circuitry for determining 620 a main field-of-view (MFOV) and a peripheral field-of-view (PFOV) in the image data.
  • the arrangement 600 further comprises circuitry for detecting 630 an object 220A, 220B in the image data, and circuitry for determining 640 if the object 220A, 220B is in the peripheral field-of-view (PFOV), and circuitry for providing 650 a graphical representation 220RB for the object 220B in black and white (being an example of a second format) if so.
  • MFOV main field-of-view
  • PFOV peripheral field-of-view
  • circuitry for providing a graphical representation 220RA for the object 220A in a first format, wherein the first format is different from the second format, if the object 220A, 220B is in the main field of view.
  • the peripheral and main field of visions are determined based on a detected gaze, and, in some embodiments, the arrangement 600 further comprises circuitry for detecting 660 a change in gaze and adapting or updating the peripheral and main field of visions accordingly as well as any graphical representations if needed.
  • the arrangement 600 further comprises circuitry(ies) for performing any of the functionalities discussed herein.

Abstract

L'invention concerne un agencement de visualisation AR (100) pour détecter le regard d'un utilisateur et pour adapter les objets affichés sur la base du fait qu'ils se trouvent dans le champ de vision principal ou le champ de vision périphérique.
PCT/EP2022/078090 2022-10-10 2022-10-10 Agencement de module logiciel informatique, agencement de circuits, agencement et procédé pour une perception humaine améliorée dans des systèmes xr WO2024078686A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/EP2022/078090 WO2024078686A1 (fr) 2022-10-10 2022-10-10 Agencement de module logiciel informatique, agencement de circuits, agencement et procédé pour une perception humaine améliorée dans des systèmes xr

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2022/078090 WO2024078686A1 (fr) 2022-10-10 2022-10-10 Agencement de module logiciel informatique, agencement de circuits, agencement et procédé pour une perception humaine améliorée dans des systèmes xr

Publications (1)

Publication Number Publication Date
WO2024078686A1 true WO2024078686A1 (fr) 2024-04-18

Family

ID=84047791

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2022/078090 WO2024078686A1 (fr) 2022-10-10 2022-10-10 Agencement de module logiciel informatique, agencement de circuits, agencement et procédé pour une perception humaine améliorée dans des systèmes xr

Country Status (1)

Country Link
WO (1) WO2024078686A1 (fr)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014088972A1 (fr) * 2012-12-06 2014-06-12 Microsoft Corporation Présentation de réalité mélangée
EP3716220A1 (fr) * 2017-11-20 2020-09-30 Rakuten, Inc. Dispositif, procédé et programme de traitement d'informations

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014088972A1 (fr) * 2012-12-06 2014-06-12 Microsoft Corporation Présentation de réalité mélangée
EP3716220A1 (fr) * 2017-11-20 2020-09-30 Rakuten, Inc. Dispositif, procédé et programme de traitement d'informations

Similar Documents

Publication Publication Date Title
CN107408026B (zh) 信息处理设备、信息处理方法和计算机程序
US10534428B2 (en) Image processing device and image processing method, display device and display method, and image display system
US10999412B2 (en) Sharing mediated reality content
US20110115883A1 (en) Method And System For Adaptive Viewport For A Mobile Device Based On Viewing Angle
US20200202161A1 (en) Information processing apparatus, information processing method, and program
CN108027700B (zh) 信息处理装置
US11244496B2 (en) Information processing device and information processing method
US11487354B2 (en) Information processing apparatus, information processing method, and program
JP6576536B2 (ja) 情報処理装置
US20190179406A1 (en) Display device and image display method
US20150193977A1 (en) Self-Describing Three-Dimensional (3D) Object Recognition and Control Descriptors for Augmented Reality Interfaces
EP3521978B1 (fr) Appareil et procédé de suivi de point focal dans un système de visiocasque
US20160109703A1 (en) Head mounted display, method for controlling head mounted display, and computer program
CN111213148A (zh) 用于计算机模拟现实的隐私屏幕
CN106168855B (zh) 一种便携式mr眼镜、手机和mr眼镜系统
JP6649010B2 (ja) 情報処理装置
KR101931295B1 (ko) 원격지 영상 재생 장치
US20220004250A1 (en) Information processing apparatus, information processing method, and program
WO2020044916A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme
WO2024078686A1 (fr) Agencement de module logiciel informatique, agencement de circuits, agencement et procédé pour une perception humaine améliorée dans des systèmes xr
CN107133028B (zh) 一种信息处理方法及电子设备
WO2017115587A1 (fr) Dispositif de traitement d'informations, procédé de commande et programme
KR20190129982A (ko) 전자기기 및 그 제어 방법
TW202213994A (zh) 擴增實境系統與其顯示亮度調整方法
KR20150039352A (ko) 전자 디바이스 및 그 제어방법