WO2019239161A1 - System and method for modifying image enhancement parameters for a portable display - Google Patents

System and method for modifying image enhancement parameters for a portable display Download PDF

Info

Publication number
WO2019239161A1
WO2019239161A1 PCT/GB2019/051687 GB2019051687W WO2019239161A1 WO 2019239161 A1 WO2019239161 A1 WO 2019239161A1 GB 2019051687 W GB2019051687 W GB 2019051687W WO 2019239161 A1 WO2019239161 A1 WO 2019239161A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
computing device
portable computing
wearable device
display
Prior art date
Application number
PCT/GB2019/051687
Other languages
French (fr)
Inventor
Stephen Lloyd Frederick HICKS
Noah Aaron RUSSELL
Original Assignee
Oxsight Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from GBGB1809904.4A external-priority patent/GB201809904D0/en
Priority claimed from GBGB1809905.1A external-priority patent/GB201809905D0/en
Priority claimed from GBGB1819145.2A external-priority patent/GB201819145D0/en
Priority claimed from GBGB1819144.5A external-priority patent/GB201819144D0/en
Application filed by Oxsight Ltd filed Critical Oxsight Ltd
Priority to CN201980046956.6A priority Critical patent/CN112424729A/en
Priority to GB2020074.7A priority patent/GB2589255A/en
Publication of WO2019239161A1 publication Critical patent/WO2019239161A1/en

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/001Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
    • G09G3/003Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background to produce spatial visual effects
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1694Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/39Control of the bit-mapped memory
    • G09G5/391Resolution modifying circuits, e.g. variable screen formats
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0112Head-up displays characterised by optical features comprising device for genereting colour display
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0118Head-up displays characterised by optical features comprising devices for improving the contrast of the display / brillance control visibility
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/06Adjustment of display parameters
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/06Adjustment of display parameters
    • G09G2320/0606Manual adjustment
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/06Adjustment of display parameters
    • G09G2320/066Adjustment of display parameters for control of contrast
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/06Adjustment of display parameters
    • G09G2320/0666Adjustment of display parameters for control of colour parameters, e.g. colour temperature
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/06Adjustment of display parameters
    • G09G2320/0673Adjustment of display parameters for control of gamma adjustment, e.g. selecting another gamma curve
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/08Arrangements within a display terminal for setting, manually or automatically, display parameters of the display terminal
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/04Changes in size, position or resolution of an image
    • G09G2340/045Zooming at least part of an image, i.e. enlarging it or shrinking it
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2354/00Aspects of interface with display user

Definitions

  • the present disclosure pertains to a system and method thereof for modifying image enhancement parameters for a portable display, such as one configured in a smart glass.
  • image enhancement techniques for improving vision in sight impaired individuals.
  • Such techniques include but are not limited to video passthrough of a colour/RGB image, edge detection and presentation of these edges as white on a black background, application of white edges on top of a colour or grayscale image, presentation of a black and white high-contrast image with a global threshold that applies to the entire screen, presentation of a black and white high-contrast image with multiple regional thresholds to compensate for lighting changes across a screen, and an algorithm, for instance, to detect large regions of similar hues (regardless of brightness) and then re-drawing these regions as high brightness swatches of the same colour, to aid low vision.
  • the present disclosure pertains to a system and method thereof for modifying image enhancement parameters for a portable display, such as one configured in a smart glass.
  • smart glass or‘smart glasses’ wearable computer glasses (or‘spectacles’) that provide visual information in a user’s field of view in addition to that which the user is able to view substantially directly (either with or without an intermediate optical element such as a lens, which may be substantially transparent or partially transparent).
  • the provision of visual information in addition to that which the user views substantially directly may be by superimposing information onto the user’s field of view, for example on a display element in the user’s field of view (which display element may be substantially transparent, at least partially transparent or opaque).
  • the display may be a substantially opaque LED or LCD display of the type used in mobile telephones or partially transparent.
  • an image may be projected onto the display from a light source in the form or a projector device, for example of the type used in head-up display (HUD) displays or augmented reality (AR) overlays and the reflected image viewed by the user.
  • HUD head-up display
  • AR augmented reality
  • a smart glass system arranged to permit a visually impaired user to modify one or more control parameters of an image provided by the system according to their eye condition, comprising:
  • a portable computing device comprising a motion sensor
  • a smart glass based wearable device comprising a display portion, the display portion being provided in a field of view of the user;
  • the portable computing device is operatively coupled with the smart glass based wearable device
  • system being configured to display on the display portion an image corresponding to at least a portion of an image captured by the image capture device
  • system is configured to detect rotational movement of the portable computing device in the hand of the user by means of the motion sensor, wherein one or more control parameters of the image that is displayed on the wearable device are modified based on the rotational movement of the portable computing device.
  • the user may endeavour to optimise the scene as viewed using the smart glass based wearable device.
  • the system may be configured wherein the user may adjust the parameters in substantially real time.
  • the image capture device may comprise a video image capture device.
  • the image capture device may comprise at least one CMOS image capture device and/or at least one CCD image capture device. Other image capture devices may be useful.
  • the wearable device may generate a substantially real time stream of images captured by the image capture device.
  • the image capture device is configured to capture a scene having at least a portion in a field of view of a person wearing the wearable device, the system being configured to display on the display portion of the wearable device at least a portion of the scene captured by the image capture device corresponding to the location of the display in the field of view of the person wearing the wearable device.
  • reference to the“field of view of a person wearing the wearable device” is to be understood to be with respect to a person wearing the wearable device such that the display portion is in their field of view, optionally their field of view when looking substantially directly ahead, optionally their field of view with their eyes directed in a prescribed direction, the direction being any prescribed direction from directly upwards (a ⁇ 2 o’clock’ direction), directly downwards (a‘6 o’clock direction’) or any prescribed direction from 12 o’clock clockwise around to 12 o’clock.
  • said one or more control parameters are selected from any or a combination of passthrough of said image, colour or grayscale display of said image, brightness of said image, edge detection/enhancement in said image, contrast of said image, colour enhancement of said image, line thickness in said image, enhancement of text that forms part of said image, lighting that forms part of said image, and white:black ratio in said image.
  • said device comprises a selection interface that allows user of said wearable device to select a set of control parameters from said one or more control parameters that need to be modified for said image.
  • the motion sensor comprises a gyroscope that is positioned in a manner such that upon actuation of said portable computing device, orientation of the portable computing device is determined, and upon hand rotation of said portable computing device, one or more control parameters of an image that is displayed on the wearable device are modified based on rotational velocity computed using said gyroscope.
  • the motion sensor comprises an accelerometer arranged to determine information indicative of linear acceleration of the portable computing device relative to gravity.
  • the system is configured to transmit information indicative of the linear acceleration of the portable computing device relative to gravity to the wearable device.
  • the motion sensor comprises a magnetometer that determines information indicative of instantaneous orientation of the portable computing device relative to Earth’s magnetic field.
  • the system is configured to transmit the information indicative of the instantaneous orientation of the portable computing device relative to Earth’s magnetic field to the wearable device.
  • the system is configured wherein respective outputs from said gyroscope, said accelerometer, and said magnetometer are fused to yield the orientation and motion of the portable computing device in any direction.
  • the portable computing device is actuated by pressing of a button present in/on said computing device, wherein the one or more control parameters of the images are modified only during the time the button is kept pressed.
  • the extent of changes in said one or more control parameters is proportional to the extent of hand rotation.
  • the orientation of the portable computing device is determined based on fusion of positional data from one or more components of the motion sensor.
  • control parameter modification operation when the control parameter modification operation is paused to generate a modified image, user of said wearable device is enabled to view said modified image and/or pan around said modified image in X and Y axis and/or scroll around said modified image.
  • an absolute position of said portable computing device is configured to be indicative of level of control parameter.
  • the system determines the absolute position of said portable computing device and sets the level of the control parameter in dependence on the absolute position.
  • This feature has the advantage that a user may substantially instantly set the level of the control panel by first setting their hand in the orientation corresponding to the desired value of the control parameter and pressing the button.
  • the motion sensor comprises an inertial measurement unit (IMU).
  • IMU inertial measurement unit
  • the motion sensor is an inertial measurement unit (IMU).
  • IMU inertial measurement unit
  • the wearable device comprises the image capture device.
  • the image capture device may be an integral part of the wearable device.
  • the image capture device is provided external to the wearable device, the image capture device being operatively coupled to the wearable device.
  • the image capture device is operatively coupled to the wearable device by means of a wireless connection or a wired connection.
  • the method comprising modifying one or more control parameters of the image displayed on the display portion based on the rotational movement of the portable computing device.
  • the method comprises capturing by means of the image capture device a scene having at least a portion in a field of view of a person wearing the wearable device, the method comprising displaying on the display portion of the wearable device at least a portion of the scene captured by the image capture device corresponding to the location of the display in the field of view of the person wearing the wearable device.
  • the person wearing the wearable device will see the image captured by the image capture device within their field of view.
  • the image displayed by the display portion occupies a portion and not the whole of the field of view of the user wherein the image displayed is substantially continuous with a remainder of a field of view of the user such that the image displayed appears to be superimposed upon the scene.
  • the display portion may be at least partially transparent, allowing the user to see objects in the portion of the field of view occupied by the display portion through the display portion as well as information displayed on the display portion by the system.
  • the method further comprises the step of:
  • the one or more control parameters are selected from any or a combination of passthrough of said image, colour or grayscale display of said image, brightness of said image, edge detection/enhancement in said image, contrast of said image, colour enhancement of said image, line thickness in said image, enhancement of text that forms part of said image, lighting that forms part of said image, and white: black ratio in said image.
  • a portable computing device for use with a smart glass system arranged to permit a visually impaired user to modify one or more control parameters of an image provided by the system according to their eye condition
  • the portable computing device comprising a motion sensor; the portable computing device being arranged to be operatively coupled with the smart glass based wearable device, the smart glass based wearable device comprising a display portion, the display portion being provided in a field of view of the user, the smart glass system further comprising an image capture device, the system being configured to display on the display portion of the wearable device an image corresponding to at least a portion of an image captured by the image capture device, wherein the system is configured to detect rotational movement of the portable computing device in the hand of the user by means of the motion sensor, wherein one or more control parameters of the image that is displayed on the wearable device are modified based on the rotational movement of the portable computing device.
  • a smart glass based wearable device arranged to be operatively coupled with the portable computing device of the preceding aspect, the system being arranged to permit a visually impaired user to modify one or more control parameters of an image provided by the system according to their eye condition.
  • the present disclosure relates to a portable computing device that is operatively coupled with a smart glass based wearable device, wherein the portable computing device can include an inertial measurement unit (IMU) having a gyroscope that is positioned in a manner such that upon actuation of the portable computing device, orientation of the portable computing device can be determined, and upon hand rotation of the portable computing device, one or more control parameters of an image that is displayed on the wearable device can be modified based on rotational velocity computed using the gyroscope.
  • IMU inertial measurement unit
  • the one or more control parameters can be selected from any or a combination of passthrough of the image, colour or grayscale display of the image, brightness of the image, edge detection/enhancement in the image, contrast of the image, colour enhancement of the image, line thickness in the image, enhancement of text that forms part of the image, lighting that forms part of the image, and white: black ratio in the image.
  • the device can include a selection interface that allows user of the wearable device to select a set of control parameters from said one or more control parameters that need to be modified for the image.
  • the IMU can further include an accelerometer to transmit magnitude of linear acceleration of the portable computing device relative to gravity.
  • the IMU can further include a magnetometer that determines and transmits instantaneous orientation of the portable computing device relative to Earth’s magnetic field.
  • respective outputs from the gyroscope, the accelerometer, and the magnetometer can be fused to yield the orientation and motion of the portable computing device in any direction.
  • the portable computing device can be actuated by pressing of a button present in/on said computing device, wherein the one or more control parameters of the images are modified only during the time the button is kept pressed.
  • the extent of hand rotation is proportional to the extent of changes in said one or more control parameters.
  • the orientation of the portable computing device can be determined based on fusion of positional data from one or more components of the IMU, said one or more components comprising at least an accelerometer.
  • control parameter modification operation when the control parameter modification operation is paused to generate a modified image, user of said wearable device is enabled to view said modified image and/or pan around said modified image in X and Y axis and/or scroll around said modified image.
  • an absolute position of said portable computing device is configured to be indicative of level of control parameter.
  • the present disclosure relates to a method of modifying, by a portable computing device, one or more control parameters on an image that is displayed in a smart glass based wearable device, said method comprising the step of: receiving, at the portable computing device, from a gyroscope sensor configured in the portable computing device, a change in gyroscope value indicative of extent of hand rotation of said portable computing device, said hand rotation being mapped to one or more control parameters; determining, at the portable computing device, using an accelerometer configured in the portable computing device, orientation of the portable computing device; and generating, from the portable computing device, an image modification signal to said glass based wearable device based on the change in gyroscope value and the determined orientation, wherein said image is modified with respect to said one or more control parameters based on said image modification signal.
  • the method can further include the step of: receiving new gyroscope values as part of the change in gyroscope value; smoothing, using a sliding window filter, the received new gyroscope values; normalizing said smoothened gyroscope values; and accumulating the normalized gyroscope values to indicate the extent of hand rotation of said portable computing device.
  • the portable computing device can include an inertial measurement unit (IMU) having a gyroscope that is positioned in a manner such that upon actuation of the portable computing device, orientation of the portable computing device can be determined, and upon hand rotation of the portable computing device, one or more control parameters of an image that is displayed on the wearable device can be modified based on rotational velocity computed using said gyroscope.
  • IMU inertial measurement unit
  • FIGs. 1 and 2 illustrate exemplary representation of the proposed device in accordance with an embodiment of the present disclosure.
  • FIGs. 3A-3E illustrate exemplary representation showing how the proposed device can be used for controlling at least one parameter of imaging/image enhancement techniques.
  • FIGs. 4A-4E illustrate exemplary flow diagrams to enable tuning of different attributes of captured videos/images in accordance with an embodiment of the present disclosure.
  • Embodiments of the present disclosure include various steps, which will be described below.
  • the steps may be performed by hardware components or may be embodied in machine- executable instructions, which may be used to cause a general-purpose or special-purpose processor programmed with the instructions to perform the steps.
  • steps may be performed by a combination of hardware, software, firmware and/or by human operators.
  • Embodiments of the present disclosure may be provided as a computer program product, which may include a machine-readable storage medium tangibly embodying thereon instructions, which may be used to program a computer (or other electronic devices) to perform a process.
  • the machine-readable medium may include, but is not limited to, fixed (hard) drives, magnetic tape, floppy diskettes, optical disks, compact disc read-only memories (CD-ROMs), and magneto-optical disks, semiconductor memories, such as ROMs, PROMs, random access memories (RAMs), programmable read-only memories (PROMs), erasable PROMs (EPROMs), electrically erasable PROMs (EEPROMs), flash memory, magnetic or optical cards, or other type of media/machine-readable medium suitable for storing electronic instructions (e.g., computer programming code, such as software or firmware).
  • Various methods described herein may be practiced by combining one or more machine-readable storage media containing the code according to the present disclosure with appropriate standard computer hardware to execute the code contained therein.
  • An apparatus for practicing various embodiments of the present disclosure may involve one or more computers (or one or more processors within a single computer) and storage systems containing or having network access to computer program(s) coded in accordance with various methods described herein, and the method steps of the disclosure could be accomplished by modules, routines, subroutines, or subparts of a computer program product.
  • the present disclosure pertains to a system and method thereof for modifying image enhancement parameters for a portable display, such as one configured in a smart glass.
  • the present disclosure pertains to a real-time image processing system that is designed to improve vision for people who are severely sight impaired.
  • the proposed system can include a video input mechanism that can be a wired or a wireless camera or can include an externally streamed video or a video that is a file on the device, wherein the video input mechanism can be presented to the user via a head-mounted screen such as an augmented or virtual reality transparent display, of for instance, a smart glass.
  • a smart glass based wearable device 160 has a display screen 162 and an image capture device in the form of a video camera 163.
  • FIG. 1(b) shows a corresponding embodiment 10A in which the image capture device is not provided integral to the wearable device 160.
  • the image capture device may be coupled to the device 160 via a wireless connection.
  • the image capture device may be coupled by means of a wired connection in addition or instead.
  • the display screen 162 is a transparent waveguide with diffractive optics arranged to direct an image from an organic light emitting diode (OLED) micro display, into the user’s eye.
  • OLED organic light emitting diode
  • Other arrangements may be useful such as a transparent waveguide with a beamsplitter instead of diffractive optics.
  • Other displays may be useful such as liquid crystal on silicon (LCOS) displays or liquid crystal displays (LCDs).
  • LCOS liquid crystal on silicon
  • LCDs liquid crystal displays
  • an opaque display including a high resolution OLED panel and one or more optical elements such as a biconvex or Fresnel lens arrangement may be employed to direct the image into the user’s eye.
  • the video camera 163 is a CMOS (complementary metal oxide semiconductor) camera but other cameras may be useful in some embodiments.
  • the present disclosure relates to a physical device that can be given to a user to modify primary control parameter(s) of each of existing methods/techniques as mentioned above
  • the proposed device can be configured to receive one parameter from each of the above-mentioned techniques (A-F), and make the received respective parameter as adjustable.
  • the proposed device can be configured as a portable gesture device and can enable provision of an intuitive control that mimics other well-known control mechanisms such as a volume knob on an audio device.
  • the proposed device can rapidly and intuitively change parameters for one or more existing image enhancement techniques over a wide range, due to the relationship between movement of the device and the rate of parameter change.
  • the proposed device system can be activated by a button press, wherein at the first press, orientation of the device can be calculated.
  • orientation of the device can be determined based on fusion of positional data from components of an inertial measurement device (IMU) that comprises an accelerometer configured to determine and transmit real-time values of the devices’ position relative to gravity.
  • the accelerometer can also be configured to transmit magnitude of any linear acceleration in three dimensions.
  • IMU of the present disclosure can further include a gyroscope that indicates instantaneous rotational velocity in three dimensions.
  • An optional magnetometer can also be configured in the IMU and configured to give instantaneous orientation of the proposed device/handset relative to Earth’s magnetic field (i.e. a compass).
  • initial orientation of the handset becomes set zero. Any rotation about a defined axis of the handset can be interpreted as an increase or a decrease of the primary control parameter.
  • the axis of rotation can be defined to be along the length of the device, which is the same axis as the wrist.
  • a clockwise roll can increase the white:black ratio on a high-contrast display.
  • An anticlockwise roll can decrease the white:black ratio on a high-contrast display.
  • an anticlockwise roll can increase the white: black ratio on a high-contrast display and a clockwise roll can decrease the white:black ratio on a high-contrast display.
  • user can be given the ability to watch parameter change in real-time as they rotate the proposed device/handset, which can create an intuitive feedback system, allowing the user to be very specific with their modifications to the image.
  • the video can continue to be passed in real-time to the display.
  • the modified imaging parameter upon release of the button, can be set.
  • the modified imaging parameter can be set as the new state of the system, becoming the state in which the system continues to operate until the state is changed.
  • the state of contrast may revert to a default state upon release of the button.
  • the state of contrast may revert to a default state after a predetermined time period has elapsed. Other arrangements may be useful.
  • a user may select how the system behaves when the button is released, for example whether the changed parameter, such as the instant blackwhite contrast setting, is maintained, or whether the system reverts to a default value of the parameter.
  • each imaging/image enhancement technique can have at least one parameter that can be modified by the proposed tuning device.
  • the parameter can include the tuning that increases or decreases general image brightness.
  • the parameter can include the tuning that modifies the threshold for edge detection. Decreasing this threshold increases the number of edges displayed. Increasing this threshold decreases the number of edges displayed.
  • the same parameter as for white edges on black technique can be used except that the edges are displayed on a live video (colour or grayscale).
  • the parameter can include tuning that moves the blackwhite threshold towards the white or towards the black.
  • the parameter can include tuning that modifies the erode and dilate parameters, subsequently increasing line thickness (process called“erode”) or decreasing line thickness (process called“dilate”).
  • the parameter can include tuning that rotates the detection window through the colour spectrum, allowing for specific colours to be saturated.
  • Contrast is an important parameter in assessing vision clincally.
  • Clinical visual acuity measurements typically use high contrast images such as black letters on a white background. In reality, contrast between objects and their surroundings varies. The relationship between visual acuity and contrast allows a more detailed understanding of visual perception.
  • the resolving power of the eye may be measured by means of sinusoidal grating patterns having adjustable spacing (spatial periodicity).
  • the contrast of the grating is the differential intensity threshold of a grating, which is defined as the ratio:
  • C (Lmax - Lmin) / (Lmax + Lmin)
  • L is the luminance of the grating pattern as a function of spatial distance in a direction normal to the orientation of the parallel elements of the grating and C may be referred to as the modulation or Raleigh or Michelson contrast.
  • C can have a value between 0.0 and 1.0. Further details may be found in“Visual Acuity” by Michael Kalloniatis and Charles Luu, available at https://webvision.med.utah.edu/book/part-viii-psvchophvsics-of-vision/visual-acuitv/ portions of which are discussed below.
  • the sensitivity of a person’s eyes to contrast can be measured by determining the minimum grating spacing that each eye can resolve as a function of image contrast. This may be done, for example, by lowering the contrast for a given spatial frequency until the person can no longer detect the grating - this value is the‘contrast threshold’ for that grating size (spatial frequency). The reciprocal of this contrast threshold is called the‘contrast sensitivity’.
  • C the threshold value of modulation contrast (described above).
  • a plot of (contrast) sensitivity versus spatial frequency is called the spatial contrast sensitivity function (referred to as the SCSF or simply CSF).
  • FIG. 1(b) illustrates schematically the manner in which the contrast sensitivity function (CSF) of an individual may be affected depending upon their medical condition.
  • the plot shows log (contrast sensitivity) as a function of log (spatial frequency) (c/deg).
  • Trace N represents the expected CSF of a healthy individual.
  • Trace A represents that of an individual with contrast losses in the mid to low region of (log spatial frequency), characteristic of individuals having multiple sclerosis;
  • trace B represents the CSR of individuals with an overall reduction in CSF across the range of spatial frequencies, characteristic of cataract patients, whilst trace C represents the CSF of individuals with mild refractive error or mild amblyopia (trace B being characteristic of individuals with more severe cases of either).
  • the proposed device can be coupled to any portable display such as a smart glass that is operatively coupled with a camera that receives a range of different video sources.
  • portable display devices to which the proposed device can be applied can include but are not limited to head mounted camera, external wireless camera, video streaming from a broadcast source e.g. TV, closed-loop video, such as a theatre, concert or live sport event, and on device video source, e.g. a movie fde, internet-streamed video etc.
  • the proposed device can apply any of the image enhancement algorithms previously listed, and each of these enhancements can be modified in real-time by the“Tuning” device outlined in this disclosure.
  • FIGs. 1(a) and 2 illustrate exemplary representation of the proposed system in accordance with an embodiment of the present disclosure, wherein a portable, handheld computing device 100 having a motion sensor 102 can be either physically connected to the smart glass based wearable device 160 or can be wirelessly coupled through Bluetooth or can be mounted onto the frame of the smart glass/wearable device, or any other configuration, all of which are well within the scope of the present disclosure.
  • an electronic system/product 100 can include an inertial measurement device (IMU) 102 having a gyroscope 104 and an accelerometer 106 (it may also alternatively or additionally include a magnetometer 108), wherein during implementation/operation, a user can press and hold a button 150 on the proposed device 100, and then rotate his/her hand as if controlling the dial on a volume control.
  • IMU inertial measurement device
  • a user can press and hold a button 150 on the proposed device 100, and then rotate his/her hand as if controlling the dial on a volume control.
  • initial orientation of the device 100 becomes set zero. Any rotation about a defined axis of the device/handset 100 can be interpreted as an increase or a decrease of the primary control parameter.
  • the axis of rotation (refer to FIG. 2) can be defined to be along the length of the device, which is the same axis as the wrist.
  • a clockwise roll can increase the white:black ratio on a high-contrast display, say transparent display 162 of a smart glass 160.
  • An anticlockwise roll on the other hand, can decrease the white:black ratio on a high-contrast display 162.
  • FIGs. 3A-3E illustrate exemplary representation showing how the proposed device can be used for controlling at least one parameter of imaging/image enhancement techniques.
  • the parameter can include the tuning that increases or decreases general image brightness.
  • the parameter can include the tuning that increases or decreases general image brightness.
  • white edges on black technique as shown in FIG.
  • the parameter can include the tuning that modifies the threshold for edge detection. Decreasing this threshold increases the number of edges displayed. Increasing this threshold decreases the number of edges displayed.
  • the same parameter as for white edges on black technique can be used except that the edges are displayed on a live video (colour or grayscale).
  • the parameter can include tuning that moves the black: white threshold towards the white or towards the black. This either increases the amount of white on the screen, or increases the amount of black.
  • multiple regional thresholds technique as shown in FIG.
  • the parameter can include tuning that modifies the erode and dilate parameters, subsequently increasing line thickness (process called“erode”) or decreasing line thickness (process called “dilate”).
  • the parameter can include tuning that rotates the detection window through the colour spectrum, allowing for specific colours to be saturated.
  • the ratio of rotation to parameter control can be varied so that either: small rotations lead to large changes for highly dynamic environments, or large rotations lead to small changes for fine tuning of an image parameter.
  • the proposed device can also be configured to, for other axis of rotation of the device, modify different parameters of the image.
  • roll axis of the proposed device can change the primary control parameter; pitch axis can change ratio of rotation primary control parameter, which can allow a person to firstly make a large change to the general image, and then increase the sensitivity in order to fine tune the adjustment to suit the environment and the user’s level of vision.
  • FIGs. 4A-4E illustrate exemplary flow diagrams to enable tuning of different attributes of captured videos/images in accordance with an embodiment of the present disclosure.
  • smart glass/wearable device that is operatively coupled with the proposed tuning-enabling computing device can listen to the sensor that is configured in the tuning-enabling computing device, based on which at step 404, the tuning-enabling computing device receives a change in y-axis gyroscope value, output of which is smoothened with a sliding window filter at step 406. It would be appreciated that a part of these steps can be performed in the smart glass/wearable device as well, or in any desired combination of the smart glass/wearable device and the tuning-enabling portable computing device, all of which possible combinations are therefore well within the scope of the present invention.
  • the output can be normalized by, for instance, dividing by a value proportional to effect range, followed by, at step 410, accumulating the gyro values.
  • the tuning-enabling portable computing device can be rotated in a defined direction (clockwise or anti-clockwise) so as to, at step 412, decrease brightness variable, and at step 414, increase brightness variable.
  • a key up instruction is received by the tuning-enabling computing device, based on which at step 418, current effect values can be set as default.
  • smart glass/wearable device that is operatively coupled with the proposed tuning-enabling computing device can listen to the sensor that is configured in the tuning-enabling computing device, based on which at step 424, the tuning enabling computing device receives a change in y-axis gyroscope value, output of which is smoothened with a sliding window filter at step 426. It would be appreciated that a part of these steps can be performed in the smart glass/wearable device as well, or in any desired combination of the smart glass/wearable device and the tuning-enabling portable computing device, all of which possible combinations are therefore well within the scope of the present invention.
  • the output can be normalized by, for instance, dividing by a value proportional to effect range, followed by, at step 430, accumulating the gyro values.
  • the tuning-enabling portable computing device can be rotated in a defined direction (clockwise or anti-clockwise) so as to, at step 432, increase the threshold for line detection, and at step 434, decrease the threshold for line detection.
  • a key up instruction is received by the tuning-enabling computing device, based on which at step 438, current effect values can be set as default.
  • smart glass/wearable device that is operatively coupled with the proposed tuning-enabling computing device can listen to the sensor that is configured in the tuning-enabling computing device, based on which at step 444, the tuning-enabling computing device receives a change in y-axis gyroscope value, output of which is smoothened with a sliding window filter at step 446. It would be appreciated that a part of these steps can be performed in the smart glass/wearable device as well, or in any desired combination of the smart glass/wearable device and the tuning-enabling portable computing device, all of which possible combinations are therefore well within the scope of the present invention.
  • the output can be normalized by, for instance, dividing by a value proportional to effect range, followed by, at step 450, accumulating the gyro values.
  • the tuning-enabling portable computing device can be rotated in a defined direction (clockwise or anti-clockwise) so as to, at step 452, increase the threshold for white (increase % black), and at step 454, decrease the threshold for white (increase % white).
  • a key up instruction is received by the tuning-enabling computing device, based on which at step 458, current effect values can be set as default.
  • smart glass/wearable device that is operatively coupled with the proposed tuning-enabling computing device can listen to the sensor that is configured in the tuning-enabling computing device, based on which at step 464, the tuning-enabling computing device receives a change in y-axis gyroscope value, output of which is smoothened with a sliding window filter at step 466. It would be appreciated that a part of these steps can be performed in the smart glass/wearable device as well, or in any desired combination of the smart glass/wearable device and the tuning-enabling portable computing device, all of which possible combinations are therefore well within the scope of the present invention.
  • the output can be normalized by, for instance, dividing by a value proportional to effect range, followed by, at step 470, accumulating the gyro values.
  • the tuning-enabling portable computing device can be rotated in a defined direction (clockwise or anti-clockwise) so as to, at step 472, increase the colour display in blue-green range (as an example), and at step 474, display colours within the yellow-red range (for instance).
  • a key up instruction is received by the tuning-enabling computing device, based on which at step 478, current effect values can be set as default.
  • smart glass/wearable device that is operatively coupled with the proposed tuning-enabling computing device can listen to the sensor that is configured in the tuning-enabling computing device, based on which at step 482, the tuning-enabling computing device receives a change in y-axis gyroscope value, output of which is smoothened with a sliding window filter at step 483. It would be appreciated that a part of these steps can be performed in the smart glass/wearable device as well, or in any desired combination of the smart glass/wearable device and the tuning-enabling portable computing device, all of which possible combinations are therefore well within the scope of the present invention.
  • the output can be normalized by, for instance, dividing by a value proportional to effect range, followed by, at step 485, accumulating the gyro values.
  • the tuning-enabling portable computing device can be rotated in a defined direction (clockwise or anti-clockwise) so as to, at step 486, thicken text by increasing erode variables, and at step 487, thin text by increasing dilate variables.
  • a key up instruction is received by the tuning-enabling computing device, based on which at step 489, current effect values can be set as default.
  • a portable computing device operatively coupled with a smart glass based wearable device, said portable computing device comprising: an inertial measurement unit (IMU) having a gyroscope that is positioned in a manner such that upon actuation of said portable computing device, orientation of the portable computing device is determined, and upon hand rotation of said portable computing device, one or more control parameters of an image that is displayed on the wearable device are modified based on rotational velocity computed using said gyroscope.
  • IMU inertial measurement unit
  • control parameters are selected from any or a combination of passthrough of said image, colour or grayscale display of said image, brightness of said image, edge detection/enhancement in said image, contrast of said image, colour enhancement of said image, line thickness in said image, enhancement of text that forms part of said image, lighting that forms part of said image, and white:black ratio in said image.
  • said IMU further comprises an accelerometer to transmit magnitude of linear acceleration of the portable computing device relative to gravity.
  • a method of modifying, by a portable computing device, one or more control parameters on an image that is displayed in a smart glass based wearable device comprising the step of:
  • control parameters are selected from any or a combination of passthrough of said image, colour or grayscale display of said image, brightness of said image, edge detection/enhancement in said image, contrast of said image, colour enhancement of said image, line thickness in said image, enhancement of text that forms part of said image, lighting that forms part of said image, and white: black ratio in said image.
  • the term“coupled to” is intended to include both direct coupling; in which two elements that are coupled to each other contact each other, and indirect coupling; in which at least one additional element is located between the two elements. Therefore, the terms“coupled to” and“coupled with” are used synonymously. Within the context of this document terms“coupled to” and“coupled with” are also used euphemistically to mean“communicatively coupled with” over a network, where two or more devices are able to exchange data with each other over the network, possibly via one or more intermediary device. [0093] It should be apparent to those skilled in the art that many more modifications besides those already described are possible without departing from the inventive concepts herein.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Human Computer Interaction (AREA)
  • Optics & Photonics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A smart glass system (10) arranged to permit a visually impaired user to modify one or more control parameters of an image provided by the system (10) according to their eye condition, comprising: a portable computing device (100) comprising a motion sensor; a smart glass based wearable device(160) comprising a display portion (162), the display portion being provided in a field of view of the user; and an image capture device (163), wherein the portable computing device (100) is operatively coupled with the smart glass based wearable device (160), the system (10) being configured to display on the display portion (162) an image corresponding to at least a portion of an image captured by the image capture device (163), wherein the system (10) is configured to detect rotational movement of the portable computing device (100) in the hand of the user by means of the motion sensor (102), wherein one or more control parameters of the image that is displayed on the wearable device (160) are modified based on the rotational movement of the portable computing device (100).

Description

SYSTEM AND METHOD FOR MODIFYING IMAGE ENHANCEMENT
PARAMETERS FOR A PORTABLE DISPLAY
FIELD OF THE INVENTION
[0001] The present disclosure pertains to a system and method thereof for modifying image enhancement parameters for a portable display, such as one configured in a smart glass.
BACKGROUND
[0002] The following description includes information that may be useful in understanding the present invention. It is not an admission that any of the information provided herein is prior art or relevant to the presently claimed invention, or that any publication specifically or implicitly referenced is prior art.
[0003] There exist a number of image enhancement techniques for improving vision in sight impaired individuals. Such techniques include but are not limited to video passthrough of a colour/RGB image, edge detection and presentation of these edges as white on a black background, application of white edges on top of a colour or grayscale image, presentation of a black and white high-contrast image with a global threshold that applies to the entire screen, presentation of a black and white high-contrast image with multiple regional thresholds to compensate for lighting changes across a screen, and an algorithm, for instance, to detect large regions of similar hues (regardless of brightness) and then re-drawing these regions as high brightness swatches of the same colour, to aid low vision.
[0004] These image processing methods are good at improving visibility of objects in a real- world scenario, particularly for people with poor vision. These methods/techniques have parameters that can be set to find an optimal setting for many visual scenes. However, the visible world is highly dynamic and hence the preset parameters of each method/technique may not be suitable at all times. Examples of the dynamic nature of the visual world include ambient lighting
Figure imgf000002_0001
changes by many orders of magnitude as people move between environments or as they turn their heads, contrast of surface details on objects varying dramatically (and hence detection parameters set edge detection algorithms are not appropriate in all situations), specific objects such as faces and text having highly different contrast spectrums (and hence automatic thresholding algorithms will not optimally enhance the visibility of key features on different objects).
[0005] There is therefore a need in the art for a system and method for modifying image enhancement parameters for a portable display in real-world scenario.
[0006] All publications herein are incorporated by reference to the same extent as if each individual publication or patent application were specifically and individually indicated to be incorporated by reference. Where a definition or use of a term in an incorporated reference is inconsistent or contrary to the definition of that term provided herein, the definition of that term provided herein applies and the definition of that term in the reference does not apply.
SUMMARY
[0007] The present disclosure pertains to a system and method thereof for modifying image enhancement parameters for a portable display, such as one configured in a smart glass. By smart glass or‘smart glasses’ is meant wearable computer glasses (or‘spectacles’) that provide visual information in a user’s field of view in addition to that which the user is able to view substantially directly (either with or without an intermediate optical element such as a lens, which may be substantially transparent or partially transparent). The provision of visual information in addition to that which the user views substantially directly may be by superimposing information onto the user’s field of view, for example on a display element in the user’s field of view (which display element may be substantially transparent, at least partially transparent or opaque). For example, the display may be a substantially opaque LED or LCD display of the type used in mobile telephones or partially transparent. In some embodiments an
Figure imgf000003_0001
image may be projected onto the display from a light source in the form or a projector device, for example of the type used in head-up display (HUD) displays or augmented reality (AR) overlays and the reflected image viewed by the user.
[0008] A smart glass system arranged to permit a visually impaired user to modify one or more control parameters of an image provided by the system according to their eye condition, comprising:
a portable computing device comprising a motion sensor;
a smart glass based wearable device comprising a display portion, the display portion being provided in a field of view of the user; and
an image capture device,
wherein the portable computing device is operatively coupled with the smart glass based wearable device,
the system being configured to display on the display portion an image corresponding to at least a portion of an image captured by the image capture device,
wherein the system is configured to detect rotational movement of the portable computing device in the hand of the user by means of the motion sensor, wherein one or more control parameters of the image that is displayed on the wearable device are modified based on the rotational movement of the portable computing device.
[0009] This has the advantage that a user may adjust the image provided to them by the system in order to enhance their ability to view a scene. The user may endeavour to optimise the scene as viewed using the smart glass based wearable device. Optionally, the system may be configured wherein the user may adjust the parameters in substantially real time.
[0010] The image capture device may comprise a video image capture device. The image capture device may comprise at least one CMOS image capture device and/or at least one CCD image capture device. Other image capture devices may be useful. The wearable device may generate a substantially real time stream of images captured by the image capture device.
Figure imgf000004_0001
[0011] Optionally, the image capture device is configured to capture a scene having at least a portion in a field of view of a person wearing the wearable device, the system being configured to display on the display portion of the wearable device at least a portion of the scene captured by the image capture device corresponding to the location of the display in the field of view of the person wearing the wearable device.
[0012] It is to be understood that reference to the“field of view of a person wearing the wearable device” is to be understood to be with respect to a person wearing the wearable device such that the display portion is in their field of view, optionally their field of view when looking substantially directly ahead, optionally their field of view with their eyes directed in a prescribed direction, the direction being any prescribed direction from directly upwards (a Ί2 o’clock’ direction), directly downwards (a‘6 o’clock direction’) or any prescribed direction from 12 o’clock clockwise around to 12 o’clock.
[0013] Optionally, said one or more control parameters are selected from any or a combination of passthrough of said image, colour or grayscale display of said image, brightness of said image, edge detection/enhancement in said image, contrast of said image, colour enhancement of said image, line thickness in said image, enhancement of text that forms part of said image, lighting that forms part of said image, and white:black ratio in said image.
[0014] Optionally, said device comprises a selection interface that allows user of said wearable device to select a set of control parameters from said one or more control parameters that need to be modified for said image.
[0015] Optionally, the motion sensor comprises a gyroscope that is positioned in a manner such that upon actuation of said portable computing device, orientation of the portable computing device is determined, and upon hand rotation of said portable computing device, one or more control parameters of an image that is displayed on the wearable device are modified based on rotational velocity computed using said gyroscope.
Figure imgf000005_0001
[0016] Optionally, the motion sensor comprises an accelerometer arranged to determine information indicative of linear acceleration of the portable computing device relative to gravity.
[0017] Optionally, the system is configured to transmit information indicative of the linear acceleration of the portable computing device relative to gravity to the wearable device.
[0018] Optionally, the motion sensor comprises a magnetometer that determines information indicative of instantaneous orientation of the portable computing device relative to Earth’s magnetic field.
[0019] Optionally, the system is configured to transmit the information indicative of the instantaneous orientation of the portable computing device relative to Earth’s magnetic field to the wearable device.
[0020] Optionally, the system is configured wherein respective outputs from said gyroscope, said accelerometer, and said magnetometer are fused to yield the orientation and motion of the portable computing device in any direction.
[0021] Optionally, the portable computing device is actuated by pressing of a button present in/on said computing device, wherein the one or more control parameters of the images are modified only during the time the button is kept pressed.
[0022] Optionally, the extent of changes in said one or more control parameters is proportional to the extent of hand rotation.
[0023] Optionally, the orientation of the portable computing device is determined based on fusion of positional data from one or more components of the motion sensor.
[0024] Optionally, when the control parameter modification operation is paused to generate a modified image, user of said wearable device is enabled to view said modified image and/or pan around said modified image in X and Y axis and/or scroll around said modified image.
[0025] Optionally, an absolute position of said portable computing device is configured to be indicative of level of control parameter.
Figure imgf000006_0001
[0026] Optionally, when the button is pressed, the system determines the absolute position of said portable computing device and sets the level of the control parameter in dependence on the absolute position.
[0027] This feature has the advantage that a user may substantially instantly set the level of the control panel by first setting their hand in the orientation corresponding to the desired value of the control parameter and pressing the button.
[0028] Optionally, the motion sensor comprises an inertial measurement unit (IMU).
[0029] Optionally, the motion sensor is an inertial measurement unit (IMU).
[0030] Optionally, the wearable device comprises the image capture device.
[0031 ] The image capture device may be an integral part of the wearable device.
[0032] Optionally, the image capture device is provided external to the wearable device, the image capture device being operatively coupled to the wearable device.
[0033] Optionally, the image capture device is operatively coupled to the wearable device by means of a wireless connection or a wired connection.
[0034] In an aspect of the invention there is provided a method of controlling an image displayed on a display of a smart glass based wearable device of a smart glass system arranged to permit a visually impaired user to modify one or more control parameters of the image according to their eye condition, the display being provided in the field of view of a user, comprising: detecting rotational movement of a portable computing device of the system in the hand of a user by means of a motion sensor comprised by the device, the portable computing device being operatively coupled to the wearable device;
capturing by means of an image capture device of the system an image of a scene;
displaying on the display portion an image corresponding to at least a portion of the image captured by the image capture device,
the method comprising modifying one or more control parameters of the image displayed on the display portion based on the rotational movement of the portable computing device.
Figure imgf000007_0001
[0035] Optionally, the method comprises capturing by means of the image capture device a scene having at least a portion in a field of view of a person wearing the wearable device, the method comprising displaying on the display portion of the wearable device at least a portion of the scene captured by the image capture device corresponding to the location of the display in the field of view of the person wearing the wearable device.
[0036] Thus, the person wearing the wearable device will see the image captured by the image capture device within their field of view.
[0037] Optionally, the image displayed by the display portion occupies a portion and not the whole of the field of view of the user wherein the image displayed is substantially continuous with a remainder of a field of view of the user such that the image displayed appears to be superimposed upon the scene. It is to be understood that the display portion may be at least partially transparent, allowing the user to see objects in the portion of the field of view occupied by the display portion through the display portion as well as information displayed on the display portion by the system.
[0038] Optionally, the method further comprises the step of:
receiving from the motion sensor new gyroscope values as part of a change in gyroscope value due to movement of the portable computing device;
smoothing, using a sliding window filter, the received new gyroscope values;
normalizing said smoothened gyroscope values; and
accumulating the normalized gyroscope values to indicate the extent of hand rotation of said portable computing device.
[0039] Optionally, the one or more control parameters are selected from any or a combination of passthrough of said image, colour or grayscale display of said image, brightness of said image, edge detection/enhancement in said image, contrast of said image, colour enhancement of said image, line thickness in said image, enhancement of text that forms part of said image, lighting that forms part of said image, and white: black ratio in said image.
Figure imgf000008_0001
[0040] In an aspect of the invention there is provided a portable computing device for use with a smart glass system arranged to permit a visually impaired user to modify one or more control parameters of an image provided by the system according to their eye condition, the portable computing device comprising a motion sensor; the portable computing device being arranged to be operatively coupled with the smart glass based wearable device, the smart glass based wearable device comprising a display portion, the display portion being provided in a field of view of the user, the smart glass system further comprising an image capture device, the system being configured to display on the display portion of the wearable device an image corresponding to at least a portion of an image captured by the image capture device, wherein the system is configured to detect rotational movement of the portable computing device in the hand of the user by means of the motion sensor, wherein one or more control parameters of the image that is displayed on the wearable device are modified based on the rotational movement of the portable computing device.
[0041] In an aspect of the invention there is provided a smart glass based wearable device arranged to be operatively coupled with the portable computing device of the preceding aspect, the system being arranged to permit a visually impaired user to modify one or more control parameters of an image provided by the system according to their eye condition.
[0042] In an aspect, the present disclosure relates to a portable computing device that is operatively coupled with a smart glass based wearable device, wherein the portable computing device can include an inertial measurement unit (IMU) having a gyroscope that is positioned in a manner such that upon actuation of the portable computing device, orientation of the portable computing device can be determined, and upon hand rotation of the portable computing device, one or more control parameters of an image that is displayed on the wearable device can be modified based on rotational velocity computed using the gyroscope.
[0043] In an aspect, the one or more control parameters can be selected from any or a combination of passthrough of the image, colour or grayscale display of the image, brightness of
Figure imgf000009_0001
the image, edge detection/enhancement in the image, contrast of the image, colour enhancement of the image, line thickness in the image, enhancement of text that forms part of the image, lighting that forms part of the image, and white: black ratio in the image.
[0044] In an aspect, the device can include a selection interface that allows user of the wearable device to select a set of control parameters from said one or more control parameters that need to be modified for the image.
[0045] In an aspect, the IMU can further include an accelerometer to transmit magnitude of linear acceleration of the portable computing device relative to gravity. In another aspect, the IMU can further include a magnetometer that determines and transmits instantaneous orientation of the portable computing device relative to Earth’s magnetic field. In yet another aspect, respective outputs from the gyroscope, the accelerometer, and the magnetometer can be fused to yield the orientation and motion of the portable computing device in any direction.
[0046] In an aspect, the portable computing device can be actuated by pressing of a button present in/on said computing device, wherein the one or more control parameters of the images are modified only during the time the button is kept pressed.
[0047] In another aspect, the extent of hand rotation is proportional to the extent of changes in said one or more control parameters.
[0048] In yet another aspect, the orientation of the portable computing device can be determined based on fusion of positional data from one or more components of the IMU, said one or more components comprising at least an accelerometer.
[0049] In yet another aspect, when the control parameter modification operation is paused to generate a modified image, user of said wearable device is enabled to view said modified image and/or pan around said modified image in X and Y axis and/or scroll around said modified image.
[0050] In an aspect, an absolute position of said portable computing device is configured to be indicative of level of control parameter.
Figure imgf000010_0001
[0051] In another aspect, the present disclosure relates to a method of modifying, by a portable computing device, one or more control parameters on an image that is displayed in a smart glass based wearable device, said method comprising the step of: receiving, at the portable computing device, from a gyroscope sensor configured in the portable computing device, a change in gyroscope value indicative of extent of hand rotation of said portable computing device, said hand rotation being mapped to one or more control parameters; determining, at the portable computing device, using an accelerometer configured in the portable computing device, orientation of the portable computing device; and generating, from the portable computing device, an image modification signal to said glass based wearable device based on the change in gyroscope value and the determined orientation, wherein said image is modified with respect to said one or more control parameters based on said image modification signal.
[0052] In an aspect, the method can further include the step of: receiving new gyroscope values as part of the change in gyroscope value; smoothing, using a sliding window filter, the received new gyroscope values; normalizing said smoothened gyroscope values; and accumulating the normalized gyroscope values to indicate the extent of hand rotation of said portable computing device.
[0053] In an aspect of the invention there is provided a portable computing device that is operatively coupled with a smart glass based wearable device. In an aspect, the portable computing device can include an inertial measurement unit (IMU) having a gyroscope that is positioned in a manner such that upon actuation of the portable computing device, orientation of the portable computing device can be determined, and upon hand rotation of the portable computing device, one or more control parameters of an image that is displayed on the wearable device can be modified based on rotational velocity computed using said gyroscope.
BRIEF DESCRIPTION OF DRAWINGS
Figure imgf000011_0001
[0054] FIGs. 1 and 2 illustrate exemplary representation of the proposed device in accordance with an embodiment of the present disclosure.
[0055] FIGs. 3A-3E illustrate exemplary representation showing how the proposed device can be used for controlling at least one parameter of imaging/image enhancement techniques.
[0056] FIGs. 4A-4E illustrate exemplary flow diagrams to enable tuning of different attributes of captured videos/images in accordance with an embodiment of the present disclosure.
DETAILED DESCRIPTION OF DRAWINGS
[0057] Embodiments of the present disclosure include various steps, which will be described below. The steps may be performed by hardware components or may be embodied in machine- executable instructions, which may be used to cause a general-purpose or special-purpose processor programmed with the instructions to perform the steps. Alternatively, steps may be performed by a combination of hardware, software, firmware and/or by human operators.
[0058] Embodiments of the present disclosure may be provided as a computer program product, which may include a machine-readable storage medium tangibly embodying thereon instructions, which may be used to program a computer (or other electronic devices) to perform a process. The machine-readable medium may include, but is not limited to, fixed (hard) drives, magnetic tape, floppy diskettes, optical disks, compact disc read-only memories (CD-ROMs), and magneto-optical disks, semiconductor memories, such as ROMs, PROMs, random access memories (RAMs), programmable read-only memories (PROMs), erasable PROMs (EPROMs), electrically erasable PROMs (EEPROMs), flash memory, magnetic or optical cards, or other type of media/machine-readable medium suitable for storing electronic instructions (e.g., computer programming code, such as software or firmware).
[0059] Various methods described herein may be practiced by combining one or more machine-readable storage media containing the code according to the present disclosure with appropriate standard computer hardware to execute the code contained therein. An apparatus for
Figure imgf000012_0001
practicing various embodiments of the present disclosure may involve one or more computers (or one or more processors within a single computer) and storage systems containing or having network access to computer program(s) coded in accordance with various methods described herein, and the method steps of the disclosure could be accomplished by modules, routines, subroutines, or subparts of a computer program product.
[0060] If the specification states a component or feature“may”,“can”,“could”, or“might” be included or have a characteristic, that particular component or feature is not required to be included or have the characteristic.
[0061] Arrangements and embodiments may now be described more fully with reference to the accompanying drawings, in which exemplary embodiments may be shown. Embodiments may, however, be embodied in many different forms and should not be construed as being limited to embodiments set forth herein; rather, embodiments may be provided so that this disclosure will be thorough and complete, and will fully convey the concept to those skilled in the art.
[0062] The suffixes ‘module’, ‘unit’ and ‘part’ may be used for elements in order to facilitate the disclosure. Significant meanings or roles may not be given to the suffixes themselves and it is understood that the‘module’,‘unit’ and‘part’ may be used together or interchangeably.
[0063] The present disclosure pertains to a system and method thereof for modifying image enhancement parameters for a portable display, such as one configured in a smart glass.
[0064] The present disclosure pertains to a real-time image processing system that is designed to improve vision for people who are severely sight impaired. The proposed system can include a video input mechanism that can be a wired or a wireless camera or can include an externally streamed video or a video that is a file on the device, wherein the video input mechanism can be presented to the user via a head-mounted screen such as an augmented or virtual reality transparent display, of for instance, a smart glass. In the embodiment 10 of FIG.
Figure imgf000013_0001
1(a) a smart glass based wearable device 160 has a display screen 162 and an image capture device in the form of a video camera 163. FIG. 1(b) shows a corresponding embodiment 10A in which the image capture device is not provided integral to the wearable device 160. Rather, it may be coupled to the device 160 via a wireless connection. In some embodiments the image capture device may be coupled by means of a wired connection in addition or instead. In the embodiment of FIG. 1 the display screen 162 is a transparent waveguide with diffractive optics arranged to direct an image from an organic light emitting diode (OLED) micro display, into the user’s eye. Other arrangements may be useful such as a transparent waveguide with a beamsplitter instead of diffractive optics. Other displays may be useful such as liquid crystal on silicon (LCOS) displays or liquid crystal displays (LCDs). In some embodiments, an opaque display including a high resolution OLED panel and one or more optical elements such as a biconvex or Fresnel lens arrangement may be employed to direct the image into the user’s eye. In the embodiment of FIG. 1(a) the video camera 163 is a CMOS (complementary metal oxide semiconductor) camera but other cameras may be useful in some embodiments.
[0065] In an aspect, the present disclosure relates to a physical device that can be given to a user to modify primary control parameter(s) of each of existing methods/techniques as mentioned above A. video passthrough of a colour/RGB image, B. edge detection and presentation of these edges as white on a black background, C. application of white edges on top of a colour or grayscale image, D. presentation of a black and white high-contrast image with a global threshold that applies to the entire screen, E. presentation of a black and white high- contrast image with multiple regional thresholds to compensate for lighting changes across a screen, and F. an algorithm, for instance, to detect large regions of similar hues (regardless of brightness) and then re-drawing these regions as high brightness swatches of the same colour, to aid low vision).
[0066] In an aspect, the proposed device can be configured to receive one parameter from each of the above-mentioned techniques (A-F), and make the received respective parameter as
Figure imgf000014_0001
adjustable. In an aspect, the proposed device can be configured as a portable gesture device and can enable provision of an intuitive control that mimics other well-known control mechanisms such as a volume knob on an audio device. The proposed device can rapidly and intuitively change parameters for one or more existing image enhancement techniques over a wide range, due to the relationship between movement of the device and the rate of parameter change.
[0067] In an aspect, the proposed device system can be activated by a button press, wherein at the first press, orientation of the device can be calculated. In an exemplary implementation, orientation of the device can be determined based on fusion of positional data from components of an inertial measurement device (IMU) that comprises an accelerometer configured to determine and transmit real-time values of the devices’ position relative to gravity. The accelerometer can also be configured to transmit magnitude of any linear acceleration in three dimensions. IMU of the present disclosure can further include a gyroscope that indicates instantaneous rotational velocity in three dimensions. An optional magnetometer can also be configured in the IMU and configured to give instantaneous orientation of the proposed device/handset relative to Earth’s magnetic field (i.e. a compass). These three sources of data can be combined, or“fused” to give orientation and motion of the device/handset in any direction. This data fusion can be derived through any number of well-known algorithms, such as a Kalman filter.
[0068] In an aspect, once the button on the proposed device is pressed, initial orientation of the handset becomes set zero. Any rotation about a defined axis of the handset can be interpreted as an increase or a decrease of the primary control parameter. In an exemplary embodiment, the axis of rotation can be defined to be along the length of the device, which is the same axis as the wrist. For instance, a clockwise roll can increase the white:black ratio on a high-contrast display. An anticlockwise roll, on the other hand, can decrease the white:black ratio on a high-contrast display. Alternatively, an anticlockwise roll can increase the white: black ratio on a high-contrast display and a clockwise roll can decrease the white:black ratio on a high-contrast display.
Figure imgf000015_0001
[0069] In an aspect, user can be given the ability to watch parameter change in real-time as they rotate the proposed device/handset, which can create an intuitive feedback system, allowing the user to be very specific with their modifications to the image. During the“tuning” phase, the video can continue to be passed in real-time to the display.
[0070] In an exemplary implementation, upon release of the button, the modified imaging parameter can be set. Thus, the modified imaging parameter can be set as the new state of the system, becoming the state in which the system continues to operate until the state is changed. In some embodiments the state of contrast may revert to a default state upon release of the button. In some embodiments the state of contrast may revert to a default state after a predetermined time period has elapsed. Other arrangements may be useful. In some embodiments a user may select how the system behaves when the button is released, for example whether the changed parameter, such as the instant blackwhite contrast setting, is maintained, or whether the system reverts to a default value of the parameter.
[0071] As mentioned above, each imaging/image enhancement technique can have at least one parameter that can be modified by the proposed tuning device. For instance, for Video pass through (colour or grayscale display) technique, the parameter can include the tuning that increases or decreases general image brightness. Similarly, for white edges on black technique, the parameter can include the tuning that modifies the threshold for edge detection. Decreasing this threshold increases the number of edges displayed. Increasing this threshold decreases the number of edges displayed. For the video pass-through plus white edges technique, the same parameter as for white edges on black technique can be used except that the edges are displayed on a live video (colour or grayscale). On the other hand, for high contrast, global threshold technique, the parameter can include tuning that moves the blackwhite threshold towards the white or towards the black. This either increases the amount of white on the screen, or increases the amount of black. For high contrast, multiple regional thresholds technique, the parameter can include tuning that modifies the erode and dilate parameters, subsequently increasing line
Figure imgf000016_0001
thickness (process called“erode”) or decreasing line thickness (process called“dilate”). Finally, for colour detection and saturation image enhancement technique, the parameter can include tuning that rotates the detection window through the colour spectrum, allowing for specific colours to be saturated.
[0072] It is to be understood that Contrast is an important parameter in assessing vision clincally. Clinical visual acuity measurements typically use high contrast images such as black letters on a white background. In reality, contrast between objects and their surroundings varies. The relationship between visual acuity and contrast allows a more detailed understanding of visual perception.
[0073] The resolving power of the eye may be measured by means of sinusoidal grating patterns having adjustable spacing (spatial periodicity). The contrast of the grating is the differential intensity threshold of a grating, which is defined as the ratio:
C = (Lmax - Lmin) / (Lmax + Lmin) where L is the luminance of the grating pattern as a function of spatial distance in a direction normal to the orientation of the parallel elements of the grating and C may be referred to as the modulation or Raleigh or Michelson contrast. C can have a value between 0.0 and 1.0. Further details may be found in“Visual Acuity” by Michael Kalloniatis and Charles Luu, available at https://webvision.med.utah.edu/book/part-viii-psvchophvsics-of-vision/visual-acuitv/ portions of which are discussed below.
[0074] As the spatial frequency of a set of black/white lines increases, i.e. the thickness of the lines decreases, they become harder to resolve and begin to look like a homogenous grey area. The sensitivity of a person’s eyes to contrast can be measured by determining the minimum grating spacing that each eye can resolve as a function of image contrast. This may be done, for example, by lowering the contrast for a given spatial frequency until the person can no longer
Figure imgf000017_0001
detect the grating - this value is the‘contrast threshold’ for that grating size (spatial frequency). The reciprocal of this contrast threshold is called the‘contrast sensitivity’. The contrast threshold can be expressed as a sensitivity on a decibel (dB) scale: contrast sensitivity in dB = -20 loglOC where C is the threshold value of modulation contrast (described above). A plot of (contrast) sensitivity versus spatial frequency is called the spatial contrast sensitivity function (referred to as the SCSF or simply CSF).
[0075] FIG. 1(b) illustrates schematically the manner in which the contrast sensitivity function (CSF) of an individual may be affected depending upon their medical condition. The plot shows log (contrast sensitivity) as a function of log (spatial frequency) (c/deg). Trace N represents the expected CSF of a healthy individual. Trace A represents that of an individual with contrast losses in the mid to low region of (log spatial frequency), characteristic of individuals having multiple sclerosis; trace B represents the CSR of individuals with an overall reduction in CSF across the range of spatial frequencies, characteristic of cataract patients, whilst trace C represents the CSF of individuals with mild refractive error or mild amblyopia (trace B being characteristic of individuals with more severe cases of either).
[0076] Further information may also be found at: https://www.semanticscholar.org/paper/Comparing-the-Shape-of-Contrast-Sensitivitv-for-and-
Chung-Legge/92c9647ee47507ce50e2792eb9504l06734d37ea
[0077] In an aspect, the proposed device can be coupled to any portable display such as a smart glass that is operatively coupled with a camera that receives a range of different video sources. Other exemplary portable display devices to which the proposed device can be applied can include but are not limited to head mounted camera, external wireless camera, video streaming from a broadcast source e.g. TV, closed-loop video, such as a theatre, concert or live sport event, and on device video source, e.g. a movie fde, internet-streamed video etc. In each of these cases, the proposed device can apply any of the image enhancement algorithms previously
Figure imgf000018_0001
listed, and each of these enhancements can be modified in real-time by the“Tuning” device outlined in this disclosure.
[0078] FIGs. 1(a) and 2 illustrate exemplary representation of the proposed system in accordance with an embodiment of the present disclosure, wherein a portable, handheld computing device 100 having a motion sensor 102 can be either physically connected to the smart glass based wearable device 160 or can be wirelessly coupled through Bluetooth or can be mounted onto the frame of the smart glass/wearable device, or any other configuration, all of which are well within the scope of the present disclosure.
[0079] As mentioned above, the present disclosure provides an electronic system/product 100 that can include an inertial measurement device (IMU) 102 having a gyroscope 104 and an accelerometer 106 (it may also alternatively or additionally include a magnetometer 108), wherein during implementation/operation, a user can press and hold a button 150 on the proposed device 100, and then rotate his/her hand as if controlling the dial on a volume control.
[0080] In an aspect, once the button 150 on the proposed device 100 is pressed, initial orientation of the device 100 becomes set zero. Any rotation about a defined axis of the device/handset 100 can be interpreted as an increase or a decrease of the primary control parameter. In an exemplary embodiment, the axis of rotation (refer to FIG. 2) can be defined to be along the length of the device, which is the same axis as the wrist. For instance, a clockwise roll can increase the white:black ratio on a high-contrast display, say transparent display 162 of a smart glass 160. An anticlockwise roll, on the other hand, can decrease the white:black ratio on a high-contrast display 162.
[0081] In an aspect, user can be given the ability to watch parameter change in real-time as they rotate the proposed device/handset, which can create an intuitive feedback system, allowing the user to be very specific with their modifications to the image. During the“tuning” phase, the video can continue to be passed in real-time to the display 162. In an exemplary implementation, upon release of the button, the modified imaging parameter can be set.
Figure imgf000019_0001
[0082] FIGs. 3A-3E illustrate exemplary representation showing how the proposed device can be used for controlling at least one parameter of imaging/image enhancement techniques. For instance, for Video pass through (colour or grayscale display) technique, as shown in FIG. 3A, the parameter can include the tuning that increases or decreases general image brightness. Similarly, for white edges on black technique, as shown in FIG. 3B, the parameter can include the tuning that modifies the threshold for edge detection. Decreasing this threshold increases the number of edges displayed. Increasing this threshold decreases the number of edges displayed. For the video pass-through plus white edges technique, as shown in FIG. 3C, the same parameter as for white edges on black technique can be used except that the edges are displayed on a live video (colour or grayscale). On the other hand, for high contrast global threshold technique, as shown in FIG. 3D, the parameter can include tuning that moves the black: white threshold towards the white or towards the black. This either increases the amount of white on the screen, or increases the amount of black. For high contrast, multiple regional thresholds technique, as shown in FIG. 3E, the parameter can include tuning that modifies the erode and dilate parameters, subsequently increasing line thickness (process called“erode”) or decreasing line thickness (process called “dilate”). Finally, for colour detection and saturation image enhancement technique, the parameter can include tuning that rotates the detection window through the colour spectrum, allowing for specific colours to be saturated.
[0083] As would be appreciated, using the present invention, a user simply rotates a handheld device, making it easy to perform image enhancement and intuitive to describe. In addition, the ratio of rotation to parameter control can be varied so that either: small rotations lead to large changes for highly dynamic environments, or large rotations lead to small changes for fine tuning of an image parameter.
[0084] In an aspect, the proposed device can also be configured to, for other axis of rotation of the device, modify different parameters of the image. For instance, roll axis of the proposed device can change the primary control parameter; pitch axis can change ratio of rotation
Figure imgf000020_0001
primary control parameter, which can allow a person to firstly make a large change to the general image, and then increase the sensitivity in order to fine tune the adjustment to suit the environment and the user’s level of vision.
[0085] FIGs. 4A-4E illustrate exemplary flow diagrams to enable tuning of different attributes of captured videos/images in accordance with an embodiment of the present disclosure.
[0086] With reference to FIG. 4A which illustrates brightness based tuning operation 400, as can be seen, at step 402, smart glass/wearable device that is operatively coupled with the proposed tuning-enabling computing device can listen to the sensor that is configured in the tuning-enabling computing device, based on which at step 404, the tuning-enabling computing device receives a change in y-axis gyroscope value, output of which is smoothened with a sliding window filter at step 406. It would be appreciated that a part of these steps can be performed in the smart glass/wearable device as well, or in any desired combination of the smart glass/wearable device and the tuning-enabling portable computing device, all of which possible combinations are therefore well within the scope of the present invention. At step 408, the output can be normalized by, for instance, dividing by a value proportional to effect range, followed by, at step 410, accumulating the gyro values. At step 412, the tuning-enabling portable computing device can be rotated in a defined direction (clockwise or anti-clockwise) so as to, at step 412, decrease brightness variable, and at step 414, increase brightness variable. At step 416, a key up instruction is received by the tuning-enabling computing device, based on which at step 418, current effect values can be set as default.
[0087] With reference to FIG. 4B which illustrates edge enhancement based tuning operation 420, as can be seen, at step 422, smart glass/wearable device that is operatively coupled with the proposed tuning-enabling computing device can listen to the sensor that is configured in the tuning-enabling computing device, based on which at step 424, the tuning enabling computing device receives a change in y-axis gyroscope value, output of which is smoothened with a sliding window filter at step 426. It would be appreciated that a part of these
Figure imgf000021_0001
steps can be performed in the smart glass/wearable device as well, or in any desired combination of the smart glass/wearable device and the tuning-enabling portable computing device, all of which possible combinations are therefore well within the scope of the present invention. At step 428, the output can be normalized by, for instance, dividing by a value proportional to effect range, followed by, at step 430, accumulating the gyro values. At step 432, the tuning-enabling portable computing device can be rotated in a defined direction (clockwise or anti-clockwise) so as to, at step 432, increase the threshold for line detection, and at step 434, decrease the threshold for line detection. At step 436, a key up instruction is received by the tuning-enabling computing device, based on which at step 438, current effect values can be set as default.
[0088] With reference to FIG. 4C which illustrates contrast based tuning operation 440, as can be seen, at step 442, smart glass/wearable device that is operatively coupled with the proposed tuning-enabling computing device can listen to the sensor that is configured in the tuning-enabling computing device, based on which at step 444, the tuning-enabling computing device receives a change in y-axis gyroscope value, output of which is smoothened with a sliding window filter at step 446. It would be appreciated that a part of these steps can be performed in the smart glass/wearable device as well, or in any desired combination of the smart glass/wearable device and the tuning-enabling portable computing device, all of which possible combinations are therefore well within the scope of the present invention. At step 448, the output can be normalized by, for instance, dividing by a value proportional to effect range, followed by, at step 450, accumulating the gyro values. At step 452, the tuning-enabling portable computing device can be rotated in a defined direction (clockwise or anti-clockwise) so as to, at step 452, increase the threshold for white (increase % black), and at step 454, decrease the threshold for white (increase % white). At step 456, a key up instruction is received by the tuning-enabling computing device, based on which at step 458, current effect values can be set as default.
[0089] With reference to FIG. 4D which illustrates colour based tuning operation 460, as can be seen, at step 462, smart glass/wearable device that is operatively coupled with the
Figure imgf000022_0001
proposed tuning-enabling computing device can listen to the sensor that is configured in the tuning-enabling computing device, based on which at step 464, the tuning-enabling computing device receives a change in y-axis gyroscope value, output of which is smoothened with a sliding window filter at step 466. It would be appreciated that a part of these steps can be performed in the smart glass/wearable device as well, or in any desired combination of the smart glass/wearable device and the tuning-enabling portable computing device, all of which possible combinations are therefore well within the scope of the present invention. At step 468, the output can be normalized by, for instance, dividing by a value proportional to effect range, followed by, at step 470, accumulating the gyro values. At step 472, the tuning-enabling portable computing device can be rotated in a defined direction (clockwise or anti-clockwise) so as to, at step 472, increase the colour display in blue-green range (as an example), and at step 474, display colours within the yellow-red range (for instance). At step 476, a key up instruction is received by the tuning-enabling computing device, based on which at step 478, current effect values can be set as default.
[0090] With reference to FIG. 4E which illustrates enhanced text based tuning operation 480, as can be seen, at step 481, smart glass/wearable device that is operatively coupled with the proposed tuning-enabling computing device can listen to the sensor that is configured in the tuning-enabling computing device, based on which at step 482, the tuning-enabling computing device receives a change in y-axis gyroscope value, output of which is smoothened with a sliding window filter at step 483. It would be appreciated that a part of these steps can be performed in the smart glass/wearable device as well, or in any desired combination of the smart glass/wearable device and the tuning-enabling portable computing device, all of which possible combinations are therefore well within the scope of the present invention. At step 484, the output can be normalized by, for instance, dividing by a value proportional to effect range, followed by, at step 485, accumulating the gyro values. At step 486, the tuning-enabling portable computing device can be rotated in a defined direction (clockwise or anti-clockwise) so as to, at step 486,
Figure imgf000023_0001
thicken text by increasing erode variables, and at step 487, thin text by increasing dilate variables. At step 488, a key up instruction is received by the tuning-enabling computing device, based on which at step 489, current effect values can be set as default.
[0091] Some aspects of the present invention may be understood by reference to the following numbered clauses:
1. A portable computing device operatively coupled with a smart glass based wearable device, said portable computing device comprising: an inertial measurement unit (IMU) having a gyroscope that is positioned in a manner such that upon actuation of said portable computing device, orientation of the portable computing device is determined, and upon hand rotation of said portable computing device, one or more control parameters of an image that is displayed on the wearable device are modified based on rotational velocity computed using said gyroscope.
2. The portable computing device of clause 1, wherein said one or more control parameters are selected from any or a combination of passthrough of said image, colour or grayscale display of said image, brightness of said image, edge detection/enhancement in said image, contrast of said image, colour enhancement of said image, line thickness in said image, enhancement of text that forms part of said image, lighting that forms part of said image, and white:black ratio in said image.
3. The portable computing device of clause 1 , wherein said device comprises a selection interface that allows user of said wearable device to select a set of control parameters from said one or more control parameters that need to be modified for said image.
4. The portable computing device as described in clause 1, wherein said IMU further comprises an accelerometer to transmit magnitude of linear acceleration of the portable computing device relative to gravity.
5. The portable computing device as described in clause 4, wherein said IMU further comprises a magnetometer that determines and transmits instantaneous orientation of the
Figure imgf000024_0001
portable computing device relative to Earth’s magnetic field.
6. The portable computing device as described in clause 5, wherein respective outputs from said gyroscope, said accelerometer, and said magnetometer are fused to yield the orientation and motion of the portable computing device in any direction.
7. The portable computing device as described in clause 1 , wherein said portable computing device is actuated by pressing of a button present in/on said computing device, wherein the one or more control parameters of the images are modified only during the time the button is kept pressed.
8. The portable computing device as described in clause 1 , wherein the extent of hand rotation is proportional to the extent of changes in said one or more control parameters.
9. The portable computing device as described in clause 1 , wherein the orientation of the portable computing device is determined based on fusion of positional data from one or more components of the IMU, said one or more components comprising at least an accelerometer.
10. The portable computing device as described in clause 1, wherein when the control parameter modification operation is paused to generate a modified image, user of said wearable device is enabled to view said modified image and/or pan around said modified image in X and Y axis and/or scroll around said modified image.
11. The portable computing device as described in clause 1, wherein an absolute position of said portable computing device is configured to be indicative of level of control parameter.
12. A method of modifying, by a portable computing device, one or more control parameters on an image that is displayed in a smart glass based wearable device, said method comprising the step of:
receiving, at the portable computing device, from a gyroscope sensor configured in the portable computing device, a change in gyroscope value indicative of extent of hand rotation of said portable computing device, said hand rotation being mapped to one or more control parameters;
Figure imgf000025_0001
determining, at the portable computing device, using an accelerometer configured in the portable computing device, orientation of the portable computing device; and
generating, from the portable computing device, an image modification signal to said glass based wearable device based on the change in gyroscope value and the determined orientation, wherein said image is modified with respect to said one or more control parameters based on said image modification signal.
13. The method of clause 12, wherein said method further comprises the step of:
receiving new gyroscope values as part of the change in gyroscope value; smoothing, using a sliding window filter, the received new gyroscope values;
normalizing said smoothened gyroscope values; and
accumulating the normalized gyroscope values to indicate the extent of hand rotation of said portable computing device.
14. The method of clause 12, wherein said one or more control parameters are selected from any or a combination of passthrough of said image, colour or grayscale display of said image, brightness of said image, edge detection/enhancement in said image, contrast of said image, colour enhancement of said image, line thickness in said image, enhancement of text that forms part of said image, lighting that forms part of said image, and white: black ratio in said image.
[0092] As used herein, and unless the context dictates otherwise, the term“coupled to” is intended to include both direct coupling; in which two elements that are coupled to each other contact each other, and indirect coupling; in which at least one additional element is located between the two elements. Therefore, the terms“coupled to” and“coupled with” are used synonymously. Within the context of this document terms“coupled to” and“coupled with” are also used euphemistically to mean“communicatively coupled with” over a network, where two or more devices are able to exchange data with each other over the network, possibly via one or more intermediary device.
Figure imgf000026_0001
[0093] It should be apparent to those skilled in the art that many more modifications besides those already described are possible without departing from the inventive concepts herein. The inventive subject matter, therefore, is not to be restricted except in the spirit of the appended claims. Moreover, in interpreting both the specification and the claims, all terms should be interpreted in the broadest possible manner consistent with the context. In particular, the terms “comprises” and“comprising” should be interpreted as referring to elements, components, or steps in a non-exclusive manner, indicating that the referenced elements, components, or steps may be present, or utilized, or combined with other elements, components, or steps that are not expressly referenced. Where the specification claims refer to at least one of something selected from the group consisting of A, B, C ... and N, the text should be interpreted as requiring only one element from the group, not A plus N, or B plus N, etc. The foregoing description of the specific embodiments will so fully reveal the general nature of the embodiments herein that others can, by applying current knowledge, readily modify and/or adapt for various applications such specific embodiments without departing from the generic concept, and, therefore, such adaptations and modifications should and are intended to be comprehended within the meaning and range of equivalents of the disclosed embodiments. It is to be understood that the phraseology or terminology employed herein is for the purpose of description and not of limitation. Therefore, while the embodiments herein have been described in terms of preferred embodiments, those skilled in the art will recognize that the embodiments herein can be practiced with modification within the spirit and scope of the appended claims.
[0094] While various embodiments of the present disclosure have been illustrated and described herein, it will be clear that the disclosure is not limited to these embodiments only. Numerous modifications, changes, variations, substitutions, and equivalents will be apparent to those skilled in the art, without departing from the spirit and scope of the disclosure, as described in the claims.
Figure imgf000027_0001

Claims

1. A smart glass system (10) arranged to permit a visually impaired user to modify one or more control parameters of an image provided by the system (10) according to their eye condition, comprising:
a portable computing device (100) comprising a motion sensor;
a smart glass based wearable device (160) comprising a display portion (162), the display portion being provided in a field of view of the user; and
an image capture device (163),
wherein the portable computing device (100) is operatively coupled with the smart glass based wearable device (160),
the system (10) being configured to display on the display portion (162) an image corresponding to at least a portion of an image captured by the image capture device (163), wherein the system (10) is configured to detect rotational movement of the portable computing device (100) in the hand of the user by means of the motion sensor (102), wherein one or more control parameters of the image that is displayed on the wearable device (160) are modified based on the rotational movement of the portable computing device (100).
2. The system of claim 1 wherein the image capture device is configured to capture a scene having at least a portion in a field of view of a person wearing the wearable device, the system being configured to display on the display portion of the wearable device at least a portion of the scene captured by the image capture device corresponding to the location of the display in the field of view of the person wearing the wearable device.
3. The system of claim 1 or claim 2, wherein said one or more control parameters are selected from any or a combination of passthrough of said image, colour or grayscale display of said
Figure imgf000028_0001
image, brightness of said image, edge detection/enhancement in said image, contrast of said image, colour enhancement of said image, line thickness in said image, enhancement of text that forms part of said image, lighting that forms part of said image, and white:black ratio in said image.
4. The system of any preceding claim, wherein said device comprises a selection interface that allows user of said wearable device to select a set of control parameters from said one or more control parameters that need to be modified for said image.
5. The system of any preceding claim, wherein the motion sensor comprises a gyroscope that is positioned in a manner such that upon actuation of said portable computing device, orientation of the portable computing device is determined, and upon hand rotation of said portable computing device, one or more control parameters of an image that is displayed on the wearable device are modified based on rotational velocity computed using said gyroscope.
6. The system of any preceding claim, wherein the motion sensor comprises an accelerometer arranged to determine information indicative of linear acceleration of the portable computing device relative to gravity.
7. The system of claim 6 configured to transmit information indicative of the linear acceleration of the portable computing device relative to gravity to the wearable device.
8. The system of any preceding claim, wherein the motion sensor comprises a magnetometer that determines information indicative of instantaneous orientation of the portable computing device relative to Earth’s magnetic field.
Figure imgf000029_0001
9. The system of claim 8 configured to transmit the information indicative of the instantaneous orientation of the portable computing device relative to Earth’s magnetic field to the wearable device.
10. The system of claim 8 or 9 as depending through claim 6 as dependent on claim 5, wherein respective outputs from said gyroscope, said accelerometer, and said magnetometer are fused to yield the orientation and motion of the portable computing device in any direction.
11. The system of any preceding claim, wherein the portable computing device is actuated by pressing of a button present in/on said computing device, wherein the one or more control parameters of the images are modified only during the time the button is kept pressed.
12. The system of any preceding claim configured wherein the extent of changes in said one or more control parameters is proportional to the extent of hand rotation.
13. The system of claim 6 or any one of claims 7 to 12 depending through claim 6, wherein the orientation of the portable computing device is determined based on fusion of positional data from one or more components of the motion sensor.
14. The system of any preceding claim, wherein when the control parameter modification operation is paused to generate a modified image, user of said wearable device is enabled to view said modified image and/or pan around said modified image in X and Y axis and/or scroll around said modified image.
15. The system of any preceding claim, wherein an absolute position of said portable computing device is configured to be indicative of level of control parameter.
Figure imgf000030_0001
16. The system of claim 15 as depending through claim 11, wherein when the button is pressed, the system determines the absolute position of said portable computing device and sets the level of the control parameter in dependence on the absolute position.
17. The system of any preceding claim wherein the motion sensor comprises an inertial measurement unit (IMU).
18. The system of any preceding claim wherein the motion sensor is an inertial measurement unit (IMU).
19. A system according to any preceding claim wherein the wearable device comprises the image capture device.
20. A system according to any one of claims 1 to 18 wherein the image capture device is provided external to the wearable device, the image capture device being operatively coupled to the wearable device.
21. A system according to claim 20 wherein the image capture device is operatively coupled to the wearable device by means of a wireless connection or a wired connection.
22. A method of controlling an image displayed on a display of a smart glass based wearable device of a smart glass system arranged to permit a visually impaired user to modify one or more control parameters of the image according to their eye condition, the display being provided in the field of view of a user, comprising:
detecting rotational movement of a portable computing device of the system in the hand of
Figure imgf000031_0001
a user by means of a motion sensor comprised by the device, the portable computing device being operatively coupled to the wearable device;
capturing by means of an image capture device of the system an image of a scene;
displaying on the display portion an image corresponding to at least a portion of the image captured by the image capture device,
the method comprising modifying one or more control parameters of the image displayed on the display portion based on the rotational movement of the portable computing device.
23. The method of claim 22 comprising capturing by means of the image capture device a scene having at least a portion in a field of view of a person wearing the wearable device, the method comprising displaying on the display portion of the wearable device at least a portion of the scene captured by the image capture device corresponding to the location of the display in the field of view of the person wearing the wearable device.
24. The method of claim 22 or 23 whereby the method further comprises the step of:
receiving from the motion sensor new gyroscope values as part of a change in gyroscope value due to movement of the portable computing device;
smoothing, using a sliding window filter, the received new gyroscope values;
normalizing said smoothened gyroscope values; and
accumulating the normalized gyroscope values to indicate the extent of hand rotation of said portable computing device.
25. The method of any one of claims 22 to 24, whereby the one or more control parameters are selected from any or a combination of passthrough of said image, colour or grayscale display of said image, brightness of said image, edge detection/enhancement in said image, contrast of said image, colour enhancement of said image, line thickness in said image, enhancement of text that
Figure imgf000032_0001
forms part of said image, lighting that forms part of said image, and white:black ratio in said image.
Figure imgf000033_0001
PCT/GB2019/051687 2018-06-16 2019-06-17 System and method for modifying image enhancement parameters for a portable display WO2019239161A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201980046956.6A CN112424729A (en) 2018-06-16 2019-06-17 System and method for modifying image enhancement parameters of a portable display
GB2020074.7A GB2589255A (en) 2018-06-16 2019-06-17 System and method for modifying image enhancement parameters for a portable display

Applications Claiming Priority (8)

Application Number Priority Date Filing Date Title
GB1809905.1 2018-06-16
GBGB1809904.4A GB201809904D0 (en) 2018-06-16 2018-06-16 System and method for modifying image enhancement parameters for a portable display
GB1809904.4 2018-06-16
GBGB1809905.1A GB201809905D0 (en) 2018-06-16 2018-06-16 Hand held device for controlling digital magnification on a portable display
GBGB1819145.2A GB201819145D0 (en) 2018-11-24 2018-11-24 Hand held device for controlling digital magnification on a portable display
GB1819145.2 2018-11-24
GBGB1819144.5A GB201819144D0 (en) 2018-11-24 2018-11-24 System and method for modifying image enhancement parameters for a portable display
GB1819144.5 2018-11-24

Publications (1)

Publication Number Publication Date
WO2019239161A1 true WO2019239161A1 (en) 2019-12-19

Family

ID=66998444

Family Applications (2)

Application Number Title Priority Date Filing Date
PCT/GB2019/051687 WO2019239161A1 (en) 2018-06-16 2019-06-17 System and method for modifying image enhancement parameters for a portable display
PCT/GB2019/051688 WO2019239162A1 (en) 2018-06-16 2019-06-17 Hand held device for controlling digital magnification on a portable display

Family Applications After (1)

Application Number Title Priority Date Filing Date
PCT/GB2019/051688 WO2019239162A1 (en) 2018-06-16 2019-06-17 Hand held device for controlling digital magnification on a portable display

Country Status (3)

Country Link
CN (2) CN112313731A (en)
GB (2) GB2588055A (en)
WO (2) WO2019239161A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112733624B (en) * 2020-12-26 2023-02-03 电子科技大学 People stream density detection method, system storage medium and terminal for indoor dense scene

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110221656A1 (en) * 2010-02-28 2011-09-15 Osterhout Group, Inc. Displayed content vision correction with electrically adjustable lens
US20170336882A1 (en) * 2016-05-17 2017-11-23 Google Inc. Virtual/augmented reality input device

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090262074A1 (en) * 2007-01-05 2009-10-22 Invensense Inc. Controlling and accessing content using motion processing on mobile devices
JP2009140107A (en) * 2007-12-04 2009-06-25 Sony Corp Input device and control system
JP4725818B2 (en) * 2009-02-20 2011-07-13 ソニー株式会社 INPUT DEVICE AND METHOD, INFORMATION PROCESSING SYSTEM, AND PROGRAM
US9564075B2 (en) * 2009-12-30 2017-02-07 Cyweemotion Hk Limited Electronic control apparatus and method for responsively controlling media content displayed on portable electronic device
GB201310364D0 (en) * 2013-06-11 2013-07-24 Sony Comp Entertainment Europe Head-mountable apparatus and systems
KR102083596B1 (en) * 2013-09-05 2020-03-02 엘지전자 주식회사 Display device and operation method thereof
US10459254B2 (en) * 2014-02-19 2019-10-29 Evergaze, Inc. Apparatus and method for improving, augmenting or enhancing vision
WO2016036412A1 (en) * 2014-09-02 2016-03-10 Apple Inc. Remote camera user interface
JP6761228B2 (en) * 2015-06-09 2020-09-23 キヤノン株式会社 Display devices and their control methods, programs, and storage media
US20170214856A1 (en) * 2016-01-22 2017-07-27 Mediatek Inc. Method for controlling motions and actions of an apparatus including an image capture device having a moving device connected thereto using a controlling device
WO2018065986A1 (en) * 2016-10-06 2018-04-12 Remoria Vr S.R.L. Orientation and motion tracking controller
US20180144554A1 (en) * 2016-11-18 2018-05-24 Eyedaptic, LLC Systems for augmented reality visual aids and tools
CN107764262B (en) * 2017-11-09 2019-10-25 深圳创维新世界科技有限公司 Virtual reality shows equipment, system and pose calibrating method
CN108108015A (en) * 2017-11-20 2018-06-01 电子科技大学 A kind of action gesture recognition methods based on mobile phone gyroscope and dynamic time warping

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110221656A1 (en) * 2010-02-28 2011-09-15 Osterhout Group, Inc. Displayed content vision correction with electrically adjustable lens
US20170336882A1 (en) * 2016-05-17 2017-11-23 Google Inc. Virtual/augmented reality input device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
MICHAEL KALLONIATISCHARLES LUU, VISUAL ACUITY, Retrieved from the Internet <URL:https://webvision.med.utah.edu/book/part-viii-psychophysics-of-vision/visual-acuity>

Also Published As

Publication number Publication date
GB202020074D0 (en) 2021-02-03
CN112424729A (en) 2021-02-26
WO2019239162A1 (en) 2019-12-19
CN112313731A (en) 2021-02-02
GB2588055A (en) 2021-04-14
GB202020076D0 (en) 2021-02-03
GB2589255A (en) 2021-05-26

Similar Documents

Publication Publication Date Title
US11298288B2 (en) Providing enhanced images for navigation
EP3330771B1 (en) Display apparatus and method of displaying using focus and context displays
US9711114B1 (en) Display apparatus and method of displaying using projectors
US9311718B2 (en) Automated content scrolling
US20170208312A1 (en) Apparatus and method for a dynamic &#34;region of interest&#34; in a display system
WO2016185563A1 (en) Head-mounted display, head-up display, and picture display method
KR20180096434A (en) Method for displaying virtual image, storage medium and electronic device therefor
JP2006267604A (en) Composite information display device
US20230336865A1 (en) Device, methods, and graphical user interfaces for capturing and displaying media
CN111886564A (en) Information processing apparatus, information processing method, and program
US11774764B2 (en) Digital glasses having display vision enhancement
JP6576639B2 (en) Electronic glasses and control method of electronic glasses
WO2019239161A1 (en) System and method for modifying image enhancement parameters for a portable display
KR20180045644A (en) Head mounted display apparatus and method for controlling thereof
EP3441847B1 (en) Controller for use in a display device
KR20080033681A (en) Method for the vision assistance in head mount display unit and head mount display unit therefor
JP2017091190A (en) Image processor, image processing method, and program
US20240233288A1 (en) Methods for controlling and interacting with a three-dimensional environment
US20240103685A1 (en) Methods for controlling and interacting with a three-dimensional environment
US20240104843A1 (en) Methods for depth conflict mitigation in a three-dimensional environment
US20240005630A1 (en) Real Time Visual Mitigation of a Live Camera Feed
US20240103678A1 (en) Devices, methods, and graphical user interfaces for interacting with extended reality experiences
EP4242736A1 (en) System comprising an optical device and a controller
CN113660477A (en) VR glasses and image presentation method thereof
CN117170602A (en) Electronic device for displaying virtual object

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19732110

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 202020074

Country of ref document: GB

Kind code of ref document: A

Free format text: PCT FILING DATE = 20190617

122 Ep: pct application non-entry in european phase

Ref document number: 19732110

Country of ref document: EP

Kind code of ref document: A1