CN112424729A - System and method for modifying image enhancement parameters of a portable display - Google Patents

System and method for modifying image enhancement parameters of a portable display Download PDF

Info

Publication number
CN112424729A
CN112424729A CN201980046956.6A CN201980046956A CN112424729A CN 112424729 A CN112424729 A CN 112424729A CN 201980046956 A CN201980046956 A CN 201980046956A CN 112424729 A CN112424729 A CN 112424729A
Authority
CN
China
Prior art keywords
image
computing device
portable computing
wearable device
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201980046956.6A
Other languages
Chinese (zh)
Inventor
S·L·F·希克斯
N·A·拉塞尔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Renming Hangzhou Intelligent Technology Co ltd
Original Assignee
Renming Hangzhou Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from GBGB1809905.1A external-priority patent/GB201809905D0/en
Priority claimed from GBGB1809904.4A external-priority patent/GB201809904D0/en
Priority claimed from GBGB1819145.2A external-priority patent/GB201819145D0/en
Priority claimed from GBGB1819144.5A external-priority patent/GB201819144D0/en
Application filed by Renming Hangzhou Intelligent Technology Co ltd filed Critical Renming Hangzhou Intelligent Technology Co ltd
Publication of CN112424729A publication Critical patent/CN112424729A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/001Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
    • G09G3/003Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background to produce spatial visual effects
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1694Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/39Control of the bit-mapped memory
    • G09G5/391Resolution modifying circuits, e.g. variable screen formats
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0112Head-up displays characterised by optical features comprising device for genereting colour display
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0118Head-up displays characterised by optical features comprising devices for improving the contrast of the display / brillance control visibility
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/06Adjustment of display parameters
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/06Adjustment of display parameters
    • G09G2320/0606Manual adjustment
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/06Adjustment of display parameters
    • G09G2320/066Adjustment of display parameters for control of contrast
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/06Adjustment of display parameters
    • G09G2320/0666Adjustment of display parameters for control of colour parameters, e.g. colour temperature
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/06Adjustment of display parameters
    • G09G2320/0673Adjustment of display parameters for control of gamma adjustment, e.g. selecting another gamma curve
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/08Arrangements within a display terminal for setting, manually or automatically, display parameters of the display terminal
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/04Changes in size, position or resolution of an image
    • G09G2340/045Zooming at least part of an image, i.e. enlarging it or shrinking it
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2354/00Aspects of interface with display user

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Human Computer Interaction (AREA)
  • Optics & Photonics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A smart eyewear system (10) arranged to allow a sight-impaired user to modify one or more control parameters of an image provided by the system (10) in accordance with his eye condition, comprising: a portable computing device (100), comprising: a motion sensor; a smart-glasses-based wearable device (160) comprising a display portion (162) disposed within a field of view of a user; and an image capture device (163), wherein the portable computing device (100) is operatively coupled with the smart-glasses-based wearable device (160), the system (10) configured to display, on the display portion (162), an image corresponding to at least a portion of the image captured by the image capture device (163), wherein the system (10) is configured to detect a rotational motion (102) of the portable computing device (100) in the hand of the user through the motion sensor (102), wherein one or more control parameters of the image displayed on the wearable device (160) are modified based on the rotational motion of the portable computing device (100).

Description

System and method for modifying image enhancement parameters of a portable display
Technical Field
The present disclosure relates to a system for modifying image enhancement parameters of a portable display, such as a portable display configured in smart glasses, and a method thereof.
Background
The following description includes information that may be useful in understanding the present invention. It is not an admission that any of the information provided herein is prior art or relevant to the presently claimed invention, or that any publication specifically or implicitly referenced is prior art.
There are many image enhancement techniques for improving the vision of a visually impaired person. Such techniques include, but are not limited to, video through of color/RGB images, edge detection and rendering these edges as white on a black background, applying white edges on a color or grayscale image, rendering a black and white high contrast image with a global threshold that applies to the entire screen, rendering a black and white high contrast image with multiple region thresholds to compensate for lighting variations on the screen, and for example, algorithms that detect large regions of similar hue (independent of brightness) and then redraw these regions into high brightness color panels of the same color to aid in low vision.
These image processing methods are good at improving the visibility of objects in the real world, especially for people with weak eyesight. The parameters of these methods/techniques may be set to find the best settings for many visual scenarios. However, the world seen is highly dynamic, so preset parameters for each method/technique may not always apply. Examples of the dynamic nature of the visual world include that as people move or turn around between environments, ambient lighting varies by many orders of magnitude, the contrast of object surface details varies greatly (so edge detection algorithms that set detection parameters are not applicable to all cases), and the contrast spectrum of certain objects such as faces and text varies widely (so automatic thresholding algorithms do not optimally enhance key features of visibility on different objects).
There is therefore a need in the art for systems and methods for modifying image enhancement parameters for portable displays in real-world scenes.
All publications herein are incorporated by reference to the same extent as if each individual publication or patent application was specifically and individually indicated to be incorporated by reference. Where a definition or use of a term in an incorporated reference is inconsistent or contrary to the definition of that term provided herein, the definition of that term provided herein applies and the definition of that term in the reference does not apply.
Disclosure of Invention
The present disclosure relates to a system and method for modifying image enhancement parameters of a portable display, such as a portable display configured in smart glasses. Smart glasses refer to wearable computer glasses (or ' glasses ') that provide visual information in a user's field of view (with or without intermediate optical elements such as lenses that may be substantially transparent or partially transparent) outside of a scene that the user is able to substantially directly see. Visual information may be provided outside of a scene that is substantially directly viewable by a user by superimposing the information over the user's field of view, for example, in a display element (which display element may be substantially transparent, at least partially transparent, or opaque) in the user's field of view. The display may be a substantially opaque LED or LCD display of the type used in mobile phones, for example, or may be partially transparent. In some embodiments, the image may be projected from the light source onto a display, such as the type used in head-up display (HUD) displays or Augmented Reality (AR) overlays, in the form of a projector device, and the reflected image may be seen by the user.
A smart eyewear system arranged to allow a sight-impaired user to modify one or more control parameters of images provided by the system in accordance with their eye condition, comprising:
a portable computing device including a motion sensor;
a smart-glasses-based wearable device comprising a display portion disposed within a field of view of a user; and
an image-capturing device for capturing an image of a subject,
wherein the portable computing device is operatively coupled with a smart-glasses based wearable device,
the system is configured to display an image corresponding to at least a portion of an image captured by the image capturing device on the display portion,
wherein the system is configured to detect a rotational movement of the portable computing device in the user's hand by means of the motion sensor, wherein one or more control parameters of the image displayed on the wearable device are modified based on the rotational movement of the portable computing device.
This has the advantage that the user can adjust the images provided to them by the system to improve their ability to view the scene. A user may use a wearable device based on smart glasses in an effort to optimize a scene while viewing. Alternatively, the system may be configured wherein the user may adjust the parameters in substantially real time.
The image capture device may comprise a video image capture device. The image capturing device may comprise at least one CMOS image capturing device and/or at least one CCD image capturing device. Other image capture devices may be useful. The wearable device may generate a substantially real-time stream of images captured by the image capture device.
Optionally, the image capture device is configured to capture a scene having at least a portion in a field of view of a person wearing the wearable device, the system being configured to display on the display portion of the wearable device at least a portion of the scene captured by the image capture device corresponding to a position of the display in the field of view of the person wearing the wearable device.
It will be understood that reference to "the field of view of the person wearing the wearable device" is to be understood with respect to the person wearing the wearable device such that the display portion is within its field of view, optionally looking substantially straight ahead of their field of view, optionally with their eyes facing in a prescribed direction, said direction being any of the following prescribed directions: directly up (12 o 'clock direction), directly down (6 o' clock direction), or any specified direction from 12 o 'clock clockwise to 12 o' clock.
Optionally, the one or more control parameters are selected from any one or a combination of: through of the image, color or grayscale display of the image, brightness of the image, edge detection/enhancement in the image, contrast of the image, color enhancement of the image, line width (line thickness) in the image, enhancement of text forming part of the image, illumination forming part of the image, and white in the image: black ratio.
Optionally, the device comprises a selection interface allowing a user of the wearable device to select a set of control parameters from the one or more control parameters that require modification for the image.
Optionally, the motion sensor comprises a gyroscope positioned in such a way that: upon activation of the portable computing device, determining an orientation of the portable computing device, and upon rotation of a hand of the portable computing device, modifying one or more control parameters of an image displayed on a wearable device based on a rotational speed calculated using the gyroscope.
Optionally, the motion sensor comprises an accelerometer arranged to determine information indicative of linear acceleration of the portable computing device relative to gravity.
Optionally, the system is configured to transmit information indicative of a linear acceleration of the portable computing device relative to gravity to the wearable device.
Optionally, the motion sensor comprises a magnetometer that determines information indicative of the instantaneous orientation of the portable computing device relative to the earth's magnetic field.
Optionally, the system is configured to transmit information indicative of the instantaneous orientation of the portable computing device relative to the earth's magnetic field to the wearable device.
Optionally, the system is configured wherein the respective outputs from the gyroscope, the accelerometer and the magnetometer are fused to produce orientation and motion of the portable computing device in any direction.
Optionally, the portable computing device is activated by pressing a button present in/on the computing device, wherein one or more control parameters of the image are modified only during the time the button is kept pressed.
Optionally, the degree of variation of the one or more control parameters is proportional to the degree of hand rotation.
Optionally, the orientation of the portable computing device is determined based on a fusion of position data from one or more components of the motion sensor.
Optionally, when control parameter modification operations are paused to generate modified images, a user of the wearable device is allowed to view the modified images and/or translate the modified images in the X and Y axes and/or scroll the modified images.
Optionally, the absolute position of the portable computing device is configured to indicate a level of a control parameter.
Alternatively, when a button is pressed, the system determines the absolute position of the portable computing device and sets the level of the control parameter according to the absolute position.
An advantage of this feature is that the user can set the level of the control panel substantially immediately by first placing their hand in an orientation corresponding to the desired value of the control parameter and pressing the button.
Optionally, the motion sensor comprises an Inertial Measurement Unit (IMU).
Optionally, the motion sensor is an Inertial Measurement Unit (IMU).
Optionally, the wearable device comprises an image capture device.
The image capture device may be an integral part of the wearable device.
Optionally, the image capturing device is provided external to the wearable device, the image capturing device being operatively coupled to the wearable device.
Optionally, the image capture device is operatively coupled to the wearable device by a wireless connection or a wired connection.
In one aspect of the invention, there is provided a method of controlling an image displayed on a display of a smart-glasses-based wearable device of a smart-glasses system, the smart-glasses system being arranged to allow a sight-impaired user to modify one or more control parameters of the image in accordance with their eye condition, the display being disposed in a field of view of the user, the method comprising:
detecting, by a motion sensor included with the device, rotational motion of a portable computing device of the system in a hand of a user, the portable computing device operatively coupled to the wearable device;
capturing an image of a scene by means of an image capturing device of the system;
displaying an image corresponding to at least a part of an image captured by an image capturing apparatus on a display section,
the method includes modifying one or more control parameters of an image displayed on the display portion based on rotational motion of the portable computing device.
Optionally, the method comprises: capturing a scene with the aid of the image capture device, the scene having at least a portion in a field of view of a person wearing the wearable device, the method comprising: displaying, on a display portion of the wearable device, at least a portion of a scene captured by the image capture device corresponding to a position of the display in a field of view of a person wearing the wearable device.
Thus, the person wearing the wearable device will see the image captured by the image capture device within their field of view.
Optionally, the image displayed by the display portion occupies a portion, but not all, of the user's field of view, wherein the displayed image is substantially continuous with the remainder of the user's field of view such that the displayed image appears to be superimposed on the scene. It should be appreciated that the display portion may be at least partially transparent, thereby allowing a user to see through the display portion objects in the portion of the field of view occupied by the display portion and information displayed on the display portion by the system.
Optionally, the method further comprises the steps of:
receiving a new gyroscope value from the motion sensor as part of a change in gyroscope value due to motion of the portable computing device;
smoothing the received new gyroscope value by using a sliding window filter;
normalizing the smoothed gyroscope values; and
accumulating the normalized gyroscope values to indicate a degree of hand rotation of the portable computing device.
Optionally, the one or more control parameters are selected from any one or a combination of: through of the image, color or grayscale display of the image, brightness of the image, edge detection/enhancement in the image, contrast of the image, color enhancement of the image, line width in the image, enhancement of text forming part of the image, lighting forming part of the image, and white in the image: black ratio.
In an aspect of the invention, there is provided a portable computing device for use with a smart eyewear system arranged to allow a sight-impaired user to modify one or more control parameters of an image provided by the system in accordance with their eye condition, the portable computing device comprising a motion sensor; the portable computing device is arranged to be operatively coupled with a smart-glasses-based wearable device, the smart-glasses-based wearable device including a display portion disposed within a field of view of a user, the smart-glasses system further comprising: an image capture device, the system configured to display an image corresponding to at least a portion of the image captured by the image capture device on a display portion of the wearable device, wherein the system is configured to detect a rotational motion of the portable computing device in a hand of a user by a motion sensor, wherein one or more control parameters of the image displayed on the wearable device are modified based on the rotational motion of the portable computing device.
In an aspect of the invention, there is provided a smart-glasses-based wearable device arranged to be operatively coupled with the portable computing device of the preceding aspect, the system being arranged to allow a vision-impaired user to modify one or more control parameters of images provided by the system in accordance with their eye condition.
In one aspect, the present disclosure is directed to a portable computing device operably coupled with a smart-glasses-based wearable device, wherein the portable computing device may include an Inertial Measurement Unit (IMU) having a gyroscope positioned such that: such that when the portable computing device is started, an orientation of the portable computing device may be determined, and when a hand of the portable computing device is rotated, one or more control parameters of an image displayed on the wearable device may be modified based on the rotational speed calculated using the gyroscope.
In one aspect, the one or more control parameters may be selected from any one or a combination of: through of the image, color or grayscale display of the image, brightness of the image, edge detection/enhancement in the image, contrast of the image, color enhancement of the image, line width in the image, enhancement of text forming part of the image, lighting forming part of the image, and white in the image: black ratio.
In an aspect, the apparatus may include a selection interface that allows a user of the wearable apparatus to select a set of control parameters from the one or more control parameters that need to be modified for the image.
In one aspect, the IMU may further include an accelerometer to transmit a magnitude of linear acceleration of the portable computing device relative to gravity. In another aspect, the IMU may also include a magnetometer that determines and transmits the instantaneous orientation of the portable computing device relative to the earth's magnetic field. In yet another aspect, the various outputs from the gyroscope, accelerometer, and magnetometer can be fused to produce orientation and motion of the portable computing device in any direction.
In one aspect, a portable computing device may be activated by pressing a button present in/on the computing device, wherein one or more control parameters of an image are modified only during the time that the button is held down.
In another aspect, the degree of hand rotation is proportional to the degree of change in the one or more control parameters.
In another aspect, the orientation of the portable computing device may be determined based on a fusion of position data from one or more components of the IMU, the one or more components including at least an accelerometer.
In another aspect, when a control parameter modification operation is paused to generate a modified image, a user of the wearable device is allowed to view the modified image and/or translate the modified image in the X and Y axes and/or scroll the modified image.
In an aspect, the absolute position of the portable computing device is configured to indicate a level of a control parameter.
In another aspect, the present disclosure is directed to a method of modifying, by a portable computing device, one or more control parameters on an image displayed in a smart-glasses-based wearable device, the method comprising the steps of: receiving, at a portable computing device, a change in a gyroscope value indicative of a degree of hand rotation of the portable computing device from a gyroscope sensor configured in the portable computing device, the hand rotation being mapped to one or more control parameters; determining, at the portable computing device, an orientation of the portable computing device using an accelerometer configured in the portable computing device; and generating an image modification signal from the portable computing device to the eyewear-based wearable device as a function of the change in the gyroscope value and the determined orientation, wherein the image is modified with respect to the one or more control parameters based on the image modification signal.
In one aspect, the method may further comprise the steps of: receiving a new gyroscope value as part of the change in gyroscope value; smoothing the received new gyroscope value by using a sliding window filter; normalizing the smoothed gyroscope values; accumulating the normalized gyroscope values to indicate a degree of hand rotation of the portable computing device.
In one aspect of the invention, a portable computing device operatively coupled with a smart-glasses-based wearable device is provided. In one aspect, a portable computing device may include an Inertial Measurement Unit (IMU) having a gyroscope positioned in such a way that: such that upon activation of the portable computing device, an orientation of the portable computing device may be determined, and upon rotation of a hand of the portable computing device, one or more control parameters of an image displayed on the wearable device may be modified based on the rotational speed calculated using the gyroscope.
Drawings
Fig. 1 and 2 show exemplary representations of the proposed apparatus according to embodiments of the present disclosure.
Fig. 3A-3E show exemplary representations showing how the proposed apparatus can be used to control at least one parameter of an imaging/image enhancement technique.
Fig. 4A-4E illustrate exemplary flow diagrams enabling adjustment of different properties of a captured video/image according to embodiments of the present disclosure.
Detailed Description
Embodiments of the present disclosure include various steps, which will be described below. The steps may be performed by hardware components or may be embodied in machine-executable instructions, which may be used to cause a general-purpose processor or special-purpose processor programmed with the instructions to perform the steps. Alternatively, the steps may be performed by a combination of hardware, software, firmware, and/or by an operator.
Embodiments of the present disclosure may be provided as a computer program product that may include a machine-readable storage medium tangibly embodying instructions thereon, which may be used to program a computer (or other electronic devices) to perform a process. The machine-readable medium may include, but is not limited to, a fixed (hard) drive, magnetic tape, floppy disk, optical disk, compact disc read only memory (CD-ROM), and magneto-optical disk, semiconductor memory such as ROM, PROM, Random Access Memory (RAM), Programmable Read Only Memory (PROM), erasable PROM (eprom), electrically erasable PROM (eeprom), flash memory, magnetic or optical cards, or other type of media/machine-readable medium suitable for storing electronic instructions (e.g., computer program code, such as software or firmware).
The various methods described herein may be implemented by combining one or more machine-readable storage media containing code in accordance with the present disclosure with appropriate standard computer hardware to execute the code contained therein. Apparatus for practicing various embodiments of the disclosure may include one or more computers (or one or more processors within a single computer) and storage systems containing, or having network access to, computer programs encoded according to the various methods described herein, and the method steps of the disclosure may be implemented by modules, routines, subroutines, or sub-parts of a computer program product.
If the specification states a component or feature "may", "might", "could", or "might" be included or have a particular characteristic, that particular component or feature need not be included or have that characteristic.
Arrangements and embodiments may now be described more fully with reference to the accompanying drawings, in which exemplary embodiments are shown. Embodiments may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, embodiments may be provided so that this disclosure will be thorough and complete, and will fully convey the concept to those skilled in the art.
The suffixes "module", "unit" and "part" may be used for elements to facilitate disclosure. The suffix itself may not impart any significant meaning or effect, and it is to be understood that "module," "unit," and "component" may be used together or interchangeably.
The present disclosure relates to a system for modifying image enhancement parameters of a portable display, such as configured in smart glasses, and a method thereof.
The present disclosure relates to a real-time image processing system designed to improve the vision of severely vision impaired people. The proposed system may include a video input mechanism, which may be a wired or wireless camera, or may include an external streaming video or video, which is a file on the device; wherein the video input mechanism may be presented to the user through a head mounted screen, such as an augmented or virtual reality transparent display, e.g., the display of smart glasses. In the embodiment 10 of fig. 1(a), the smart-glasses based wearable device 160 has a display screen 162 and an image capture device in the form of a camera 163. Fig. 1(b) shows a corresponding embodiment 10A, in which the image capture device is not provided integrally with the wearable device 160. Instead, it may be coupled to the device 160 via a wireless connection. In some embodiments, the image capture devices may additionally or alternatively be coupled by a wired connection. In the embodiment of fig. 1, the display screen 162 is a transparent waveguide with diffractive optics arranged to direct images from an Organic Light Emitting Diode (OLED) microdisplay into the eye of a user. Other arrangements may be useful, such as a transparent waveguide with a beam splitter rather than diffractive optics. Other displays may be useful, such as Liquid Crystal On Silicon (LCOS) displays or Liquid Crystal Displays (LCDs). In some embodiments, an opaque display, including a high resolution OLED panel and one or more optical elements (e.g., lenticular or fresnel lens devices) may be used to direct the image into the user's eye. In the embodiment of fig. 1(a), camera 163 is a CMOS (complementary metal oxide semiconductor) camera, but other cameras may be useful in some embodiments.
In one aspect, the present invention relates to a physical device that can give a user to modify the main control parameters of each of the existing methods/techniques mentioned above: A. video through of color/RGB images, b. edge detection and rendering these edges as white on a black background, c. applying white edges on color or grayscale images, d. rendering black and white high contrast images with global thresholds applicable across the entire screen, e. rendering black and white high contrast images with multiple regional thresholds to compensate for illumination variations across the screen; F. an algorithm, for example, to detect large regions with similar hue (independent of brightness) and then redraw these regions into high-brightness color panels of the same color to aid in low vision).
In one aspect, the proposed apparatus may be configured to receive one parameter from each of the above-mentioned techniques (a-F) and make the respective received parameter adjustable. In an aspect, the proposed device may be configured as a portable gesture device and may enable the provision of intuitive controls that mimic other well-known control mechanisms, such as volume knobs on audio devices. Due to the relationship between the motion of the device and the rate of change of the parameters, the proposed device can quickly and intuitively change the parameters for one or more existing image enhancement techniques over a wide range.
In one aspect, the proposed device system may be activated by a button press, wherein on the first press the orientation of the device may be calculated. In an exemplary embodiment, the orientation of a device may be determined based on a fusion of position data from components of an inertial measurement device (IMU) including an accelerometer configured to determine and transmit real-time values of the device position relative to gravity. The accelerometer may also be configured to transmit the magnitude of any linear acceleration in three dimensions. The IMU of the present disclosure may further include a gyroscope that indicates instantaneous rotational speed in three dimensions. An optional magnetometer may also be configured in the IMU and configured to give the instant orientation of the proposed apparatus/handheld device with respect to the earth's magnetic field (i.e. compass). These three data sources may be combined or "fused" to give the orientation and motion of the apparatus/handheld device in any direction. The data fusion can be derived by many well-known algorithms, such as kalman filters.
In one aspect, the initial orientation of the handheld device is set to zero upon pressing a button on the proposed apparatus. Any rotation around a defined axis of the handheld device may be interpreted as an increase or decrease of the main control parameter. In an exemplary embodiment, the axis of rotation may be defined along the length of the device, which is the same axis as the wrist. For example, clockwise scrolling may increase white on a high contrast display: black ratio. On the other hand, scrolling counterclockwise may reduce white on a high contrast display: black ratio. Alternatively, scrolling counterclockwise may increase the white on a high contrast display: black ratio, while clockwise scrolling may reduce white on a high contrast display: black ratio.
In one aspect, the user may be given the ability to view parameter changes in real time as they rotate the proposed apparatus/handheld device, which may create an intuitive feedback system, allowing the user to have their modifications to the image very specific. During the "adjustment" phase, the video may continue to be transmitted to the display in real-time.
In an exemplary embodiment, upon release of the button, the modified imaging parameters may be set. Thus, the modified imaging parameters can be set to a new state of the system, a state in which the system continues to operate until the state changes. In some embodiments, upon release of the button, the contrast state may revert to the default state. In some embodiments, the contrast state may revert to the default state after a predetermined period of time has elapsed. Other arrangements may be useful. In some embodiments, the user may select the behavior of the system when the button is released, such as whether to maintain a changed parameter, such as an instant black to white contrast setting, or whether the system reverts to a default value for that parameter.
As mentioned above, each imaging/image enhancement technique may have at least one parameter, which may be modified by the proposed adjustment means. For example, for video-through (color or grayscale display) techniques, the parameter may include an adjustment to increase or decrease the overall image brightness. Similarly, for black-on-white edge techniques, the parameters may include adjustments to modify thresholds for edge detection. Decreasing the threshold increases the number of edges displayed. Increasing the threshold reduces the number of edges displayed. For the video through plus white edge technique, the same parameters as the black-on-white edge technique can be used, except that the edges are displayed on real-time video (color or grayscale). On the other hand, for high contrast global threshold techniques, the parameters may include the ratio of black: the white threshold shifts to an adjustment of white or black. This increases either the amount of white or the amount of black on the screen. For high contrast, multi-region threshold techniques, the parameters may include modification of the erosion and dilation parameters followed by increasing the linewidth (a process called "erosion") or decreasing the linewidth (a process called "dilation"). Finally, for color detection and saturation image enhancement techniques, the parameters may include adjustments that rotate the detection window throughout the color spectrum, allowing for saturation of a particular color.
It is understood that contrast is an important parameter for assessing vision clinically. Clinical visual acuity measurements typically use high contrast images, such as black letters on a white background. In fact, the contrast between the object and its surroundings varies. The relationship between visual acuity and contrast allows a more detailed understanding of visual perception.
The resolving power of the eye can be measured by means of a sinusoidal grating pattern with adjustable spacing (spatial periodicity). The contrast of a grating is the differential intensity threshold of the grating, which is defined as the ratio:
C=(Lmax-Lmin)/(Lmax+Lmin)
where L is the brightness of the grating pattern, and is the direction perpendicular to the orientation of the parallel elements of the gratingThe function of the upward spatial distance, C, may be referred to as the modulation or rayleigh (Raleigh) or Michelson (Michelson) contrast. The value of C may be between 0.0 and 1.0. Further details can be found in "Visual Acuity" of Michael Kallonitis and Charles Luu, which can be found inhttps://webvision.med.utah.edu/book/part-viii-psychophysics-of- vision/visual-acuity/Are found in, part of which are discussed below.
As the spatial frequency of a set of black/white lines increases, i.e., the line width decreases, they become more difficult to resolve and begin to look like a uniform gray area. The sensitivity of the human eye to contrast can be measured by determining the minimum grating separation each eye can resolve as a function of image contrast. This can be done, for example, by reducing the contrast for a given spatial frequency until a person no longer detects the grating — this value is the "contrast threshold" for that grating size (spatial frequency). The inverse of this contrast threshold is called the "contrast sensitivity". The contrast threshold can be expressed as sensitivity on the decibel (dB) scale: the contrast sensitivity in dB is-20 log10C, where C is the threshold for modulating contrast (as described above). The plot of (contrast) sensitivity versus spatial frequency is referred to as the spatial contrast sensitivity function (referred to as SCSF or simply CSF).
Fig. 1(b) schematically illustrates the manner in which the Contrast Sensitivity Function (CSF) of an individual may be influenced in dependence on the medical condition of the individual. The figure shows the logarithm (contrast sensitivity) as a function of the logarithm (spatial frequency) (c/deg). Line N represents the expected CSF of a healthy individual. Line a represents the expected CSF of individuals with loss of contrast in mid-low regions (log spatial frequency), a characteristic of individuals with multiple sclerosis; line B represents CSR of individuals with overall reduction of CSF over the entire spatial frequency range, characteristic of cataract patients; while line C represents CSF of individuals with mild refractive error or mild amblyopia (line B represents the characteristic of individuals with more severe cases).
Further information can also be found at the following web site: https:// www.semanticscholar.org/paper/composing-the-Shape-of-Contrast-Sensitivity-for-and-Chung-Legge/92 c9647ee47507ce50e2792eb9504106734d37 ea.
In one aspect, the proposed apparatus may be coupled to any portable display, such as smart glasses, that is operatively coupled to a camera that receives a series of different video sources. Other exemplary portable display devices to which the proposed apparatus may be applied may include, but are not limited to, head mounted cameras, external wireless cameras, video streams from broadcast sources (e.g., television), closed loop video (e.g., theater, concert, or live sporting events), and video sources on the apparatus (e.g., movie files, internet streaming video, etc.). In each of these cases, the proposed apparatus may apply any of the image enhancement algorithms listed previously, and each of these enhancements may be modified in real time by the "adjustment" apparatus outlined in the present disclosure.
Fig. 1(a) and 2 show exemplary representations of the proposed system in which a portable handheld computing device 100 with motion sensors 102 may be physically connected to a smart-glasses-based wearable device 160, or may be wirelessly coupled via bluetooth, or may be mounted on a frame of the smart glasses/wearable device, or any other configuration, all within the scope of the present disclosure, according to embodiments of the present disclosure.
As described above, the present disclosure provides an electronic system/product 100 that may include an Inertial Measurement Unit (IMU)102 (which may also alternatively or additionally include magnetometer 108) having a gyroscope 104 and an accelerometer 106, wherein during implementation/operation, a user may press and hold a button 150 on the proposed device 100 and then rotate his/her hand, just as a tuning button on a volume controller is controlled.
In one aspect, the initial orientation of the device 100 is set to zero upon pressing the button 150 on the proposed device 100. Any rotation about the defined axis of the apparatus/handheld device 100 may be interpreted as an increase or decrease of the primary control parameter. In an exemplary embodiment, the axis of rotation (see fig. 2) may be defined as the same axis along the length of the device as the wrist. For example, clockwise scrolling may increase white on a high contrast display (e.g., the transparent display 162 of the smart glasses 160): black ratio. On the other hand, scrolling counterclockwise may reduce the white on the high contrast display 162: black ratio.
In one aspect, the user may be given the ability to view parameter changes in real time as the user rotates the proposed apparatus/handheld device, which may create an intuitive feedback system, allowing the user to modify the image very accurately. During the "adjust" phase, video may continue to be delivered to display 162 in real-time. In an exemplary embodiment, upon release of the button, the modified imaging parameters may be set.
Fig. 3A-3E show exemplary representations showing how the proposed apparatus can be used to control at least one parameter of an imaging/image enhancement technique. For example, for video-through (color or grayscale display) techniques, as shown in FIG. 3A, the parameter may include an adjustment to increase or decrease the brightness of the general image. Similarly, for a black-on-white edge technique, as shown in FIG. 3B, the parameters may include adjustments that modify the thresholds for edge detection. Decreasing the threshold increases the number of edges displayed. Increasing the threshold reduces the number of edges displayed. For the video-through-plus-white-edge technique, as shown in FIG. 3C, the same parameters as for black-on-white edges can be used, except that the edges are displayed on real-time video (color or grayscale). On the other hand, for the high contrast global threshold technique, as shown in fig. 3D, the parameters may include the ratio of black: the white threshold shifts to an adjustment of white or black. This increases either the amount of white or the amount of black on the screen. For high contrast, multi-region threshold techniques, as shown in fig. 3E, the parameters may include modification of the erosion and dilation parameters followed by increasing the linewidth (a process called "erosion") or decreasing the linewidth (a process called "dilation"). Finally, for color detection and saturation image enhancement techniques, the parameters may include adjustments that rotate the detection window throughout the color spectrum, allowing for saturation of a particular color.
As will be appreciated, with the present invention, the user simply rotates the handheld device, making it easy to perform image enhancement and intuitive to describe. Further, the ratio of rotation to parameter control may be varied such that: for highly dynamic environments, a smaller rotation produces a larger change, or for fine tuning of image parameters, a larger rotation produces a smaller change.
In an aspect, the proposed apparatus may also be configured to modify different parameters of the image for other rotational axes of the apparatus. For example, the roll axis of the proposed device may change the main control parameter; the pitch axis may change the ratio of rotation → the primary control parameter, which may allow a person to first make large changes to the overall image and then increase sensitivity to fine tune to the environment and the user's vision level.
Fig. 4A-4E illustrate exemplary flow diagrams enabling adjustment of different properties of a captured video/image according to embodiments of the present disclosure.
Referring to fig. 4A, which illustrates a brightness-based adjustment operation 400, as can be seen, at step 402, a smart eyewear/wearable device operatively coupled with the proposed adjustment-enabled computing device may listen to sensors configured in the adjustment-enabled computing device, upon which the adjustment-enabled computing device receives changes in the y-axis gyroscope values at step 404 and smoothes its output using a sliding window filter at step 406. It is to be understood that a portion of these steps may also be performed in the smart glasses/wearable device, or in any desired combination of the smart glasses/wearable device and the portable computing device that enables adjustment, and all such possible combinations are therefore within the scope of the present invention. At step 408, the output may be normalized, for example, by dividing by a value proportional to the range of effects, followed by accumulating gyroscope values at step 410. At step 412, the portable computing device that enables adjustment may be rotated in a defined direction (clockwise or counterclockwise) to decrease the brightness variable at step 412 and increase the brightness variable at step 414. At step 416, an up key instruction is received by the computing device that enables adjustment, based on which at step 418 the current effect value may be set to a default value.
Referring to fig. 4B, which illustrates an edge enhancement based adjustment operation 420, as can be seen, at step 422 a smart eyewear/wearable device operatively coupled with the proposed adjustment-enabled computing device may listen to sensors configured in the adjustment-enabled computing device, upon which the adjustment-enabled computing device receives changes in the y-axis gyroscope values at step 424, and smoothes its output with a sliding window filter at step 426. It is to be understood that a portion of these steps may also be performed in the smart glasses/wearable device, or in any desired combination of the smart glasses/wearable device and the portable computing device that enables adjustment, and all such possible combinations are therefore within the scope of the present invention. At step 428, the output may be normalized by, for example, dividing by a value proportional to the range of effects, followed by accumulating the gyroscope values at step 430. At step 432, the portable computing device that enables adjustment may be rotated in a defined direction (clockwise or counterclockwise) to increase the threshold for line detection at step 432 and decrease the threshold for line detection at step 434. At step 436, an up key instruction is received by the computing device that enables adjustment, based on which at step 438 the current effect value may be set to a default value.
Referring to fig. 4C, which illustrates a contrast-based adjustment operation 440, as can be seen, at step 442 a smart eyewear/wearable device operatively coupled with the proposed adjustment-enabled computing device may listen for sensors configured in the adjustment-enabled computing device, upon which the adjustment-enabled computing device receives changes in the y-axis gyroscope values at step 444 and smoothes its output with a sliding window filter at step 446. It is to be understood that a portion of these steps may also be performed in the smart glasses/wearable device, or in any desired combination of the smart glasses/wearable device and the portable computing device that enables adjustment, and all such possible combinations are therefore within the scope of the present invention. At step 448, the output may be normalized, for example, by dividing by a value proportional to the range of effects, followed by accumulating gyroscope values at step 450. At step 452, the portable computing device that enables adjustment may be rotated in a defined direction (clockwise or counterclockwise) to increase the threshold for white (increase% black) at step 452 and decrease the threshold for white (increase% white) at step 454. At step 456, an up key instruction is received by the computing device that enables adjustment, based on which at step 458 the current effect value may be set to a default value.
Referring to fig. 4D, which illustrates a color-based adjustment operation 460, as can be seen, a smart glasses/wearable device operatively coupled with the proposed adjustment-enabled computing device may listen for sensors configured in the adjustment-enabled computing device at step 462, upon which the adjustment-enabled computing device receives changes in the y-axis gyroscope values at step 464 and smoothes its output with a sliding window filter at step 466. It is to be understood that a portion of these steps may also be performed in the smart glasses/wearable device, or in any desired combination of the smart glasses/wearable device and the portable computing device that enables adjustment, and all such possible combinations are therefore within the scope of the present invention. At step 468, the output may be normalized, for example, by dividing by a value proportional to the range of effects, followed by accumulating the gyroscope values at step 470. At step 472, the portable computing device that enables adjustment may be rotated in a defined direction (clockwise or counterclockwise) to increase the display of colors in the blue-green range at step 472 (as an example) and display colors in the yellow-red range (for example) at step 474. At step 476, an up key instruction is received by the adjustment-enabled computing device, based on which at step 478 the current effect value may be set to a default value.
Referring to fig. 4E, which illustrates an enhanced text based adjustment operation 480, as can be seen, at step 481, a smart eyewear/wearable device operatively coupled with the proposed adjustment-enabled computing device may listen to sensors configured in the adjustment-enabled computing device, upon which the adjustment-enabled computing device receives changes in the y-axis gyroscope values at step 482 and smoothes its output with a sliding window filter at step 483. It is to be understood that a portion of these steps may also be performed in the smart glasses/wearable device, or in any desired combination of the smart glasses/wearable device and the portable computing device that enables adjustment, and all such possible combinations are therefore within the scope of the present invention. The output may be normalized, at step 484, by, for example, dividing by a value proportional to the range of effects, followed by accumulating gyroscope values, at step 485. At step 486, the portable computing device that enabled the adjustment may be rotated in a defined direction (clockwise or counterclockwise) to thicken the text by adding the erosion variable at step 486 and to refine the text by adding the dilation variable at step 487. At step 488, an up key instruction is received by the computing device that enables adjustment, based on which at step 489 the current effect value may be set to a default value.
Some aspects of the invention may be understood by reference to the following numbered clauses:
1. a portable computing device operatively coupled with a smart-glasses-based wearable device, the portable computing device comprising: an Inertial Measurement Unit (IMU) having a gyroscope positioned such that: upon activation of the portable computing device, determining an orientation of the portable computing device, and upon rotation of a hand of the portable computing device, modifying one or more control parameters of an image displayed on a wearable device based on a rotational speed calculated using the gyroscope.
2. The portable computing device of clause 1, wherein the one or more control parameters are selected from any one or a combination of: through of the image, color or grayscale display of the image, brightness of the image, edge detection/enhancement in the image, contrast of the image, color enhancement of the image, line width in the image, enhancement of text forming part of the image, lighting forming part of the image, and white in the image: black ratio.
3. The portable computing device of clause 1, wherein the device includes a selection interface that allows a user of the wearable device to select a set of control parameters from the one or more control parameters that require modification for the image.
4. The portable computing device of clause 1, wherein the IMU further comprises an accelerometer to transmit a magnitude of linear acceleration of the portable computing device relative to gravity.
5. The portable computing device of clause 4, wherein the IMU further comprises a magnetometer that determines and transmits the instantaneous orientation of the portable computing device relative to the earth's magnetic field.
6. The portable computing device of clause 5, wherein the respective outputs from the gyroscope, the accelerometer, and the magnetometer are fused to produce orientation and motion of the portable computing device in any direction.
7. The portable computing device of clause 1, wherein the portable computing device is activated by pressing a button present in/on the computing device, wherein the one or more control parameters of the image are modified only during the time that the button is held pressed.
8. The portable computing device of clause 1, wherein the degree of hand rotation is proportional to the degree of change in the one or more control parameters.
9. The portable computing device of clause 1, wherein the orientation of the portable computing device is determined based on a fusion of position data from one or more components of the IMU, the one or more components including at least an accelerometer.
10. The portable computing device of clause 1, wherein, when suspending control parameter modification operations to generate a modified image, allowing a user of the wearable device to view the modified image and/or translate the modified image in the X and Y axes and/or scroll the modified image.
11. The portable computing device of clause 1, wherein the absolute position of the portable computing device is configured to indicate a level of a control parameter.
12. A method of modifying, by a portable computing device, one or more control parameters on an image displayed in a smart-glasses-based wearable device, the method comprising the steps of:
receiving, at the portable computing device, a change in a gyroscope value indicative of a degree of hand rotation of the portable computing device from a gyroscope sensor configured in portable computing, the hand rotation mapped to one or more control parameters;
determining, at the portable computing device, an orientation of the portable computing device using an accelerometer configured in the portable computing device; and
generating an image modification signal from a portable computing device to the eyewear-based wearable device based on the change in the gyroscope values and the determined orientation, wherein the image is modified with respect to one or more control parameters based on the image modification signal.
13. The method of clause 12, wherein the method further comprises the steps of:
receiving a new gyroscope value as part of the gyroscope value change;
smoothing the received new gyroscope value by using a sliding window filter;
normalizing the smoothed gyroscope values; and
accumulating normalized gyroscope values to indicate a degree of hand rotation of the portable computing device.
14. The method of clause 12, wherein the one or more control parameters are selected from any one or a combination of: through of the image, color or grayscale display of the image, brightness of the image, edge detection/enhancement in the image, contrast of the image, color enhancement of the image, line width in the image, enhancement of text forming part of the image, lighting forming part of the image, and white in the image: black ratio.
As used herein, and unless the context indicates otherwise, the term "coupled to" is intended to include both direct coupling, in which two elements coupled to each other are in contact with each other, and indirect coupling, in which at least one additional element is located between the two elements. Thus, the terms "coupled to" and "coupled with … …" are used synonymously. In the context of this document, the terms "coupled to" and "coupled with … …" are also used restrictively to mean "communicatively coupled with … …" over a network, wherein two or more devices are capable of exchanging data with each other over the network (possibly through one or more intermediate devices).
It should be apparent to those skilled in the art that many more modifications besides those already described are possible without departing from the inventive concepts herein. The inventive subject matter, therefore, is not to be restricted except in the spirit of the appended claims. Moreover, in interpreting both the specification and the claims, all terms should be interpreted in the broadest possible manner consistent with the context. In particular, the terms "comprises" and "comprising" should be interpreted as referring to elements, components, or steps in a non-exclusive manner, indicating that the referenced elements, components, or steps may be present, or utilized, or combined with other elements, components, or steps that are not expressly referenced. If the claims of the specification refer to at least one member selected from the group consisting of A, B, c. The foregoing description of the specific embodiments will so fully reveal the general nature of the embodiments herein that others can, by applying current knowledge, readily modify and/or adapt for various applications such specific embodiments without departing from the generic concept, and, therefore, such adaptations and modifications should and are intended to be comprehended within the meaning and range of equivalents of the disclosed embodiments. It is to be understood that the phraseology or terminology employed herein is for the purpose of description and not of limitation. Therefore, while the embodiments herein have been described in terms of preferred embodiments, those skilled in the art will recognize that the embodiments herein can be practiced with modification within the spirit and scope of the appended claims.
While various embodiments of the present disclosure have been shown and described herein, it will be clear that the disclosure is not limited to these embodiments only. Numerous modifications, changes, variations, substitutions and equivalents will be apparent to those skilled in the art without departing from the spirit and scope of the present disclosure as described in the claims.

Claims (25)

1. A smart eyewear system (10) arranged to allow a sight-impaired user to modify one or more control parameters of an image provided by the system (10) in accordance with his eye condition, comprising:
a portable computing device (100) including a motion sensor;
a smart-glasses-based wearable device (160) comprising a display portion (162) disposed within a field of view of the user; and
an image capturing device (163),
wherein the portable computing device (100) is operatively coupled with the smart-glasses-based wearable device (160),
the system (10) is configured to display an image corresponding to at least a portion of the image captured by the image capture device (163) on the display portion (162),
wherein the system (10) is configured to detect a rotational motion (102) of a portable computing device (100) in the user's hand by the motion sensor (102), wherein one or more control parameters of an image displayed on the wearable device (160) are modified based on the rotational motion of the portable computing device (100).
2. The system of claim 1, wherein the image capture device is configured to capture a scene having at least a portion in a field of view of a person wearing the wearable device, the system configured to display on the display portion of the wearable device at least a portion of the scene captured by the image capture device corresponding to a position of the display in the field of view of the person wearing the wearable device.
3. The system of claim 1 or claim 2, wherein the one or more control parameters are selected from any one or a combination of: through of the image, color or grayscale display of the image, brightness of the image, edge detection/enhancement in the image, contrast of the image, color enhancement of the image, line width in the image, enhancement of text forming part of the image, lighting forming part of the image, and white in the image: black ratio.
4. The system of any one of the preceding claims, wherein the device includes a selection interface that allows a user of the wearable device to select a set of control parameters from the one or more control parameters that need to be modified for the image.
5. The system of any preceding claim, wherein the motion sensor comprises a gyroscope positioned in such a way that: upon activation of the portable computing device, determining an orientation of the portable computing device, and upon rotation of a hand of the portable computing device, modifying one or more control parameters of an image displayed on the wearable device based on a rotational speed calculated using the gyroscope.
6. The system of any preceding claim, wherein the motion sensor comprises an accelerometer arranged to determine information indicative of linear acceleration of the portable computing device relative to gravity.
7. The system of claim 6, configured to send information indicative of a linear acceleration of the portable computing device relative to gravity to the wearable device.
8. The system of any preceding claim, wherein the motion sensor comprises a magnetometer that determines information indicative of an instantaneous orientation of the portable computing device relative to the earth's magnetic field.
9. The system of claim 8, configured to send information to the wearable device indicative of an instantaneous orientation of the portable computing device relative to the earth's magnetic field.
10. A system according to claim 8 or 9 as dependent on claim 6 as dependent on claim 5, wherein the respective outputs from the gyroscope, the accelerometer and the magnetometer are fused to produce orientation and motion of the portable computing device in any direction.
11. The system of any one of the preceding claims, wherein the portable computing device is activated by pressing a button present in/on the computing device, wherein one or more control parameters of the image are modified only during the time the button is held pressed.
12. The system of any preceding claim, wherein the degree of variation of the one or more control parameters is proportional to the degree of hand rotation.
13. The system of claim 6 or any one of claims 7 to 12 when dependent on claim 6, wherein the orientation of the portable computing device is determined based on a fusion of position data from one or more components of the motion sensor.
14. The system of any one of the preceding claims, wherein when a control parameter modification operation is paused to generate a modified image, a user of the wearable device is allowed to view the modified image and/or translate the modified image in the X and Y axes and/or scroll the modified image.
15. The system of any preceding claim, wherein the absolute position of the portable computing device is configured to indicate a level of a control parameter.
16. The system of claim 15 when dependent on claim 11, wherein when the button is pressed, the system determines an absolute position of the portable computing device and sets the level of the control parameter in dependence on the absolute position.
17. The system of any of the preceding claims, wherein the motion sensor comprises an Inertial Measurement Unit (IMU).
18. The system of any one of the preceding claims, wherein the motion sensor is an Inertial Measurement Unit (IMU).
19. The system of any of the preceding claims, wherein the wearable device comprises the image capture device.
20. The system of any one of claims 1 to 18, wherein the image capture device is disposed external to the wearable device, the image capture device being operatively coupled to the wearable device.
21. The system of claim 20, wherein the image capture device is operatively coupled to the wearable device by a wireless connection or a wired connection.
22. A method of controlling an image displayed on a display of a smart-glasses-based wearable device of a smart-glasses system, the smart-glasses system arranged to allow a vision-impaired user to modify one or more control parameters of the image in accordance with their eye condition, the display disposed in a field of view of the user, the method comprising:
detecting, by a motion sensor included with the device, rotational motion of a portable computing device of a system in a hand of a user, the portable computing device operatively coupled to the wearable device;
capturing an image of a scene by means of an image capturing device of the system;
displaying an image corresponding to at least a part of an image captured by the image capturing apparatus on the display portion,
the method includes modifying one or more control parameters of an image displayed on the display portion based on rotational motion of the portable computing device.
23. The method of claim 22, comprising: capturing a scene with the aid of the image capture device, the scene having at least a portion in a field of view of a person wearing the wearable device, the method comprising: displaying, on a display portion of the wearable device, at least a portion of a scene captured by the image capture device corresponding to a location of the display in a field of view of a person wearing the wearable device.
24. The method according to claim 22 or 23, wherein the method further comprises the step of:
receiving, from the motion sensor, a new gyroscope value as part of a change in a gyroscope value due to motion of the portable computing device;
smoothing the received new gyroscope values by using a sliding window filter;
normalizing the smoothed gyroscope values; and
accumulating normalized gyroscope values to indicate a degree of hand rotation of the portable computing device.
25. The method of any one of claims 22 to 24, wherein the one or more control parameters are selected from any one or a combination of: through of the image, color or grayscale display of the image, brightness of the image, edge detection/enhancement in the image, contrast of the image, color enhancement of the image, line width in the image, enhancement of text forming part of the image, lighting forming part of the image, and white in the image: black ratio.
CN201980046956.6A 2018-06-16 2019-06-17 System and method for modifying image enhancement parameters of a portable display Pending CN112424729A (en)

Applications Claiming Priority (9)

Application Number Priority Date Filing Date Title
GB1809904.4 2018-06-16
GBGB1809905.1A GB201809905D0 (en) 2018-06-16 2018-06-16 Hand held device for controlling digital magnification on a portable display
GBGB1809904.4A GB201809904D0 (en) 2018-06-16 2018-06-16 System and method for modifying image enhancement parameters for a portable display
GB1809905.1 2018-06-16
GB1819144.5 2018-11-24
GBGB1819145.2A GB201819145D0 (en) 2018-11-24 2018-11-24 Hand held device for controlling digital magnification on a portable display
GBGB1819144.5A GB201819144D0 (en) 2018-11-24 2018-11-24 System and method for modifying image enhancement parameters for a portable display
GB1819145.2 2018-11-24
PCT/GB2019/051687 WO2019239161A1 (en) 2018-06-16 2019-06-17 System and method for modifying image enhancement parameters for a portable display

Publications (1)

Publication Number Publication Date
CN112424729A true CN112424729A (en) 2021-02-26

Family

ID=66998444

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201980046956.6A Pending CN112424729A (en) 2018-06-16 2019-06-17 System and method for modifying image enhancement parameters of a portable display
CN201980043476.4A Pending CN112313731A (en) 2018-06-16 2019-06-17 Hand-held device for controlling digital magnification on portable display

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN201980043476.4A Pending CN112313731A (en) 2018-06-16 2019-06-17 Hand-held device for controlling digital magnification on portable display

Country Status (3)

Country Link
CN (2) CN112424729A (en)
GB (2) GB2588055A (en)
WO (2) WO2019239162A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112733624B (en) * 2020-12-26 2023-02-03 电子科技大学 People stream density detection method, system storage medium and terminal for indoor dense scene

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009140107A (en) * 2007-12-04 2009-06-25 Sony Corp Input device and control system
US20090262074A1 (en) * 2007-01-05 2009-10-22 Invensense Inc. Controlling and accessing content using motion processing on mobile devices
CN101815182A (en) * 2009-02-20 2010-08-25 索尼公司 Input equipment and method, information processing system and program
US20110157231A1 (en) * 2009-12-30 2011-06-30 Cywee Group Limited Electronic control apparatus and method for responsively controlling media content displayed on portable electronic device
CN102906623A (en) * 2010-02-28 2013-01-30 奥斯特豪特集团有限公司 Local advertising content on an interactive head-mounted eyepiece
CN106233328A (en) * 2014-02-19 2016-12-14 埃弗加泽公司 For improving, improve or strengthen equipment and the method for vision
US20160365039A1 (en) * 2015-06-09 2016-12-15 Canon Kabushiki Kaisha Display apparatus and control method of the same
US20170336882A1 (en) * 2016-05-17 2017-11-23 Google Inc. Virtual/augmented reality input device
CN107764262A (en) * 2017-11-09 2018-03-06 深圳创维新世界科技有限公司 Virtual reality display device, system and pose calibrating method
WO2018065986A1 (en) * 2016-10-06 2018-04-12 Remoria Vr S.R.L. Orientation and motion tracking controller
US20180144554A1 (en) * 2016-11-18 2018-05-24 Eyedaptic, LLC Systems for augmented reality visual aids and tools
CN108108015A (en) * 2017-11-20 2018-06-01 电子科技大学 A kind of action gesture recognition methods based on mobile phone gyroscope and dynamic time warping

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB201310364D0 (en) * 2013-06-11 2013-07-24 Sony Comp Entertainment Europe Head-mountable apparatus and systems
KR102083596B1 (en) * 2013-09-05 2020-03-02 엘지전자 주식회사 Display device and operation method thereof
TWI600322B (en) * 2014-09-02 2017-09-21 蘋果公司 Method for operating an electronic device with an integratd camera and related electronic device and non-transitory computer readable storage medium
US20170214856A1 (en) * 2016-01-22 2017-07-27 Mediatek Inc. Method for controlling motions and actions of an apparatus including an image capture device having a moving device connected thereto using a controlling device

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090262074A1 (en) * 2007-01-05 2009-10-22 Invensense Inc. Controlling and accessing content using motion processing on mobile devices
JP2009140107A (en) * 2007-12-04 2009-06-25 Sony Corp Input device and control system
CN101815182A (en) * 2009-02-20 2010-08-25 索尼公司 Input equipment and method, information processing system and program
US20110157231A1 (en) * 2009-12-30 2011-06-30 Cywee Group Limited Electronic control apparatus and method for responsively controlling media content displayed on portable electronic device
CN102906623A (en) * 2010-02-28 2013-01-30 奥斯特豪特集团有限公司 Local advertising content on an interactive head-mounted eyepiece
CN106233328A (en) * 2014-02-19 2016-12-14 埃弗加泽公司 For improving, improve or strengthen equipment and the method for vision
US20160365039A1 (en) * 2015-06-09 2016-12-15 Canon Kabushiki Kaisha Display apparatus and control method of the same
US20170336882A1 (en) * 2016-05-17 2017-11-23 Google Inc. Virtual/augmented reality input device
WO2018065986A1 (en) * 2016-10-06 2018-04-12 Remoria Vr S.R.L. Orientation and motion tracking controller
US20180144554A1 (en) * 2016-11-18 2018-05-24 Eyedaptic, LLC Systems for augmented reality visual aids and tools
CN107764262A (en) * 2017-11-09 2018-03-06 深圳创维新世界科技有限公司 Virtual reality display device, system and pose calibrating method
CN108108015A (en) * 2017-11-20 2018-06-01 电子科技大学 A kind of action gesture recognition methods based on mobile phone gyroscope and dynamic time warping

Also Published As

Publication number Publication date
WO2019239162A1 (en) 2019-12-19
GB202020076D0 (en) 2021-02-03
CN112313731A (en) 2021-02-02
GB2588055A (en) 2021-04-14
GB2589255A (en) 2021-05-26
GB202020074D0 (en) 2021-02-03
WO2019239161A1 (en) 2019-12-19

Similar Documents

Publication Publication Date Title
EP3330771B1 (en) Display apparatus and method of displaying using focus and context displays
EP3330772B1 (en) Display apparatus and method of displaying using projectors
CN111095077B (en) Electronic device with adaptive display
US11816257B2 (en) Image changes based on gaze location
JP5420793B1 (en) Head-mounted display with adjustable image viewing distance
WO2017053966A1 (en) Eye-tracking enabled wearable devices
CN108605120A (en) Viewing equipment adjustment based on the eye adjusting about display
WO2013191120A1 (en) Image processing device, method, and program, and storage medium
EP3462283A1 (en) Image display method and device utilized in virtual reality-based apparatus
US20230336865A1 (en) Device, methods, and graphical user interfaces for capturing and displaying media
CN109997067B (en) Display apparatus and method using portable electronic device
CN112445339A (en) Gaze and glance based graphical manipulation
JP2013148599A (en) Display device
CN112424729A (en) System and method for modifying image enhancement parameters of a portable display
JP2017102221A (en) Image display device
US10403002B2 (en) Method and system for transforming between physical images and virtual images
KR20180055637A (en) Electronic apparatus and method for controlling thereof
TW202213994A (en) Augmented reality system and display brightness adjusting method thereof
JP5828400B2 (en) Video display device and video display method
TWI644260B (en) Display apparatus
KR20160098732A (en) System and method for controlling flexible display curvature variable of viewer information
KR20200079876A (en) Head mounted display device for providing an augmented reality
CN109151446A (en) Control method and electronic equipment
US20230396752A1 (en) Electronic Device that Displays Virtual Objects
CN117170602A (en) Electronic device for displaying virtual object

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination