GB2554121A - Image processing - Google Patents

Image processing Download PDF

Info

Publication number
GB2554121A
GB2554121A GB1710049.6A GB201710049A GB2554121A GB 2554121 A GB2554121 A GB 2554121A GB 201710049 A GB201710049 A GB 201710049A GB 2554121 A GB2554121 A GB 2554121A
Authority
GB
United Kingdom
Prior art keywords
image
lesion
product
locating feature
instructions
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1710049.6A
Other versions
GB201710049D0 (en
Inventor
Montague Fuller Charles
Lawrence John Murray Bruce
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Moletest Ltd
Original Assignee
Moletest Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Moletest Ltd filed Critical Moletest Ltd
Publication of GB201710049D0 publication Critical patent/GB201710049D0/en
Publication of GB2554121A publication Critical patent/GB2554121A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/12Edge-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/40Scaling the whole image or part thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20112Image segmentation details
    • G06T2207/20132Image cropping
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30088Skin; Dermal
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30096Tumor; Lesion

Abstract

The invention is related to an image processing product, comprising the machine-executable instructions, for the processing of images of skin lesions 5. An image 2 of a mole is depicted by way of a graphic user interface 1 on a visual display. This is followed by instructions to display a locating feature 4, which is graphically represented, and which is overlain or superimposed on the displayed image, wherein the feature is substantially circular. User input then changes the relative position of the lesion image with respect to the locating feature. The user is allowed to position the lesion within the target area and crop a sub-image comprising the lesion 8 and at least a portion of the surrounding skin. The product may be computer software such as a graphic user interface (GUI). The locating feature may comprise indicia, such as cross-hairs (3,fig.3). The cropped picture may be stored or output.

Description

(71) Applicant(s):
Moletest Limited
PO Box 25, Regency Court, Glategny Esplanade, St Peter Port, GY1 3AP, Guernsey (72) Inventor(s):
Charles Montague Fuller Bruce Lawrence John Murray (56) Documents Cited:
WO 2013/009828 A1 US 20130235076 A1 US 20100149211 A1 US 20010048447 A1 J P 2005-10903 (58) Field of Search:
INT CL G06T Other: EPODOC, WPI.
WO 2011/007264 A1 US 20110185297 A1 US 20050162445 A1 (74) Agent and/or Address for Service:
Barker Brettell LLP
Medina Chambers, Town Quay, SOUTHAMPTON, Hampshire, SO14 2AQ, United Kingdom (54) Title of the Invention: Image processing
Abstract Title: Image cropping software for skin lesion detection.
(57) The invention is related to an image processing product, comprising the machine-executable instructions, for the processing of images of skin lesions 5. An image 2 of a mole is depicted by way of a graphic user interface 1 on a visual display. This is followed by instructions to display a locating feature 4, which is graphically represented, and which is overlain or superimposed on the displayed image, wherein the feature is substantially circular. User input then changes the relative position of the lesion image with respect to the locating feature. The user is allowed to position the lesion within the target area and crop a sub-image comprising the lesion 8 and at least a portion of the surrounding skin. The product may be computer software such as a graphic user interface (GUI). The locating feature may comprise indicia, such as cross-hairs (3,fig.3). The cropped picture may be stored or output.
Figure 4
1/4
102
2/4
Figure 2
3/4
Figure 3
4/4
ED
Figure 4
IMAGE PROCESSING
Technical Field
The present invention relates generally to image processing.
Background
We have devised an image processing tool for use in analysing images of pigmented lesions, a class of which is commonly called moles. Skin lesions may be either malignant or benign, and the tool seeks to assist in determining whether a skin lesion is more likely to be malignant or benign. The tool can advantageously assist in reducing the number of unnecessary referrals to specialist dermatologists, by accurately identifying the likelihood of a skin lesion being benign.
Summary
According to a first aspect of the invention there is provided an image processing product for use in processing an image of a skin lesion which may comprise one, some or all of the following machine-executable instructions:
instructions to display an image of a skin lesion by way of a graphic user interface on a visual display, instructions to display a locating feature which is graphically represented and which is overlain or superimposed on the displayed image, wherein the feature may be of substantially circular outline and may include centering indicia, instructions responsive to a user input to change the relative displayed position of the image to the locating feature, instructions responsive to a user input to adjust the size of the image relative to the locating feature, and wherein said instructions are such as to allow a user to position the displayed skin lesion within the locating feature, and further wherein the product comprises instructions which are such that once a user has caused the displayed lesion to be located in the locating feature, a cropped image is selected from the overall image which includes both the lesion and at least a portion of the skin surrounding the lesion.
The product may comprise a software product, or tool. The product may comprise an application program.
The instructions may be such as to allow a user to substantially wholly locate the displayed skin lesion within the boundary or extent of the locating feature.
The product preferably enables the user input to bring about a very close or tight positioning of the edge or boundary region of the skin lesion, to the locating feature.
The locating feature may be arranged to remain of a substantially fixed size and/or position, at least during the user input procedure. The locating feature may be considered as a substantially static feature relative to the displayed feature.
The locating feature may include centering indicia. The centering indicia may be configured to allow a user to identify a central part, or region, of the locating feature. The centering indicia may include at least two lines in a substantially orthogonal relationship. The centering indicia may include cross hairs. Centering is of particular benefit with irregularly shaped skin lesions.
The locating feature may allow all, or substantially all, of the underlying image to be visible.
The locating feature may comprise a transparent or translucent region, bounded by a substantially circular border, margin or outline, of solid or broken form (such as a solid line or a broken line, or a combination of both). The circular border of the locating feature may comprise one or more smooth curved portions, or may comprise a plurality of straight portions which are collectively of substantially circular outline.
A boundary region of the lesion may be described as a transition region between the lesion and surrounding skin which is substantially devoid of the lesion.
The image of the skin lesion may comprise image of a skin lesion as well as skin surrounding the lesion.
The software product may be a software application suitable for use and loading on to a portable or handheld computing device and/or a mobile telephone and/or a communications device arranged to interface with a communications network. The software product may comprise an App. The device or telephone may comprise a touch screen which allows a user to provide an input.
The product may comprise instructions to cause the locating feature, and preferably the GUI, to be displayed whilst the user takes a photograph of a skin lesion. The instructions may effect control of, or use of, a host device's camera functionality.
The product may comprise instructions to identify a skin lesion and specify its boundary region and the product may comprise instructions to analyse the boundary region and the interior of the skin lesion.
In overview, suspect skin lesions, particularly malignant examples can be very irregular in shape, not simply circular like a normal or benign mole. Therefore, directions/training may be given to the user to require the user to centre the skin lesion using the centering indicia (e.g. cross-hairs) and then expand or reduce it until it just touches or is contiguous with, or is wholly within or adjacent to, the margin of the locating feature indicia (e.g. the circle 4). The product may then during an automated procedure add a very small margin of (additional) (clear) skin around the lesion such as between 2% to 5%, or between and including 3% and 4%, to help identify the boundary of the skin lesion to enable, amongst other features measured, the edge irregularity to be accurately accessed to provide a reading. In general terms, the greater the degree of irregularity; the more likelihood of the skin lesion being malignant.
The product may comprise instructions which cause an additional portion of the image to that which is the user has arranged as being within the locating feature to be part of the selected cropped image. The additional portion preferably relates to skin surrounding the lesion. The additional portion may be of substantially circular outline. It will be appreciated that the user may have arranged the lesion and a periphery of surrounding skin to be located in the locating feature, and so the step of including the additional portion of surrounding skin image means that the selected copped image which is to be subject of subsequent image analysis includes a larger extent of lesion surrounding skin as compared to that which the user had initially included.
The user input which enables the image to be enlarged or reduced may be referred to as image zoom (functionality).
The selected cropped image is preferably of substantially circular outline.
The skin lesions under investigation may include, but not are limited to, lesions referred to as moles.
An aspect of the invention may be viewed as comprising circular cropping on a touch screen in general in view of the prior art.
By 'selected cropped area' of the image, or equivalent terminology, we include a subregion of the overall image displayed, which sub-region includes at least, that which is contained in the locating feature. The cropped area can be thought of as a selected sub-area or sub-region of the overall image.
The product may comprise instructions to store (locally and/or remotely) and/or output the selected cropped area.
The product may comprise instructions to effect the incorporation, in achieving the selected cropped image, of an image portion representative of skin surrounding the lesion by (effectively) enlarging the size of the circular locating feature. The area of the surrounding skin which is incorporated may be a (predetermined) proportion of the area bounded by the locating feature.
Some of the features of the product which may be included may be summarised as follows:
• (Visually/graphically displayed) cross-hairs for centering a suspect skin lesion.
• A transparent circular area with a semi-transparent surrounding rectangular area to help identify the whole or entirety of the suspect skin lesion - this has particular utility when the image includes several skin lesions, like a cluster of skin lesions - so that a suspicious one can be clearly identified.
• Automatic cropping of the circular locating feature area once the mole has been centered and its boundary just touches or is contiguous with, or is wholly within, the (inner) margin of the circle. The cropping may be instigated by a user input to signify that the lesion image is centered.
• The area in the locating feature may then automatically be increased or augmented by a small or predetermined percentage (and thus ensure that the boundary can be fully identified and measured during subsequent analysis).
• A preferred embodiment is that the cropped area may have approximately 3% (as a percentage of the area of the cropped region in the locating feature) added around it.
• Advantageously no rotational adjustments are necessary by the user when positioning and sizing the displayed lesion in the locating feature. This is because the subsequent analysis is not affected by rotation so no particular rotational orientation adjustment is required.
According to a second aspect of the invention there is provided an image processing device, which comprises, executes or implements the product of the first aspect of the invention.
The invention may comprise one or more features as described in the description and/or as shown in the drawings, either individually or in combination.
Brief Description of the Drawings
Various embodiments of the invention are now described, by way of example only, with reference to the accompanying drawings, in which:
Figure 1 is a schematic view of a touch screen tablet or mobile telephone which enables use of an image processing product; and
Figures 2 to 4 are exemplary screen shots of a graphic user interface which illustrate various progressive stages of use of the image processing product.
Detailed Description
Reference is made to Figure 1 which shows a portable computing device 100, which may be a mobile telephone or a tablet computer, which incorporates a camera which is able to capture and store digital images. A memory of the device 1 comprises an image processing application program, the functionality of which is described below. The device comprises a touch screen 101 and a manual input button 102. The device further comprises a data processor, which executes instructions stored in the memory. The device further comprises an interface to allow data to be input and output externally of the device. This may include an air interface, or a port to which a connector, such as a plug, can be connected. This may for example allow the device to connect to a communications network, such as the Internet.
With reference to Figures 2, 3 and 4 the use of the image processing tool is now described.
STEP 1
Using the camera of the device a user obtains the best close-up image of the skin lesion. This involves the user using the display screen of the device and directing the camera towards the lesion so that it is displayed and visible on the screen. The captured image is then displayed by way of a GUI, as shown in Figure 2.
STEP 2
The GUI comprises a template with a clear locating circle 4 and cross-hairs 3 which automatically overlaid on image, in particular in that part of the GUI which displays the captured image. The cross hairs 3 comprise four lines arranged orthogonally, which intersect the circle 4, and which in essence are directed to the geometric centre or central region of the circle 4. As can be seen, in an initial presentation to a user the skin lesion 5 is offset from the locating circle 4. The GUI also comprises display user input buttons 6 and 7, which can be selected by way of a touch gesture. The displayed image of the lesion (and its surrounding skin) is therefore a sub-region of the displayed GUI, and the lesion occupies a sub-region of the image, which allow includes a portion of surrounding skin.
STEP 3
The user uses his fingers by way of touch gestures to centre the skin lesion on the cross-hairs by touching screen with one finger and moving the image. This is shown in Figure 3. The instructions of the GUI are responsive to the touch and movement of the user’s finger on the display screen of the device to both position and size the image relative to the centering circle 4.
STEP 4
The user then reduces the size of the skin lesion image by a touch gesture on the display screen comprising bringing two fingers together or increases the size of the skin lesion image by moving two fingers apart which are in contact with the region of the screen so that the skin lesion is just within the boundary of the clear circle 4.In the example shown in Figure 4, the user has applied a touch gesture to reduce the size of the image in the GUI, so that the lesion is wholly within the circle 4. In the example shown, the user has reduced the size of the image such that the lesion is just within the circle 4, and so both the lesion 5 and a proportion of surrounding skin 8 are included within the locating circle 4. Once the user has achieved this he then provides an input, such as by way of touching one of the displayed buttons, or touching (twice, say) the locating feature containing the lesion, to provide an input indicative of the fact that the lesion is correctly located in the locating circle 4. This may also be achieved by the user selecting the displayed button 7, marked 'CONFIRM'.
The above steps are indicative of what may be termed a first phase of operation/use of the GUI, which requires user input. Once the user has provided the input that the lesion has been located as required in the centering circle 4, the underlying application program is operative to perform a series of automated steps, to determine the cropped image, which is ultimately selected, saved and output.
The area of the locating circle 4 is known by the application program. The application program also stores an indication or measure relating a predetermined percentage of that area which is to be incorporated with the image of the lesion. In this part of the processing, a portion of the skin which surrounds the lesion, and which is included in the overall image displayed by the GUI, and whose area corresponds to that percentage, is identified and included. In order to achieve this, the application program, in essence, enlarges the diameter of the locating circle 4, to include that proportion of surrounding skin. The proportion equates to around 3% of the area bounded by the locating circle 4. Once this processing has been effected, the application program is then operative to cause the image comprising the lesion and the small proportion of surrounding skin to be stored in memory and/or output for remote storage, and its availability for subsequent analysis. This may also be achieved by rotating a radius of predetermined size (and larger than the radius of the circle) about the centre of the locating circle 4. This then defines the outer boundary and extent of the image to be cropped and selected for storage, output, etc.
In an alternative embodiment, the application program may be operative to shrink the image of the lesion within the locating circle 4, such that and to the extent that there is the predetermined additional proportion of surrounding skin from the overall image, relative to the (reduced) area of the lesion contained within the circle 4.
The processor of the device 100 then analyses the circular cropped image and uploads the analysis and image to a secure server for later retrieval by medical professionals. As described above, the application program automatically adds a small boundary of surrounding clear skin 8 to the selected image in order to optimise subsequent identification of the edge of the skin lesion during a detailed analysis phase. This surrounding boundary may have an area equivalent to approximately 3% of the area contained in the circular locating feature.
The above combination of user input steps and cropping is not only novel but advantageously allows high performance analysis of suspect skin lesions. Unlike known cropping procedures, functionalities and features where a user is required to 'puli' a 'handle' to move each of the four edges of a square or rectangular image inwards to remove unwanted parts of the image, the image processing tool advantageously employs a substantially circular cropping feature.
The product described above advantageously addresses the problem of specifying the part of the image captured by the user device that is to be analysed. The product further advantageously takes full advantage of the advanced gesture-based capabilities of the touch screen devices. The improvement in specifying the part of the image to be analysed will ensure consistency and accuracy and enable tighter thresholds or ranges to be used to differentiate benign skin lesions from potentially suspect ones.
It is to be appreciated that the GUI is displayed whilst the user takes the photograph, and so at that time the user can use the locating feature to position the lesion within the framed composition. This may result in the user requiring only minimal repositioning of the lesion, or none at all, subsequent to taking the picture in order to locate the lesion within the locating feature.

Claims (16)

1. An image processing product for use in processing an image of a skin lesion comprising the following machine-executable instructions:
instructions to display an image of a skin lesion by way of a graphic user interface on a visual display, instructions to display a locating feature which is graphically represented and which is overlain or superimposed on the displayed image, wherein the feature is of substantially circular outline, instructions responsive to a user input to change the relative displayed position of the image to the locating feature, instructions responsive to a user input to adjust the size of the image relative to the locating feature, and wherein the instructions are such as to allow a user to position the displayed lesion within the locating feature, and further wherein the product comprises instructions which are such that once a user has caused the displayed lesion to be located in the locating feature, a cropped image is selected from the overall image which includes both the lesion and at least a portion of the skin surrounding the lesion.
2. A product as claimed in claim 1 which comprises a software product.
3. A product as claimed in claim 1 or claim 2 in which the instructions are such as to allow a user to substantially wholly locate the displayed lesion within the boundary or extent of the locating feature.
4. A product as claimed in any preceding claim which enables a very close or tight crop to the edge or boundary region of the skin lesion.
5. A product as claimed in any preceding claim in which the locating feature is arranged to remain of a substantially fixed size and/or position, at least during the user positioning and sizing the lesion relative to the locating feature.
6. A product as claimed in claim 6 in which the locating features includes centering indicia is configured to allow a user to identify a central part, or region, of the locating feature.
7. A product as claimed in any preceding claim in which the centering indicia includes at least two lines in a substantially orthogonal relationship.
8. A product as claimed in any preceding claim in which the centering indicia includes cross hairs.
9. A product as claimed in any preceding claim in which the locating feature allows all, or substantially all, of the underlying image to be visible.
10. A product as claimed in any preceding claim in which the locating feature comprises a transparent or translucent region, bounded by a substantially circular border, margin or outline.
11. A product as claimed in any preceding claim in which the selected cropped area comprises a sub-region or sub-area of the displayed image, which comprises at least that which is contained in the locating feature.
12. A product as claimed in any preceding claim which comprises instructions to store and/or output the selected cropped image.
13. A product as claimed in any preceding claim which comprises instructions to select at least the image contained in the locating feature for further processing.
14. A product as claimed in claim 13 which includes instructions, responsive to a user input to cause the further processing of the selected image, wherein the instructions are indicative of the user having located the lesion within the locating feature.
15. A product as claimed in any preceding claim which comprises instructions to identify a skin lesion and specify its boundary region and the product comprises instructions to analyse the boundary region and the interior of the skin lesion.
16. A product as claimed in any preceding claim which comprises instructions to subsequent to a user having located the lesion within the locating feature, arranged to include an additional portion of the displayed image relating to skin surrounding the lesion, to the lesion image in the locating feature, and to cause the lesion image and
5 the additional skin portion to comprise a selected cropped image which is available to subsequent processing.
Intellectual
Property
Office
Application No: Claims searched:
GB1710049.6
1-16
GB1710049.6A 2016-06-27 2017-06-23 Image processing Withdrawn GB2554121A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GBGB1611152.8A GB201611152D0 (en) 2016-06-27 2016-06-27 Image processing

Publications (2)

Publication Number Publication Date
GB201710049D0 GB201710049D0 (en) 2017-08-09
GB2554121A true GB2554121A (en) 2018-03-28

Family

ID=56891660

Family Applications (2)

Application Number Title Priority Date Filing Date
GBGB1611152.8A Ceased GB201611152D0 (en) 2016-06-27 2016-06-27 Image processing
GB1710049.6A Withdrawn GB2554121A (en) 2016-06-27 2017-06-23 Image processing

Family Applications Before (1)

Application Number Title Priority Date Filing Date
GBGB1611152.8A Ceased GB201611152D0 (en) 2016-06-27 2016-06-27 Image processing

Country Status (1)

Country Link
GB (2) GB201611152D0 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023227908A1 (en) * 2022-05-26 2023-11-30 Moletest Limited Image processing

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010048447A1 (en) * 2000-06-05 2001-12-06 Fuji Photo Film Co., Ltd. Image croppin and synthesizing method, and imaging apparatus
JP2005010903A (en) * 2003-06-17 2005-01-13 Sharp Corp Electronic device
US20050162445A1 (en) * 2004-01-22 2005-07-28 Lumapix Method and system for interactive cropping of a graphical object within a containing region
US20100149211A1 (en) * 2008-12-15 2010-06-17 Christopher Tossing System and method for cropping and annotating images on a touch sensitive display device
WO2011007264A1 (en) * 2009-07-17 2011-01-20 Sony Ericsson Mobile Communications Ab Using a touch sensitive display to control magnification and capture of digital images by an electronic device
US20110185297A1 (en) * 2010-01-26 2011-07-28 Apple Inc. Image mask interface
WO2013009828A1 (en) * 2011-07-12 2013-01-17 Apple Inc. Multifunctional environment for image cropping
US20130235076A1 (en) * 2012-03-06 2013-09-12 Apple Inc. User interface tools for cropping and straightening image

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010048447A1 (en) * 2000-06-05 2001-12-06 Fuji Photo Film Co., Ltd. Image croppin and synthesizing method, and imaging apparatus
JP2005010903A (en) * 2003-06-17 2005-01-13 Sharp Corp Electronic device
US20050162445A1 (en) * 2004-01-22 2005-07-28 Lumapix Method and system for interactive cropping of a graphical object within a containing region
US20100149211A1 (en) * 2008-12-15 2010-06-17 Christopher Tossing System and method for cropping and annotating images on a touch sensitive display device
WO2011007264A1 (en) * 2009-07-17 2011-01-20 Sony Ericsson Mobile Communications Ab Using a touch sensitive display to control magnification and capture of digital images by an electronic device
US20110185297A1 (en) * 2010-01-26 2011-07-28 Apple Inc. Image mask interface
WO2013009828A1 (en) * 2011-07-12 2013-01-17 Apple Inc. Multifunctional environment for image cropping
US20130235076A1 (en) * 2012-03-06 2013-09-12 Apple Inc. User interface tools for cropping and straightening image

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023227908A1 (en) * 2022-05-26 2023-11-30 Moletest Limited Image processing

Also Published As

Publication number Publication date
GB201710049D0 (en) 2017-08-09
GB201611152D0 (en) 2016-08-10

Similar Documents

Publication Publication Date Title
US11096668B2 (en) Method and ultrasound apparatus for displaying an object
US20150293600A1 (en) Depth-based analysis of physical workspaces
CN105243676B (en) Method for displaying ultrasonic image and ultrasonic equipment used by same
US9554030B2 (en) Mobile device image acquisition using objects of interest recognition
EP2974665B1 (en) Apparatus and method for supporting computer aided diagnosis (cad) based on probe speed
US10524591B2 (en) Mirror display apparatus and the operation method thereof
CN102138827B (en) Image display device
US20160313883A1 (en) Screen Capture Method, Apparatus, and Terminal Device
US9836130B2 (en) Operation input device, operation input method, and program
JP2005122696A (en) Interactive display system and interactive display method
JP2014509535A (en) Gaze point mapping method and apparatus
TWI485600B (en) Pattern swapping method and multi-touch device thereof
JP2010067062A (en) Input system and method
US9582169B2 (en) Display device, display method, and program
US20130251268A1 (en) Image retrieval apparatus, image retrieval method, and storage medium
US20160171158A1 (en) Medical imaging apparatus and method using comparison image
JP2008510247A (en) Display system for mammography evaluation
JP5822545B2 (en) Image processing apparatus, image processing apparatus control method, and program
JP6158690B2 (en) Image display device
US20150010214A1 (en) Information processing device, communication counterpart decision method and storage medium
US10070049B2 (en) Method and system for capturing an image for wound assessment
GB2554121A (en) Image processing
US10324582B2 (en) Medical image display apparatus, method for controlling the same
JP6229554B2 (en) Detection apparatus and detection method
US8923483B2 (en) Rotation of an x-ray image on a display

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)