WO2017137747A1 - Conception de cadre de porte et de fenêtre en reconnaissant et en embellissant des traits dessinés à la main sur un écran tactile - Google Patents

Conception de cadre de porte et de fenêtre en reconnaissant et en embellissant des traits dessinés à la main sur un écran tactile Download PDF

Info

Publication number
WO2017137747A1
WO2017137747A1 PCT/GB2017/050321 GB2017050321W WO2017137747A1 WO 2017137747 A1 WO2017137747 A1 WO 2017137747A1 GB 2017050321 W GB2017050321 W GB 2017050321W WO 2017137747 A1 WO2017137747 A1 WO 2017137747A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
input
frame
constraints
path
Prior art date
Application number
PCT/GB2017/050321
Other languages
English (en)
Inventor
Robert Franks
Christopher BRUNSDON
Original Assignee
Tommytrinder.Com Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tommytrinder.Com Limited filed Critical Tommytrinder.Com Limited
Publication of WO2017137747A1 publication Critical patent/WO2017137747A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/10Geometric CAD
    • G06F30/13Architectural design, e.g. computer-aided architectural design [CAAD] related to design of buildings, bridges, landscapes, production plants or roads
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/40Software arrangements specially adapted for pattern recognition, e.g. user interfaces or toolboxes therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/10Geometric CAD
    • G06F30/12Geometric CAD characterised by design entry means specially adapted for CAD, e.g. graphical user interfaces [GUI] specially adapted for CAD
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/40Document-oriented image-based pattern recognition
    • G06V30/42Document-oriented image-based pattern recognition based on the type of document
    • G06V30/422Technical drawings; Geographical maps
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2111/00Details relating to CAD techniques
    • G06F2111/04Constraint-based CAD

Definitions

  • the present invention is in the field of user interfaces. More particularly, but
  • the present invention relates to augmented user interfaces for defining building frames.
  • customisations can relate to building frames, such as window- and door-frames. 5 At present, and to facilitate the customisation of frames, one or more designs
  • CAD Computer-Aided Design
  • a computer- implemented method of defining building frames within a user interface including:
  • the input interface may be a touch-screen interface.
  • the frame may be displayed to the user on the touch-screen interface.
  • the input may be processed based upon context.
  • the context may relate to a phase within a process; a touched location within a displayed frame; and/or a user-selectable mode.
  • the constraints may be selected from one of a plurality of constraints.
  • the constraints may include aesthetics constraints and/or manufacturing constraints.
  • the input interface may be at a user device, such as a tablet.
  • the input may be processed at a server.
  • the constraints may be retrieved by the server from a database.
  • the input may be received from the user at a plurality of phases. Each phase is associated with specific constraints.
  • one or more geometric objects may be detected within the input.
  • the classes of geometric objects may include shapes and lines. At least some of the phases may be associated with detecting a single class of geometric object.
  • a geometric object may be detected when a beginning and/or ending of the input is within specific threshold of possible start and/or end-points for the geometric object. If the geometric object is a shape, the start and end-points may be defined as the same location; and if the geometric object is a line, the start and endpoints may be defined as being on an existing shape outline or line.
  • a user device for defining building frames within a user interface including:
  • An input apparatus configured to receive input from a user to define a frame;
  • a processor configured to process the input in relation to constraints to generate a frame;
  • An output configured to display the frame to the user.
  • Figure 1 shows a block diagram illustrating a device in accordance with an embodiment of the invention
  • Figure 2 shows a block diagram illustrating an architecture of a device in accordance with an embodiment of the invention
  • Figure 3 shows a flow diagram illustrating a method in accordance with an embodiment of the invention
  • Figure 4 shows a flow diagram illustrating a method for an application in accordance with an embodiment of the invention
  • Figure 5 shows a screenshot illustrating paths drawn on a user interface in accordance with an embodiment of the invention
  • FIGS. 1-10 show screenshots illustrating the processing of "rectangular" path into a generated building frame within a user interface in accordance with an embodiment of the invention
  • FIGS. 1-10 show screenshots illustrating the processing of "linear" paths into mullions for a building frame within a user interface in accordance with an embodiment of the invention
  • FIGS. 1-10 show screenshots illustrating the processing of "linear" paths into transoms and mullions for a building frame within a user interface in accordance with an embodiment of the invention ;
  • FIGS. 1-10 show screenshots illustrating different generated building frames within a user interface in accordance with an embodiment of the invention
  • FIGS. 1-10 show screenshots illustrating the processing of input within a sashes mode into sashes for a building frame within a user interface in accordance with an embodiment of the invention ;
  • Figures 1 1 a to 1 1f show screenshots illustrating the processing of "linear" input within a glazing mode into glazing bars for a building frame within a user interface in accordance with an embodiment of the invention
  • Figure 12 shows a screenshot illustrating removal of a sash opening within a sashes mode within a user interface in accordance with an embodiment of the invention.
  • the present invention provides a method and device to facilitate the definition of building frames within a user interface.
  • the inventor has discovered that construction constraints can be utilised within a user interface to process user input to generate viable building frames.
  • Such a user interface could be used by less skilled individuals such as salespeople or customers to both generate and visualise accurate and realistic building frames.
  • the device 100 includes a processor 101 , a display 102, an input apparatus 103, and a memory 104.
  • the device 100 may be a portable computing apparatus such as a smart- phone, tablet or smart-watch, or a laptop, or desktop computer.
  • the display 102 and input apparatus 103 may be unified in a combined input/display apparatus 105 such as a near-touch/touch-screen.
  • the display 102 and input apparatus 103 may be separate, for example, where the input apparatus is a pointer device such as a mouse, or a near- touch/touch pad.
  • the input apparatus 105 functions with a digital stylus.
  • the memory 104 may be configured to store software applications 106, libraries 107, an operating system 108, and device drivers 109.
  • the processor 101 is configured to execute the software applications 106, libraries 107, operating system 108, and device drivers 109.
  • the software applications 106 may include an HTML5 enable web browser, such as Chrome, Internet Explorer, Safari, Firefox, or Opera.
  • the web browser supports the Canvas element and Javascript.
  • the device is configured to perform the method described in relation to Figure 3.
  • the device may be configured to execute a computer program to perform the method.
  • the computer program may be stored in the memory 104.
  • the computer program is executed within a web browser executing on the device.
  • the computer program may be configured to interoperate with one or more servers to perform at least part of the method.
  • Application software 201 (e.g. 106) is provided at a top layer. Below this layer are user interface APIs 202 which provide access for the application software 201 to user interface libraries. Below this layer are operating system APIs 203 which provide access for the application software 201 and user interface libraries to the core operating system 204. Below the core operating system 204 are the device drivers 205 which provide access to the input and display hardware.
  • step 301 input is received from the user at a device (e.g. 100) to define the building frame.
  • the input may be received via a touch-pad or touch-screen, or via a pointer mechanism.
  • the input may consist of one or more discrete input paths from the input apparatus at the device 100.
  • the input may represent movement by a user within a 2D space, such as movement of a finger across a touch-screen.
  • step 302 the input is processed in accordance with constraints to generate a building frame.
  • a user may select or define one of a plurality of building frame types before the processing step.
  • a set of constraints may be associated with each of the types and retrieved either from local storage at the device 100 or from a server via a communication system for processing.
  • steps 301 to 303 may occur without the need for communication between the device 100 or the server.
  • the input is transmitted to a server via a communications system and the processing occurs at the server.
  • the generated building frame may be transmitted from the server back to the device 100 for subsequent display to the user.
  • the input is processed to detect geometric objects (such as shapes and lines).
  • the geometric objects may be defined approximately within the input, for example, the input may include an approximately circular path from which a circle can be detected, the input may include an approximately rectangular path from which a rectangle can be detected, and/or the input may include an approximately linear path from which a line can be detected.
  • the linear paths may be detected to be horizontal or vertical. The approximations of the shapes or lines may be detected within predefined thresholds.
  • the geometric objects may be detected when a beginning and ending of the path within a specific threshold of possible starts and/or end-points for the geometric object.
  • the end point may be predefined the beginning of the path and where the geometric object is a line, the start-points and end-points may relate to geometric objects already detected (i.e. the line must start and end near to a rectangular outline, or the line must start near a shape outline and end near another detected line).
  • Detected geometric objects may be utilised to generate the building frame within the constraints.
  • a detected approximate rectangle may be used to generate a rectangular frame if this is possible within the constraints and a detected approximate vertical line may be used to generate a mullion within the rectangular frame if this is possible within the constraints.
  • the widths of the frame and mullions may also be defined by the constraints.
  • the location of the mullion within the generated building frame may be symmetric within the frame.
  • a detected approximate horizontal line may be used to generate a transom within the rectangular frame if this is possible within the constraints.
  • the user interface mechanism may transition through different phases or operate in different modes. Different constraints may apply to different phases or modes. For example, in a first phase, only a single geometric shape may be detected in the input to contribute to generation of the building frame (i.e. the outside frame of the building frame), in a second phase, only lines may be detected in the input to contribute to generation of the building frame (i.e. mullions or transoms), and in one mode, only lines may be detected in the input to contribute to generation of the building frame (i.e. glazing bars).
  • each building frame type may be associated with one or more options.
  • the options may be associated with the building frame type, with the different phases or modes, or with one or more components of the building frame.
  • the user interface mechanism may then permit the selection of any of the one or more options in accordance with the building frame type, phase/mode, or component selected.
  • constraints associated with a building frame type, a phase/mode, or a component may restrict the options otherwise applicable for selection.
  • step 303 the generated building frame is displayed on the device 100.
  • the user input may be displayed to the user on the device 100 as input is provided.
  • a rough line may be displayed corresponding to user's input before steps 302 and 303 take place.
  • the generated building frame may displace the rough line when displayed in step 303.
  • the user interface mechanism may provide for alternative views of the generated building frame to be displayed.
  • Alternative views may be displayed upon selection of an action by the user.
  • an alternative view may be the display of the generated building frame from the outside rather than inside. This may, for example, assist in the visualisation of the generated building frame by the user.
  • Embodiments of the present invention will now be described with reference to Figures 4 to 12. These embodiments provide a method and system to enable a user to specify and configure a range of windows and doors in a customisable way from within a browser of their user device.
  • the web application executing within the browser is configured to recognise/detect how a user has traced a path on the screen of their device.
  • the application then generates a photorealistic graphic based on the size, position and shape that has been drawn and in accordance with constraints which may be defined for a specific product.
  • the technology required to run the application is any machine running an HTML5 enabled web browser. This includes, for example, typical smart-phones, tablets, PCs, Macs and hybrids.
  • the browser could be any of the common types: Chrome, Internet Explorer, Safari, Firefox, Opera. More specifically, in this embodiment, the browser is to support the Canvas element ( h p- //www . w3. org/ ⁇ R/2003/ D ⁇ htm ; S ⁇ 2QQ9QS25flhe-canvas-eiement.htm j). In addition, in this embodiment, the browser is to have JavaScript enabled.
  • the application also utilises the Fabric JavaScript graphics library which is a collection of functions used to aid in the creation of graphics for the HTML5 canvas element and SVG shapes.
  • a touch device the user can then trace a path which is rendered on the screen as it is traced (a mouse can be used on a non-touch device).
  • a path which is rendered on the screen as it is traced
  • a mouse can be used on a non-touch device.
  • the user stops tracing and, if the path is recognised it is automatically replaced with a realistic graphical representation of the basic window shape which has been drawn, e.g. usually a rectangular frame.
  • the user then continues to trace more paths on top of the image to add further frame sections such as mullions and transoms.
  • Each path drawn, if recognised, is positioned intelligently and symmetrically in the assumed correct place within the basic frame and rendered realistically. In this way the user can specify the configuration of the window/door by replicating the traditional method of using pen and paper. More advantageously the item is now captured digitally, rendered accurately and can be shown to others.
  • the item is then centred on the screen, with default dimensions shown, along with a set of tabs which display the 'configuration modes' possible.
  • the user is in 'Sashes mode' by default and is invited to add any sash openers by touching (or clicking) any of the panes bounded by any of the mullions, transoms or frame.
  • a realistic graphical representation of the opener is then rendered in the correct position.
  • the user can then touch any existing sash to change its orientation or to remove it.
  • the user can choose either of the major dimensions shown (height and width) and change the overall size of the item.
  • the image is then resized and rescaled appropriately according to the canvas size currently available to the user. The rescaling is applied precisely to all components of the item (e.g. thickness of frame, size of handle, etc) so that it retains its overall accurate proportional graphical representation.
  • the user continues to specify the full feature set and options for the item by selecting any of the configuration modes, for example: Sashes, Frame, Colour, Glass, Glazing, Hardware.
  • the application is intuitive in that it responds to where the user touches the item and, depending which mode is selected, presents a set of features and options to choose and change. As the user interacts the graphical representation and pricing is updated according to the choices made and any pre-configured values and rules.
  • Modes can be entered and exited as often as is required to complete the full specification of the required item and to support any temporary changes of options.
  • the route of the corresponding URL is used to send a request to the server.
  • the server responds with a JSON payload which consists of all the data necessary to configure the product.
  • JSON payload An exemplary JSON payload is shown below:
  • the application assigns this data to model for the product.
  • the scaling is used to represent actual mm in equivalent screen pixels (px). Firstly, the viewport width and height available for current device (measured in px) are detected by the application.
  • a scaling value is initialised, dependent on the viewport width, for example, as follows:
  • the maximum scale is then limited to 1 and the minimum limited to 0.3: a typical viewport width of 1 ,068px giving a scaling of 0.356.
  • 0.356px represents 1 mm
  • 1000px represents ⁇ 2, 809mm.
  • the application defines a header and footer, each 60px high, positioned at the top and bottom of the screen in the browser displayed on the user's device.
  • the header may be used for a retailer's logo at top left, and some navigational links on the right.
  • the footer is used for various control objects, as well as buttons for saving and cancelling.
  • the rest of the space is initialized by the application as a canvas area.
  • a background image is loaded to tile the entire canvas, for example: a blue graph paper.
  • Each major square on the graph paper image measures 50px x 50px.
  • Each minor square on the graph paper image measures 10px x 10px.
  • the application uses FabricJS, an open source library, to provide an interactive object model for the HTML5 ⁇ canvas> element.
  • the application can set some appropriate options for the cursor style, brush colour and brush width:
  • the input device can be either a mouse, finger or digital pen.
  • W3C standards define an Open Web Platform for application development to enable developers to build rich interactive experiences.
  • the path information is stored as a sequence of coordinates and
  • M 100,200 L 200,400 L 300,200 z corresponds to a triangle with vertices (100, 100) (300, 100) (200, 100).
  • the M indicates a moveto
  • the L indicate lineto
  • the z indicates a closepath.
  • the user is drawing freehand which can be defined as a sequence of quadratic Bezier segments, eg:
  • the application uses the following event listener to store information about each path traced :
  • 100px is an optimal value that ignores any mistaken path traces but supports users who may be using a small device such as a phone. It will be appreciated that different threshold values could be used, for example, to support different device sizes or input modalities.
  • threshold of 100px of each other This ensures that the user is intending to finish where they started, (i.e. to complete the drawing of a frame).
  • the nominal value of 100px allows for a natural margin of error whilst still enforcing completeness. It will be appreciated that the threshold value is exemplary and similar threshold values may be used, or alternative threshold values to support other devices/inputs may be used.
  • the first pair of ordinates signify the control point of the curve and the second pair define the end point the path passes through.
  • the application considers the change in x (dx) and the change in y (dy) of each end point compared to the previous point
  • the decision breakpoint should be set at 50%.
  • the application can also analyse the magnitude and polarity * of the dx and dy pairs, as follows:
  • the application uses the default properties set in the manufacturer's product data model which was loaded when page was initialized.
  • p.y2 p.transY + p.fh - p.bfrnotbd;
  • lockMovementY true, selectable : true
  • the user may want to add one or more vertical lines, corresponding to muljions.
  • the application captures each complete path trace for analysis and, if certain conditions are met (outlined below), the canvas is reloaded with the new graphic corresponding to a photorealistic graphical interpretation.
  • the first condition is that any new trace must start and end within a nominal tolerance of 50px of the basic frame. It will be appreciated that alternative tolerances may be used for different devices or input methods.
  • the application may use the following isPathBoxedQ function, shown below: * checks that path is within bounding box of basic frame
  • the application may use the function isPathVerticalLineQ (described below) which uses 18° as the maximum angle the path can stray from the vertical and 50px as the minimum vertical length of the path.
  • var minHeight 50; // - - smallest possible height - - // - - has a height and is within 18 degrees (a rctan 1/3 ) of vertical - - if ( pbb . height > minHeight && pbb . height > 3 * pbb . width ) ⁇ return true;
  • the application also tests for the path reaching both the top and bottom of frame, (i.e. fully spanning the basic frame).
  • the application can now replace the successful path with a mullion
  • the width of the mullion section itself and its various other properties are read from the manufacturer's product data model.
  • FIG. 7c An example of a mullion positioned from the path shown in Figure 7b so it divides the glass area in half is shown in Figure 7c.
  • the application can simply add another mullion so that the glass is always divided in equal widths no matter where the vertical line was drawn. For example, if the path was drawn to the right of the first mullion (as shown in Figure 7d), or drawn to the left of the first mullion (as shown in Figure 7e), it will be replaced with equidistant mullions as shown in Figure 7f. Note that in each redraw the complete item is reloaded from scratch so that any pre-existing sections' positions can be adjusted.
  • the user can continue drawing as many full mullions as desired and the application will always position them equally within the basic frame.
  • the transom(s) can be traced before or after the mullion(s).
  • constraints include two transoms positioned either 1/3 from top or 1 /3 from bottom as a default. These constraints are driven from the manufacturer's product data model and can of course differ. For example, cottage casement windows never have transoms, so the path trace would simply be removed by the application if the user tried to draw one.
  • FIG 8a A user drawn path for a transom after a first mullion is shown in Figure 8a.
  • the application can support a range of window design layouts and that the layouts shown and described herein are exemplary. The possible designs are read from the manufacturer's product data, and the application will not allow the user to draw layouts that cannot be
  • the product data defines constraints that apply when processing the user input to generate the building frames for visualisation. This is facilitated by labelling each of the designs and then mapping these labels to the manufacturer's labelling system.
  • the application can continue to analyse path traces over what has been already represented graphically.
  • the user can submit the design within the application as being completed.
  • the user can now touch or click on various sections of the product and the response will depend on which mode is currently selected, as explained in more detail below.
  • the sizes of the sash sections are determined from the product data model.
  • the default direction of the opener is determined and dependent on which pane was clicked on.
  • the sash is set to open from the top as the default.
  • the user can touch or click on the sash again to reconfigure it's opener direction, or remove the opener completely.
  • FIG. 10a An initial design is shown in "Sashes Mode" in Figure 10a.
  • a top hung opener has been added to this design, opening to the top as a default.
  • the top hung opener can have its opening direction changed as shown in Figure 10c.
  • Figure 10d the top hung opener is now opening to the left.
  • Glazing mode the user is invited by the application to draw paths on the product to represent where the glazing bars (or astragals) should be located.
  • the application firstly uses the same criteria as above to decide if a line is mainly vertical or mainly horizontal (or neither):
  • the application then analyses the bounding box of the path to determine which pane(s) the glazing bars are being drawn across by:
  • pane -- // -- OR path ends in bottom half of pane -- (pbb. bottom > pane. top + pane. height / 2 && pbb. bottom ⁇ pane. top + pane. height) // -- OR middle of path is in pane -- (pbb.top ⁇ pane. top && pbb. bottom > pane. top + pane . height)
  • the application then adds the glazing bar to the pane, symmetrically placed as to any others that have already been added.
  • the width of the glazing bar is determined from the product data model.
  • the first horizontally drawn path spans two panes as shown Figure 1 1 a. This results in the symmetric addition of glazing bars as shown in Figure 1 1 b, taking account of any sashes already present.
  • Another horizontally drawn path spans four panes as shown in Figure 1 1 c, and the glazing bars are symmetrically added in the same way as shown in Figure 1 1 d.
  • a vertically drawn path spans two panes as shown in Figure 1 1 e, and the glazing bars are symmetrically added again as shown in Figure 1 1 f.
  • the modes are fully independent so a user can freely switch between them.
  • Any changes made to the product configuration may be automatically saved to the server. This allows the user to return later and reload a product they are working on.
  • the user can also copy the current product to use as the starting point for a similar product they want to design and configure.
  • the price of each component may be delivered in the product data model as set by the manufacturer. Each component added and configured may have an impact on the pricing.
  • the component pricing can handle a combination of price per unit and price per length or area.
  • the total product price may be recalculated and presented to the user.
  • Pricing rules may be implemented where certain combinations of components infer a pricing uplift.
  • a potential advantage of some embodiments of the present invention is that, by processing input in accordance with stored constraints, building frames can be defined by individuals without specific construction or architectural knowledge.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Geometry (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Computational Mathematics (AREA)
  • Architecture (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Pure & Applied Mathematics (AREA)
  • Structural Engineering (AREA)
  • Artificial Intelligence (AREA)
  • Civil Engineering (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Biology (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente invention concerne un procédé mis en œuvre par ordinateur pour définir des cadres de construction dans une interface utilisateur. Le procédé comprend les étapes consistant à recevoir une saisie d'un utilisateur au niveau d'une interface de saisie pour définir un cadre ; traiter la saisie en fonction de contraintes pour produire un cadre ; et afficher le cadre à l'utilisateur. L'invention concerne également un dispositif.
PCT/GB2017/050321 2016-02-08 2017-02-08 Conception de cadre de porte et de fenêtre en reconnaissant et en embellissant des traits dessinés à la main sur un écran tactile WO2017137747A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB1602256.8 2016-02-08
GB1602256.8A GB2548323B (en) 2016-02-08 2016-02-08 A user interface mechanism

Publications (1)

Publication Number Publication Date
WO2017137747A1 true WO2017137747A1 (fr) 2017-08-17

Family

ID=56297178

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2017/050321 WO2017137747A1 (fr) 2016-02-08 2017-02-08 Conception de cadre de porte et de fenêtre en reconnaissant et en embellissant des traits dessinés à la main sur un écran tactile

Country Status (2)

Country Link
GB (1) GB2548323B (fr)
WO (1) WO2017137747A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111125806A (zh) * 2018-10-12 2020-05-08 阿里巴巴集团控股有限公司 房屋装修信息处理方法、装置及系统

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5870106A (en) * 1996-10-15 1999-02-09 Langelaan; J. Willem R. Computer graphics system for modelling objects which include segments that are bounded by linear radial and elliptical tangential elements
US20020103557A1 (en) * 2000-10-04 2002-08-01 Habersham Metal Products Company Design tool systems and methods, and user interface
EP1519300A2 (fr) * 2003-09-24 2005-03-30 Microsoft Corporation Reconnaissance de forme des objets tracés à la main
US20130188877A1 (en) * 2012-01-24 2013-07-25 Microsoft Corporation Sketch beautification and completion of partial structured-drawings

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5870106A (en) * 1996-10-15 1999-02-09 Langelaan; J. Willem R. Computer graphics system for modelling objects which include segments that are bounded by linear radial and elliptical tangential elements
US20020103557A1 (en) * 2000-10-04 2002-08-01 Habersham Metal Products Company Design tool systems and methods, and user interface
EP1519300A2 (fr) * 2003-09-24 2005-03-30 Microsoft Corporation Reconnaissance de forme des objets tracés à la main
US20130188877A1 (en) * 2012-01-24 2013-07-25 Microsoft Corporation Sketch beautification and completion of partial structured-drawings

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
ANONYMOUS: "Modeling Windows and Doors", 1 January 2001 (2001-01-01), XP055369484, Retrieved from the Internet <URL:https://www.datacad.com/support/down/DataCAD_10_Manual/DCXL28.pdf> [retrieved on 20170504] *
JAMES ARVO ET AL: "Fluid sketches", PROCEEDINGS OF THE 2000 ACM SIGCPR CONFERENCE. CHICAGO. IL, APRIL 6 - 8, 2000; [ACM SYMPOSIUM ON USER INTERFACE SOFTWARE AND TECHNOLOGY], NEW YORK, NY : ACM, US, 1 November 2000 (2000-11-01), pages 73 - 80, XP058180284, ISBN: 978-1-58113-212-0, DOI: 10.1145/354401.354413 *

Also Published As

Publication number Publication date
GB2548323B (en) 2021-11-17
GB2548323A (en) 2017-09-20
GB201602256D0 (en) 2016-06-22

Similar Documents

Publication Publication Date Title
US7441202B2 (en) Spatial multiplexing to mediate direct-touch input on large displays
US9836192B2 (en) Identifying and displaying overlay markers for voice command user interface
US8997025B2 (en) Method, system and computer readable medium for document visualization with interactive folding gesture technique on a multi-touch display
US9367199B2 (en) Dynamical and smart positioning of help overlay graphics in a formation of user interface elements
US9354707B2 (en) Combination color and pen palette for electronic drawings
CN102945557B (zh) 基于移动终端的矢量现场图绘制方法
US20040257346A1 (en) Content selection and handling
US20120272144A1 (en) Compact control menu for touch-enabled command execution
US20110115814A1 (en) Gesture-controlled data visualization
EP3491506B1 (fr) Systèmes et procédés pour une interface utilisateur d&#39;écran tactile pour un outil d&#39;édition collaboratif
WO2016145832A1 (fr) Procédé d&#39;exploitation de terminal et dispositif faisant appel à ce dernier
US20120110483A1 (en) Multi-desktop management
AU2007241972A1 (en) Method and apparatus for controlling display output of multidimensional information
EP1363185A2 (fr) Procédé et appareil d&#39;indication de sélection par mise en relief
US10241651B2 (en) Grid-based rendering of nodes and relationships between nodes
US20190369935A1 (en) Electronic whiteboard, electronic whiteboard system and control method thereof
US11380028B2 (en) Electronic drawing with handwriting recognition
US8416237B1 (en) Perspective aware automatic guide generation
US20230342729A1 (en) Method and Apparatus for Vehicle Damage Mapping
WO2017137747A1 (fr) Conception de cadre de porte et de fenêtre en reconnaissant et en embellissant des traits dessinés à la main sur un écran tactile
JP5256755B2 (ja) 情報処理方法及び情報処理装置
CN104081333A (zh) 包括均描绘图形用户界面的区域的输入镜头的远程显示区
JP6127401B2 (ja) 情報処理装置、プログラム及び情報処理方法
JP6945345B2 (ja) 表示装置、表示方法及びプログラム
JP4573817B2 (ja) スクロール同期システム及びスクロール同期方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17711723

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17711723

Country of ref document: EP

Kind code of ref document: A1