DK179255B1 - A computer implemented method for determining the actual size of a target object within a digital image/video frame - Google Patents

A computer implemented method for determining the actual size of a target object within a digital image/video frame Download PDF

Info

Publication number
DK179255B1
DK179255B1 DKPA201600335A DKPA201600335A DK179255B1 DK 179255 B1 DK179255 B1 DK 179255B1 DK PA201600335 A DKPA201600335 A DK PA201600335A DK PA201600335 A DKPA201600335 A DK PA201600335A DK 179255 B1 DK179255 B1 DK 179255B1
Authority
DK
Denmark
Prior art keywords
target object
video frame
digital image
pixel
computer
Prior art date
Application number
DKPA201600335A
Other languages
Danish (da)
Inventor
Kenneth Gjerulff Obbekjær Kring
Original Assignee
Blue Sky Tec Aps
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Blue Sky Tec Aps filed Critical Blue Sky Tec Aps
Priority to DKPA201600335A priority Critical patent/DK179255B1/en
Priority to PCT/EP2017/063492 priority patent/WO2017211726A1/en
Publication of DK201600335A1 publication Critical patent/DK201600335A1/en
Application granted granted Critical
Publication of DK179255B1 publication Critical patent/DK179255B1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Geometry (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

The present invention relates to a computer implemented method for determining the actual size of a target object within a digital image/video frame. The mothod comprises the steps of: i) obtaining a digital image/video frame containing a target object; ii) obtaining data about the digital image/video frame containing a target object, the data comprising information about pixel resolution, and horizontal to vertical aspect ratio; iii) identifying a target object within the digital image/video frame; iva) determining boundaries of the target object from pixel data in the digital image/video frame; v) determining the pixel per arbitrary length unit or the arbitrary length unit per pixel of the actual target object in the horizontal and/or vertical direction of the digital image/video frame; wherein the determination is made from a pre-calibrated curve or look-up table based on a digital image/video frame recorded of a reference object with a known size, and wherein the same pixel resolution and horizontal to vertical aspect ratio is used as the digital image/video frame comprising the target object; and vi) calculating the actual size of a target object from the number of pixel per arbitrary length unit or the arbitrary length unit per pixel of the actual target object in the horizontal and/or vertical direction of the digital image/video frame.

Description

<1θ> DANMARK (10)
Figure DK179255B1_D0001
<12> PATENTSKRIFT
Patent- og
Varemærkestyrelsen (51) Int.CI.: G06T 7/60(2006.01) (21) Ansøgningsnummer: PA 2016 00335 (22) Indleveringsdato: 2016-06-08 (24) Løbedag: 2016-06-08 (41) Aim. tilgængelig: 2017-12-09 (45) Patentets meddelelse bkg. den: 2018-03-12 (73) Patenthaver: Blue Sky Tec ApS, Nattergalevej 17, 8464 Galten, Danmark (72) Opfinder: Kenneth Gjerulff Obbekjær Kring, Nattergalevej 17, 8464 Galten, Danmark (74) Fuldmægtig: Larsen & Birkeholm A/S Skandinavisk Patentbureau, Banegårdspladsen 1,1570 København V, Danmark (54) Benævnelse: A computer implemented method for determining the actual size of a target object within a digital image/video frame (56) Fremdragne publikationer:
US 2014/0314276 A1 US 9342900 B1 US 2014/0300722 A1 US 8885916 B1 (57) Sammendrag:
The present invention relates to a computer implemented method for determining the actual size of a target object within a digital image/video frame. The mothod comprises the steps of: i) obtaining a digital image/video frame containing a target object; ii) obtaining data about the digital image/video frame containing a target object, the data comprising information about pixel resolution, and horizontal to vertical aspect ratio; iii) identifying a target object within the digital image/video frame; iva) determining boundaries of the target object from pixel data in the digital image/video frame; v) determining the pixel per arbitrary length unit or the arbitrary length unit per pixel of the actual target object in the horizontal and/or vertical direction of the digital image/video frame; wherein the determination is made from a pre-calibrated curve or look-up table based on a digital image/video frame recorded of a reference object with a known size, and wherein the same pixel resolution and horizontal to vertical aspect ratio is used as the digital image/video frame comprising the target object; and vi) calculating the actual size of a target object from the number of pixel per arbitrary length unit or the arbitrary length unit per pixel of the actual target object in the horizontal and/or vertical direction of the digital image/video frame.
Fortsættes ...
Figure DK179255B1_D0002
Hg. 2
Background of the invention
Building products or elements include products used in constructing/remodeling buildings, including non-residential commercial buildings, governmental buildings, and residential or home (single family and multi-family) buildings.
An improvement project may include a replacement of existing building products (e.g., windows, doors, siding, roof, and gutters), an addition to an existing structure, a new structure, a renovation, etc.
Selection of materials and options offered in manufactured building products is critical in the delivery of an aesthetically pleasing and satisfying improvement or project and often determines or affects the cost of the products and the project. The building products industry represents a long standing industry directed to manufacturing, delivering, and installing architectural products or elements, e.g., windows, fenestration appurtenances, doors, hardware, etc.
Building product or element manufacturers face many challenges in providing their dealers/contractors with effective sales tools and support. A further problem faced by dealers and contractors is providing the home/business owner with a representative visual representation (“visualization”) of the final configured product or completed improvement project.
US20140314276 discloses a system and method for measuring distances related to a target object depicted in an image and the construction and delivery of supplemental window materials for fenestration. A digital image is obtained that contains a target object dimension and a reference object dimension in the same plane. The digital image may contain a target object dimension identified by an ancillary object and a reference object dimension in different planes. Fiducial patterns on the reference and optional ancillary objects are used that are recognized by an image analysis algorithm. Information regarding a target object and its immediate surroundings is provided to an automated or semi-automated measurement process, design and manufacturing system such that customized parts are provided to end users. The digital image contains a reference object having a reference dimension and calculating a constraint dimension from the digital image based on a reference dimension. The custom part is then designed and manufactured based on a calculated constraint dimension.
A system is needed that allows for easy retrieval of target dimensions (such as a window frame for a window opening in a wall), for visualization of presented product offerings for a specific target, and for firm quotes generated during the sales process.
Summary of the invention
A first aspect relates to a computer implemented method for determining the actual size of a target object within a digital image/video frame comprising the steps of:
i) obtaining a digital image/video frame containing a target object;
ii) obtaining data about the digital image/video frame containing a target object, the data comprising information about pixel resolution, and horizontal to vertical aspect ratio;
iii) identifying a target object within the digital image/video frame;
iv.a) determining boundaries of the target object from pixel data in the digital image/video frame;
iv. b) correlating and/or correcting the object boundaries with a figure shape from a database;
v. a) calculating/determining the pixel per arbitrary length unit or the arbitrary length unit per pixel of the actual target object in the horizontal and/or vertical direction of the digital image/video frame; and vi) calculating/determining, based on the correlated and/or corrected figure shape, the actual size of a target object from the number of pixel per arbitrary length unit or the arbitrary length unit per pixel of the actual target object in the horizontal and/or vertical direction of the digital image/video frame.
Another aspect relates to a data processing apparatus comprising means for carrying out the method of the present invention.
Yet another aspect relates to a computer program product comprising instructions which, when the program is executed by a computer, cause the computer to carry out the method of the present invention.
A second aspect relates to a computer implemented method for determining the actual size of a target object within a digital image/video frame comprising the steps of:
i) obtaining a digital image/video frame containing a target object;
ii) obtaining data about the digital image/video frame containing a target object, the data comprising information about pixel resolution, and horizontal to vertical aspect ratio;
iii) identifying a target object within the digital image/video frame;
iv.a) determining boundaries of the target object from pixel data in the digital image/video frame;
iv.b) correlating and/or correcting the object boundaries with a figure shape from a database;
v) determining the pixel per arbitrary length unit or the arbitrary length unit per pixel of the actual target object in the horizontal and/or vertical direction of the digital image/video frame; wherein the determination is made from a precalibrated curve or look-up table based on a digital image/video frame recorded of a reference object with a known size, and wherein the same pixel resolution and horizontal to vertical aspect ratio is used as the digital image/video frame comprising the target object; and vi) calculating/determining, based on the correlated and/or corrected figure shape, the actual size of a target object from the number of pixel per arbitrary length unit or the arbitrary length unit per pixel of the actual target object in the horizontal and/or vertical direction of the digital image/video frame.
In one or more embodiments, step vi) is calculated/determined from a precalibrated curve or look-up table based on a digital image/video frame recorded of a reference object with a known size, and wherein the same pixel resolution and horizontal to vertical aspect ratio is used as the digital image/video frame comprising the target object.
A “target object” of the present invention refers to an object in the digital image/video frame having a constraint dimension that is measured by one or more methods of the present invention. In describing the present invention, “constraint dimension” refers to a measured portion or a multiple of a measured portion of a target object to which a designed part is to conform and a “constraint pixel dimension” refers to the length of a constraint dimension measured in pixels. A target object may contain a “symmetry element” which in the present invention refers to an aspect of the target object that in standard practice resides at a position within the target object such that the symmetry element divides a constraint dimension in an integer number of equal parts.
As will be appreciated by one skilled in the art, one or more aspects or embodiments of the present invention may be embodied as a system, method, computer program product or any combination thereof. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a “circuit,” “module” or “system.” Furthermore, the present invention may take the form of a computer program product embodied in any tangible medium of expression having computer usable program code embodied in the medium.
The invention or portions thereof may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc., that perform particular tasks or implement particular abstract data types. The invention may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
Any combination of one or more computer usable or computer readable medium(s) may be utilized. The computer-usable or computer-readable medium may be, for example but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CDROM), optical storage device or a magnetic storage device. Note that the computer-usable or computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via, for instance, optical scanning of the paper or other medium, then compiled, interpreted, or otherwise processed in a suitable manner, if necessary, and then stored in a computer memory. In the context of this document, a computer-usable or computer-readable medium may be any medium that can contain or store the program for use by or in connection with the instruction execution system, apparatus, or device.
Computer program code for carrying out operations of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, Swift, C++,
C# or the like and conventional procedural programming languages, such as the C programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
In one or more embodiments, image-processing algorithms are used to identify a target object within the digital image/video frame. Image processing algorithms are used to decipher certain attributes of the captured image frame. The processor uses image processing algorithms to identify one or more discernable objects in the image frame and attempts to identify them. For example, the image processing may use edge detection techniques to identify one or more objects in the captured image. For each detected object, identification algorithms are used to determine the likely identity of the object. Any number of techniques might be used for such a task. For example, the object might be normalized and compared to a database of possible objects using geometric and/or size analysis. If an object is viewed askew or at an angle, a normalization routine might rotate it and compensate for skew to result in a rectangular object. The features of the image object can then be compared to the database of known rectangular objects having similar dimensional relationships, (e.g. ratio of length to width, such as other currency) and the denomination can be determined. Other techniques, such as morphological filters, look-up table, trained artificial neural network, some threshold, or an object repository of learned objects may be used as well.
The content of the image frame may in some embodiments be deciphered by processing the frame for edge pattern detection. The processed edge pattern is classified by artificial neural networks that have been trained on a list of known objects, in a look up table, or by a threshold. Once the pattern is classified, a descriptive sentence is constructed consisting of the object and its certain attributes.
In other embodiments, a graphical user interface for interactively selecting a target object is used to identify a target object within the digital image/video frame.
The computer comprises a processor and a memory coupled to the processor. The memory is coupled to the processor, and the memory comprises program instructions implementing a graphical user interface for interactively selecting a target object. Program instructions are executable by the processor for:
- displaying the digital image/video frame; and
- processing a user input routine that processes selection of a control accepting manual selection by a user of a target object. Preferably, the manual selection of the target object is made directly on the digital image/video frame at one or more particular points on said digital image/video frame.
The computer/computing device may comprise a central processing unit (CPU), a host/PCI/cache bridge, and a main memory.
The CPU may comprise one or more general purpose CPU cores and optionally one or more special purpose cores (e.g., DSP core, floating point, etc.). The one or more general purpose cores execute general purpose opcodes, while the special purpose cores executes functions specific to their purpose. The CPU is coupled through the CPU local bus to a host/PCI/cache bridge or chipset. A second level (i.e. L2) cache memory may be coupled to a cache controller in the chipset. For some processors, the external cache may comprise an L1 or first level cache. The bridge or chipset couples to main memory via a memory bus. The main memory comprises dynamic random access memory (DRAM) or extended data out (EDO) memory, or other types of memory such as ROM, static RAM, flash, and non-volatile static random access memory (NVSRAM), bubble memory, etc.
The computing device may also comprise various system components coupled to the CPU via system bus (e.g., PCI). The host/PCI/cache bridge or chipset interfaces to the system bus, such as peripheral component interconnect (PCI) bus. The system bus may comprise any of several types of well-known bus structures using any of a variety of bus architectures. Example architectures include Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MCA) bus, Enhanced ISA (EISA) bus, Video Electronics Standards Associate (VESA) local bus and Peripheral Component Interconnect (PCI) also known as Mezzanine bus.
Various components connected to the system bus include, but are not limited to, a non-volatile memory (e.g., disk based data storage), a video/graphics adapter connected to a display, a user input interface (l/F) controller connected to one or more input devices such as a mouse, tablet, microphone, keyboard and modem, network interface controller, and/or a peripheral interface controller connected to one or more external peripherals, such as printer and speakers. The network interface controller is coupled to one or more devices, such as data storage, remote computer running one or more remote applications, via a network, which may comprise the Internet cloud, a local area network (LAN), wide area network (WAN), storage area network (SAN), etc. A small computer systems interface (SCSI) adapter may also be coupled to the system bus. The SCSI adapter can couple to various SCSI devices such as a CD-ROM drive, tape drive, etc.
The non-volatile memory may include various removable/non-removable, volatile/nonvolatile computer storage media, such as hard disk drives that reads from or writes to non-removable, nonvolatile magnetic media, a magnetic disk drive that reads from or writes to a removable, nonvolatile magnetic disk, an optical disk drive that reads from or writes to a removable, nonvolatile optical disk such as a CD ROM or other optical media. Other removable/non-removable, volatile/nonvolatile computer storage media that can be used in the exemplary operating environment include, but are not limited to, magnetic tape cassettes, flash memory cards, digital versatile disks, digital video tape, solid state RAM, solid state ROM, and the like.
A user may enter commands and information into the computer through input devices connected to the user input interface. Examples of input devices include a keyboard and pointing device, mouse, trackball or touch pad. Other input devices may include a microphone, joystick, game pad, satellite dish, scanner, etc.
The computer may operate in a networked environment via connections to one or more remote computers, such as a remote computer. The remote computer may comprise a personal computer (PC), server, router, network PC, peer device or other common network node, and typically includes many or all of the elements described supra. Such networking environments are commonplace in offices, enterprise-wide computer networks, intranets and the Internet.
When used in a LAN networking environment, the computer is connected to the LAN via a network interface. When used in a WAN networking environment, the computer includes a modem or other means for establishing communications over the WAN, such as the Internet. The modem, which may be internal or external, is connected to the system bus via a user input interface, or other appropriate mechanism.
In one embodiment, the software adapted to implement the system and methods of the present invention can also reside in the cloud. Cloud computing provides computation, software, data access and storage services that do not require enduser knowledge of the physical location and configuration of the system that delivers the services. Cloud computing encompasses any subscription-based or pay-per-use service and typically involves provisioning of dynamically scalable and often virtualized resources. Cloud computing providers deliver applications via the internet, which can be accessed from a web browser, while the business software and data are stored on servers at a remote location.
In another embodiment, software adapted to implement the system and methods of the present invention is adapted to reside on a tangible, non-transitory computer readable medium. Computer readable media can be any available media that can be accessed by the computer and capable of storing for later reading by a computer a computer program implementing the method of this invention. Computer readable media includes both volatile and nonvolatile media, removable and non-removable media. By way of example, and not limitation, computer readable media may comprise computer storage media and communication media. Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by a computer. Communication media typically embodies computer readable instructions, data structures, program modules or other data such as a magnetic disk within a disk drive unit. The software adapted to implement the system and methods of the present invention may also reside, in whole or in part, in the static or dynamic main memories or in firmware within the processor of the computer system (i.e. within microcontroller, microprocessor or microcomputer internal memory).
It is noted that computer programs implementing the system and methods of this invention will commonly be distributed to users via Internet download or on a distribution medium such as floppy disk, CDROM, DVD, flash memory, portable hard disk drive, etc. From there, they will often be copied to a hard disk or a similar intermediate storage medium. When the programs are to be run, they will be loaded either from their distribution medium or their intermediate storage medium into the execution memory of the computer, configuring the computer to act in accordance with the method of this invention. All these operations are well known to those skilled in the art of computer systems.
The computer implemented method comprises the step of iv.b) correlating and/or correcting the object boundaries with a figure shape from a database; and wherein step vi) is based on the correlated and/or corrected figure shape. This is to refine the calculation by removing possible pixel errors or indirectly removing objects covering parts of the target object.
In one or more embodiments, the computer implemented method further comprises the steps of v.b) adjusting the size of the target object in the digital image/video frame to contain as many pixels as possible within the pixel resolution limits of the original digital image/video frame, while retaining its original horizontal to vertical aspect ratio; and v.c) adjusting the in step v.a) calculated/determined pixel per arbitrary length unit or arbitrary length unit per pixel of the actual target object in the horizontal and/or vertical direction of the digital image/video frame to the resized target object. This adjustment makes it easier to calculate/determine the actual size of the target object.
In one or more embodiments, step v.a) is calculated/determined by using machine learning algorithms.
A third aspect relates to an apparatus for determining the actual size of a target object within a digital image/video frame comprising:
- a processor; and
- a memory coupled to the processor, wherein the memory comprises program instructions, and wherein the program instructions are executable by the processor for:
i) obtaining a digital image/video frame containing a target object;
ii) obtaining data about the digital image/video frame containing a target object, the data comprising information about pixel resolution, and horizontal to vertical aspect ratio;
iii) identifying a target object within the digital image/video frame;
iv.a) determining boundaries of the target object from pixel data in the digital image/video frame;
iv. b) correlating and/or correcting the object boundaries with a figure shape from a database;
v. a) calculating/determining the pixel per arbitrary length unit or the arbitrary length unit per pixel of the actual target object in the horizontal and/or vertical direction of the digital image/video frame; and vi) calculating/determining, based on the correlated and/or corrected figure shape, the actual size of a target object from the number of pixel per arbitrary length unit or the arbitrary length unit per pixel of the actual target object in the horizontal and/or vertical direction of the digital image/video frame.
A fourth aspect relates to a computer program product for determining the actual size of a target object within a digital image/video frame, the computer program product comprising a readable memory device having computer readable program code stored thereon, including program code which, when executed, causes one or more processors to perform the steps of:
i) obtaining a digital image/video frame containing a target object;
ii) obtaining data about the digital image/video frame containing a target object, the data comprising information about pixel resolution, and horizontal to vertical aspect ratio;
iii) identifying a target object within the digital image/video frame;
iv.a) determining boundaries of the target object from pixel data in the digital image/video frame;
iv.b) correlating and/or correcting the object boundaries with a figure shape from a database;
v.a) calculating/determining the pixel per arbitrary length unit or the arbitrary length unit per pixel of the actual target object in the horizontal and/or vertical direction of the digital image/video frame; and vi) calculating/determining, based on the correlated and/or corrected figure shape, the actual size of a target object from the number of pixel per arbitrary length unit or the arbitrary length unit per pixel of the actual target object in the horizontal and/or vertical direction of the digital image/video frame.
It should be noted that embodiments and features described in the context of one of the aspects of the present invention also apply to the other aspects of the invention.
Brief description of the figures
Figure 1 shows a digital image containing a target object that has already been identified, and its boundaries determined;
Figure 2 shows that the minimum distance (Lmin) from which a target object can be photographed depends on its size;
Figure 3 shows exemplary part of a look-up table; and
Figure 4 shows the target object in Figure 1, adjusted to contain as many pixels as possible within the pixel resolution limits of the original digital image/video frame.
Detailed description of the invention
Figure 1 shows a digital image containing a target object that has already been identified, and its boundaries determined.
As shown in Figure 2, the minimum distance (Lmin) from which a target object can be photographed depends on its size. At the same time, the number of pixels used to define the objects is the same. This observation is crucial for the present invention, as different sizes of objects will be defined by a different number of pixels per object arbitrary length unit (e.g. per mm) of the actual object. Hence, fewer pixels are available per area object for showing details in a larger object than in a smaller object. The lack or presence of details may be used to determine the number of pixels per object arbitrary length unit (e.g. per mm) of the actual object by using machine learning methods.
When calculated/determined, the number of pixels per object arbitrary length unit (e.g. per mm) of the actual object is used to calculate/determine the actual size of the target object. The calculation/determination is made from a pre-calibrated curve or look-up table based on a digital image/video frame recorded of a reference object with a known size, and wherein the same pixel resolution and horizontal to vertical aspect ratio is used as the digital image/video frame comprising the target object. An exemplary part of a look-up table is shown in Figure 3. In a preferred embodiment, the size of the target object is adjusted to contain as many pixels as possible within the pixel resolution limits of the original digital image/video frame (Figure 4), while retaining its original horizontal to vertical aspect ratio. The calculated/determined number of pixels per object arbitrary length unit (e.g. per mm) of the actual object will be corrected accordingly. This adjustment makes it easier to calculate/determine the actual size of the target object.

Claims (3)

PATENTKRAVpatent claims 1/41.4 1. Computerimplementeret fremgangsmåde til bestemmelse af den faktiske størrelse af et målobjekt i en digital billed-/videoramme, der omfatter trinene:A computer-implemented method for determining the actual size of a target object in a digital image / video frame comprising the steps of: i) at frembringe en digital billed-/videoramme, der indeholder et målobjekt;i) producing a digital picture / video frame containing a target object; ii) at frembringe data om den digitale billed-/videoramme, der indeholder et målobjekt, hvor dataene omfatter information om pixelopløsning samt vandret/lodretforhold;ii) generating data on the digital image / video frame containing a target object, the data including pixel resolution information as well as horizontal / vertical ratio; iii) at identificere et målobjekt i den digitale billed-/videoramme;iii) identifying a target object in the digital picture / video frame; iv.a) at bestemme grænserne af målobjektet fra pixeldata i den digitale billed-/ videoramme;iv.a) determining the boundaries of the target object from pixel data in the digital image / video frame; iv. b.) at korrelere og/eller korrigere objektgrænserne med en figurform fra en database;iv. b.) correlating and / or correcting object boundaries with a figure shape from a database; v. a) at beregne/bestemme pixelen per vilkårlig længdeenhed eller den vilkårlige længdeenhed per pixel af det faktiske målobjekt i den digitale billed-/videorammes vandrette og/eller lodrette retning; og vi) at beregne/bestemme, baseret på den korrelerede og/eller korrigerede figurform, et målobjekts faktiske størrelse ud fra pixelantallet per vilkårlig længdeenhed eller den vilkårlige længdeenhed per pixel af det faktiske målobjekt i den digitale billed-/videorammes vandrette og/eller lodrette retning.v. a) calculating / determining the pixel per arbitrary length unit or the arbitrary length unit per pixel of the actual target object in the horizontal and / or vertical direction of the digital image / video frame; and vi) calculating / determining, based on the correlated and / or corrected figure shape, the actual size of a target object from the pixel count per arbitrary length unit or the arbitrary length unit per pixel of the actual target object in the horizontal and / or vertical of the digital image / video frame direction. 2. I I Claims Nos.:2. I I Claims Nos: because they relate to parts of the patent application that do not comply with the prescribed requirements to such an extent that no meaningful search can be carried out, specifically:because they relate to parts of the patent application that do not comply with the prescribed requirements to such an extent that no meaningful search can be performed, specifically: 2/42.4 IMmsH Fäto? Asm ÖmaIMmsH Fäto? Asm Öma SEARCH REPORT - PATENT SEARCH REPORT - PATENT Application No. PA 2016 00335 Application No. PA 2016 00335 1. 1 1 Certain claims were found unsearchable (See Box No. I). 1. 1 1 Certain claims were found unsearchable (See Box No. I). 2. 1 1 Unity of invention is lacking prior to search (See Box No. II). 2. 1 1 Unity of invention is lacking prior to search (See Box No. II). A. CLASSIFICATION OF SUBJECT MATTER A. CLASSIFICATION OF SUBJECT MATTER G06T 7/60(2006.01) G06T 7/60 (2006.01) According to International Patent Classification (IPC) or to both national classification and IPC According to International Patent Classification (IPC) or to both national classification and IPC B. FIELDS SEARCHED B. FIELDS SEARCHED Minimum documentation searched (classification system followed by classification symbols) Minimum documentation searched (classification system followed by classification symbols) IPC&CPC: G06T, G06K, G01C, G01B, E06B IPC & CPC: G06T, G06K, G01C, G01B, E06B Documentation searched other than minimum documentation to the extent that such documents are included in the fields searched Documentation searched other than minimum documentation to the extent that such documents are searched in the fields DK, NO, SE, FI: IPC-classes as above. DK, NO, SE, FI: IPC classes as above. Electronic database consulted during the search (name of database and, where practicable, search terms used) Electronic database consulted during the search (name of database and, where practicable, search terms used) EPODOC, WPI, FULL TEXT: ENGLISH EPODOC, WPI, FULL TEXT: ENGLISH C. DOCUMENTS CONSIDERED TO BE RELEVANT C. DOCUMENTS CONSIDERED TO BE RELEVANT Category* Citation of document, with indication, where appropriate, of the relevant passages Category * Citation of document, with indication, where appropriate, of the relevant passages Relevant for claim No. Relevant to claim no. X US 2014/0314276 Al (WEXLER, R. M. et al.) 2014.10.23 X US 2014/0314276 A1 (WEXLER, R. M. et al.) 2014.10.23 1,4, 7,9 1.4, 7.9 See paragraph [0005], [0011], [0121], [0141], [0157], FIG. 3, and FIG. 12 See paragraph [0005], [0011], [0121], [0141], [0157], FIG. 3, and FIG. 12 A US 9342900 Bl (SHEKAR, B. et al.) 2016.05.17 A US 9342900 Bl (SHEKAR, B. et al.) 2016.05.17 2, 8, 10 2, 8, 10 See abstract and column 8-9, line 66- 22 See abstract and column 8-9, lines 66-22 A US 2014/0300722 Al (GARCIA, M.) 2014.10.09 A US 2014/0300722 Al (GARCIA, M.) 2014.10.09 2, 8, 10 2, 8, 10 See abstract, paragraph [0188], and FIG. 4 See abstract, paragraph [0188], and FIG. 4 Further documents are listed in the continuation of Box C. Further documents are listed in the continuation of Box C. * Special categories of cited documents: P Document published prior to the filing date but later than the * Special categories of cited documents: P Document published prior to the filing date but later than the A Document defining the general state of the art which is not priority date claimed. A Document defining the general state of the art which is not a priority date claimed. considered to be of particular relevance. T Document not in conflict with the application but cited to considered to be of particular relevance. T Document not in conflict with the application but cited to D Document cited in the application. understand the principle or theory underlying the invention. D Document cited in the application. understand the principle or theory underlying the invention. E Earlier application or patent but published on or after the filing date. X Document of particular relevance; the claimed invention cannot beAn Earlier application or patent but published on or after the filing date. X Document of particular relevance; the claimed invention cannot be considered novel or cannot be considered to involve an inventive considered novel or cannot be considered to involve an inventive L Document which may throw doubt on priority claim(s) or which is t when ώ document is taken alone.L Document which may throw doubt on priority claim (s) or which is t when ώ document is taken alone. cited to establish the publication date of another citation or other cited to establish the publication date of another citation or other special reason fas specified5» Y Document of particular relevance; the claimed invention cannot bespecial reason fas specified 5 »Y Document of particular relevance; the claimed invention cannot be considered to involve an inventive step when the document is considered to involve an inventive step when the document is ”°” Document referring to an oral disclosure, use, exhibition or other combined with one or more other such documents, such "°" Document referring to an oral disclosure, use, exhibition or other combined with one or more other such documents, such means' combination being obvious to a person skilled in the art. means ' combination being obvious to a person skilled in the art. Document member of the same patent family. Document member of the same patent family. Danish Patent and Trademark Office Date of completion of the search report Danish Patent and Trademark Office Date of completion of the search report Helgeshøj Allé 81 12 December 2016Helgeshøj Allé 81 December 12 , 2016 DK-2630 Taastmp DK-2630 Taastmp Denmark Authorized officer Denmark Authorized officer Nikolai Bierglund Uldum Nikolai Bierglund Uldum Telephone No.+45 4350 8000 Telephone No. + 45 4350 8000 Facsimile No. +45 4350 8001 Telephone No. +45 4350 8484 Facsimile No. +45 4350 8001 Telephone No. +45 4350 8484
Search ReportSearch Report SEARCH REPORT - PATENT SEARCH REPORT - PATENT Application No. PA 2016 00335 Application No. PA 2016 00335 C (Continuation). DOCUMENTS CONSIDERED TO BE RELEVANT C (Continuation). DOCUMENTS CONSIDERED TO BE RELEVANT Category* Category * Citation of document, with indication, where appropriate, of the relevant passages Citation of document, with indication, where appropriate, of the relevant passages Relevant for claim No. Relevant to claim no. A A US 8885916 BI (MAURER, J. D. et al.) 2014.11.11 See abstract US 8885916 BI (MAURER, J. D. et al.) 2014.11.11 See abstract 1-10 1-10
Search ReportSearch Report SEARCH REPORT - PATENT SEARCH REPORT - PATENT Application No. PA 2016 00335 Application No. PA 2016 00335 Box No. I Observations where certain claims were found unsearchable Box No. In Observations where certain claims were found unsearchable
This search report has not been established in respect of certain claims for the following reasons: !·□ Claims Nos.:This search report has not been established in respect of certain claims for the following reasons:! □ Claims Nos: because they relate io subject matter not required to be searched, namely:because they relate to subject matter not required to be searched, namely: 2. Computerimplementeret fremgangsmåde ifølge krav 1, hvor trin vi) beregnes/bestemmes ud fra en præ-kalibreret kurve eller søgetabel baseret på en digital billed-/videoramme, der er registreret om et referenceobjekt med en kendt størrelse; idet referenceobjektet er et eller flere objekter, som kan skelnes; og hvor den samme pixelopløsning og vandret/lodretforhold anvendes som den digitale billed-/videoramme, der omfatter målobjektet.The computer implemented method of claim 1, wherein step vi) is calculated / determined from a pre-calibrated curve or search table based on a digital image / video frame recorded on a reference object of a known size; the reference object being one or more distinguishable objects; and where the same pixel resolution and horizontal / vertical ratio is used as the digital image / video frame comprising the target object. 3. Computerimplementeret fremgangsmåde ifølge ethvert af kravene 1-2, hvor trin v.a) beregnes/bestemmes ved anvendelse af machine learning algoritmer.A computer implemented method according to any one of claims 1-2, wherein step v.a) is calculated / determined using machine learning algorithms. 4. Computerimplementeret fremgangsmåde ifølge ethvert af kravene 1-3, hvor billedbehandlende algoritmer anvendes til at identificere et målobjekt i den digitale billed-/videoramme.The computer implemented method of any of claims 1-3, wherein image processing algorithms are used to identify a target object in the digital image / video frame. 5 5. Computerimplementeret fremgangsmåde ifølge ethvert af kravene 1-4, der endvidere omfatter trinene i v.b) at justere målobjektets størrelse i den digitale billed-/videoramme til at indeholde som mange pixels som muligt inden for den oprindelige digitale billed/videorammes pixelopløsningsgrænser, medens at det bibeholder dets oprindelige vandret/lodretforhold; og v.c) at justere den i trin v.a)A computer-implemented method according to any one of claims 1-4, further comprising the steps of vb) adjusting the size of the target object in the digital image / video frame to contain as many pixels as possible within the pixel resolution limits of the original digital image / video frame, while that it maintains its original horizontal / vertical ratio; and v.c) adjusting it in step v.a) 10 beregnede/bestemte pixel per vilkårlig længdeenhed eller vilkårlige længdeenhed per pixel af det faktiske målobjekt i den digitale billed-/videorammes vandrette og/eller lodrette retning til målobjektets ændrede størrelse.10 calculated / determined pixels per arbitrary length unit or arbitrary length units per pixel of the actual target object in the horizontal and / or vertical orientation of the digital image / video frame to the size of the target object. 6. Databehandlingsapparat, der omfatter midler til at udføre fremgangsmåden6. A data processing apparatus comprising means for carrying out the method 15 ifølge ethvert af kravene 1 -5.15 according to any one of claims 1 -5. 7. Computerprogramprodukt, der omfatter instruktioner, som får computeren til at udføre fremgangsmåden ifølge ethvert af kravene 1-5, når programmet afvikles afen computer.A computer program product comprising instructions which cause the computer to perform the method of any of claims 1-5 when the program is run by a computer.
3. I I Claims Nos.: because of other matters.3. I I Claims Nos .: because of other matters. Box No. II Observations where unity of invention is lacking prior to the searchBox No. II Observations where unity of invention is lacking prior to the search The Danish Patent and Trademark Office found multiple inventions in this patent application, as follows:The Danish Patent and Trademark Office found multiple inventions in this patent application, as follows: Search ReportSearch Report SEARCH REPORT - PATENT SEARCH REPORT - PATENT Application No. PA 2016 00335 Application No. PA 2016 00335 SUPPLEMENTAL BOX SUPPLEMENTAL BOX Continuation of Box [.] Continuation of Box [.]
Search ReportSearch Report
DKPA201600335A 2016-06-08 2016-06-08 A computer implemented method for determining the actual size of a target object within a digital image/video frame DK179255B1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
DKPA201600335A DK179255B1 (en) 2016-06-08 2016-06-08 A computer implemented method for determining the actual size of a target object within a digital image/video frame
PCT/EP2017/063492 WO2017211726A1 (en) 2016-06-08 2017-06-02 A computer implemented method for determining the actual size of a target object within a digital image or video frame

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
DKPA201600335A DK179255B1 (en) 2016-06-08 2016-06-08 A computer implemented method for determining the actual size of a target object within a digital image/video frame

Publications (2)

Publication Number Publication Date
DK201600335A1 DK201600335A1 (en) 2017-12-18
DK179255B1 true DK179255B1 (en) 2018-03-12

Family

ID=59014631

Family Applications (1)

Application Number Title Priority Date Filing Date
DKPA201600335A DK179255B1 (en) 2016-06-08 2016-06-08 A computer implemented method for determining the actual size of a target object within a digital image/video frame

Country Status (2)

Country Link
DK (1) DK179255B1 (en)
WO (1) WO2017211726A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11636235B2 (en) 2019-04-15 2023-04-25 Awi Licensing Llc Systems and methods of predicting architectural materials within a space
CN110147465A (en) * 2019-05-23 2019-08-20 上海闻泰电子科技有限公司 Image processing method, device, equipment and medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140300722A1 (en) * 2011-10-19 2014-10-09 The Regents Of The University Of California Image-based measurement tools
US20140314276A1 (en) * 2013-01-07 2014-10-23 Wexenergy Innovations Llc System and method of measuring distances related to an object
US8885916B1 (en) * 2014-03-28 2014-11-11 State Farm Mutual Automobile Insurance Company System and method for automatically measuring the dimensions of and identifying the type of exterior siding
US9342900B1 (en) * 2014-12-23 2016-05-17 Ricoh Co., Ltd. Distinguishing between stock keeping units using marker based methodology

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101748180B1 (en) * 2010-12-31 2017-06-16 주식회사 케이티 Method and apparatus of measuring size of object in image

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140300722A1 (en) * 2011-10-19 2014-10-09 The Regents Of The University Of California Image-based measurement tools
US20140314276A1 (en) * 2013-01-07 2014-10-23 Wexenergy Innovations Llc System and method of measuring distances related to an object
US8885916B1 (en) * 2014-03-28 2014-11-11 State Farm Mutual Automobile Insurance Company System and method for automatically measuring the dimensions of and identifying the type of exterior siding
US9342900B1 (en) * 2014-12-23 2016-05-17 Ricoh Co., Ltd. Distinguishing between stock keeping units using marker based methodology

Also Published As

Publication number Publication date
WO2017211726A1 (en) 2017-12-14
DK201600335A1 (en) 2017-12-18

Similar Documents

Publication Publication Date Title
US9477886B2 (en) Smart document anchor
US10346560B2 (en) Electronic blueprint system and method
CN111009002B (en) Point cloud registration detection method and device, electronic equipment and storage medium
US11682225B2 (en) Image processing to detect a rectangular object
US20210406953A1 (en) System And Method For Identifying Hidden Content
US20120101783A1 (en) Computer system for automatically classifying roof elements
US9058501B2 (en) Method, apparatus, and computer program product for determining media item privacy settings
US10192058B1 (en) System and method for determining an aggregate threat score
US20090327229A1 (en) Automatic knowledge-based geographical organization of digital media
US11100357B2 (en) Real-time micro air-quality indexing
WO2019056503A1 (en) Store monitoring evaluation method, device and storage medium
DK179255B1 (en) A computer implemented method for determining the actual size of a target object within a digital image/video frame
US20180032643A1 (en) Design-model management
US10474768B2 (en) Sensor-based facility energy modeling
CN114511661A (en) Image rendering method and device, electronic equipment and storage medium
CN110555488A (en) Image sequence auditing method and system, electronic equipment and storage medium
CA3194599A1 (en) System and method for detecting objects in images
US20190050687A1 (en) Detecting artifacts based on digital signatures
CN112001300B (en) Building monitoring method and device based on cross entropy according to position and electronic equipment
JP2007316950A (en) Method, apparatus and program for processing image
US20220292549A1 (en) Systems and methods for computer-aided appraisal
US11196766B2 (en) Detecting denial of service attacks in serverless computing
JP6259864B2 (en) Multi-functional payment support apparatus, multi-functional payment support method, and program
US10062124B2 (en) System and method for construction estimating
CN115687673B (en) Picture archiving method and device, electronic equipment and readable storage medium

Legal Events

Date Code Title Description
PBP Patent lapsed

Effective date: 20190608