WO2020210637A1 - Marqueurs de limite de surface d'écriture pour vision artificielle - Google Patents

Marqueurs de limite de surface d'écriture pour vision artificielle Download PDF

Info

Publication number
WO2020210637A1
WO2020210637A1 PCT/US2020/027687 US2020027687W WO2020210637A1 WO 2020210637 A1 WO2020210637 A1 WO 2020210637A1 US 2020027687 W US2020027687 W US 2020027687W WO 2020210637 A1 WO2020210637 A1 WO 2020210637A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
markers
boundary
writing surface
marker
Prior art date
Application number
PCT/US2020/027687
Other languages
English (en)
Inventor
Joseph Lemay
Jacob Epstein
Original Assignee
Rocket Innovations, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Rocket Innovations, Inc. filed Critical Rocket Innovations, Inc.
Priority to CN202080027842.XA priority Critical patent/CN114026598A/zh
Priority to AU2020271104A priority patent/AU2020271104A1/en
Priority to KR1020217036999A priority patent/KR20220002372A/ko
Priority to JP2021560650A priority patent/JP2022527413A/ja
Priority to EP20788455.2A priority patent/EP3953789A4/fr
Priority to CA3136438A priority patent/CA3136438A1/fr
Publication of WO2020210637A1 publication Critical patent/WO2020210637A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/80Geometric correction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/40Support for services or applications
    • H04L65/401Support for services or applications wherein the services involve a main real-time session and one or more additional parallel real-time or time sensitive sessions, e.g. white board sharing or spawning of a subconference
    • H04L65/4015Support for services or applications wherein the services involve a main real-time session and one or more additional parallel real-time or time sensitive sessions, e.g. white board sharing or spawning of a subconference where at least one of the additional parallel sessions is real time or time sensitive, e.g. white board sharing, collaboration or spawning of a subconference
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/18Image warping, e.g. rearranging pixels individually
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/194Segmentation; Edge detection involving foreground-background segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/60Network streaming of media packets
    • H04L65/61Network streaming of media packets for supporting one-way streaming services, e.g. Internet radio
    • H04L65/611Network streaming of media packets for supporting one-way streaming services, e.g. Internet radio for multicast or broadcast
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20112Image segmentation details
    • G06T2207/20132Image cropping
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker

Definitions

  • Illustrative embodiments of the invention generally relate to markers placed on a writing surface to define a boundary and, more particularly, the illustrative embodiments of the invention relate to machine-vision identification of the position of the markers.
  • a picture of the writing on the whiteboard may be taken by a user to memorialize or share the notes from the meeting. Users may take pictures of the whiteboard to have a record of the notes on their phone, and to share with their colleagues.
  • a system for capturing, organizing, and storing handwritten notes includes a plurality of boundary markers.
  • the boundary markers are configured to be positioned on a writing surface. Furthermore, the plurality of boundary markers have a fluorescent color.
  • the system also includes a tangible non-transitory computer readable medium encoded with instructions which, when run on a camera- equipped computing device, causes the camera-equipped computing device to execute processes.
  • the processes include capturing an image of the writing surface with the fluorescent markers thereon.
  • the processes also include detecting the fluorescent colored boundary markers in the captured image.
  • the processes include identifying a virtual boundary in the captured image based on the positions of the fluorescent colored boundary markers. The processes then unwarp a portion of the captured image within the virtual boundary to produce an unwarped image.
  • the boundary markers are a fluorescent orange color.
  • the markers may have a generally triangular shape.
  • the markers made be formed from silicone.
  • the markers couple to the writing surface using an adhesive and/ or microsuction.
  • the markers may be portable and easily graspable. To that end, the markers may have a thickness of between about 0.5 millimeters and about 3 millimeters.
  • the processes executed by the camera-equipped computing device further comprise broadcasting the unwarped image.
  • broadcast may be updated as new images are captured.
  • Other processes may include saving the unwarped image in an image store.
  • the processes may further include cropping the boundary markers out of the image. Some other processes may include removing the background from the captured image, cropping the captured image using the virtual boundary in the image, and/ or enhancing the image.
  • the processes performed by the computing device are performed in response to taking a picture of the writing surface.
  • the writing surface may include a whiteboard or a wall.
  • a method for capturing and storing handwritten notes includes placing a plurality of boundary markers on a writing surface.
  • the boundary markers define a virtual boundary encompassing the handwritten notes.
  • the method includes capturing a writing surface image by scanning the writing surface with an electronic device. Additionally, the method identifies the position of the markers in the writing surface image. The method also determines the boundary based on the positions of the markers in the writing surface image. The method may also unwarp a portion of the captured image within the virtual boundary to produce an unwarped image. The unwarped image may then be cropped based on the position of the detected boundary.
  • placing the plurality of boundary markers includes positioning the boundary markers at positions that approximately define corners of a rectangular boundary. Identifying the position of the markers may include identifying the fluorescent color in the image. The unwarped image may be stored and/ or broadcasted.
  • a second writing surface image may be captured by scanning the writing surface with the electronic device.
  • the method may then identify the position of the markers in the second writing surface image, and determine the boundary based on the position of the markers in the second writing surface image.
  • the method may then unwarp the second writing surface image, as a function of detecting the boundary in the captured second writing surface image, to produce a second unwarped image.
  • the method may also crop the second unwarped image based on the position of the detected boundary.
  • the broadcasting of the unwarped image may be updated to broadcast the second unwarped image.
  • a marker for detection by machine vision includes a first surface having a fluorescent color.
  • the first surface is configured to be viewed by machine vision.
  • the marker has a second surface with a surface coupling portion.
  • the surface coupling portion is configured to couple to a writing surface such that the marker remains coupled to the writing surface when the writing surface is in a vertical orientation.
  • the shape of the marker may correspond to at least a portion of a shape of an edge of the writing surface.
  • the writing surface may be a whiteboard.
  • the marker may couple to the writing surface using a microsuction layer.
  • the marker is formed from a material that does not retain visible folding patterns.
  • the marker, or a majority thereof, may be formed from silicone. Accordingly, the marker may be washable and reusable.
  • handwritten notes includes a computer device coupled with a camera.
  • the camera is configured to view a background having content.
  • the system also includes a plurality of boundary markers having a fluorescent color.
  • the boundary markers are configured to be positioned between the background and the camera so as to define a virtual boundary around a portion of the
  • the computer device is configured to: (1) detect the fluorescent color boundary markers, (2) determine the virtual boundary, and (3) deskew the portion of the background as a function of the shape of the virtual boundary to produce a deskewed image of the portion of the background.
  • the computer device is further configured to share the deskewed image of the portion of the background.
  • the boundary markers may be held together by a frame.
  • the frame may have an outer marker holding portion and an inner portion to be imaged (also referred to as an image portion).
  • the marker holding portion may be formed from plastic or metal.
  • the marker holding portion may be shaped to hold the markers in a predefined orientation that corresponds to the virtual boundary.
  • the image portion may include a preset background, or an aperture/ opening through which a background may be viewed.
  • the frame may have a boundary marker positioned at one or more vertexes of the marker holding portion, so that the positioned markers define a virtual boundary, such as a rectangle.
  • the frame may be positioned between the camera and the background.
  • the frame may have a transparent annotation surface over the image portion.
  • the annotation surface can be annotated and/ or marked using a writing utensil.
  • the image portion may include a preset background. The image portion may be deskewed and/ or shared with participants. The image portion may be shared as an image or video.
  • Some embodiments may include a kit having a plurality of boundary markers.
  • the boundary markers may have a top surface opposite a bottom surface.
  • the top surface may have a fluorescent color, and the bottom surface may be configured to adhere to a writing surface.
  • the writing surface may be a white board.
  • the boundary markers may be shaped as triangles.
  • the kit may include four boundary markers.
  • Illustrative embodiments of the invention are implemented as a computer program product having a computer usable medium with computer readable program code thereon.
  • the computer readable code may be read and utilized by a computer system in accordance with conventional processes.
  • Figure 1 schematically shows an example of a system for capturing, storing, and/ or sharing images from a writing surface in accordance with illustrative embodiments of the invention.
  • Figure 2 schematically shows a boundary marker in accordance with illustrative embodiments of the invention.
  • Figure 3 schematically shows a plurality of boundary markers forming a virtual boundary in accordance with illustrative embodiments of the invention.
  • Figure 4 schematically shows a user viewing the notes in Figure 3 through a camera of a computing device in accordance with illustrative embodiments of the invention.
  • Figure 5 schematically shows the system identifying the virtual boundary defined by the markers in Figure 4.
  • Figure 6 schematically shows an image of the notes of Figure 5 after processing in accordance with illustrative embodiments of the invention.
  • Figure 7 schematically shows an updated image of the notes from Figure 6 in accordance with illustrative embodiments of the invention.
  • Figure 8 shows a method of using the markers in accordance with illustrative embodiments of the invention.
  • Figure 9 schematically shows a frame configured to hold the markers in accordance with illustrative embodiments of the invention.
  • a set of boundary markers that define a virtual boundary are placed on a writing surface, such as a whiteboard. Inside of the virtual boundary may be notes, writing, printing, pictures, or other objects (e.g., a model) that the user may wish to capture in an image.
  • a camera- equipped computing device e.g., a smartphone
  • machine vision identifies the markers and determines the virtual boundary based on the position of the markers.
  • an image of the writing surface is captured and processed (e.g., cropping out the markers and parts of the image outside of the virtual boundary defined by the markers, deskewing the image, and/ or enhancing the image).
  • the processed image may be stored in a database, and may also be shared with others. Details of illustrative
  • Figure 1 schematically shows an example of a system 100 for capturing, storing, and/ or sharing images from a writing surface 12 in accordance with illustrative embodiments of the invention.
  • the system 100 captures, stores, and shares images
  • the system 100 could be used to stream share images from the writing surface 12 without storing the images.
  • the system 100 may be used to capture and store an image, without simultaneously sharing.
  • the system 100 includes a writing surface 12 having content 10 thereon.
  • the content 10 could be any kind of writing, drawing, scribbles, etc.
  • Illustrative embodiments include markers 18 that are placed on the writing surface 12.
  • the writing 10 is referred to as notes 10 through the application.
  • the terms "writing” and “notes” 10 are not intended to limit the type of writing, drawings, markings, or other content that can be present on a writing surface 12. Instead, the terms “writing” and “notes” are merely used to facilitate an easier understanding of how to make and use illustrative embodiments of the invention.
  • illustrative embodiments are not limited to capturing of content 10 that includes alphanumerical writing. Indeed, as discussed further below, the content 10 can include a variety of notes, writing, printing, pictures, and/ or other objects (e.g., a model), including objects that are not on the writing surface 12.
  • the notes 10 may be created during a collaborative working session.
  • Figure 1 schematically shows a plurality of participants 15 with whom the notes 10 are shared. Some of the participants 15 may access the collaborative working session remotely (e.g., via dial-in, Internet, or various messaging systems), and may benefit from seeing the notes 10 on an electronic device. For example, as shown, some of the participants 15 may wish to view the notes 10 on a television, a computer, and/ or a smartphone device. Illustrative embodiments capture, deskew and enhance images of the notes 10. The notes 10 may be saved locally on the device 14 or on cloud storage, forwarded to an application 16, and/ or broadcast to others. Furthermore, the broadcast may be updated in real time as notes 10 are updated and/ or changed. Accordingly, illustrative embodiments provide for easy sharing of the notes 10 among the various participants.
  • the system 100 includes a camera-equipped computing device 14 that captures one or more images including the markers 18 on the writing surface 12.
  • the markers 18 may define the virtual boundary (shown in broken lines as virtual boundary 22 in Figure 3 below) that encompasses the notes 10 or portion thereof that the user wishes to save and/ or share.
  • the device 14 is optionally coupled over the internet to a system cloud service, and third-party cloud services 16.
  • the cloud service 16 is within a local area network, a wide area network, or a virtual network such as a VPN (virtual private network). Additionally, or alternatively, some of the services may be used locally on the device 14.
  • the camera-equipped computing device 14 may include any computing device coupled to a camera, including but not limited to a camera-equipped smartphone, a camera-equipped tablet computer, a desktop computer with a USB-connected camera, and a laptop computer coupled to a camera.
  • illustrative embodiments may further include machine vision machines, and/ or camera-equipped headsets (e.g., helmets).
  • FIG. 2 schematically shows the boundary marker 18 in accordance with illustrative embodiments of the invention.
  • one or more of the boundary markers 18 may be placed on the writing surface 12 to define the desired boundary 22 on the writing surface 12.
  • illustrative embodiments may have a shape configured to correspond to the writing surface 12.
  • the marker 18 shown in the figure has a triangular shape with a right angle that corresponds to a corner of a traditionally rectangular
  • Shape correspondence between the marker(s) 18 and the writing surface 12 provides easy positioning of the marker(s) 18 along the edges of the writing surface 12, although it should be noted that the marker(s) 18 are not required to be positioned along the edges of the writing surface 12.
  • illustrative embodiments may include a marker 18 of any shape, and are not limited to triangular shapes or shapes that correspond to the writing surface 12.
  • the markers 18 may be round, square, or other shape.
  • the notes 10 may be on various types of writing surfaces 12, including, for example, a wall, a whiteboard, a projector, a chalkboard, a glass panel, or paper. Furthermore, the notes 10 may contain a variety of content, such as words, pictures, and/ or drawings.
  • the boundary markers 18 are positioned on the writing surface in such a way that they form a non- continuous perimeter/ boundary (e.g., the four corners of a virtual boundary) around the notes 10 that the user wants to capture in an image. It should be understood that the markers 18 can be positioned to form the virtual boundary around the entirety of the writing surface 12, a portion of the writing surface 12 (e.g., just the notes 10 on the writing surface 12), or just a portion of the notes 10.
  • users may choose the portion of the writing surface 12 to capture by forming the virtual boundary 22 using the markers 18.
  • the user can reposition the markers 18 in order to focus on different portions of the notes 10, even if, for example, the camera captures the entire writing surface 12 in a number of successive images.
  • the marker 18 may be formed from a variety of materials, including one or more of rubber, silicone, and/ or polypropylene. Additionally, at least one side of the marker 18 may have a writing surface coupling portion, such as an adhesive (e.g., including permanent adhesives, static cling adhesives and other semi-permanent adhesives), electrostatic adhesion layer, and/ or microsuction adhesion layer, to provide reliable attachment to the writing surface 12.
  • an adhesive e.g., including permanent adhesives, static cling adhesives and other semi-permanent adhesives
  • electrostatic adhesion layer e.g., electrostatic adhesion layer
  • microsuction adhesion layer e.g., microsuction adhesion layer
  • the coupling portion provides sufficient coupling such that the markers 18 do not fall from the writing surface 12 because of their weight (e.g., such as when the writing surface 12 is in a vertical orientation).
  • illustrative embodiments may form the marker 18 from the previously mentioned materials, or other materials, such that the markers 18 are reusable and washable.
  • the markers 18 may be about 0.5 mm to about 3 mm (e.g., 1/32 of an inch) thick, providing for easy grasping and removal from the writing surface 12 while remaining portable.
  • the markers 18 may be provided in a kit (e.g., in a pack of four) to facilitate boundary 22 detection (e.g., a rectangular boundary), as shown in Figure 3.
  • the fluorescent colors are more easily detectable in poor light conditions because of their ability to reflect light absorbed in the non-visible spectrum.
  • Some illustrative embodiments use markers 18 having colors that are not commonly found in office environments (e.g., orange fluorescent markers 18 that do not "compete” with other colors on a common whiteboard environment).
  • FIG 3 schematically shows a plurality of boundary markers 18 forming a virtual boundary 22 in accordance with illustrative embodiments of the invention.
  • the markers 18 are offset from the corners of the whiteboard 12, but in other embodiments, some or all of the markers 18 could be placed up against the corners of the whiteboard 12.
  • the placement of the markers 18 define the virtual boundary 22.
  • the virtual boundary 22 shown in the figure is not physically present on the writing surface. Instead, the virtual boundary 22 is created by the system 100 as a result of identifying the position of the markers 18.
  • machine vision detects the markers 18 and determines the virtual boundary 22 formed by the markers 18.
  • determining the virtual boundary 22 may include correlating the position of the markers 18 with an expected image shape.
  • machine vision may detect the markers 18, and some separate logic may determine the virtual boundary 22 (e.g., using a cloud based server).
  • Illustrative embodiments may use a variety of different portions of the markers 18 to determine the virtual boundary 22.
  • the virtual boundary 22 may be defined by the outside edges of the markers 18.
  • the inner edges of the markers 18 may be used to identify the virtual boundary 22.
  • the midpoint of hypotenuse of each of the triangular markers 18 may be used to define the virtual boundary 22.
  • the midpoint of each markers 18 may be used to define the virtual boundary 22.
  • these are merely exemplary, and there are a number of ways to use the markers 18 to define the virtual boundary 22.
  • the markers 18 may not align perfectly into the desired shape, e.g., a rectangular shape.
  • illustrative embodiments may compensate for the offset by using various portions of the markers 18 to correspond to the expected image shape, e.g., by defining the virtual boundary 22 having a "best fit” with respect to the placement of the markers.
  • Some embodiments may create a "best fit” that does not pass through all or any of the markers 18.
  • the "best fit" virtual boundary 22 may pass through 3 of 4 markers 18.
  • the boundary 22 may be defined as being some distance inward of the markers 18.
  • One of skill in the art can use a variety of methods for defining the boundary 22 using the markers 18 while being within the scope of illustrative embodiments of the invention.
  • FIG 4 schematically shows a user viewing the notes 10 in Figure 3 through the camera of the computing device 14 in accordance with illustrative embodiments of the invention.
  • the markers 18 are generally positioned on the writing surface 12 so as to correspond to corners of a rectangle.
  • the markers 18 appear to be positioned at the corners of a quadrangle. This depends on the relative angle of the camera 14 to the writing surface 12.
  • the system 100 identifies the markers 18, creates the virtual boundary 22 based on the position of the markers 18, and deskews the image to the appropriate shape (e.g., based on the size and proportions of the markers 18, which are known).
  • Figure 5 schematically shows the system 100 identifying the virtual boundary 22 defined by the markers 18 in Figure 4.
  • the system 100 uses the outside edges 28 of the markers 18 to define the virtual boundary 22.
  • the system 100 applies computer vision transformation to unwarp each part of the image within the boundary (e.g., quadrangle) into its appropriate shape (e.g., a rectangle) and to remove the background of the image so that it is cropped to, or approximately to, the virtual boundary 22.
  • This identification and deskewing process is similar to the identification and deskewing process for page borders that are described in US Patent No.
  • Figure 6 schematically shows an image of the notes 10 of Figure 5 after processing in accordance with illustrative embodiments of the invention.
  • the image in Figure 6 has been deskewed and enhanced. As can be seen, although the image is taken at an angle, the deskewed image appears as if it is taken from directly in front of the notes 10.
  • the system 100 applies computer vision transformation to unwarp each boundary into a predefined shape. For example, the quadrangle formed by the four markers 18 may be unwarped into a rectangle.
  • the system 100 may remove the background of the image and enhance the image.
  • the system 100 crops out the markers 18 themselves from the image, and everything outside of the boundary 22, so that the image is cropped to the virtual boundary 22.
  • the image may also be enhanced using conventional image enhancement and filtering techniques, such as, for example, noise filtering, sharpness
  • the notes 10 may be stored locally on the device 14, and/ or may be broadcast to others (e.g., participants) and to various applications and programs 16.
  • the system 100 allows users to image and/ or video stream the writing surface 12, or the portion of the writing surface within the defined virtual boundary 22, in real-time.
  • the system generates and shares a unique URL with other users (e.g., via text message link, email, etc.).
  • the private, real-time page may be updated every time the writing surface 12 and/ or the virtual boundary 22 is scanned.
  • illustrative embodiments may have an auto-scan mode, wherein the camera faces the writing surface and/ or the virtual boundary 22, and automatically scans at a predetermined time (e.g., every 5 seconds, every minute, every 5 minutes, etc.). The automatic scanning time may be adjusted by the user.
  • FIG 7 schematically shows an updated image 10A of the notes 10 from Figure 6 in accordance with illustrative embodiments of the invention.
  • new notes 30 were added to the notes 10 from Figure 6.
  • the user may have drawn these new notes 30 on the writing surface 12.
  • the camera of the device 14 views the writing surface 12 and/ or the markers 18 again, and a second image is produced using the processes described previously.
  • This second image may be broadcast to the participants 15 in real-time.
  • illustrative embodiments may provide broadcast updates of the writing surface 12. This process may be repeated many times.
  • these notes 10A may be erased, and an entirely new set of notes may be created and broadcast using the methods described herein.
  • Illustrative embodiments may save the various images, and allow users to maintain a record of the various images scanned by the system 100 for review.
  • Figure 8 shows a method of using the markers 18 in accordance with illustrative embodiments of the invention.
  • the method begins at step 802, which positions the boundary markers 18.
  • the boundary markers 18 may be positioned in a number of ways to define various kinds of boundaries 22.
  • the markers 18 may be positioned at four corners of a rectangle, at three points forming a triangle, and so on.
  • the system 100 has logic that determines the shape formed, or the shape most closely approximated, by the markers 18, and deskews based on the determined shape.
  • conventional writing surfaces 12 are rectangular, it is expected that many use scenarios will be based on a rectangular shape.
  • some embodiments may disregard (i.e., not detect a virtual boundary 22) configurations of markers 18 which are not positioned in or near edges of a defined shape (e.g., scattered randomly). Accordingly, illustrative embodiments may provide four markers 18 in a kit to easily define a rectangular virtual boundary 22.
  • the process proceeds to step 804, which identifies the boundary markers 18 with computer vision.
  • the computer vision can be on any of the devices 14 described previously.
  • the computer vision searches for the four bright orange triangles in the image on the screen.
  • the system 100 may employ some color thresholds around the target color to find the markers 18 on the screen. For example, the system may look for an RGB color that is within some value of hue/ saturation to ensure it detects the markers in a wide variety of environments (e.g., sunny v. dark).
  • a dynamic value may also be employed - the system may look for the shapes of the markers 18 (e.g., finds four triangular shaped markers 18 of the same color).
  • the process then proceeds to stop 806, which determines the boundary 22 based on the position of the markers 18.
  • the markers are fluorescent to provide for easy distinguishing of the markers from the writing surface 12 and from the background in the image. As described previously, it is suspected that the glow of fluorescent colors helps the markers stick out more to computer vision, and is less likely to be confused as a shadow, a person, or a drawn shape (e.g., a drawn triangle). However, other embodiments may use non-fluorescent colors.
  • step 808 processes the image.
  • the system 100 may take a snapshot and deskew the image (e.g., by using the known shape and proportions of the markers 18).
  • the system 100 may also crop the image to the boundary 22 defined by the position of the markers 18. It should be understood that the system 100 may also deskew the image from a steep angle, as it can determine the angle (e.g., by detecting that the more distant markers 18 are smaller and warped relative to the closer markers 18). For example, when markers 18 are positioned at corners of a rectangle in the physical world, the virtual boundary 22 may appear like a trapezoid on the screen.
  • the system can use the known marker 18 shape, and the known writing surface 12 shape, to stretch the image back to a rectangle.
  • 3D data may be used to enhance the deskewing algorithm.
  • smartphones may detect 3D shape (e.g., similar to the facial recognition used to unlock the Apple® iPhone 10).
  • the true 3D data may be used to more precisely determine the position of the markers 18 and to produce a more precisely deskewed image. Indeed, some embodiments may account for any position of the markers 18 even if they are not arranged in a predetermined shape (e.g., randomly scattered).
  • the deskewed image may take on the shape defined by the position of the markers 18.
  • the deskewed images may optionally be cropped into a preferred shape (such as a rectangle).
  • a final layer of image processing may be applied.
  • Background and foreground detecting techniques can be used to enhance the image by using the known color of the markers 18 to correct for color distortion. For instance, if the color of the markers 18 in the image is dimmer than the expected color value, the image could be brightened. Additionally, or alternatively if the color of the markers is off (e.g., more yellow than orange), the image could be shifted away from yellow.
  • illustrative embodiments describe capturing an image, it should be understood that this process includes scanning or viewing the markers without saving the image. Thus, illustrative embodiments may initiate the processing described herein merely by viewing and identifying the markers 18, without requiring actively capturing (e.g., pushing a button and/ or saving the image) to capture the image. However, some other embodiments may require that the user actively capture the image and/ or save the image.
  • step 810 stores and/ or shares the images.
  • the images may be stored locally, or on a cloud-based drive.
  • the images may be broadcast in real time, and on a continuous basis as described previously.
  • the process then moves to step 812, which asks if there are more images to take. If there are more images to take, the process returns to step 802. This may be the case for example, if revisions or changes have been made to the notes 10, or if the user wishes to update the broadcast. If there are no more images to take, then the process comes to an end.
  • Some embodiments may operate without any markers 18.
  • the system 100 may identify a boundary, such as the edge of a chalkboard, whiteboard, and/ or projector screen, and use that in place of the virtual boundary 22 defined by the markers 18 as discussed above. Accordingly, users can save images of notes 10 on a chalkboard, whiteboard, and/ or projector screen in real time. Notes captured without markers may also be broadcast, deskewed, and saved in accordance with the methods described herein.
  • illustrative embodiments refer to using the markers 18 with reference to writing surfaces 12, it should be understood that illustrative embodiments may be used generally with machine vision.
  • the inventors surprising discovery that fluorescent colors are more easily identifiable by machine vision may be used more generally in any field requiring machine vision identification. Accordingly, not all illustrative embodiments are intended to be limited in application to writing surfaces 12.
  • illustrative embodiments may include a variety of surfaces 12 in place of the previously described writing surface(s) 12.
  • the writing surface 12 may be a non-traditional writing surface 12, such as a road (e.g., where children draw with colored chalk), and the markers 18 may be placed on the road.
  • some embodiments may capture, store, and/ or share a background (i.e., instead of a writing surface 12).
  • the markers 18 may be held up against the background of the sky (from the perspective of the camera 14).
  • Figure 9 schematically shows a frame 32 (e.g., formed of metal or plastic) configured to hold the markers 18 in accordance with illustrative embodiments of the invention.
  • the frame 32 may have a predetermined shape (e.g., rectangular) with markers 18 coupled thereto in a predetermined orientation and position.
  • the markers 18 may be placed at the vertex of the frame 32 (e.g., at the four corners of a rectangular frame).
  • the frame 32 provides an easy and convenient way to predefine the shape of the background image to be shared within the frame 32.
  • Illustrative embodiments may otherwise process or operate on a background in a similar manner to the writing surface 12 (e.g., by identifying the markers 18, detecting the boundary 22, deskewing, enhancing, storing, and/ or sharing the background image).
  • the frame 32 may include a predefined background (e.g., as opposed to the open frame described above which allows the user to view the notes 10 on the whiteboard 12), such as a background of a location (e.g., a famous landmark such as the Eiffel tower in Paris, or the Colosseum in Rome).
  • the predefined background may include a variety of backgrounds, such as various sports formations (e.g., football or basketball formations from a playbook). Accordingly, a coach could broadcast plays as he draws on the background.
  • Some embodiments may have a frame 32 with a transparent annotation surface 34 configured to overlay the background and or writing surface 12.
  • the transparent surface 34 may be annotated 36 by a user (e.g., using a pen or other writing instrument). Accordingly, illustrative embodiments enable the system 100 to operate as a telestrator over some background or writing surface 12. Thus, the user may draw/ annotate 36 over a moving video or still image.
  • the system 100 may include a receiving headset (e.g., a helmet modified to include a video or image display screen). The system 100 may further broadcast the annotated image to the receiving headset.
  • this process can be a simplified version of a more complex process of using the markers 18. As such, the process may have additional steps that are not discussed. In addition, some steps may be optional, performed in a different order, or in parallel with each other. For example, step 812 may take place before any of steps 808 or 810. Accordingly, discussion of this process is illustrative and not intended to limit various embodiments of the invention. It should be noted this symbolic representation is one view of the logical flow of the system 100. Logical flow variants would not change the underlying enablement of the system using the algorithmic methods outlined above. Additionally, it should be understood that the process described above, although referring to images, could also apply to video.
  • logic blocks e.g., programs, modules, functions, or subroutines
  • logic elements may be added, modified, omitted, performed in a different order, or implemented using different logic constructs (e.g., logic gates, looping
  • the present invention may be embodied in many different forms, including, but in no way limited to, computer program logic for use with a processor (e.g., a microprocessor, microcontroller, digital signal processor, or general purpose computer), programmable logic for use with a programmable logic device (e.g., a Field Programmable Gate Array (FPGA) or other PLD), discrete components, integrated circuitry (e.g., an Application Specific Integrated Circuit (ASIC)), or any other means including any combination thereof.
  • a processor e.g., a microprocessor, microcontroller, digital signal processor, or general purpose computer
  • programmable logic for use with a programmable logic device
  • FPGA Field Programmable Gate Array
  • ASIC Application Specific Integrated Circuit
  • Computer program logic implementing some or all of the described functionality is typically implemented as a set of computer program instructions that is converted into a computer executable form, stored as such in a computer readable medium, and executed by a microprocessor under the control of an operating system.
  • Hardware-based logic implementing some or all of the described functionality may be implemented using one or more appropriately configured FPGAs.
  • Source code may include a series of computer program instructions implemented in any of various programming languages (e.g., an object code, an assembly language, or a high-level language such as Fortran, C, C++, JAVA, or HTML) for use with various operating systems or operating environments.
  • the source code may define and use various data structures and communication messages.
  • the source code may be in a computer executable form (e.g., via an interpreter), or the source code may be converted (e.g., via a translator, assembler, or compiler) into a computer executable form.
  • Computer program logic implementing all or part of the functionality previously described herein may be executed at different times on a single processor (e.g., concurrently) or may be executed at the same or different times on multiple processors and may run under a single operating system
  • computer process refers generally to the execution of a set of computer program instructions regardless of whether different computer processes are executed on the same or different processors and regardless of whether different computer processes run under the same operating system process/ thread or different operating system processes/ threads.
  • the computer program may be fixed in any form (e.g., source code form, computer executable form, or an intermediate form) either permanently or transitorily in a tangible storage medium, such as a semiconductor memory device (e.g., a RAM, ROM, PROM, EEPROM, or Flash-Programmable RAM), a magnetic memory device (e.g., a diskette or fixed disk), an optical memory device (e.g., a CD-ROM), a PC card (e.g., PCMCIA card), or other memory device.
  • a semiconductor memory device e.g., a RAM, ROM, PROM, EEPROM, or Flash-Programmable RAM
  • a magnetic memory device e.g., a diskette or fixed disk
  • an optical memory device e.g., a CD-ROM
  • PC card e.g., PCMCIA card
  • the computer program may be fixed in any form in a signal that is tiansmittable to a computer using any of various communication technologies, including, but in no way limited to, analog technologies, digital technologies, optical technologies, wireless technologies (e.g., Bluetooth), networking technologies, and internetworking technologies.
  • the computer program may be distributed in any form as a removable storage medium with accompanying printed or electronic documentation (e.g., shrink wrapped software), preloaded with a computer system (e.g., on system ROM or fixed disk), or distributed from a server or electronic bulletin board over the communication system (e.g., the Internet or World Wide Web).
  • Hardware logic including programmable logic for use with a
  • programmable logic device implementing all or part of the functionality previously described herein may be designed using traditional manual methods, or may be designed, captured, simulated, or documented electronically using various tools, such as Computer Aided Design (CAD), a hardware description language (e.g., VHDL or AHDL), or a PLD programming language (e.g.,
  • CAD Computer Aided Design
  • a hardware description language e.g., VHDL or AHDL
  • PLD programming language e.g.,
  • PALASM PALASM
  • ABEL ABEL
  • CUPL CUPL
  • Programmable logic may be fixed either permanently or transitorily in a tangible storage medium, such as a semiconductor memory device (e.g., a RAM, ROM, PROM, EEPROM, or Flash-Programmable RAM), a magnetic memory device (e.g., a diskette or fixed disk), an optical memory device (e.g., a CD-ROM), or other memory device.
  • a semiconductor memory device e.g., a RAM, ROM, PROM, EEPROM, or Flash-Programmable RAM
  • a magnetic memory device e.g., a diskette or fixed disk
  • an optical memory device e.g., a CD-ROM
  • the programmable logic may be fixed in a signal that is transmittable to a computer using any of various communication technologies, including, but in no way limited to, analog technologies, digital technologies, optical technologies, wireless technologies (e.g., Bluetooth), networking technologies, and internetworking technologies.
  • the programmable logic may be distributed as a removable storage medium with accompanying printed or electronic documentation (e.g., shrink wrapped software), preloaded with a computer system (e.g., on system ROM or fixed disk), or distributed from a server or electronic bulletin board over the communication system (e.g., the Internet or World Wide Web).
  • a computer system e.g., on system ROM or fixed disk
  • a server or electronic bulletin board over the communication system
  • some embodiments of the invention may be implemented as a combination of both software (e.g., a computer program product) and hardware. Still other embodiments of the invention are implemented as entirely hardware, or entirely software.
  • embodiments of the present invention may employ conventional components such as conventional computers (e.g., off-the-shelf PCs, mainframes, microprocessors), conventional programmable logic devices (e.g., off-the shelf FPGAs or PLDs), or conventional hardware components (e.g., off-the-shelf ASICs or discrete hardware components) which, when programmed or configured to perform the non-conventional methods described herein, produce non-conventional devices or systems.
  • conventional computers e.g., off-the-shelf PCs, mainframes, microprocessors
  • conventional programmable logic devices e.g., off-the shelf FPGAs or PLDs
  • conventional hardware components e.g., off-the-shelf ASICs or discrete hardware components

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Human Computer Interaction (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Image Processing (AREA)
  • Geometry (AREA)
  • Studio Devices (AREA)

Abstract

L'invention concerne un système de capture, d'organisation et de stockage de notes manuscrites qui comprend une pluralité de marqueurs de limite. Les marqueurs de limite sont configurés pour être disposés sur une surface d'écriture. En outre, la pluralité de marqueurs de limite ont une couleur fluorescente. Le système comprend également un support lisible par ordinateur non-transitif tangible codé avec des instructions qui, lorsqu'elles sont exécutées sur un dispositif informatique équipé d'une caméra, amène le dispositif informatique équipé d'une caméra à exécuter des processus. Les procédés comprennent la capture d'une image de la surface d'écriture avec les marqueurs fluorescents sur celle-ci. Les procédés comprennent également la détection des marqueurs de limite colorés fluorescents dans l'image capturée. De plus, les procédés comprennent l'identification d'une limite virtuelle dans l'image capturée sur la base des positions des marqueurs de limite colorés fluorescents. Les processus découpent ensuite une partie de l'image capturée à l'intérieur de la limite virtuelle pour produire une image non déformée.
PCT/US2020/027687 2019-04-12 2020-04-10 Marqueurs de limite de surface d'écriture pour vision artificielle WO2020210637A1 (fr)

Priority Applications (6)

Application Number Priority Date Filing Date Title
CN202080027842.XA CN114026598A (zh) 2019-04-12 2020-04-10 用于计算机视觉的书写表面边界标记
AU2020271104A AU2020271104A1 (en) 2019-04-12 2020-04-10 Writing surface boundary markers for computer vision
KR1020217036999A KR20220002372A (ko) 2019-04-12 2020-04-10 컴퓨터 비전을 위한 기록 표면 경계 마커
JP2021560650A JP2022527413A (ja) 2019-04-12 2020-04-10 コンピュータビジョン用の書き込み面境界マーカー
EP20788455.2A EP3953789A4 (fr) 2019-04-12 2020-04-10 Marqueurs de limite de surface d'écriture pour vision artificielle
CA3136438A CA3136438A1 (fr) 2019-04-12 2020-04-10 Marqueurs de limite de surface d'ecriture pour vision artificielle

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201962833321P 2019-04-12 2019-04-12
US62/833,321 2019-04-12

Publications (1)

Publication Number Publication Date
WO2020210637A1 true WO2020210637A1 (fr) 2020-10-15

Family

ID=72750882

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2020/027687 WO2020210637A1 (fr) 2019-04-12 2020-04-10 Marqueurs de limite de surface d'écriture pour vision artificielle

Country Status (7)

Country Link
EP (1) EP3953789A4 (fr)
JP (1) JP2022527413A (fr)
KR (1) KR20220002372A (fr)
CN (1) CN114026598A (fr)
AU (1) AU2020271104A1 (fr)
CA (1) CA3136438A1 (fr)
WO (1) WO2020210637A1 (fr)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040131232A1 (en) * 1998-04-08 2004-07-08 Jeffrey Meisner Augmented reality technology
US20070077419A1 (en) * 2001-12-27 2007-04-05 Seed Company Limited Mark transfer tool, mark transfer tape, and manufacturing method of mark transfer tape
US20140297646A1 (en) 2013-04-02 2014-10-02 3M Innovative Properties Company Systems and methods for managing notes
US20150125846A1 (en) * 2013-11-05 2015-05-07 Michael Langford Rollable and Transportable Dry Erase Board
US20160339337A1 (en) * 2015-05-21 2016-11-24 Castar, Inc. Retroreflective surface with integrated fiducial markers for an augmented reality system
US20180134068A1 (en) * 2016-11-13 2018-05-17 Rocket Innovations, Inc. Moisture-erasable note taking system
US20180238802A1 (en) * 2011-10-13 2018-08-23 Affymetrix, Inc. Methods, systems and apparatuses for testing and calibrating fluorescent scanners
US10127468B1 (en) 2015-07-17 2018-11-13 Rocket Innovations, Inc. System and method for capturing, organizing, and storing handwritten notes
US20190037171A1 (en) * 2017-07-26 2019-01-31 Blue Jeans Network, Inc. System and methods for physical whiteboard collaboration in a video conference
US20190072693A1 (en) * 2016-02-29 2019-03-07 Enplas Corporation Marker
US20190084341A1 (en) * 2017-09-21 2019-03-21 Comsero, Inc. Micro-suction reusable and repositionable writing surfaces

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040131232A1 (en) * 1998-04-08 2004-07-08 Jeffrey Meisner Augmented reality technology
US20070077419A1 (en) * 2001-12-27 2007-04-05 Seed Company Limited Mark transfer tool, mark transfer tape, and manufacturing method of mark transfer tape
US20180238802A1 (en) * 2011-10-13 2018-08-23 Affymetrix, Inc. Methods, systems and apparatuses for testing and calibrating fluorescent scanners
US20140297646A1 (en) 2013-04-02 2014-10-02 3M Innovative Properties Company Systems and methods for managing notes
US20150125846A1 (en) * 2013-11-05 2015-05-07 Michael Langford Rollable and Transportable Dry Erase Board
US20160339337A1 (en) * 2015-05-21 2016-11-24 Castar, Inc. Retroreflective surface with integrated fiducial markers for an augmented reality system
US10127468B1 (en) 2015-07-17 2018-11-13 Rocket Innovations, Inc. System and method for capturing, organizing, and storing handwritten notes
US20190072693A1 (en) * 2016-02-29 2019-03-07 Enplas Corporation Marker
US20180134068A1 (en) * 2016-11-13 2018-05-17 Rocket Innovations, Inc. Moisture-erasable note taking system
US20190037171A1 (en) * 2017-07-26 2019-01-31 Blue Jeans Network, Inc. System and methods for physical whiteboard collaboration in a video conference
US20190084341A1 (en) * 2017-09-21 2019-03-21 Comsero, Inc. Micro-suction reusable and repositionable writing surfaces

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3953789A4

Also Published As

Publication number Publication date
EP3953789A1 (fr) 2022-02-16
AU2020271104A1 (en) 2021-12-02
JP2022527413A (ja) 2022-06-01
CN114026598A (zh) 2022-02-08
CA3136438A1 (fr) 2020-10-15
KR20220002372A (ko) 2022-01-06
EP3953789A4 (fr) 2023-01-18

Similar Documents

Publication Publication Date Title
US11115565B2 (en) User feedback for real-time checking and improving quality of scanned image
US10841551B2 (en) User feedback for real-time checking and improving quality of scanned image
US11908101B2 (en) Writing surface boundary markers for computer vision
WO2018214365A1 (fr) Procédé, appareil, dispositif et système de correction d'image, dispositif de prise de vues et dispositif d'affichage
US10175845B2 (en) Organizing digital notes on a user interface
KR101711233B1 (ko) 카메라 기반의 스캐닝
US9516214B2 (en) Information processing device and information processing method
US9292186B2 (en) Note capture and recognition with manual assist
US20160330374A1 (en) Adaptive camera control for reducing motion blur during real-time image capture
US10474922B1 (en) System and method for capturing, organizing, and storing handwritten notes
US20170372449A1 (en) Smart capturing of whiteboard contents for remote conferencing
US9779323B2 (en) Paper sheet or presentation board such as white board with markers for assisting processing by digital cameras
US20150220800A1 (en) Note capture, recognition, and management with hints on a user interface
CN110490200A (zh) 一种证件扫描方法、装置及设备
AU2020271104A1 (en) Writing surface boundary markers for computer vision
US9774791B2 (en) Method and related camera device for generating pictures with object moving trace
JP6914369B2 (ja) ベクトル形式小画像生成
JP2006099504A (ja) ペンライト付電子ペンのペンライト自動制御システムおよびペンライト制御方法
JP6025031B2 (ja) 画像処理装置及びプログラム

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 3136438

Country of ref document: CA

ENP Entry into the national phase

Ref document number: 2021560650

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 20217036999

Country of ref document: KR

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 2020788455

Country of ref document: EP

Effective date: 20211112

ENP Entry into the national phase

Ref document number: 2020271104

Country of ref document: AU

Date of ref document: 20200410

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20788455

Country of ref document: EP

Kind code of ref document: A1