WO2020027813A1 - Recadrage de parties des images - Google Patents
Recadrage de parties des images Download PDFInfo
- Publication number
- WO2020027813A1 WO2020027813A1 PCT/US2018/044668 US2018044668W WO2020027813A1 WO 2020027813 A1 WO2020027813 A1 WO 2020027813A1 US 2018044668 W US2018044668 W US 2018044668W WO 2020027813 A1 WO2020027813 A1 WO 2020027813A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- sensitive surface
- touch sensitive
- touch
- point
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1626—Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1637—Details related to the display arrangement, including those related to the mounting of the display in the housing
- G06F1/1643—Details related to the display arrangement, including those related to the mounting of the display in the housing the display being associated to a digitizer, e.g. laptops that can be used as penpads
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
- G06F1/1686—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated camera
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/26—Power supply means, e.g. regulation thereof
- G06F1/32—Means for saving power
- G06F1/3203—Power management, i.e. event-based initiation of a power-saving mode
- G06F1/3206—Monitoring of events, devices or parameters that trigger a change in power modality
- G06F1/3215—Monitoring of peripheral devices
- G06F1/3218—Monitoring of peripheral devices of display devices
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/26—Power supply means, e.g. regulation thereof
- G06F1/32—Means for saving power
- G06F1/3203—Power management, i.e. event-based initiation of a power-saving mode
- G06F1/3234—Power saving characterised by the action undertaken
- G06F1/325—Power saving in peripheral device
- G06F1/3265—Power saving in display device
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/26—Power supply means, e.g. regulation thereof
- G06F1/32—Means for saving power
- G06F1/3203—Power management, i.e. event-based initiation of a power-saving mode
- G06F1/3234—Power saving characterised by the action undertaken
- G06F1/325—Power saving in peripheral device
- G06F1/3278—Power saving in modem or I/O interface
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02D—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
- Y02D10/00—Energy efficient computing, e.g. low power processors, power management or thermal management
Definitions
- Computing systems such as desktop computer, laptops, and smartphones may have touch sensitive surfaces which may allow interaction of users with the computing systems.
- a touch sensitive surface may facilitate a touch-based input to be provided by a user to a computing system.
- an image capturing system such as a camera may also be associated with the computing system to capture a video or an image of the user’s input on the touch sensitive device.
- the computing system may receive the captured video or image and may process the captured video or image for various purposes, such as displaying the user's input on a display of the computing system or projecting the user’s input on a surface.
- Figure 1 illustrates a computing system, in accordance with an example implementation of the present subject matter
- Figure 2 illustrates a computing system, in accordance with another example implementation of the present subject matter
- Figure 3 illustrates a method for cropping a portion of an image, according to an example implementation of the present subject matter
- Figure 4 illustrates a method for cropping a portion of an image, according to another example implementation of the present subject matter.
- Figure 5 illustrates a computing environment for cropping a portion of an image, according to an example implementation of the present subject matter.
- Computing systems such as desktops computers, and laptops may be coupled to touch sensitive surfaces, such as touch screens, touchpads, and touchmats, to allow users to interact with the computing systems.
- touch sensitive surfaces such as touch screens, touchpads, and touchmats
- the interaction of a user with a computing system through a touch sensitive surface may be facilitated by an image capturing device coupled to the computing system.
- the user may provide an input, such as a touch input, or may draw or write on the touch sensitive surface using a fingertip or a stylus.
- the image capturing device may be positioned to capture the input of the user by recording movement of the fingertip or stylus over the touch sensitive surface in a series of images or a video.
- the computing system may thereafter process the images or video captured by the image capturing device to interpret the input and perform an action corresponding to the input, such as displaying the images or video on a display device associated with the computing system or projecting the images or video on a surface.
- an entire surface area of the touch sensitive surface is in the field of view of the image capturing device, such that an input provided by the user at any portion of the touch sensitive surface is captured and processed for further action.
- the image capturing device captures multiple images or videos of the entire surface area of the touch sensitive surface in a short time frame.
- the computing system performs a frame-by-frame processing of the images or videos corresponding to the entire surface area of the touch sensitive surface to interpret the input. Such a processing is often computationally expensive for the computing system.
- an indication of actuation of a touch sensitive surface is received.
- the actuation of the touch sensitive surface is based on a touch at a point on the touch sensitive surface.
- an image of the touch sensitive surface is obtained from an image capturing device. Thereafter, a portion of the image, corresponding to the point on the touch sensitive surface, is determined and the same is cropped.
- the portion of the image may be an area around the point in the image.
- the cropping of the portion of the image is based on the point of actuation of the touch sensitive surface.
- the cropping provides for processing of a portion of the image, wherein the portion of the image corresponds to a region of the touch sensitive surface on which the input is received.
- a host processor such as a processor of a computing system associated with the touch sensitive surface, processing the image, processes the cropped portion of the image alone. Processing the cropped portion of the image to the exclusion of the rest of the image, reduces the processing burden on the host processor and enhances the response time of the host processor.
- Figure 1 illustrates a computing system 100, in accordance with an example implementation of the present subject matter.
- the computing system 100 comprises a processor 102.
- processor(s) may be provided through the use of dedicated hardware as well as hardware capable of executing software.
- the functions may be provided by a single dedicated processor, by a single shared processor, or by a plurality of individual processors, some of which may be shared.
- explicit use of the term“processor” should not be construed to refer exclusively to hardware capable of executing software, and may implicitly include, without limitation, digital signal processor (DSP) hardware, network processor, application specific integrated circuit (ASIC), field programmable gate array (FPGA), read only memory (ROM) for storing software, random access memory (RAM), non-volatile storage.
- DSP digital signal processor
- ASIC application specific integrated circuit
- FPGA field programmable gate array
- ROM read only memory
- RAM random access memory
- the computing system 100 farther comprises an image capturing device 104 and a touch sensitive surface 106 coupled to the processor 102.
- Examples of the computing system 100 may comprise computing devices, such as laptops, smartphones, and tables that incorporate an integrated image capturing device and touch sensitive surface as well as computing devices, such as desktop computers to which an external image capturing device and a touch sensitive surface may be coupled.
- the touch sensitive surface 106 may enable touch- based interactions between a user and the computing system 100.
- the touch sensitive surface 106 may be a touch sensitive surface that may determine an actuation provided to it by way of a touch.
- a user may use, for example, a stylus or his finger to touch the touch sensitive surface 106 to provide the actuation.
- Examples of a touch sensitive surface include, but are not restricted to, a touchmat, a touch panel, and a touchscreen.
- the touch sensitive surface 106 may be a capacitive touch sensitive surface, a resistive touch sensitive surface, a strain gauge touch sensitive surface, an optical touch sensitive surface or a combination of these touch sensitive surfaces.
- the touch sensitive surface 106 may be a touch screen which may comprise a touchmat or a touch panel integrated with a screen. The screen may display content while the touchmat or a touch panel may be coupled to the screen to sense a touch on the screen.
- the touch-based interactions between the user and the computing system 100 may involve the user touching or actuating the touch sensitive surface 106 to provide inputs to the computing system 100 for performing various actions.
- touch-based interactions include, user activities such as writing or drawing on the touch sensitive surface 106 that may be captured by the computing system 100, for example, for storing in a memory of the computing system 100.
- touch-based interactions may also include touch-based commands, such as a one-finger right swipe, a two-finger left swipe, or touch on a predefined location may cause the computing system 100 to perform corresponding actions.
- an image, such as that of a webpage may be projected by the computing system 100 on the touch sensitive surface.
- the user may touch a top right comer of the webpage projected on the touch sensitive surface 106 to minimize the webpage.
- the user may touch the touch sensitive surface 106 at a location corresponding to a hyperlink included in the webpage to open the hyperlink.
- an image, such as a map may be projected on the touch sensitive surface 106 and the user may zoom in or zoom out of the map by making two- finger movements in the same or opposite direction on the touch sensitive surface 106.
- the image capturing device 104 may be employed to facilitate the touch-based interactions of the user with the computing system 100.
- the image capturing device 104 may be a camera that is inbuilt or integrated into the computing system 100, such as a webcam.
- a webcam may be a complementary metal-oxide semiconductor (CMOS) camera in an example.
- CMOS complementary metal-oxide semiconductor
- the image capturing device may be an external camera coupled to the computing device 100, such as an external webcam coupled to the computing device 100 through a universal serial bus (USB) port.
- USB universal serial bus
- Examples of the image capturing device 104 include infrared camera, dual infrared camera, digital single lens reflex camera, depth camera, and mirrorless camera.
- the touch sensitive surface 106 and the image capturing device 104 are positioned such that the touch sensitive surface 106 lies in a field of view of the image capturing device 104.
- the touch sensitive surface 106 may be positioned on a flat horizonal surface, such a surface of a desk and the image capturing device 104 may be attached to a supporting structure which holds the image capturing device 104 in a downward facing orientation, such that the image capturing device 104 faces the touch sensitive surface.
- the image capturing device 104 may capture an image or video of a user's touch-based interactions with the touch sensitive surface.
- the image capturing device 104 may record and provide an indication of an actuation to the computing system 102, for the computing system 100 to perform the corresponding action.
- the computing system 100 comprises a touch detection module 108 coupled to the processor 102.
- the touch detection module 108 receives an indication of actuation of the touch sensitive surface 106.
- the touch detection module 108 may cause the image capturing device 104 to capture an image of the touch sensitive surface
- a location detection module 110 of the computing system 100 is coupled to the processor 102 to determine a portion of the image which corresponds to the point of actuation of the touch sensitive surface 106.
- the portion of the image may be a predefined area, for example, a circular, rectangular, pentagonal, area surrounding the point of actuation.
- the location detection module 110 crops the portion of the image that corresponds to the point of actuation of the touch sensitive surface 106.
- the cropped portion of the image alone may be processed while the remaining portion may be disregarded. This reduces the processing overhead of the processor as the processor processes the cropped portion having less data and not the entire image having comparatively more data.
- computing system 100 depicted in the example implementation illustrated in Figure 1 shows the image capturing device 104 and touch sensitive surface 106 integrated in the computing system 100, it should be understood that the image capturing device 104 and touch sensitive surface 106 may be implemented as separate devices as well. Accordingly, the techniques for processing images of a touch sensitive surface using significantly low processing resources, described herein, also extends to computing systems comprising computing devices that may be coupled to external image capturing devices and external touch sensitive surfaces.
- FIG. 2 shows a computing system 200 according to an example implementation of the present subject matter.
- the computing system comprises a computing device 202 coupled to an image capturing device 204 and a touch sensitive surface 206.
- the image capturing device 204 and touch sensitive surface 206 are similar to the above-explained image capturing device 104 and touch sensitive surface 106.
- the computing system 200 includes processors) 208, similar to the processor 102, a memory 210, and interface(s) 212 coupled to the processors) 208.
- processors similar to the processor 102
- a memory 210 may be coupled to the processors) 208.
- the system memory 210 may include any computer-readable medium including, for example, volatile memory (e.g., RAM), and/or non-volatile memory (e.g., EPROM, flash memory, etc.).
- the interface(s) 212 may include a variety of software and hardware interfaces that enable the computing device to communicate with image capturing device and touch sensitive surface.
- Modules 214 and data 216 may reside in the memory 210.
- the modules 214 include routines, programs, objects, components, data structures, and the like, which perform particular tasks or implement particular abstract data types.
- the modules 214 include a calibration module 218 and an image processing module 220 in addition to the aforementioned touch detection module 108 and location detection module 110 as described with reference to Figure 1.
- the modules 214 may also comprise other modules 222 that supplement functions of the computing device 200.
- the data 216 serves, amongst other things, as a repository for storing data that may be fetched, processed, received, or generated by the modules 214.
- the data 216 comprises other data 224 corresponding to the other modules 222.
- the data of the computing device 200 also comprises calibration data 226, image data, and dimension data 228.
- the image capturing device 204 and the touch sensitive surface 206 may be calibrated for a given position of the image capturing device 204 and the touch sensitive surface 206.
- Calibration may be understood as a process of mapping locations on the touch sensitive surface to their corresponding locations in a view of the camera or on an image captured by the image capturing device 204.
- a calibration mode may be initiated on the image capturing device 204 for the calibration.
- respective locations of a plurality of reference points on a surface of the touch sensitive surface 206 feeing the image capturing device 204 is made known to the computing device 200.
- a mapping of the respective locations of the reference points on the touch sensitive surface and their corresponding location in the image is provided to the calibration module 218 coupled to the processor 208 of the computing device 202.
- the calibration module 218 may map locations of various points on the touch sensitive surface to their corresponding locations in an image captured by the image capturing device 204.
- coordinates of the location of the reference points on the touch sensitive surface 206 may be provided to the calibration module 218.
- a two-dimensional coordinate system may be used to define physical location of various points on the touch sensitive surface 206.
- the touch coordinates corresponding to the physical location of the reference points on the touch sensitive surface 206 may be provided to the calibration module 218 as user input.
- four reference points say. A, B, C, and D may be used for the calibration and accordingly respective locations of the four reference points may be provided to the calibration module 218.
- the touch coordinates (xi.yi), (xa.ya), (xs.ys), and (X4,y4) of the reference points A, B, C, and D, respectively may be provided to the calibration module 218.
- a user input may be provided to indicate the respective locations of the reference points in the image.
- the user may mark respective locations of the reference points in the image.
- the user may use a graphical user interface provided by the interface(s) of the computing device 202 to provide the touch coordinates and to indicate respective locations of the touch coordinates of the reference points in the image to the calibration module 218.
- the reference points may be determined by the calibration module 218 and may not be defined by the user.
- the reference points A, B, C, and D may be comers of the touch sensitive surface 206.
- the calibration module 218 may implement comer detection algorithms to detect the comers of the touch sensitive surface.
- the calibration module may detect the comers of the touch sensitive surface 206 and may accordingly map the respective locations of each of the comers to their corresponding locations in the image, for example, based on the relative distance of the comers from each other.
- the calibration module 218 may determine image coordinates of each of the reference points in the image.
- the image coordinates indicate the location of the reference points in the image.
- the processor 208 may determine image coordinates of each of the various points on the touch sensitive surface 206.
- the processor 208 uses the touch coordinates and the image coordinates of the reference points to generate a transformation matrix.
- the transformation matrix may be used to determine the touch coordinates corresponding to the image coordinates. Accordingly, once calibrated, the image coordinates corresponding to a point on the touch sensitive surface 206 may be determined.
- the user input such as the location of the reference points received during calibration and the corresponding transformation matrix generated may be stored in the calibration data 226 in the memory of the computing device 202.
- the calibration data 226 may be applicable for a relative position of the image capturing device 204 and the touch sensitive surface 206 for which the calibration process was performed and may be used until the relative position between the image capturing device and the touch sensitive surface does not change. In case the relative position between the image capturing system and the touch sensitive surface changes, a re-calibration may be performed by the user. Accordingly, the calibration data corresponding to the changed position may be stored in the memory.
- the touch detection module 108 coupled to the processor 208 receives an indication of the actuation of the touch sensitive surface 206 from the touch sensitive surface 206. In response to the indication, the touch detection module 108 causes the image capturing device 204 to capture an image of the touch sensitive surface 26.
- the image of the touch sensitive surface 206 may be stored in the image data 228 in the memory of the computing device 202.
- the image capturing device 204 may be in a passive or inactive mode. Based on the received indication, the image capturing device 204 may be activated to capture an image of the touch sensitive surface 206. In an implementation, once activated, the image capturing device 204 may remain active for a predefined period of time and may thereafter be deactivated or may enter a passive mode. As the image capturing device 204 may be maintained in a passive mode when not activated, the power of the computing device 202 is conserved.
- the indication of the actuation of the touch sensitive surface 206 from the touch sensitive surface 206 is received by the touch detection module 108 may include touch coordinates of the point on the touch sensitive surface 206.
- the touch coordinates of the point of actuation of the touch sensitive surface 206 may be indicative of the physical location of the point on the touch sensitive surface 206.
- the calibration module 218 may receive touch coordinates corresponding to the point of actuation of the touch sensitive surface 206.
- the touch coordinates of the point of actuation is, say, (M, N).
- the calibration module 218 may convert the touch coordinates (M, N) into image coordinates (M ⁇ N').
- the conversion of the touch coordinates into image coordinates may be done using the transformation matrix stored in the calibration data 226.
- the calibration module 218 may select a transformation matrix from amongst a plurality of transformation matrix stored in the calibration data 226, depending on a current relative position of the touch sensitive surface 206 and the image capturing device 204.
- the location detection module 110 may identify a portion of image corresponding to the point of actuation, i.e., the portion of the image corresponding the image coordinates (M', N’).
- the portion of the image corresponding the image coordinates may be an area of the image comprising the image coordinates.
- the location detection module 110 may then crop the portion of image corresponding to the point or the image coordinates.
- the cropped portion of image may be stored in the image data 228 in the memory 210 of the computing device 202.
- the portion of the image may have a predefined dimension.
- a plurality of dimensions may be predefined for various types of actuation of the touch sensitive surface 206, such as an actuation of the touch sensitive surface by a single-point touch or a multi-point touch.
- the user may touch a single point on the touch sensitive surface or the actuation may be localized to a few points on the touch sensitive surface.
- the portion of the image may be predefined to have first dimension.
- the user may touch a series of points on the touch sensitive surface 206, such that a distance between a first point of touch and a last point of touch may be beyond a threshold.
- the portion may have a second dimension in case of the multi-point touch.
- a multi-point touch may also include a plurality of simultaneous single-point touches, such as a two-finger swipe.
- a third dimension for the portion of the image may be defined for such multi-point touches.
- the plurality of predefined dimensions may be stored in the dimension data 228.
- the location detection module 110 retrieves the corresponding predefined dimension from the dimension data 228 and crops the portion of the image. Considering that, in an example, the predefined dimension is 20 centimeters by 30 centimeters, the location detection module 110 may crop a rectangular portion having a dimension of 20 centimeters by 30 centimeters around the point of actuation or the image coordinates thereof. In an example, the point of actuation may be at a center of the rectangular portion where two diagonals of the rectangular portion intersect.
- the portion of the image may be predefined as an area having a shape of a circle, with the image coordinates of the point of actuation as a center of the circle.
- Various radii may be defined for the circle, for example, based on a distance between the image capturing device and the touch sensitive surface or a dimension of the touch sensitive surface.
- the image processing module 220 receives the cropped portion of the image and processes the cropped portion.
- the image processing module 220 may process the cropped portion of the image and may display the processed portion of the image on a display (not shown) of the computing device 202.
- the image processing module 220 may transmit the processed data or may project the processed portion of the image on a surface.
- cropping is explained herein with reference to an image, however, it should be understood that videos, which are the collection of images in short time frame, may be processed in the same way as explained in context of an image.
- the computing device 202 is shown directly coupled to touch sensitive surface and the image capturing device.
- the computing device 202 may be at a location different from the touch sensitive surface 206 and the image capturing device 204.
- the computing device 202 may be coupled to the touch sensitive surface 206 and the image capturing device 204 via a network.
- the I/O interface may allow the computing device 202 to interface with the touch sensitive surface 206 and image capturing device 204.
- the indication of actuation of the touch sensitive surface 206 and the image captured by the image capturing device 204 may be transmitted to the computing device via the network.
- the network may be a single network or a combination of multiple networks and may use a variety of different communication protocols.
- the network 108 may be a wireless or a wired network, or a combination thereof. Examples of such individual networks include, but are not limited to, Global System for Mobile Communication (GSM) network, Universal Mobile Telecommunications System (UMTS) network, Personal Communications Service (PCS) network, Time Division Multiple Access (TDMA) network, Code Division Multiple Access (CDMA) network, Next Generation Network (NON), Public Switched Telephone Network (PSTN).
- GSM Global System for Mobile Communication
- UMTS Universal Mobile Telecommunications System
- PCS Personal Communications Service
- TDMA Time Division Multiple Access
- CDMA Code Division Multiple Access
- NON Next Generation Network
- PSTN Public Switched Telephone Network
- the image capturing device may include a location detection module and a processor which may be responsible for cropping and processing the portion of the image.
- Figure 3 illustrates a method 300 for cropping a portion of an image in accordance with an example implementation of the present subject matter.
- the method 300 may be implemented in a variety of computing system, but for the ease of explanation, the present description of the example method 300 to crop the portion of the image is provided in reference to the above-described computing system 100.
- the order in which the method 300 is described is not intended to be construed as a limitation, and any number of the described method blocks may be combined in any order to implement the method 300, or an alternative method.
- blocks of the method 300 may be performed by the computing system 100.
- the blocks of the method 300 may be executed based on instructions stored in a non-transitory computer-readable medium, as will be readily understood.
- the non-transitory computer-readable medium may include, for example, digital memories, magnetic storage media, such as magnetic disks and magnetic tapes, hard drives, or optically readable digital data storage media.
- an indication of actuation of a touch sensitive surface is received.
- the touch sensitive surface may be actuated by a user who may actuate the touch sensitive surface by touching at a point on the touch sensitive surface with his finger or a stylus.
- the indication may be received by a processor of a computing device, such as the processor 102 of the computing system 100.
- an image of the touch sensitive surface is obtained in response to the receiving of the indication.
- the image may be captured by an image capturing device, such as the image capturing device 104. Further, in an example, the image may be obtained by the computing system 100. The method, thereafter, proceeds to block 306.
- a portion of the image corresponding to the point on the touch sensitive surface is determined and at block 308, the determined portion is cropped.
- Figure 4 illustrates a method 400 for cropping a portion of an image in accordance with another example implementation of the present subject matter.
- the method 400 may be implemented in a variety of computing systems, but for the ease of explanation, the present description of the example method 400 to crop the portion of the image is provided in reference to the above-described computing system 102.
- the order in which the method 400 is described is not intended to be construed as a limitation, and any number of the described method blocks may be combined in any order to implement the method 400, or an alternative method.
- blocks of the method 400 may be performed by the computing system 100.
- the blocks of the method 400 may be executed based on instructions stored in a non-transitory computer-readable medium, as will be readily understood.
- the non-transitory computer-readable medium may include, for example, digital memories, magnetic storage media, such as magnetic disks and magnetic tapes, hard drives, or optically readable digital data storage media.
- an Indication of actuation of a touch sensitive surface is received.
- the actuation of the touch sensitive surface is based on a touch at a point on the touch sensitive surface.
- the method 400 thereafter, proceeds to block 404.
- an image capturing device such as the image capturing device 104, is activated, in response to receiving of the indication, to capture an image of the touch sensitive surface.
- the image capturing device may be in a passive or inactive mode by default and may be activated when the indication of actuation is received.
- the method then proceeds to block 406 where the captured image is obtained by the computing system 100.
- touch coordinates corresponding to the point on the touch sensitive surface is obtained.
- the touch coordinates indicate the physical location of the point on the touch sensitive surface.
- the indication of actuation may comprise the touch coordinates.
- the touch coordinates are converted into the image coordinates.
- a transformation matrix may be retrieved from a memory of the computing system and may be used to convert the touch coordinates into image coordinates.
- the image coordinates indicate a location of the point in the image.
- a predefined dimension for a portion of the image is retrieved from a memory of the computing system.
- the predefined dimension may be based on the touch at the point on the touch sensitive surface being a single-point touch or a multi-point touch. For example, in case of the single-point touch, where a user may touch the touch sensitive surface using one finger, the predefined dimension may correspond to 10% of the total area of the image. Further, in case of the multi-point touch, for example, when a user uses more than one finger to provide touch input, the predefined dimension may correspond to 25% of the total area of the image.
- a portion of the image corresponding to the image coordinates is determined.
- the portion of the image may have a shape, such as a circle or a rectangle with dimensions in accordance with the predefined dimension.
- the computing system may determine the portion to be a circle with the point of actuation as the center of the circle and having a radius such that the area of the circle corresponds to 10% of the total area of the image.
- the portion is cropped. The portion may be processed, and further actions may be performed on the cropped portion.
- Figure 5 illustrates a computing environment implementing a non- transitory computer-readable medium for cropping a portion of an image, according to an example.
- the computing environment 500 may comprise a computing system, such as the computing system 100.
- the computing environment 500 indudes a processing resource 502 communicatively coupled to the non-transitory computer-readable medium 504 through a communication link 506.
- the processing resource 502 may be a processor of the computing system 100 that fetches and executes computer-readable instructions from the non-transitory computer-readable medium 504.
- the non-transitory computer-readable medium 504 can be, for example, an internal memory device or an external memory device.
- the communication link 506 may be a direct communication link, such as any memory read/write interface.
- the communication link 506 may be an indirect communication link, such as a network interface.
- the processing resource 502 can access the non-transitory computer- readable medium 504 through a network 508.
- the network 508 may be a single network or a combination of multiple networks and may use a variety of different communication protocols.
- the processing resource 502 and the non-transitory computer- readable medium 504 may also be communicatively coupled to data sources 510.
- the data source(s) 510 may also be used to store data, such as calibration data, image data. Further the data source(s) 510 may be a database.
- the non-transitory computer-readable medium 504 comprises executable instructions 512 for cropping a portion of an image.
- the non-transitory computer-readable medium 504 may comprise instructions executable to implement the previously described location detection module
- the instructions 512 may be executable by the processing resource 502 to receive an indication of actuation of a touch sensitive surface.
- the actuation of the touch sensitive surface may be based on a touch at a point on the touch sensitive surface.
- an image capturing device is invoked to capture the image of the touch sensitive surface.
- the processing resource 502 may receive an image of the touch sensitive surface from the image capturing device.
- the instructions 512 may further executable by the processing resource 502 to determine a location of the point on the touch sensitive surface. After the location of the point on the touch sensitive surface is known, the processing resource 502 may identify a portion of the image that corresponds to the location of the point of actuation of the touch sensitive surface. The portion of the image corresponding to the point of actuation may be understood as an area of the image comprising the point of actuation.
- the processing resource 502, may, thereafter crop the portion of the image.
Abstract
La présente invention concerne des techniques données à titre d'exemple permettant de recadrer des parties des images. Dans un exemple, une indication d'actionnement d'une surface tactile, basée sur un toucher à un point sur une surface tactile, est reçue. Une image de la surface tactile est reçue en réponse à la réception de l'indication. Une partie d'image correspondant au point est déterminée et recadrée.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/US2018/044668 WO2020027813A1 (fr) | 2018-07-31 | 2018-07-31 | Recadrage de parties des images |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/US2018/044668 WO2020027813A1 (fr) | 2018-07-31 | 2018-07-31 | Recadrage de parties des images |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020027813A1 true WO2020027813A1 (fr) | 2020-02-06 |
Family
ID=69232617
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2018/044668 WO2020027813A1 (fr) | 2018-07-31 | 2018-07-31 | Recadrage de parties des images |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2020027813A1 (fr) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112738625A (zh) * | 2020-12-24 | 2021-04-30 | 广东九联科技股份有限公司 | 基于机顶盒的视频图像增强方法及装置 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100194781A1 (en) * | 2008-12-15 | 2010-08-05 | Christopher Tossing | System and method for cropping and annotating images on a touch sensitive display device |
US20130321313A1 (en) * | 2012-05-31 | 2013-12-05 | Htc Corporation | Method, apparatus and computer program product for cropping screen frame |
US20140055398A1 (en) * | 2012-08-27 | 2014-02-27 | Samsung Electronics Co., Ltd | Touch sensitive device and method of touch-based manipulation for contents |
US20150054854A1 (en) * | 2013-08-22 | 2015-02-26 | Htc Corporation | Image Cropping Manipulation Method and Portable Electronic Device |
US20170109023A1 (en) * | 2012-03-06 | 2017-04-20 | Apple Inc. | User Interface Tools for Cropping and Straightening Image |
-
2018
- 2018-07-31 WO PCT/US2018/044668 patent/WO2020027813A1/fr active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100194781A1 (en) * | 2008-12-15 | 2010-08-05 | Christopher Tossing | System and method for cropping and annotating images on a touch sensitive display device |
US20170109023A1 (en) * | 2012-03-06 | 2017-04-20 | Apple Inc. | User Interface Tools for Cropping and Straightening Image |
US20130321313A1 (en) * | 2012-05-31 | 2013-12-05 | Htc Corporation | Method, apparatus and computer program product for cropping screen frame |
US20140055398A1 (en) * | 2012-08-27 | 2014-02-27 | Samsung Electronics Co., Ltd | Touch sensitive device and method of touch-based manipulation for contents |
US20150054854A1 (en) * | 2013-08-22 | 2015-02-26 | Htc Corporation | Image Cropping Manipulation Method and Portable Electronic Device |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112738625A (zh) * | 2020-12-24 | 2021-04-30 | 广东九联科技股份有限公司 | 基于机顶盒的视频图像增强方法及装置 |
CN112738625B (zh) * | 2020-12-24 | 2023-03-31 | 广东九联科技股份有限公司 | 基于机顶盒的视频图像增强方法及装置 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10564806B1 (en) | Gesture actions for interface elements | |
KR101184460B1 (ko) | 마우스 포인터 제어 장치 및 방법 | |
KR101608423B1 (ko) | 모바일 디바이스상의 풀 3d 상호작용 | |
US9727135B2 (en) | Gaze calibration | |
US9201500B2 (en) | Multi-modal touch screen emulator | |
US20120174029A1 (en) | Dynamically magnifying logical segments of a view | |
US8269842B2 (en) | Camera gestures for user interface control | |
US9348466B2 (en) | Touch discrimination using fisheye lens | |
WO2020027818A1 (fr) | Détermination de l'emplacement d'un contact sur des surfaces tactiles | |
CA2909182C (fr) | Ecran tactile virtuel | |
JP2014211858A (ja) | ジェスチャに基づくユーザ・インターフェイスを提供するシステム、方法及びプログラム | |
WO2011146070A1 (fr) | Système et procédé de rapport de données dans un système de vision par ordinateur | |
US9525906B2 (en) | Display device and method of controlling the display device | |
WO2017028491A1 (fr) | Dispositif d'affichage à commande tactile et procédé d'affichage à commande tactile | |
KR101488662B1 (ko) | Nui 장치를 통하여 사용자와 상호작용하는 인터페이스 제공방법 및 제공장치 | |
US10025420B2 (en) | Method for controlling display of touchscreen, and mobile device | |
WO2018076720A1 (fr) | Procédé d'utilisation à une seule main et système de commande | |
WO2020027813A1 (fr) | Recadrage de parties des images | |
US9235338B1 (en) | Pan and zoom gesture detection in a multiple touch display | |
WO2023273071A1 (fr) | Procédé et appareil de traitement d'images et dispositif électronique | |
US20160124602A1 (en) | Electronic device and mouse simulation method | |
CN113867562A (zh) | 触摸屏报点的校正方法、装置和电子设备 | |
WO2019100547A1 (fr) | Procédé de commande de projection, appareil, système d'interaction de projection, et support d'informations | |
WO2021118560A1 (fr) | Mode de verrouillage de scène pour capturer des images de caméra | |
US20130265283A1 (en) | Optical operation system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18928902 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 18928902 Country of ref document: EP Kind code of ref document: A1 |