WO2022037247A1 - Device, method and system for operating device - Google Patents

Device, method and system for operating device Download PDF

Info

Publication number
WO2022037247A1
WO2022037247A1 PCT/CN2021/101830 CN2021101830W WO2022037247A1 WO 2022037247 A1 WO2022037247 A1 WO 2022037247A1 CN 2021101830 W CN2021101830 W CN 2021101830W WO 2022037247 A1 WO2022037247 A1 WO 2022037247A1
Authority
WO
WIPO (PCT)
Prior art keywords
processing unit
present disclosure
imaging unit
pattern
time period
Prior art date
Application number
PCT/CN2021/101830
Other languages
French (fr)
Inventor
Kushal Kamleshbhai GANDHI
Varun B P
Roshan V NAYAK
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp., Ltd. filed Critical Guangdong Oppo Mobile Telecommunications Corp., Ltd.
Publication of WO2022037247A1 publication Critical patent/WO2022037247A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1626Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1686Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2200/00Indexing scheme relating to G06F1/04 - G06F1/32
    • G06F2200/16Indexing scheme relating to G06F1/16 - G06F1/18
    • G06F2200/163Indexing scheme relating to constructional details of the computer
    • G06F2200/1636Sensing arrangement for detection of a tap gesture on the housing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/22Details of telephonic subscriber devices including a touch pad, a touch sensor or a touch detector
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/52Details of telephonic subscriber devices including functional features of a camera

Definitions

  • the present disclosure relates generally to a device, method and system for operating the device, and more particularly to a device having an imaging unit for operating the device.
  • the mobile device such as a mobile phone is commonly used for calling purposes.
  • a user intends to view the content displayed on the mobile device during a call
  • the user tends to change the holding position of his hand while operating the mobile device by releasing the thumb, and uses the thumb for other purposes like scrolling the screen.
  • the phone is loosely held in the user's hand and may fall from their hands and get damaged.
  • a method for operating a device may comprise detecting, by a processing unit, a tapping input of an object over an imaging unit; determining, by the processing unit, a pattern of the detected tapping input over the imaging unit based on a number of taps within a predefined time period, a duration of each of the taps, and a time period between the subsequent taps; comparing, by the processing unit, the pattern with one or more predefined patterns, wherein each of one or more patterns is allocated for performing an operation; and operating, by the processing unit, the device by performing the operation allocated to the pattern.
  • the method may further comprise receiving, by the processing unit, one or more images captured by the imaging unit; determining, by the processing unit, a color value of each of the received images; comparing, by the processing unit, the color value to a threshold color value; and determining, by the processing unit, for each of the images, whether the image corresponds to a touch of the object on the imaging unit, or a release of the object from the imaging unit, based on the comparison.
  • the method may further comprise determining, by the processing unit, a first time interval between the touch and the release within the predefined time period, to determine the duration of the tap.
  • the method may further comprise determining, by the processing unit, a second time interval between the release and subsequent the touch within the predefined time period, to determine the time period between subsequent the taps.
  • the operation of one or more patterns varies based on a state of the device.
  • the operation of one or more patterns varies based on an application running on the device.
  • the operation includes a scrolling operation to navigate a movement of a content, displayed on a display, in any direction on the display.
  • the one or more predefined patterns are defined by a user.
  • a system for operating a device may comprise an imaging unit; and a processing unit configured to detect a tapping input of an object over the imaging unit, the processing unit determines a pattern of the detected tapping input over the imaging unit based on a number of taps within a predefined time period, a duration of each of the taps, and a time period between the subsequent taps, wherein the processing unit compares the pattern with one or more predefined patterns, wherein each of one or more patterns is allocated for performing an operation, and wherein the processing unit operates the device by performing the operation allocated to the pattern.
  • the processing unit receives one or more images captured by the imaging unit and determines a color value of each of the received images, and wherein the processing unit compares the color value to a threshold color value, and determines for each of the images, whether the image corresponds to a touch of the object on the imaging unit, or a release of the object from the imaging unit, based on the comparison.
  • the processing unit determines a first time interval between the touch and the release within the predefined time period, to determine the duration of the tap.
  • the processing unit determines a second time interval between the release and subsequent the touch within the predefined time period, to determine the time period between subsequent the taps.
  • the operation of one or more patterns varies based on a state of the device.
  • the operation of one or more patterns varies based on an application running on the device.
  • the operation includes a scrolling operation to navigate a movement of a content, displayed on a display screen, in any direction on the display screen.
  • the one or more predefined patterns are defined by a user.
  • a device in a third aspect of the present disclosure, may comprise an imaging unit; and a processing unit configured to detect a tapping input of an object over the imaging unit, the processing unit determines a pattern of the detected tapping input over the imaging unit based on a number of taps within a predefined time period, a duration of each of the taps, and a time period between the subsequent taps; and a memory for storing the tapping unit of the object, wherein the processing unit compares the pattern with one or more predefined patterns stored in the memory, wherein each of one or more patterns is allocated for performing an operation, and wherein the processing unit operates the device by performing the operation allocated to the pattern.
  • the processing unit receives one or more images captured by the imaging unit and determines a color value of each of the received images, and wherein the processing unit compares the color value to a threshold color value, and determines for each of the images, whether the image corresponds to a touch of the object on the imaging unit, or a release of the object from the imaging unit, based on the comparison.
  • the processing unit determines a first time interval between the touch and the release within the predefined time period, to determine the duration of the tap.
  • the processing unit determines a second time interval between the release and subsequent the touch within the predefined time period, to determine the time period between subsequent the taps.
  • the operation of one or more patterns varies based on a state of the device.
  • the operation of one or more patterns varies based on an application running on the device.
  • the operation includes a scrolling operation to navigate a movement of a content, displayed on a display screen, in any direction on the display screen.
  • the one or more predefined patterns are defined by a user.
  • Figure 1 is an illustration of an example device 100, according to an embodiment of the disclosure.
  • Figure 2 is a schematic illustration of an example device 200 depicting hardware components, according to an embodiment of the present disclosure.
  • Figure 3 is an illustration of an exemplary scenario 300, according to an embodiment of the present disclosure, depicting a touch by a user on an imaging unit 360 of a device 340.
  • Figure 4 is an illustration of an exemplary scenario 400, according to an embodiment of the present disclosure, depicting a finger of a user not in contact with an imaging unit 460 of a device 440.
  • Figure 5 is an illustration of an exemplary scenario 500, according to an embodiment of the present disclosure, depicting a pattern of touch and release by a user on an imaging unit 560 of a device 540.
  • Figure 6 is an illustration of another exemplary scenario 600, according to an embodiment of the present disclosure, depicting a pattern of touch and release by a user on an imaging unit 660 of a device 640.
  • Figure 7 is a timing diagram of an example pattern 700, according to an embodiment of the present disclosure.
  • Figure 8 is a timing diagram of another example pattern 800, according to an embodiment of the present disclosure.
  • Figure 9 is a timing diagram of another example pattern 900, according to an embodiment of the present disclosure.
  • Figure 10 is a flow diagram that illustrates an example method 1000 fordefining a pattern by a user, according to an embodiment of the present disclosure.
  • Figure 11 is an illustrative block diagram of an example system 1100 for performing a scrolling operation on a device 1120, according to an embodiment of the present disclosure.
  • Figure 12 is a flow diagram that illustrates an example method 1200 disclosing a scrolling operation, according to an embodiment of the present disclosure.
  • Figure 13 is a flow diagram that illustrates an example of a method 1300 for operating a device, according to an embodiment of the present disclosure.
  • Figure 14 is a flow diagram that illustrates another example of a method 1400 for operating a device, according to an embodiment of the present disclosure.
  • Figure 15 is a flow diagram that illustrates another example of a method 1500 for operating a device, according to an embodiment of the present disclosure.
  • Figure 16 is a flow diagram that illustrates another example of a method 1600 for operating a device, according to an embodiment of the present disclosure.
  • Imaging unit 160 160, 260, 360, 460, 560, 660, 1160 Imaging unit
  • Embodiments of the present disclosure relate to a method and system enabling a user to operate a device in a handheld position efficiently without compromising with the safety of the device.
  • a device may be of a portable device or a non-portable device, having imaging capabilities.
  • the device may be capable of performing wireless communication.
  • the device may be a handheld device like smartphone or non-smart phone, wireless device, mobile phone, tablet, computer, laptop, gaming console, a wearable smart device, notebook, netbook, television like smart television or non-smart television, desktop and/or the like.
  • the device may also include a wireless organizer, pager, personal digital assistant, handheld wireless communication device, wirelessly enabled computer, portable gaming device or any other portable electronic device with processing and communication capabilities.
  • the device may include a device with peripheral devices such as display, printer, touchscreen, projector, digital watch, camera, digital scanner, and any other types of auxiliary device that may communicate with other devices.
  • peripheral devices such as display, printer, touchscreen, projector, digital watch, camera, digital scanner, and any other types of auxiliary device that may communicate with other devices.
  • the term device can be broadly defined to encompass any electronic, computing, and/or telecommunications device, or combination of devices having imaging capabilities.
  • a processing unit may include without limitation, microcontroller, microprocessor, embedded processor, media processor, graphics processing unit (GPU) , central processing unit (CPU) , application processor, digital signal processor (DSP) , reduced instruction set computing (RISC) processor, system on a chip (SoC) , baseband processor, field programmable gate array (FPGA) , programmable logic device (PLD) , state machine, gated logic, discrete hardware circuit, and other suitable hardware configured to facilitate the innovative features as disclosed in the present detailed description.
  • the processing unit for example may execute the instruction (s) for the functioning of the device.
  • the processing unit may include an integrated graphics processor that processes the rendered data into an image in pixels for display.
  • a memory may be any form of storage either within or outside the device.
  • the memory may store information regarding tap functions and tap patterns.
  • the memory may store information regarding one or more allocated operations of one or more tap functions or tap patterns.
  • the memory may include a database.
  • the memory may also be a combination of one or more storage available internally or externally.
  • flash memory random-access memory (RAM) , read-only memory (ROM) , compact disc read-only memory (CDROM) , electro-optical memory like compact disk or digital versatile disk (DVD) , smart card magneto-optical memory, erasable programmable read-only memory (EPROM) , and electrically-erasable programmable read-only memory (EEPROM) , and/or the like.
  • the memory may store or carry the source code or instruction for executing required tasks.
  • a carrier wave may carry content or data including those used in transmitting and receiving electronic data such as electronic mail (e-mail) or in accessing a computer network such as the Internet or a local area network (LAN) .
  • the memory may be a cloud storage and/or the like that may be accessible via the internet.
  • a display may be a touch-sensitive or presence-sensitive display.
  • the display includes an input/output interface module (I/O interface module) .
  • the display may provide an output to the user, for example, display contents, including without limitation, an image or a video image and/or the like.
  • the display may include or be integrated with a touch screen or touch sensitive overlay for receiving touch input from the user.
  • the display may also be capable of receiving a user input from a stylus, fingertip, or other means of gesture input.
  • the display may be a computer monitor, for example, a personal computer, with an internal or external display operatively connected.
  • the display may be a display device, such as an LCD TV or projector and/or the like.
  • An Input/Output interface module refers to any means or set of commands or menus through which a user may communicate with the device.
  • the I/O interface module may be a virtual keyboard or any other means through which a user may input information to the device.
  • the I/O interface module may enable the device to communicate with the user for exchanging data or for establishing connection with the devices.
  • the I/O interface module may enable the device to connect with various I/O peripherals.
  • the peripherals for example may include keyboard, mouse, camera, touch screen (e.g., display) , a microphone, and may also include one or more output devices such as a display screen (e.g., display) and a speaker.
  • the I/O interface module may enable the user to navigate, view, edit and perform several other operations to notification banners, badges, application program interface (API) , files and documents such as portable document format (PDF) files, word files, spreadsheets, powerpoint presentations, screenshots, JPEG (Joint Photographic Experts Group) , PNG (Portable Network Graphics) , GIF (Graphics Interchange Format) , SVG (Scalable Vector Graphics) , MP4 (Moving Picture Experts Group) and/or the like.
  • PDF portable document format
  • JPEG Joint Photographic Experts Group
  • PNG Portable Network Graphics
  • GIF Graphics Interchange Format
  • SVG Scalable Vector Graphics
  • MP4 Moving Picture Experts Group
  • An imaging unit may refer to an image capturing device, including without limitation, a camera.
  • the imaging unit may include imaging sensors.
  • the imaging unit may be an inbuilt imaging unit or an external imaging unit.
  • the imaging unit may be a camera having one or more lenses for capturing images or video images.
  • the imaging unit may be positioned or located on the rear side of the device, or on the sides of the device like, top, bottom, left or right side.
  • the imaging unit may be positioned on the front side of the device.
  • the imaging unit may include various hardware electronic circuitries, optical components to aid image capturing and processing functions.
  • the imaging unit may be capable of recognizing motions and gestures.
  • Figure 1 is an illustration of an example device 100, with a display 110, according to an embodiment of the present disclosure.
  • Figure 1 illustrates the device 100 with the display 110 and an imaging unit 160 located on the rear face of the device 100.
  • the device 100 may be any type of electronics device having a user interface, including without limitation, personal computer, handheld device, cell phone, consumer electronic device, multimedia device and/or the like.
  • the display 110 may be a touch-sensitive or presence-sensitive display.
  • the display 110 includes an I/O interface module.
  • the I/O interface module refers to any means or set of commands or menus through which a user may communicate with the device.
  • the I/O interface module may be a virtual keyboard or any other means through which a user may input information to the device.
  • the display may provide output to the user, for example, display contents, including without limitation, an image or a video image and/or the like.
  • the display 110 may display contents like image, video image, audio file, files and documents such as portable document format (PDF) files, word files, spreadsheets, powerpoint presentations, screenshots, JPEG (Joint Photographic Experts Group) , PNG (Portable Network Graphics) , GIF (Graphics Interchange Format) , SVG (Scalable Vector Graphics) , MP4 (Moving Picture Experts Group) and/or the like.
  • PDF portable document format
  • JPEG Joint Photographic Experts Group
  • PNG Portable Network Graphics
  • GIF Graphics Interchange Format
  • SVG Scalable Vector Graphics
  • MP4 Moving Picture Experts Group
  • the display 110 may include or be integrated with a touch screen or touch sensitive overlay for receiving touch input from the user.
  • the I/O interface module may enable the user to navigate, view, edit and perform several other operations to notification banners, badges, application program interface (API) , files and documents such as portable document format (PDF) files, word files, spreadsheets, powerpoint presentations, screenshots, JPEG (Joint Photographic Experts Group) , PNG (Portable Network Graphics) , GIF (Graphics Interchange Format) , SVG (Scalable Vector Graphics) , MP4 (Moving Picture Experts Group) and/or the like.
  • PDF portable document format
  • JPEG Joint Photographic Experts Group
  • PNG Portable Network Graphics
  • GIF Graphics Interchange Format
  • SVG Scalable Vector Graphics
  • MP4 Moving Picture Experts Group
  • the imaging unit 160 may be a unit with imaging capabilities such as an image capturing device like a camera positioned on the rear face of the device 100.
  • the camera may be a digital camera.
  • the camera may be an inbuilt camera or an external camera.
  • the camera may have one or more lenses for capturing images or video images.
  • the camera may be positioned or located on the rear side of the device 100, or on the sides of the device like, top, bottom, left or right side. In some embodiments of the present disclosure, the camera may be positioned on the front side of the device.
  • the camera may include various hardware electronic circuitries, optical components to aid image capturing and processing functions.
  • the imaging unit may be capable of detecting motions and gestures.
  • the imaging unit may include an imaging sensor.
  • the device 100 may be connected to a network.
  • the network may be further connected to one or more devices.
  • the one or more devices may be connected to each other through the network.
  • the network may refer to a Public Land Mobile Network (PLMN) , a Device to Device (D2D) network, a Machine to Machine/Man (M2M) network, local area network (LAN) , wireless local area network (WLAN) , wide area network (WAN) , wireless wide area network (WWAN) , the internet, a mobile network or another network, or a combination thereof.
  • PLMN Public Land Mobile Network
  • D2D Device to Device
  • M2M Machine to Machine/Man
  • LAN local area network
  • WLAN wireless local area network
  • WAN wide area network
  • WWAN wireless wide area network
  • the internet a mobile network or another network, or a combination thereof.
  • the network may be a core network of a cellular service provider, a telecommunication network such as a public switched telephone network (PSTN) .
  • Figure 2 is a schematic illustration of an example device 200 depicting hardware components, according to an embodiment of the present disclosure.
  • the device 200 may include a display 210, a processing unit 220, an I/O interface module (input/output interface module) 230, a communication system and network interface module 240, an imaging unit 260 and a memory 250.
  • I/O interface module input/output interface module
  • the processing unit 220 may control the overall functioning and control of the device 200.
  • the processing unit 220 herein can be defined as the central processing unit i.e. CPU that controls and operates the functioning of the device 200.
  • the processing unit 220 may include without limitation, an integrated graphics processor that processes the rendered data into an image in pixels for display.
  • the processing unit 220 may include without limitation, microcontroller, microprocessor, embedded processors, media processors, graphics processing units (GPUs) , central processing units (CPUs) , application processors, digital signal processors (DSPs) , reduced instruction set computing (RISC) processors, systems on a chip (SoC) , baseband processors, field programmable gate arrays (FPGAs) , programmable logic devices (PLDs) , state machines, gated logic, discrete hardware circuits, and other suitable hardware configured to facilitate the innovative features as disclosed in the present detailed description.
  • the processing unit 220 for example may execute the instruction (s) for the functioning of the device 200.
  • the communication system and network interface module 240 enables the device 200 to connect to a wired or wireless network.
  • the communication system and network interface module 240 enables large-scale network communications through routable protocols, such as Internet Protocol (IP) .
  • IP Internet Protocol
  • the communication system and network interface module 240 may allow the device 200 to communicate with one or more devices. The communication of data including image data, voice data or contents are achieved through this communication system and network interface module 240.
  • the communication system and network interface module 240 may use different wireless communication techniques such as WiFi, Bluetooth, Wireless USB, capacitive coupling communications and/or the like.
  • the communication system and network interface module 240 may also allow a wireless data communication, which can be a shared communication like a wireless hotspot or a direct WiFi provided by any one device to the nearby devices.
  • the module 240 may be an external module that connects one device with other devices. The external module may be connected to a device by for example USB or other direct connection.
  • the communication system and network interface module 240 may be communicatively coupled to the processing unit 220.
  • the memory 250 may be any form of storage either within or outside the device 200.
  • the memory may store information regarding tap functions and tap patterns.
  • the memory may store information regarding one or more allocated operations of one or more tap functions or tap patterns.
  • the memory may include a database.
  • the memory may also be a combination of one or more storage available internally or externally.
  • the memory may store or carry the source code or instruction for executing required tasks.
  • the memory 250 may be communicatively coupled to the processing unit 220.
  • the display 210 may be a built-in or integrated display to present a content to a viewer or a user.
  • the I/O interface module 230 may enable the device 200 to communicate with the user for exchanging data or for establishing connection with the devices 200.
  • the I/O interface module 230 may enable the device 200 to connect with various I/O peripherals.
  • the peripherals for example may include keyboard, mouse, camera, touch screen (e.g., display 210) , a microphone, and may also include one or more output devices such as a display screen (e.g., display 210) and a speaker.
  • the display 210 may be communicatively coupled to the processing unit 220.
  • the device 200 may include the imaging unit 260.
  • the imaging unit may enable the imaging capabilities of the device.
  • the imaging unit may be capable of capturing images or video images.
  • the imaging unit may be capable of processing images or video images.
  • the imaging unit may capture image attributes such as pixel attributes, illumination, color information and/or the like.
  • the imagine unit may be any image capturing device like a camera.
  • the imaging unit may be a digital camera.
  • the imaging unit may be an inbuilt camera or an external camera.
  • the imaging unit may have one or more lenses for capturing images or video images.
  • the imaging unit may be positioned or located on the rear side of the device, or on the sides of the device like, top, bottom, left or right side.
  • the imaging unit may be positioned on the front side of the device.
  • the imaging unit may include various hardware electronic circuitries, optical components to aid image capturing and processing functions.
  • the imaging unit 260 may be communicatively coupled to the processing unit 220.
  • Figure 3 is an illustration of an exemplary scenario 300, according to an embodiment of the present disclosure, depicting a touch by a user on an imaging unit 360 of a device 340.
  • the user may hold the device 340 in his right or left hand. While generally operating the device or while performing a specific function, such as making a call, the user may tap the imaging unit 360 to operate the device without compromising the safety of the device 340.
  • finger 320 of the user or any object covers the imaging unit 360.
  • the imaging unit 360 may be configured to capture a plurality of images in which the image may have a transition from a light color to a darker color.
  • the image captured by the imaging unit 360 may be darker or black in color.
  • the color of the images captured by the imaging unit 360 is defined by a color value.
  • the one or more captured images are dark or darker and the color value of the one or more images may be higher.
  • the lowest color value may be assigned a value 0 (zero) for a complete white image.
  • the highest color value may be assigned a value 1 (one) .
  • a first threshold color value and a second threshold color value may be predefined.
  • the processing unit of the device 340 may be provided with the first and the second threshold color value.
  • the first and the second threshold color values may be stored in a memory.
  • the color value of one or more images captured by the imaging unit 360 may be compared with the one or more threshold color values. A variation in the color value of the one or more captured images from the one or more threshold color values may be indicative of a touch event or a release event having occurred on the imaging unit 360.
  • the color value of the subsequent captured images observed to be higher than the first threshold color value may be considered to indicate a touch event, that is, the user has touched the imaging unit 360 by his finger 320.
  • Figure 4 is an illustration of an exemplary scenario 400, according to an embodiment of the present disclosure, depicting a finger 420 of a user not in contact with an imaging unit 460 of a device 440.
  • the scenario 400 as disclosed in Figure 4 illustrates the finger 420 of the user in a position wherein it is not connected or is not in contact with the imaging unit 460 of the device 440.
  • atap input may be defined by a sequence of a touch event on the imaging unit 460 and the release event from the imaging unit 460.
  • the release event may include a scenario wherein the finger is not connected or is disconnected with the imaging device.
  • an exemplary scenario may disclose an object held by the user not in contact with the imaging unit of the device.
  • the imaging unit 460 may capture one or more images.
  • the imaging unit 460 may capture a plurality of images to determine the relative color value of the plurality of images.
  • the images captured by the imaging unit 460 may be white or light or bright in color because of the presence of the surrounding light.
  • the color of the images captured by the imaging unit 360 is defined by a color value. In the scenario 400 as disclosed in Figure 4, one or more captured images may have a decreased color value.
  • the color value of the plurality of images captured by the imaging unit 460 may be compared with a second threshold color value to identify the disconnect between the imaging unit 460 and the finger 420, which may disclose a release event of the finger 420 of the user from the imaging unit 460 that is, the user has removed his finger 420 from the imaging unit 460.
  • Figure 5 is an illustration of an exemplary scenario 500, according to an embodiment of the present disclosure, depicting a pattern of touch and release by a user on an imaging unit 560 of a device 540.
  • Figure 5 discloses a pattern with a plurality of events including one or more touch and release events.
  • a touch event or a release event occurs on the imaging unit 560 of the device 540, one or more images are captured by the imaging device 560.
  • a first threshold color value and a second threshold color value may be predefined.
  • a variation in the color value of the one or more captured images from the one or more threshold color values may be indicative of a touch event or a release event having occurred on the imaging unit 560.
  • the one or more images captured by the imaging unit 560 may be identified and differentiated on the color of the one or more images, such as one or more dark images may indicate a touch event and one or more lighter images may indicate a release event.
  • the user touches the imaging unit 560 for a touch event that is, the color value of the images captured during t 0 may be greater than a first threshold color value.
  • the user again touches the imaging unit 560 for a subsequent touch event that is, the color value of the images captured during t 2 may also be greater than the first threshold color value.
  • the occurrence of the touch event followed by the release event forms a tap input.
  • the time between the touch or touch event and the release or the release event may be defined as a first time interval that further defines the duration of the input tap.
  • the time between the release or the release event and the subsequent touch or touch event may be defined as a second time interval that further defines the time period between the subsequent tap inputs.
  • the exemplary pattern discloses a tap input between the time t 0 and t 1 and a tap pattern between the time t 0 and t 2 .
  • Figure 6 is an illustration of another exemplary scenario 600, according to an embodiment of the present disclosure, depicting a pattern of touch and release by a user on an imaging unit 660 of a device 640.
  • Figure 6 discloses, a pattern with a plurality of events including one or more touch and release events.
  • a touch event or a release event occurs on the imaging unit 660 of the device 640, one or more images are captured by the imaging device 660.
  • a first threshold color value and a second threshold color value may be predefined.
  • a variation in the color value of the one or more captured images from the one or more threshold color values may be indicative of a touch event or a release event having occurred on the imaging unit 660.
  • the one or more images captured by the imaging unit 660 may be identified and differentiated on the color of the one or more images, such as one or more dark images may indicate a touch event and one or more lighter images may indicate a release event.
  • the finger 620 of the user is removed from the imaging unit 660 of the device for a release event, that is, the color value of the images captured during t 0 may be lower than a second threshold value.
  • the user touches the imaging unit 660 for a touch event that is, the color value of the images captured during t 1 may be greater than the first threshold value.
  • the exemplary pattern discloses a tap input between the time t 1 and t 2 .
  • Figure 7 is a timing diagram of an example pattern 700, according to an embodiment of the present disclosure.
  • Figure 7 discloses the pattern 700 in a predefined time period t p .
  • the pattern 700 as disclosed in Figure 7 may have four taps, namely 701, 702, 703 and 704.
  • the first tap 701 may have a duration d 0
  • the second tap 702 may have a duration d 1
  • the third tap 703 may have a duration d 2
  • the fourth tap 704 may have a duration d 3 .
  • the time interval or time period between the first tap 701 and the subsequent second tap 702 is t 0
  • the time interval or time period between the second tap 702 and the subsequent third tap 703 is t 1 .
  • the time interval or time period between the third tap 703 and the subsequent fourth tap 704 is t 2 .
  • the duration of each of the taps in the one or more taps may be the same.
  • the time period between the subsequent taps may be different.
  • the time period between the subsequent taps may be the same. The user may create a plethora of patterns using variable duration of the taps, variable time period between subsequent taps, in the pattern and/or the like.
  • Figure 8 is a timing diagram of another example pattern 800, according to an embodiment of the present disclosure.
  • Figure 8 discloses the pattern 800 in a predefined time period t p .
  • the pattern 800 as disclosed in Figure 8 may have four taps, namely, 801, 802, 803 and 804.
  • the first tap 801 may have a duration d 0
  • the second tap 802 may have a duration d 1
  • the third tap 803 may have a duration d 2
  • the fourth tap 804 may have a duration d 3 .
  • the time interval or time period between the first tap 801 and the subsequent second tap 802 is t 0
  • the time interval between the second tap 802 and the subsequent third tap 803 is t 1 .
  • the time interval between the third tap 803 and the subsequent fourth tap 804 is t 2 .
  • the duration of each of the taps namely, d 0 , d 1 , d 2 , andd 3 may be the same.
  • the time period between the first tap 801 and the second tap 802, that is, t 0 may be greater than the time period t 1 and t 2 , or the time period t 0 may be equal to time period t 2 and may be greater than the time period t 1 .
  • the patterns may have taps with equal duration and with unequal time gap between subsequent taps.
  • Figure 9 is a timing diagram of another example pattern 900, according to an embodiment of the present disclosure.
  • Figure 9 discloses the pattern 900 in a predefined time period t p .
  • the pattern 900 as disclosed in Figure 9 may have three taps, namely, 901, 902 and 903.
  • the first tap 901 may have a duration d 0
  • the second tap 902 may have a duration d 1
  • the third tap 903 may have a duration d 2 .
  • the time interval or time period between the first tap 901 and the subsequent second tap 902 is t 0
  • the time gap between the second tap 902 and the subsequent third tap 903 is t 1 .
  • the duration of each of the taps may vary.
  • the duration d 0 of the first tap 901 may be greater than the duration d 1 and/ord 2 of the second tap 902 and the third tap 903, respectively, or the duration d 0 may be equal to the duration d 1 but greater than the duration d 2 .
  • the time period t 0 between the first tap 901 and the second tap 902 may be lesser than the time period t 1 between the second tap902 and the third tap 903.
  • the patterns may have taps with unequal duration and/or with unequal time gaps between subsequent taps.
  • Figure 10 is a flow diagram that illustrates an example method 1000 fordefining a pattern by a user, according to an embodiment of the present disclosure.
  • the method 1000 includes requesting, by a processing unit, a user to input a pattern.
  • the method 1000 includes receiving, by the processing unit, the pattern from the user.
  • the user may input or enter any one of a plethora of patterns by performing a touch event or a release event on an imaging unit of a device.
  • one or more patterns may be differentiated by the intensity of the touch event, the frequency of the touch and release events, the time variation between the touch and release events, the relative sequence of the touch and release events and/or the like.
  • the method 1000 includes determining by the processing unit, a number of tap (s) , a duration of each tap, and a time period of subsequent each tap in the pattern.
  • the method 1000 includes prompting, by the processing unit, to re-enter the pattern for verification and confirmation.
  • the prompt may be in the form of a notification, push-up notification, dialogue box, alert and/or the like. The user may re-enter the pattern using a finger or an object on the imaging unit of the device.
  • the processing unit assesses whether the pattern re-entered by the user matches the pattern received by the processing unit at block 1012 to verify and confirm the pattern. If at block 1020, the processing unit assesses that the pattern is not verified and confirmed, the processing displays an error message at block 1022 to the user and prompts the user to re-enter the pattern for verification and confirmation as at block 1018. In some embodiments of the present disclosure, if the pattern provided by the user does not match the the pattern received by the processing unit at block 1012 for a predefined number of attempts, the processing unit may re-initiate the method 1000 and request the user to input a pattern, as disclosed at block 1010.
  • the processing unit assesses that the pattern re-entered by the user matches the pattern received by the processing unit at block 1012, the processing unit stores the pattern in a memory of the device, as disclosed at block 1024.
  • the method 1000 includes requesting, by the processing unit, a user to allocate the pattern to an operation.
  • the processing unit may communicate one or more predefined operations to the user to apply to the pattern.
  • the one or more predefined operations may be communicated to the user by way of a notification, push-up notification, dialogue box, list, alert and/or the like.
  • the one or more predefined operations may be based on the user’s behavior or preferences.
  • the one or more predefined operations may be based on accessibility functions of the device.
  • the operation may be a general operation or an application-specific operation.
  • the user may input a text or voice command using an I/O interface module of the device and the processing unit may communicate a related operation to the user via the I/O interface module of the device.
  • the method 1000 includes receiving, by the processing unit, the allocated operation of the pattern from the user.
  • the method 1000 includes storing, by the processing unit, the allocated operation in connection with the stored pattern in the memory.
  • FIG 11 is an illustrative block diagram of an example system 1100 for performing a scrolling operation on a device 1120, according to an embodiment of the present disclosure.
  • the device 1120 may have a display 1110 for displaying content.
  • the content may be notification banners, badges, application program interface (API) , files and documents such as portable document format (PDF) files, word files, spreadsheets, powerpoint presentations, screenshots, JPEG (Joint Photographic Experts Group) , PNG (Portable Network Graphics) , GIF (Graphics Interchange Format) , SVG (Scalable Vector Graphics) , MP4 (Moving Picture Experts Group) and/or the like.
  • JPEG Joint Photographic Experts Group
  • PNG Portable Network Graphics
  • GIF Graphics Interchange Format
  • SVG Scalable Vector Graphics
  • MP4 Moving Picture Experts Group
  • the user may tap a pattern on the imaging unit 1160, and based on the verification of the pattern with the predefined pattern, which is allocated for a scrolling operation, the processing unit of the device 1120 may enable performing an operation of scrolling the content on the display 1110.
  • the scrolling operation may provide a navigation movement of the content displayed on the display screen or display 1110.
  • the scrolling may be in any direction on the display 1110 including, without any limitation, either top-down, left-right and/or the like.
  • the user may again tap another pattern, may be same or different, to close the content displayed on the display 1110.
  • Figure 12 is a flow diagram that illustrates an example method 1200 disclosing a scrolling operation, according to an embodiment of the present disclosure.
  • the method 1200 includes detecting, by a processing unit, whether pattern detection is enabled in a device.
  • the pattern detection may be a module like a pattern detection module comprising instructions to enable pattern detection.
  • the instructions may be stored in a memory of the device.
  • the pattern detection module may not be always enabled to avoid unnecessary activation of the operation.
  • the user may choose to enable the pattern activation module as per preferences or requirements.
  • the processing unit may not perform any activity, that is nothing is to be done. If at block 1210, pattern detection is enabled in the device, the processing unit detects a tapping pattern over an imaging unit, as disclosed at block 1214. At block 1216, the processing unit assesses whether the tapping pattern is detected. If at block 1216, the tapping pattern is not detected by the processing unit, the processing unit may continue to detect the tapping pattern, as disclosed at block 1214. If at block 1216, the tapping pattern is detected, the processing unit determines characteristics of the pattern including, without any limitation, the number of taps, duration of each tap, time period between the subsequent taps and/or the like at block 1218.
  • the characteristics of the pattern may also include tap pattern along with inputs from I/O interface modules, including without limitation, mic, touch screen, keypads or keyboard, switches and/or the like. This may enable a user to operate multiple operations by using the imaging unit.
  • the processing unit determines whether the total duration of the tapping pattern is within a predefined time period of a predefined pattern.
  • the predefined time period may be augmented by the user and stored in the memory of the device. In some embodiments of the present disclosure, the predefined time period may be 1 second or 2 second, as per users’ needs, preferences and accessibility.
  • the processing unit may ignore the pattern and continue to detect the pattern as disclosed at block 1214. If at block 1220, the total duration of the tapping pattern is within the predefined time period, the processing unit compares the characteristics of the detected pattern with the predefined pattern stored in the memory of the device at block 1222.
  • the predefined patterns may be augmented by the user as per the need, or may be stored in the memory during the manufacturing process of the device. In some embodiments of the present disclosure, one or more predefined patterns and their corresponding operations may be initially determined and stored by the processing unit in the memory.
  • the one or more predefined patterns and their corresponding operations may be determined by the processing unit based on the user’s behavior or preferences. In some embodiments of the present disclosure, the one or more predefined patterns and their corresponding operations may be determined by the processing unit based on accessibility functions of the device. In some embodiments of the present disclosure, one or more predefined patterns and their corresponding operations may be determined by the user and stored by the processing unit in the memory. At block 1224, the processing unit checks whether the detected tapping pattern matches with at least one predefined pattern.
  • the processing unit may not perform any activity, that is nothing is to be done and the processing unit ignores the pattern and continues to detect the pattern, as disclosed at block 1214. If at block 1224, the detected tapping pattern matches with at least one predefined pattern, the processing unit enables the device to perform the operation of the scrolling function, as disclosed at block 1226.
  • the user may perform other operations like, without any limitation, emails, gallery, contact list, messages or any other mobile application that the user may intend to operate.
  • the operation for one or more patterns may vary based on the application running on the device at the time of the input of the pattern by the user.
  • a tap pattern may correspond to an operation of a scrolling function if the application running at the time of the input of the pattern by the user is an internet browser, and the tap pattern may correspond to an operation of a zooming function if the application running at the time of the input of the pattern by the user is an image viewer.
  • the operation for one or more patterns may vary based on the state of the device at the time of the input of the pattern by the user. For example, a tap pattern may correspond to an operation of a calling function if the device is locked and the tap pattern may correspond to an operation of a scrolling function if the device is unlocked.
  • Figure 13 is a flow diagram that illustrates an example of a method 1300 for operating a device, according to an embodiment of the present disclosure.
  • the method 1300 includes detecting, by a processing unit, a tapping input of an object over an imaging unit.
  • a user may input or enter any one of a plethora of taps or tapping patterns by performing a touch event or a release event on an imaging unit of the device.
  • one or more patterns may be differentiated by the intensity of the touch event, the frequency of the touch and release events, the time variation between the touch and release events, the relative sequence of the touch and release events and/or the like.
  • the processing unit determines a pattern of the detected tapping input over the imaging unit based on a number of taps within a predefined time period, a duration of each of the taps, and a time period between the subsequent taps.
  • the method 1300 includes comparing, by the processing unit, the pattern with one or more predefined patterns, wherein each of one or more patterns is allocated for performing an operation.
  • one or more predefined patterns and their corresponding operations may be stored in a memory.
  • the memory may store information regarding tap functions and tap patterns.
  • the memory may store information regarding one or more allocated operations of one or more tap functions or tap patterns.
  • the memory may include a database.
  • one or more predefined patterns and their corresponding operations may be initially determined and stored by the processing unit in the memory. In some embodiments of the present disclosure, the one or more predefined patterns and their corresponding operations may be determined by the processing unit based on the user’s behavior or preferences. In some embodiments of the present disclosure, the one or more predefined patterns and their corresponding operations may be determined by the processing unit based on accessibility functions of the device. In some embodiments of the present disclosure, one or more predefined patterns and their corresponding operations may be determined by the user and stored by the processing unit in the memory. At block 1340, the method 1300 includes operating, by the processing unit, the device by performing the operation allocated to the pattern.
  • the operation may be a general operation or an application-specific operation. In some embodiments of the present disclosure, the operation may pertain to a function or command performed over a network. In some embodiments of the present disclosure, the operation may be a scrolling operation and the processing unit of the device may scroll the content on a display of the device. In some embodiments of the present disclosure, the operation may initiate a prompt to the user to input a voice command. In some embodiments of the present disclosure, the operation may initiate a predefined voice command to the device to perform a function or carry out an action. In some embodiments of the present disclosure, the operation for one or more patterns may vary based on the application running on the device at the time of the input of the pattern by the user.
  • a tap pattern may correspond to an operation of a refresh function if the application running at the time of the input of the pattern by the user is an internet browser, and the tap pattern may correspond to an operation of a zooming function if the application running at the time of the input of the pattern by the user is an image viewer.
  • the operation for one or more patterns may vary based on the state of the device at the time of the input of the pattern by the user. For example, a tap pattern may correspond to an operation of a calling function if the device is locked and the tap pattern may correspond to an operation of closing the running application if the device is unlocked.
  • Figure 14 is a flow diagram that illustrates another example of a method 1400 for operating a device, according to an embodiment of the present disclosure.
  • the method 1400 includes detecting, by a processing unit, a tapping input of an object over an imaging unit.
  • the method 1400 includes receiving, by the processing unit, one or more images captured by the imaging unit.
  • the method 1400 includes determining, by the processing unit, a color value of each of the received images. In some embodiments of the present disclosure, the color value may be indicative of the relative lightness or darkness of the one or more images captured by the imaging unit.
  • the method 1400 includes comparing, by the processing unit, the color value to a threshold color value.
  • the one or more images captured by the imaging unit may have a color value either less or more than a predefined threshold color value.
  • a first threshold color value and a second threshold color value may be predefined.
  • the first threshold color value and the second threshold color value may be adjusted either manually by the user or automatically by the processing unit based on dynamic factors, such as surroundings.
  • a variation in the color value of the one or more captured images from the one or more threshold color values may be indicative of a touch event or a release event having occurred on the imaging unit.
  • the method 1400 includes determining, by the processing unit, for each of the images, whether the image corresponds to a touch of the object on the imaging unit, or a release of the object from the imaging unit, based on the comparison.
  • the color value of one or more images captured by the imaging unit may be higher than the first threshold color value and the same may be determined by the processing unit as a touch event.
  • the one or more captured images in this scenario may be dark or darker due to lack or absence of surrounding light detected by the imaging unit.
  • the color value of one or more images captured by the imaging unit may be lower than the second threshold color value and the same may be determined by the processing unit as a release event.
  • the one or more captured images in this scenario may be bright or brighter due to presence or abundance of surrounding light detected by the imaging unit.
  • the processing unit determines a pattern of the detected tapping input over the imaging unit based on a number of taps within a predefined time period, a duration of each of the taps, and a time period between the subsequent taps.
  • the predefined time period may be, for example, one second.
  • the predefined time period may be adjusted (increased or decreased) based on the accessibility and comfort of the user.
  • the predefined time period may be adjusted either manually by the user or automatically by the processing unit.
  • one or more patterns may be differentiated by the intensity of the touch event, the frequency of the touch and release events, the time variation between the touch and release events, the relative sequence of the touch and release events and/or the like.
  • the method 1400 includes comparing, by the processing unit, the pattern with one or more predefined patterns, wherein each of one or more patterns is allocated for performing an operation.
  • one or more predefined patterns and their corresponding operations may be stored in a memory.
  • one or more predefined patterns and their corresponding operations may be initially determined and stored by the processing unit in the memory.
  • the one or more predefined patterns and their corresponding operations may be determined by the processing unit based on the user’s behavior, preferences or accessibility. In some embodiments of the present disclosure, the one or more predefined patterns and their corresponding operations may be determined by the processing unit based on accessibility functions of the device. In some embodiments of the present disclosure, one or more predefined patterns and their corresponding operations may be determined by the user and stored by the processing unit in the memory.
  • the method 1400 includes operating, by the processing unit, the device by performing the operation allocated to the pattern. In some embodiments of the present disclosure, the operation may be a general operation or an application-specific operation.
  • the operation for one or more patterns may vary based on the application running on the device at the time of the input of the pattern by the user or the state of the device at the time of the input of the pattern by the user.
  • the operation may pertain to a function or command performed over a network.
  • the operation may be a scrolling operation and the processing unit of the device may scroll the content on a display of the device.
  • Figure 15 is a flow diagram that illustrates another example of a method 1500 for operating a device, according to an embodiment of the present disclosure.
  • the method 1500 includes detecting, by a processing unit, a tapping input of an object over an imaging unit.
  • the method 1500 includes receiving, by the processing unit, one or more images captured by the imaging unit.
  • the processing unit determines a color value of each of the received images.
  • the method 1500 includes comparing, by the processing unit, the color value to a threshold color value.
  • the method 1500 includes determining, by the processing unit, for each of the images, whether the image corresponds to a touch of the object on the imaging unit, or a release of the object from the imaging unit, based on the comparison.
  • the method 1500 includes determining, by the processing unit, a first time interval between the touch event and the release event within a predefined time period, to determine the duration of the tap.
  • the processing unit determines a pattern of the detected tapping input over the imaging unit based on a number of taps within the predefined time period, a duration of each of the taps, and a time period between the subsequent taps.
  • the method 1500 includes comparing, by the processing unit, the pattern with one or more predefined patterns, wherein each of one or more patterns is allocated for performing an operation.
  • the method 1500 includes operating, by the processing unit, the device by performing the operation allocated to the pattern.
  • Figure 16 is a flow diagram that illustrates another example of a method 1600 for operating a device, according to an embodiment of the present disclosure.
  • the method 1600 includes detecting, by a processing unit, a tapping input of an object over an imaging unit.
  • a user may input or enter any one of a plethora of taps or tapping patterns by performing a touch event or a release event on an imaging unit of the device.
  • one or more patterns may be differentiated by the intensity of the touch event, the frequency of the touch and release events, the time variation between the touch and release events, the relative sequence of the touch and release events and/or the like.
  • the method 1600 includes receiving, by the processing unit, one or more images captured by the imaging unit.
  • the processing unit determines a color value of each of the received images.
  • the method 1600 includes comparing, by the processing unit, the color value to a threshold color value.
  • a first threshold color value and a second threshold color value may be predefined.
  • a variation in the color value of the one or more captured images from the one or more threshold color values may be indicative of a touch event or a release event having occurred on the imaging unit.
  • the one or more images captured by the imaging unit may be identified and differentiated on the color of the one or more images, such as one or more dark images may indicate a touch event and one or more lighter images may indicate a release event.
  • the method 1600 includes determining, by the processing unit, for each of the images, whether the image corresponds to a touch of the object on the imaging unit, or a release of the object from the imaging unit, based on the comparison.
  • the color value of one or more captured images being higher than the first threshold color value may be identified as a touch event.
  • the one or more captured images in this scenario may be dark or darker due to lack or absence of surrounding light detected by the imaging unit.
  • the color value of one or more captured images being lower than the second threshold color value may be identified as a release event.
  • the one or more captured images in this scenario may be bright or brighter due to presence or abundance of surrounding light detected by the imaging unit.
  • the method 1600 includes determining, by the processing unit, a first time interval between the touch event and the release event within a predefined time period, to determine the duration of the tap.
  • the processing unit determines a second time interval between the release and subsequent the touch within the predefined time period, to determine the time period between subsequent the taps.
  • the processing unit determines a pattern of the detected tapping input over the imaging unit based on a number of taps within the predefined time period, a duration of each of the taps, and a time period between the subsequent taps.
  • the method 1600 includes comparing, by the processing unit, the pattern with one or more predefined patterns, wherein each of one or more patterns is allocated for performing an operation.
  • one or more predefined patterns and their corresponding operations may be stored in a memory.
  • the memory may store information regarding tap functions and tap patterns.
  • the memory may store information regarding one or more allocated operations of one or more tap functions or tap patterns.
  • the memory may include a database.
  • one or more predefined patterns and their corresponding operations may be initially determined and stored by the processing unit in the memory. In some embodiments of the present disclosure, the one or more predefined patterns and their corresponding operations may be determined by the processing unit based on the user’s behavior or preferences. In some embodiments of the present disclosure, the one or more predefined patterns and their corresponding operations may be determined by the processing unit based on accessibility functions of the device. In some embodiments of the present disclosure, one or more predefined patterns and their corresponding operations may be determined by the user and stored by the processing unit in the memory. At block 1628, the method 1600 includes operating, by the processing unit, the device by performing the operation allocated to the pattern.
  • the operation may be a general operation or an application-specific operation. In some embodiments of the present disclosure, the operation may pertain to a function or command performed over a network. In some embodiments of the present disclosure, the operation may be a scrolling operation and the processing unit of the device may scroll the content on a display of the device.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A device, method and system for operating the device having an imaging unit for operating the device are disclosed. In many circumstances, while operating the mobile device during a call, the user tends to change the holding position of his hand by releasing the thumb, for other purposes like scrolling the screen, which may not be safe for the mobile device. According to some embodiments, the processing unit of the device is configured to detect a tapping input of an object over the imaging unit. The processing unit determines a pattern of the detected tapping input over the imaging unit based on a number of taps within a predefined time period, a duration of each of the taps, and a time period between the subsequent taps. The Processing unit compares the pattern with one or more predefined patterns, wherein each of one or more patterns is allocated for performing an operation. The processing unit operates the device by performing the operation allocated to the pattern.

Description

[Title established by the ISA under Rule 37.2] DEVICE, METHOD AND SYSTEM FOR OPERATING DEVICE TECHNICAL FIELD
The present disclosure relates generally to a device, method and system for operating the device, and more particularly to a device having an imaging unit for operating the device.
BACKGROUND
The mobile device such as a mobile phone is commonly used for calling purposes. In general, when a user intends to view the content displayed on the mobile device during a call, the user tends to change the holding position of his hand while operating the mobile device by releasing the thumb, and uses the thumb for other purposes like scrolling the screen. In this situation, the phone is loosely held in the user's hand and may fall from their hands and get damaged.
SUMMARY
The following summary is provided to facilitate an understanding of some of the innovative features unique to the disclosed embodiment and is not intended to be a full description. A full appreciation of the various aspects of the embodiments disclosed herein can be gained by taking into consideration the entire specification, claims, drawings, and abstract as a whole.
In a first aspect of the present disclosure, a method for operating a device is disclosed. The method may comprise detecting, by a processing unit, a tapping input of an object over an imaging unit; determining, by the processing unit, a pattern of the detected tapping input over the imaging unit based on a number of taps within a predefined time period, a duration of each of the taps, and a time period between the subsequent taps; comparing, by the processing unit, the pattern with one or more predefined patterns, wherein each of one or more patterns is allocated for performing an operation; and operating, by the processing unit, the device by performing the operation allocated to the pattern.
According to an embodiment in conjunction to the first aspect of the present disclosure, the method may further comprise receiving, by the processing unit, one or more images captured by the imaging unit; determining, by the processing unit, a color value of each of the received images; comparing, by the processing unit, the color value to a threshold color value; and determining, by the processing unit, for each of the images, whether the image corresponds to a touch of the object on the imaging unit, or a release of the object from the imaging unit, based on the comparison.
According to an embodiment in conjunction to the first aspect of the present disclosure, the method may further comprise determining, by the processing unit, a first time interval between the touch and the release within the predefined time period, to determine the duration of the tap.
According to an embodiment in conjunction to the first aspect of the present disclosure, the method may further comprise determining, by the processing unit, a second time interval between the release and subsequent the touch within the predefined time period, to determine the time period between subsequent the taps.
According to an embodiment in conjunction to the first aspect of the present disclosure, the operation of one or more patterns varies based on a state of the device.
According to an embodiment in conjunction to the first aspect of the present disclosure, the operation of one or more patterns varies based on an application running on the device.
According to an embodiment in conjunction to the first aspect of the present disclosure, the operation includes a scrolling operation to navigate a movement of a content, displayed on a display, in any direction on the display.
According to an embodiment in conjunction to the first aspect of the present disclosure, the one or more predefined patterns are defined by a user.
In a second aspect of the present disclosure, a system for operating a device is disclosed. The system may comprise an imaging unit; and a processing unit configured to detect a tapping input of an object over the imaging unit, the processing unit determines a pattern of the detected tapping input over the imaging unit based on a number of taps within a predefined time period, a duration of each of the taps, and a time period between the subsequent taps, wherein the processing unit compares the pattern with one or more predefined patterns, wherein each of one or more patterns is allocated for performing an operation, and wherein the processing unit operates the device by performing the operation allocated to the pattern.
According to an embodiment in conjunction to the second aspect of the present disclosure, the processing unit receives one or more images captured by the imaging unit and determines a color value of each of the received images, and wherein the processing unit compares the color value to a threshold color value, and determines for each of the images, whether the image corresponds to a touch of the object on the imaging unit, or a release of the object from the imaging unit, based on the comparison.
According to an embodiment in conjunction to the second aspect of the present disclosure, the processing unit determines a first time interval between the touch and the release within the predefined time period, to determine the duration of the tap.
According to an embodiment in conjunction to the second aspect of the present disclosure, the processing unit determines a second time interval between the release and subsequent the touch within the predefined time period, to determine the time period between subsequent the taps.
According to an embodiment in conjunction to the second aspect of the present disclosure, the operation of one or more patterns varies based on a state of the device.
According to an embodiment in conjunction to the second aspect of the present disclosure, the operation of one or more patterns varies based on an application running on the device.
According to an embodiment in conjunction to the second aspect of the present disclosure, the operation includes a scrolling operation to navigate a movement of a content, displayed on a display screen, in any direction on the display screen.
According to an embodiment in conjunction to the second aspect of the present disclosure, the one or more predefined patterns are defined by a user.
In a third aspect of the present disclosure, a device is disclosed. The device may comprise an imaging unit; and a processing unit configured to detect a tapping input of an object over the imaging unit, the processing unit determines a pattern of the detected tapping input over the imaging unit based on a number of taps within a predefined time period, a duration of each of the taps, and a time period between the subsequent taps; and a memory for storing the tapping unit of the object, wherein the processing unit compares the pattern with one or more predefined patterns stored in the memory, wherein each of one or more patterns is allocated for performing an operation, and wherein the processing unit operates the device by performing the operation allocated to the pattern.
According to an embodiment in conjunction to the third aspect of the present disclosure, the processing unit receives one or more images captured by the imaging unit and determines a color value of each of the received images, and wherein the processing unit compares the color value to a threshold color value, and determines for each of the images, whether the image corresponds to a touch of the object on the imaging unit, or a release of the object from the imaging unit, based on the comparison.
According to an embodiment in conjunction to the third aspect of the present disclosure, the processing unit determines a first time interval between the touch and the release within the predefined time period, to determine the duration of the tap.
According to an embodiment in conjunction to the third aspect of the present disclosure, the processing unit determines a second time interval between the release and subsequent the touch within the predefined time period, to determine the time period between subsequent the taps.
According to an embodiment in conjunction to the third aspect of the present disclosure, the operation of one or more patterns varies based on a state of the device.
According to an embodiment in conjunction to the third aspect of the present disclosure, the operation of one or more patterns varies based on an application running on the device.
According to an embodiment in conjunction to the third aspect of the present disclosure, the operation includes a scrolling operation to navigate a movement of a content, displayed on a display screen, in any direction on the display screen.
According to an embodiment in conjunction to the third aspect of the present disclosure, the one or more predefined patterns are defined by a user.
BRIEF DESCRIPTION OF THE DRAWINGS
The accompanying drawings are included in order to more clearly illustrate the embodiments of the present disclosure and the related art. The drawings included herein provide a further understanding of the disclosure, and are incorporated in and constitute a part of the present disclosure. It is appreciable that the drawings illustrate implementations of the disclosure and, together with the description, serve to explain the principles of the disclosure. It is apparent that the drawings are merely some embodiments of the present disclosure, a person having ordinary skill in this field can obtain other figures according to these figures without paying the premise.
Figure 1 is an illustration of an example device 100, according to an embodiment of the disclosure.
Figure 2 is a schematic illustration of an example device 200 depicting hardware components, according to an embodiment of the present disclosure.
Figure 3 is an illustration of an exemplary scenario 300, according to an embodiment of the present disclosure, depicting a touch by a user on an imaging unit 360 of a device 340.
Figure 4 is an illustration of an exemplary scenario 400, according to an embodiment of the present disclosure, depicting a finger of a user not in contact with  an imaging unit 460 of a device 440.
Figure 5 is an illustration of an exemplary scenario 500, according to an embodiment of the present disclosure, depicting a pattern of touch and release by a user on an imaging unit 560 of a device 540.
Figure 6 is an illustration of another exemplary scenario 600, according to an embodiment of the present disclosure, depicting a pattern of touch and release by a user on an imaging unit 660 of a device 640.
Figure 7 is a timing diagram of an example pattern 700, according to an embodiment of the present disclosure.
Figure 8 is a timing diagram of another example pattern 800, according to an embodiment of the present disclosure.
Figure 9 is  a timing diagram of another example pattern 900, according to an embodiment of the present disclosure.
Figure 10 is a flow diagram that illustrates an example method 1000 fordefining a pattern by a user, according to an embodiment of the present disclosure.
Figure 11 is an illustrative block diagram of an example system 1100 for performing a scrolling operation on a device 1120, according to an embodiment of the present disclosure.
Figure 12 is a flow diagram that illustrates an example method 1200 disclosing a scrolling operation, according to an embodiment of the present disclosure.
Figure 13 is a flow diagram that illustrates an example of a method 1300 for operating a device, according to an embodiment of the present disclosure.
Figure 14 is a flow diagram that illustrates another example of a method 1400 for operating a device, according to an embodiment of the present disclosure.
Figure 15 is a flow diagram that illustrates another example of a method 1500 for operating a device, according to an embodiment of the present disclosure.
Figure 16 is a flow diagram that illustrates another example of a method 1600 for operating a device, according to an embodiment of the present disclosure.
REFERENCE NUMERAL                   DESCRIPTION
100, 200, 340, 440, 540, 640, 1120  Device
110, 210, 1110                      A display
160, 260, 360, 460, 560, 660, 1160  Imaging unit
220                                 A processing unit
230                                 Input/Output interface module
240                                 Communication &Network interface module
250                                 Memory
320, 420, 520, 620                  Finger of a user
300, 400, 500 , 600                 Exemplary scenario
700, 800, 900                       Exemplary pattern
701, 702, 703, 704, 801, 802, 803,  Taps
804, 901, 902, 903
1100                                Exemplary system
The summary above, as well as the following detailed description of illustrative embodiments, is better understood when read in conjunction with the appended drawings. For the purpose of illustrating the present disclosure, exemplary constructions of the disclosure are shown in the drawings. However, the disclosure is not limited to specific methods and instrumentalities disclosed herein. Moreover, those in the art will understand that the drawings are not to scale. Wherever possible, like elements have been indicated by identical numbers.
DETAILED DESCRIPTION OF THE EMBODIMENTS
Detailed embodiments and implementations of the claimed subject matters are disclosed herein in detail with the technical matters, structural features, achieved objects, and effects with reference to the accompanying drawings as follows. It shall be understood that the disclosed embodiments and implementations are merely illustrative of the claimed subject matters which may be embodied in various forms. The present disclosure may, however, be embodied in many different forms and should not be construed as limited to the exemplary embodiments and implementations set forth herein. Rather, these exemplary embodiments and implementations are provided so that description of the present disclosure is thorough and  complete and will fully convey the scope of the present disclosure to those skilled in the art. Specifically, the terminologies in the embodiments of the present disclosure are merely for describing the purpose of the certain embodiment, but not to limit the disclosure. In the description below, details of well-known features and techniques may be omitted to avoid unnecessarily obscuring the presented embodiments and implementations.
The particular configurations discussed in the following description are non-limiting examples that can be varied and are cited merely to illustrate at least one embodiment and are not intended to limit the scope thereof.
Embodiments of the present disclosure relate to a method and system enabling a user to operate a device in a handheld position efficiently without compromising with the safety of the device.
The technical solutions of the embodiments of the disclosure may be applied to one or more devices. A device may be of a portable device or a non-portable device, having imaging capabilities. The device may be capable of performing wireless communication. The device may be a handheld device like smartphone or non-smart phone, wireless device, mobile phone, tablet, computer, laptop, gaming console, a wearable smart device, notebook, netbook, television like smart television or non-smart television, desktop and/or the like. The device may also include a wireless organizer, pager, personal digital assistant, handheld wireless communication device, wirelessly enabled computer, portable gaming device or any other portable electronic device with processing and communication capabilities. The device may include a device with peripheral devices such as display, printer, touchscreen, projector, digital watch, camera, digital scanner, and any other types of auxiliary device that may communicate with other devices. In general, the term device can be broadly defined to encompass any electronic, computing, and/or telecommunications device, or combination of devices having imaging capabilities.
A processing unit may include without limitation, microcontroller, microprocessor, embedded processor, media processor, graphics processing unit (GPU) , central processing unit (CPU) , application processor, digital signal processor (DSP) , reduced instruction set computing (RISC) processor, system on a chip (SoC) , baseband processor, field programmable gate array (FPGA) , programmable logic device (PLD) , state machine, gated logic, discrete hardware circuit, and other suitable hardware configured to facilitate the innovative features as disclosed in the present detailed description. The processing unit for example may execute the instruction (s) for the functioning of the device. The processing unit may include an integrated graphics processor that processes the rendered data into an image in pixels for display.
A memory may be any form of storage either within or outside the device. The memory may store information regarding tap functions and tap patterns. The memory may store information regarding one or more allocated operations of one or more tap functions or tap patterns. The memory may include a database. In some embodiments of the present disclosure, the memory may also be a combination of one or more storage available internally or externally. For example, flash memory, random-access memory (RAM) , read-only memory (ROM) , compact disc read-only memory (CDROM) , electro-optical memory like compact disk or digital versatile disk (DVD) , smart card magneto-optical memory, erasable programmable read-only memory (EPROM) , and electrically-erasable programmable read-only memory (EEPROM) , and/or the like. In some embodiments of the present disclosure, the memory may store or carry the source code or instruction for executing required tasks. In some embodiments of the present disclosure, a carrier wave may carry content or data including those used in transmitting and receiving electronic data such as electronic mail (e-mail) or in accessing a computer network such as the Internet or a local area network (LAN) . In some embodiments of the present disclosure, the memory may be a cloud storage and/or the like that may be accessible via the internet.
A display may be a touch-sensitive or presence-sensitive display. In some embodiments of the present disclosure, the display includes an input/output interface module (I/O interface module) . In some embodiments of the present disclosure, the display may provide an output to the user, for example, display contents, including without limitation, an image or a video image and/or the like. In some embodiments of the present disclosure, the display may include or be integrated with a touch screen or touch sensitive overlay for receiving touch input from the user. In some embodiments of the present disclosure, the display may also be capable of receiving a user input from a stylus, fingertip, or other means of gesture input. In some embodiments of the present disclosure, the display may be a computer monitor, for example, a personal computer, with an internal or external display operatively connected. In yet another exemplary embodiment, the display may be a display device, such as an LCD TV or projector and/or the like.
An Input/Output interface module (I/O interface module) refers to any means or set of commands or menus through which a user may communicate with the device. In some embodiments of the present disclosure, the I/O interface module may be a virtual keyboard or any other means through which a user may input information to the device. The I/O interface module may enable the device to communicate with the user for exchanging data or for establishing connection with  the devices. The I/O interface module may enable the device to connect with various I/O peripherals. The peripherals for example may include keyboard, mouse, camera, touch screen (e.g., display) , a microphone, and may also include one or more output devices such as a display screen (e.g., display) and a speaker. The I/O interface module may enable the user to navigate, view, edit and perform several other operations to notification banners, badges, application program interface (API) , files and documents such as portable document format (PDF) files, word files, spreadsheets, powerpoint presentations, screenshots, JPEG (Joint Photographic Experts Group) , PNG (Portable Network Graphics) , GIF (Graphics Interchange Format) , SVG (Scalable Vector Graphics) , MP4 (Moving Picture Experts Group) and/or the like.
An imaging unit may refer to an image capturing device, including without limitation, a camera. The imaging unit may include imaging sensors. The imaging unit may be an inbuilt imaging unit or an external imaging unit. The imaging unit may be a camera having one or more lenses for capturing images or video images. The imaging unit may be positioned or located on the rear side of the device, or on the sides of the device like, top, bottom, left or right side. The imaging unit may be positioned on the front side of the device. The imaging unit may include various hardware electronic circuitries, optical components to aid image capturing and processing functions. The imaging unit may be capable of recognizing motions and gestures.
Figure 1 is an illustration of an example device 100, with a display 110, according to an embodiment of the present disclosure. Figure 1 illustrates the device 100 with the display 110 and an imaging unit 160 located on the rear face of the device 100. The device 100 may be any type of electronics device having a user interface, including without limitation, personal computer, handheld device, cell phone, consumer electronic device, multimedia device and/or the like.
In some embodiments of the present disclosure, the display 110 may be a touch-sensitive or presence-sensitive display. In some embodiments of the present disclosure, the display 110 includes an I/O interface module. The I/O interface module refers to any means or set of commands or menus through which a user may communicate with the device. In some embodiments of the present disclosure, the I/O interface module may be a virtual keyboard or any other means through which a user may input information to the device. Further, in some embodiments of the present disclosure, the display may provide output to the user, for example, display contents, including without limitation, an image or a video image and/or the like. In some embodiments of the present disclosure, the display 110 may display contents like image, video image, audio file, files and documents such as portable document format (PDF) files, word files, spreadsheets, powerpoint presentations, screenshots, JPEG (Joint Photographic Experts Group) , PNG (Portable Network Graphics) , GIF (Graphics Interchange Format) , SVG (Scalable Vector Graphics) , MP4 (Moving Picture Experts Group) and/or the like. In some embodiments of the present disclosure, the display 110 may include or be integrated with a touch screen or touch sensitive overlay for receiving touch input from the user. The I/O interface module may enable the user to navigate, view, edit and perform several other operations to notification banners, badges, application program interface (API) , files and documents such as portable document format (PDF) files, word files, spreadsheets, powerpoint presentations, screenshots, JPEG (Joint Photographic Experts Group) , PNG (Portable Network Graphics) , GIF (Graphics Interchange Format) , SVG (Scalable Vector Graphics) , MP4 (Moving Picture Experts Group) and/or the like.
In an embodiment of the present disclosure, the imaging unit 160 may be a unit with imaging capabilities such as an  image capturing device like a camera positioned on the rear face of the device 100. In some embodiments of the present disclosure, the camera may be a digital camera. In some embodiments of the present disclosure, the camera may be an inbuilt camera or an external camera. In some embodiments of the present disclosure, the camera may have one or more lenses for capturing images or video images. In some embodiments of the present disclosure, the camera may be positioned or located on the rear side of the device 100, or on the sides of the device like, top, bottom, left or right side. In some embodiments of the present disclosure, the camera may be positioned on the front side of the device. In some embodiments of the present disclosure, the camera may include various hardware electronic circuitries, optical components to aid image capturing and processing functions. In some embodiments of the present disclosure, the imaging unit may be capable of detecting motions and gestures. In some embodiments of the present disclosure, the imaging unit may include an imaging sensor.
In some embodiments of the present disclosure, the device 100 may be connected to a network. The network may be further connected to one or more devices. In some embodiments of the present disclosure, the one or more devices may be connected to each other through the network. In some embodiments of the present disclosure, the network may refer to a Public Land Mobile Network (PLMN) , a Device to Device (D2D) network, a Machine to Machine/Man (M2M) network, local area network (LAN) , wireless local area network (WLAN) , wide area network (WAN) , wireless wide area network (WWAN) , the internet, a mobile network or another network, or a combination thereof. In some embodiments, the network may be a core network of a cellular service provider, a telecommunication network such as a public switched telephone network (PSTN) .
Figure 2 is a schematic illustration of an example device 200 depicting hardware components, according to an embodiment of the present disclosure. In an embodiment of the present disclosure, the device 200 may include a display 210, a processing unit 220, an I/O interface module (input/output interface module) 230, a communication system and network interface module 240, an imaging unit 260 and a memory 250.
In an embodiment of the present disclosure, the processing unit 220 may control the overall functioning and control of the device 200. The processing unit 220 herein can be defined as the central processing unit i.e. CPU that controls and operates the functioning of the device 200. In some embodiments of the present disclosure, the processing unit 220 may include without limitation, an integrated graphics processor that processes the rendered data into an image in pixels for display. In some embodiments of the present disclosure, the processing unit 220 may include without limitation, microcontroller, microprocessor, embedded processors, media processors, graphics processing units (GPUs) , central processing units (CPUs) , application processors, digital signal processors (DSPs) , reduced instruction set computing (RISC) processors, systems on a chip (SoC) , baseband processors, field programmable gate arrays (FPGAs) , programmable logic devices (PLDs) , state machines, gated logic, discrete hardware circuits, and other suitable hardware configured to facilitate the innovative features as disclosed in the present detailed description. The processing unit 220 for example may execute the instruction (s) for the functioning of the device 200.
In some embodiments of the present disclosure, the communication system and network interface module 240 enables the device 200 to connect to a wired or wireless network. In some embodiments of the present disclosure, the communication system and network interface module 240 enables large-scale network communications through routable protocols, such as Internet Protocol (IP) . In some embodiment of the present disclosure, the communication system and network interface module 240, may allow the device 200 to communicate with one or more devices. The communication of data including image data, voice data or contents are achieved through this communication system and network interface module 240. In some embodiments of the present disclosure, the communication system and network interface module 240 may use different wireless communication techniques such as WiFi, Bluetooth, Wireless USB, capacitive coupling communications and/or the like. In some embodiments of the present disclosure, the communication system and network interface module 240 may also allow a wireless data communication, which can be a shared communication like a wireless hotspot or a direct WiFi provided by any one device to the nearby devices. In some embodiments of the present disclosure, the module 240 may be an external module that connects one device with other devices. The external module may be connected to a device by for example USB or other direct connection. In some embodiments of the present disclosure, the communication system and network interface module 240 may be communicatively coupled to the processing unit 220.
In an embodiment of the present disclosure, the memory 250 may be any form of storage either within or outside the device 200. The memory may store information regarding tap functions and tap patterns. The memory may store information regarding one or more allocated operations of one or more tap functions or tap patterns. The memory may include a database. In some embodiments of the present disclosure, the memory may also be a combination of one or more storage available internally or externally. For example, flash memory, random-access memory (RAM) , read-only memory (ROM) , compact disc read-only memory (CDROM) , electro-optical memory like compact disk or digital versatile disk (DVD) , smart card magneto-optical memory, erasable programmable read-only memory (EPROM) , and electrically-erasable programmable read-only memory (EEPROM) , and/or the like. In some embodiments of the present disclosure, the memory may store or carry the source code or instruction for executing required tasks. In an embodiment of the present disclosure, the memory 250 may be communicatively coupled to the processing unit 220.
In an embodiment of the present disclosure, the display 210 may be a built-in or integrated display to present a content to a viewer or a user. In some embodiments of the present disclosure, the I/O interface module 230 may enable the device 200 to communicate with the user for exchanging data or for establishing connection with the devices 200. In some embodiments of the present disclosure, the I/O interface module 230 may enable the device 200 to connect with various I/O peripherals. The peripherals for example may include keyboard, mouse, camera, touch screen (e.g., display 210) , a microphone, and may also include one or more output devices such as a display screen (e.g., display 210) and a speaker. In an embodiment of the present disclosure, the display 210 may be communicatively coupled to the processing unit 220.
In an embodiment of the present disclosure, the device 200 may include the imaging unit 260. In some embodiments of the present disclosure, the imaging unit may enable the imaging capabilities of the device. In some embodiments of the present disclosure, the imaging unit may be capable of capturing images or video images. In some embodiments of the present disclosure, the imaging unit may be capable of processing images or video images. In some embodiments of the present disclosure, the imaging unit may capture image attributes such as pixel attributes, illumination, color information and/or the like. In some embodiments of the present disclosure, the imagine unit may be any image  capturing device like a camera. In some embodiments of the present disclosure, the imaging unit may be a digital camera. In some embodiments of the present disclosure, the imaging unit may be an inbuilt camera or an external camera. In some embodiments of the present disclosure, the imaging unit may have one or more lenses for capturing images or video images. In some embodiments of the present disclosure, the imaging unit may be positioned or located on the rear side of the device, or on the sides of the device like, top, bottom, left or right side. In some embodiments of the present disclosure, the imaging unit may be positioned on the front side of the device. In some embodiments of the present disclosure, the imaging unit may include various hardware electronic circuitries, optical components to aid image capturing and processing functions. In an embodiment of the present disclosure, the imaging unit 260 may be communicatively coupled to the processing unit 220.
Figure 3 is an illustration of an exemplary scenario 300, according to an embodiment of the present disclosure, depicting a touch by a user on an imaging unit 360 of a device 340. In an aspect of the present disclosure, the user may hold the device 340 in his right or left hand. While generally operating the device or while performing a specific function, such as making a call, the user may tap the imaging unit 360 to operate the device without compromising the safety of the device 340. In an aspect of the present disclosure, during the touch, finger 320 of the user or any object covers the imaging unit 360. In another aspect of the disclosure, during the touch, the imaging unit 360 may be configured to capture a plurality of images in which the image may have a transition from a light color to a darker color. In the scenario 300 disclosed in Figure 3, when the finger 320 of the user touches the imaging unit 360, due to the absence of the surrounding light, the image captured by the imaging unit 360 may be darker or black in color. In some embodiments of the present disclosure, the color of the images captured by the imaging unit 360 is defined by a color value. In the scenario 300 as disclosed in Figure 3, when the finger 320 of the usertouches the imaging unit 360, the one or more captured images are dark or darker and the color value of the one or more images may be higher. In some embodiments of the present disclosure, the lowest color value may be assigned a value 0 (zero) for a complete white image. In some embodiments of the present disclosure, the highest color value may be assigned a value 1 (one) . In some embodiments of the present disclosure, a first threshold color value and a second threshold color value may be predefined. In some embodiments of the disclosure, the processing unit of the device 340 may be provided with the first and the second threshold color value. In some embodiments of the present disclosure, the first and the second threshold color values may be stored in a memory. In some embodiments of the disclosure, the color value of one or more images captured by the imaging unit 360 may be  compared with the one or more threshold color values. A variation in the color value of the one or more captured images from the one or more threshold color values may be indicative of a touch event or a release event having occurred on the imaging unit 360. In some embodiments of the present disclosure, , the color value of the subsequent captured images observed to be higher than the first threshold color value, may be considered to indicate a touch event, that is, the user has touched the imaging unit 360 by his finger 320.
Figure 4 is an illustration of an exemplary scenario 400, according to an embodiment of the present disclosure, depicting a finger 420 of a user not in contact with an imaging unit 460 of a device 440. The scenario 400 as disclosed in Figure 4 illustrates the finger 420 of the user in a position wherein it is not connected or is not in contact with the imaging unit 460 of the device 440. In an aspect of the disclosure, atap input may be defined by a sequence of a touch event on the imaging unit 460 and the release event from the imaging unit 460. The release event may include a scenario wherein the finger is not connected or is disconnected with the imaging device. In some embodiments, an exemplary scenario may disclose an object held by the user not in contact with the imaging unit of the device. In the scenario 400, the imaging unit 460 may capture one or more images. In some embodiments, the imaging unit 460 may capture a plurality of images to determine the relative color value of the plurality of images. On account of the disconnect between the imaging unit 460 and the finger 420 of the user, the images captured by the imaging unit 460 may be white or light or bright in color because of the presence of the surrounding light. In some embodiments of the present disclosure,  the color of the images captured by the imaging unit 360 is defined by a color value. In the scenario 400 as disclosed in Figure 4, one or more captured images may have a decreased color value. In some embodiments, the color value of the plurality of images captured by the imaging unit 460 may be compared with a second threshold color value to identify the disconnect between the imaging unit 460 and the finger 420, which may disclose a release event of the finger 420 of the user from the imaging unit 460 that is, the user has removed his finger 420 from the imaging unit 460.
Figure 5 is an illustration of an exemplary scenario 500, according to an embodiment of the present disclosure, depicting a pattern of touch and release by a user on an imaging unit 560 of a device 540. Figure 5 discloses a pattern with a plurality of events including one or more touch and release events. When a touch event or a release event occurs on the imaging unit 560 of the device 540, one or more images are captured by the imaging device 560. In some embodiments of the present disclosure, a first threshold color value and a second threshold color value may be predefined. A variation in the color value of the one or more captured images from the one or more threshold color values may be indicative of a touch event or a release event having occurred on the imaging unit 560. In some embodiments of the present disclosure, the one or more images captured by the imaging unit 560 may be identified and differentiated on the color of the one or more images, such as one or more dark images may indicate a touch event and one or more lighter images may indicate a release event. In an  aspect of the present disclosure, at time T equals t 0, the user touches the imaging unit 560 for a touch event, that is, the color value of the images captured during t 0 may be greater than a first threshold color value. Further, at time T equals t 1, the user removes his finger 520 from the imaging unit 560 for a release event, that is, the color value of the images captured during t 1 may be lower than a second threshold color value. Furthermore, at time T equals t 2, the user again touches the imaging unit 560 for a subsequent touch event, that is, the color value of the images captured during t 2 may also be greater than the first threshold color value. In an aspect of the present disclosure, the occurrence of the touch event followed by the release event forms a tap input. In an aspect of the disclosure, the time between the touch or touch event and the release or the release event may be defined as a first time interval that further defines the duration of the input tap. In an aspect of the disclosure, the time between the release or the release event and the subsequent touch or touch event may be defined as a second time interval that further defines the time period between the subsequent tap inputs. In an aspect of the present disclosure, the exemplary pattern discloses a tap input between the time t 0 and t 1 and a tap pattern between the time t 0 and t 2.
Figure 6 is an illustration of another exemplary scenario 600, according to an embodiment of the present disclosure, depicting a pattern of touch and release by a user on an imaging unit 660 of a device 640. Figure 6 discloses, a pattern with a plurality of events including one or more touch and release events. When a touch event or a release event occurs on the imaging unit 660 of the device 640, one or more images are captured by the imaging device 660. In some embodiments of the present disclosure, a first threshold color value and a second threshold color value may be predefined. A variation in the color value of the one or more captured images from the one or more threshold color values may be indicative of a touch event or a release event having occurred on the imaging unit 660. In some embodiments of the present disclosure, the one or more images captured by the imaging unit 660 may be identified and differentiated on the color of the one or more images, such as one or more dark images may indicate a touch event and one or more lighter images may indicate a release event. In an aspect of the present disclosure, at time T equals t 0, the finger 620 of the user is removed from the imaging unit 660 of the device for a release event, that is, the color value of the images captured during t 0 may be lower than a second threshold value. Further, at time T equals t 1, the user touches the imaging unit 660 for a touch event, that is, the color value of the images captured during t 1 may be greater than the first threshold value. Furthermore, at time T equals t 2, the user again releases his finger 620 from the imaging unit 660 for a subsequent release event, that is, the color value of the images captured during t 2 may be again lower than the second threshold value. In an aspect of the present disclosure, the exemplary pattern discloses a tap input between the time t 1 and t 2.
Figure 7 is a timing diagram of an example pattern 700, according to an embodiment of the present disclosure. Figure 7 discloses  the pattern 700 in a predefined time period t p. The pattern 700 as disclosed in Figure 7 may have four taps, namely 701, 702, 703 and 704. In an aspect of the disclosure, the first tap 701 may have a duration d 0, the second tap 702 may have a duration d 1, the third tap 703 may have a duration d 2 and the fourth tap 704 may have a duration d 3. The time interval or time period between the first tap 701 and the subsequent second tap 702 is t 0. The time interval or time period between the second tap 702 and the subsequent third tap 703 is t 1. The time interval or time period between the third tap 703 and the subsequent fourth tap 704 is t 2. In some embodiments of the present disclosure, the duration of each of the taps in the one or more taps may be the same. In some embodiments of the present disclosure, the time period between the subsequent taps may be different. In some embodiments of the present disclosure, the time period between the subsequent taps may be the same. The user may create a plethora of patterns using variable duration of the taps, variable time period between subsequent taps, in the pattern and/or the like.
Figure 8 is a timing diagram of another example pattern 800, according to an embodiment of the present disclosure. Figure 8 discloses the pattern 800 in a predefined time period t p. The pattern 800 as disclosed in Figure 8 may have four taps, namely, 801, 802, 803 and 804. In an aspect of the disclosure, the first tap 801 may have a duration d 0, the second tap 802 may have a duration d 1, the third tap 803 may have a duration d 2 and the fourth tap 804 may have a duration d 3. The time interval or time period between the first tap 801 and the subsequent second tap 802 is t 0. The time interval between the second tap 802 and the subsequent third tap 803 is t 1. The time interval between the third tap 803 and the subsequent fourth tap 804 is t 2. In some embodiments of the present disclosure, the duration of each of the taps, namely, d 0, d 1, d 2, andd 3 may be the same. In some of the embodiments of the present disclosure, the time period between the first tap 801 and the second tap 802, that is, t 0 may be greater than the time period t 1 and t 2, or the time period t 0 may be equal to time period t 2 and may be greater than the time period t 1. In some embodiments of the present disclosure, the patterns may have taps with equal duration and with unequal time gap between subsequent taps.
Figure 9 is a timing diagram of another example pattern 900, according to an embodiment of the present disclosure. Figure 9 discloses the pattern 900 in a predefined time period t p. The pattern 900 as disclosed in Figure 9 may have three taps, namely, 901, 902 and 903. In an aspect of the disclosure, the first tap 901 may have a duration d 0, the second tap 902 may have a duration d 1, and the third tap 903 may have a duration d 2. The time interval or time period between the first tap 901 and the subsequent second tap 902 is t 0. The time gap between the second tap 902 and the subsequent third tap 903 is t 1. In  some embodiments of the present disclosure, the duration of each of the taps, that is, d 0, d 1, andd 2 may vary. For example, the duration d 0 of the first tap 901 may be greater than the duration d 1 and/ord 2 of the second tap 902 and the third tap 903, respectively, or the duration d 0  may be equal to the duration d 1 but greater than the duration d 2. In some of the embodiments of the present disclosure, the time period t 0 between the first tap 901 and the second tap 902, may be lesser than the time period t 1between the second tap902 and the third tap 903. In some embodiments of the present disclosure, the patterns may have taps with unequal duration and/or with unequal time gaps between subsequent taps.
Figure 10 is a flow diagram that illustrates an example method 1000 fordefining a pattern by a user, according to an embodiment of the present disclosure. At block 1010, the method 1000 includes requesting, by a processing unit, a user to input a pattern. At block 1012, the method 1000 includes receiving, by the processing unit, the pattern from the user. In some embodiments of the present disclosure, the user may input or enter any one of a plethora of patterns by performing a touch event or a release event on an imaging unit of a device. In some embodiments of the present disclosure, one or more patterns may be differentiated by the intensity of the touch event, the frequency of the touch and release events, the time variation between the touch and release events, the relative sequence of the touch and release events and/or the like. At block 1014, determining, by the processing unit, a duration of the pattern. At block 1016, the method 1000 includes determining by the processing unit, a number of tap (s) , a duration of each tap, and a time period of subsequent each tap in the pattern. At block 1018, the method 1000 includes prompting, by the processing unit, to re-enter the pattern for verification and confirmation. In some embodiments of the present disclosure, the prompt may be in the form of a notification, push-up notification, dialogue box, alert and/or the like. The user may re-enter the pattern using a finger or an object on the imaging unit of the device. At block 1020, the processing unit assesses whether the pattern re-entered by the user matches the pattern received by the processing unit at block 1012 to verify and confirm the pattern. If at block 1020, the processing unit assesses that the pattern is not verified and confirmed, the processing displays an error message at block 1022 to the user and prompts the user to re-enter the pattern for verification and confirmation as at block 1018. In some embodiments of the present disclosure, if the pattern provided by the user does not match the the pattern received by the processing unit at block 1012 for a predefined number of attempts, the processing unit may re-initiate the method 1000 and request the user to input a pattern, as disclosed at block 1010. If at block 1020, the processing unit assesses that the pattern re-entered by the user matches the pattern received by the processing unit at block 1012, the processing unit stores the pattern in a memory of the device, as disclosed at block 1024. At block 1026, the method 1000 includes requesting, by the processing unit, a user to allocate the pattern to an operation. In some embodiments, the processing unit may communicate one or more predefined operations to the user to apply to the pattern. In some embodiments of the present disclosure, the one or more predefined operations may be communicated to the user by way of a notification, push-up notification, dialogue box, list, alert and/or the like. In some embodiments of the present disclosure, the one or more predefined operations may be based on the user’s behavior or preferences. In some embodiments of the present disclosure, the one or more predefined operations may be based on accessibility functions of the device. In some embodiments of the present disclosure, the operation may be a general operation or an application-specific operation. In some embodiments of the present disclosure, the user may input a text or voice command using an I/O interface module of the device and the processing unit may communicate a related operation to the user via the I/O interface module of the device. At block 1028, the method 1000 includes receiving, by the processing unit, the allocated operation of the pattern from the user. At block 1030, the method 1000 includes storing, by the processing unit, the allocated operation in connection with the stored pattern in the memory.
Figure 11 is an illustrative block diagram of an example system 1100 for performing a scrolling operation on a device 1120, according to an embodiment of the present disclosure. The device 1120 may have a display 1110 for displaying content. In some embodiment of the present disclosure, the content may be notification banners, badges, application program interface (API) , files and documents such as portable document format (PDF) files, word files, spreadsheets, powerpoint presentations, screenshots, JPEG (Joint Photographic Experts Group) , PNG (Portable Network Graphics) , GIF (Graphics Interchange Format) , SVG (Scalable Vector Graphics) , MP4 (Moving Picture Experts Group) and/or the like. The user may tap a pattern on the imaging unit 1160, and based on the verification of the pattern with the predefined pattern, which is allocated for a scrolling operation, the processing unit of the device 1120 may enable performing an operation of scrolling the content on the display 1110. The scrolling operationmay provide a navigation movement of the content displayed on the display screen or display 1110. The scrolling may be in any direction on the display 1110 including, without any limitation, either top-down, left-right and/or the like. In some embodiments of the present disclosure, the user may again tap another pattern, may be same or different, to close the content displayed on the display 1110.
Figure 12 is a flow diagram that illustrates an example method 1200 disclosing a scrolling operation, according to an embodiment of the present disclosure. At block 1210, the method 1200 includes detecting, by a processing unit, whether pattern detection is enabled in a device. In some embodiments of the present disclosure, the pattern detection may be a module like a pattern detection module comprising instructions to enable pattern detection. In some embodiments of the present disclosure, the instructions may be stored in a memory of the device. In some embodiments of the present disclosure,  the pattern detection module may not be always enabled to avoid unnecessary activation of the operation. In some embodiments of the present disclosure, the user may choose to enable the pattern activation module as per preferences or requirements. If at block 1210, pattern detection is not enabled in the device, the processing unit may not perform any activity, that is nothing is to be done. If at block 1210, pattern detection is enabled in the device, the processing unit detects a tapping pattern over an imaging unit, as disclosed at block 1214. At block 1216, the processing unit assesses whether the tapping pattern is detected. If at block 1216, the tapping pattern is not detected by the processing unit, the processing unit may continue to detect the tapping pattern, as disclosed at block 1214. If at block 1216, the tapping pattern is detected, the processing unit determines characteristics of the pattern including, without any limitation, the number of taps, duration of each tap, time period between the subsequent taps and/or the like at block 1218. In some embodiments of the present disclosure, the characteristics of the pattern may also include tap pattern along with inputs from I/O interface modules, including without limitation, mic, touch screen, keypads or keyboard, switches and/or the like. This may enable a user to operate multiple operations by using the imaging unit. At block 1220, the processing unit determines whether the total duration of the tapping pattern is within a predefined time period of a predefined pattern. In some embodiments of the present disclosure, the predefined time period may be augmented by the user and stored in the memory of the device. In some embodiments of the present disclosure, the predefined time period may be 1 second or 2 second, as per users’ needs, preferences and accessibility. If at block 1220, the total duration of the tapping pattern is not within the predefined time period, the processing unit may ignore the pattern and continue to detect the pattern as disclosed at block 1214. If at block 1220, the total duration of the tapping pattern is within the predefined time period, the processing unit compares the characteristics of the detected pattern with the predefined pattern stored in the memory of the device at block 1222. In some embodiments of the present disclosure, the predefined patterns may be augmented by the user as per the need, or may be stored in the memory during the manufacturing process of the device. In some embodiments of the present disclosure, one or more predefined patterns and their corresponding operations may be initially determined and stored by the processing unit in the memory. In some embodiments of the present disclosure, the one or more predefined patterns and their corresponding operations may be determined by the processing unit based on the user’s behavior or preferences. In some embodiments of the present disclosure, the one or more predefined patterns and their corresponding operations may be determined by the processing unit based on accessibility functions of the device. In some embodiments of the present disclosure, one or more predefined patterns and their corresponding operations may be determined by the user and stored by the processing unit in the memory. At block 1224, the processing unit checks whether the detected tapping pattern matches with at least one predefined pattern. If at block 1224, the detected tapping pattern does not match with at least one predefined pattern, the processing unit may not perform any activity, that is nothing is to be done and the processing unit ignores the pattern and continues to detect the pattern, as disclosed at block 1214. If at block 1224, the detected tapping pattern matches with at least one predefined pattern, the processing unit enables the device to perform the operation of the scrolling function, as disclosed at block 1226. In some embodiments of the present disclosure, the user may perform other operations like, without any limitation, emails, gallery, contact list, messages or any other mobile application that the user may intend to operate. In some embodiments of the present disclosure, the operation for one or more patterns may vary based on the application running on the device at the time of the input of the pattern by the user. For example, a tap pattern may correspond to an operation of a scrolling function if the application running at the time of the input of the pattern by the user is an internet browser, and the tap pattern may correspond to an operation of a zooming function if the application running at the time of the input of the pattern by the user is an image viewer. In some embodiments of the present disclosure, the operation for one or more patterns may vary based on the state of the device at the time of the input of the pattern by the user. For example, a tap pattern may correspond to an operation of a calling function if the device is locked and the tap pattern may correspond to an operation of a scrolling function if the device is unlocked.
Figure 13 is a flow diagram that illustrates an example of a method 1300 for operating a device, according to an embodiment of the present disclosure. At block 1310, the method 1300 includes detecting, by a processing unit, a tapping input of an object over an imaging unit. In some embodiments of the present disclosure, a user may input or enter any one of a plethora of taps or tapping patterns by performing a touch event or a release event on an imaging unit of the device. In some embodiments of the present disclosure, one or more patterns may be differentiated by the intensity of the touch event, the frequency of the touch and release events, the time variation between the touch and release events, the relative sequence of the touch and release events and/or the like. At block 1320, the processing unit determines a pattern of the detected tapping input over the imaging unit based on a number of taps within a predefined time period, a duration of each of the taps, and a time period between the subsequent taps. At block 1330, the method 1300 includes comparing, by the processing unit, the pattern with one or more predefined patterns, wherein each of one or more patterns is allocated for performing an operation. In some embodiments of the present disclosure, one or more predefined patterns and their corresponding operations may be stored in a memory. The memory may store information regarding tap functions and tap patterns. The memory may store information regarding one or more allocated operations of one or more tap functions or tap patterns. In some embodiments of the present disclosure, the memory may include a database. In some embodiments of the present disclosure, one or more predefined patterns and their corresponding operations may be initially determined and  stored by the processing unit in the memory. In some embodiments of the present disclosure, the one or more predefined patterns and their corresponding operations may be determined by the processing unit based on the user’s behavior or preferences. In some embodiments of the present disclosure, the one or more predefined patterns and their corresponding operations may be determined by the processing unit based on accessibility functions of the device. In some embodiments of the present disclosure, one or more predefined patterns and their corresponding operations may be determined by the user and stored by the processing unit in the memory. At block 1340, the method 1300 includes operating, by the processing unit, the device by performing the operation allocated to the pattern. In some embodiments of the present disclosure, the operation may be a general operation or an application-specific operation. In some embodiments of the present disclosure, the operation may pertain to a function or command performed over a network. In some embodiments of the present disclosure, the operation may be a scrolling operation and the processing unit of the device may scroll the content on a display of the device. In some embodiments of the present disclosure, the operation may initiate a prompt to the user to input a voice command. In some embodiments of the present disclosure, the operation may initiate a predefined voice command to the device to perform a function or carry out an action. In some embodiments of the present disclosure, the operation for one or more patterns may vary based on the application running on the device at the time of the input of the pattern by the user. For example, a tap pattern may correspond to an operation of a refresh function if the application running at the time of the input of the pattern by the user is an internet browser, and the tap pattern may correspond to an operation of a zooming function if the application running at the time of the input of the pattern by the user is an image viewer. In some embodiments of the present disclosure, the operation for one or more patterns may vary based on the state of the device at the time of the input of the pattern by the user. For example, a tap pattern may correspond to an operation of a calling function if the device is locked and the tap pattern may correspond to an operation of closing the running application if the device is unlocked.
Figure 14 is a flow diagram that illustrates another example of a method 1400 for operating a device, according to an embodiment of the present disclosure. At block 1410, the method 1400 includes detecting, by a processing unit, a tapping input of an object over an imaging unit. At block 1420, the method 1400 includes receiving, by the processing unit, one or more images captured by the imaging unit. At block 1430, the method 1400 includes determining, by the processing unit, a color value of each of the received images. In some embodiments of the present disclosure, the color value may be indicative of the relative lightness or darkness of the one or more images captured by the imaging unit. At block 1440, the method 1400 includes comparing, by the processing unit, the color value to a threshold color value. The one or more images captured by the imaging unit may have a color value either less or more than a predefined threshold color value. In some embodiments of the present disclosure, a first threshold color value and a second threshold color value may be predefined. In some embodiments of the present disclosure, the first threshold color value and the second threshold color value may be adjusted either manually by the user or automatically by the processing unit based on dynamic factors, such as surroundings. A variation in the color value of the one or more captured images from the one or more threshold color values may be indicative of a touch event or a release event having occurred on the imaging unit. At block 1450, the method 1400 includes determining, by the processing unit, for each of the images, whether the image corresponds to a touch of the object on the imaging unit, or a release of the object from the imaging unit, based on the comparison. In some embodiments of the present disclosure, the color value of one or more images captured by the imaging unit may be higher than the first threshold color value and the same may be determined by the processing unit as a touch event. The one or more captured images in this scenario may be dark or darker due to lack or absence of surrounding light detected by the imaging unit. In some embodiments of the present disclosure, the color value of one or more images captured by the imaging unit may be lower than the second threshold color value and the same may be determined by the processing unit as a  release event. The one or more captured images in this scenario may be bright or brighter due to presence or abundance of surrounding light detected by the imaging unit. At block 1460, the processing unit determines a pattern of the detected tapping input over the imaging unit based on a number of taps within a predefined time period, a duration of each of the taps, and a time period between the subsequent taps. In some embodiments of the present disclosure, the predefined time period may be, for example, one second. In some embodiments of the present disclosure, the predefined time period may be adjusted (increased or decreased) based on the accessibility and comfort of the user. In some embodiments of the present disclosure, the predefined time period may be adjusted either manually by the user or automatically by the processing unit. In some embodiments of the present disclosure, one or more patterns may be differentiated by the intensity of the touch event, the frequency of the touch and release events, the time variation between the touch and release events, the relative sequence of the touch and release events and/or the like. At block 1470, the method 1400 includes comparing, by the processing unit, the pattern with one or more predefined patterns, wherein each of one or more patterns is allocated for performing an operation. In some embodiments of the present disclosure, one or more predefined patterns and their corresponding operations may be stored in a memory. In some embodiments of the present disclosure, one or more predefined patterns and their corresponding operations may be initially determined and stored by the processing unit in the memory. In some embodiments of the present disclosure, the one or more predefined patterns and their corresponding operations may be determined by the processing unit based on the user’s behavior, preferences or accessibility. In some embodiments of the present disclosure, the one or more predefined patterns and their corresponding operations may be determined by the processing unit based on accessibility functions of the device.  In some embodiments of the present disclosure, one or more predefined patterns and their corresponding operations may be determined by the user and stored by the processing unit in the memory. At block 1480, the method 1400 includes operating, by the processing unit, the device by performing the operation allocated to the pattern. In some embodiments of the present disclosure, the operation may be a general operation or an application-specific operation. In some embodiments of the present disclosure, the operation for one or more patterns may vary based on the application running on the device at the time of the input of the pattern by the user or the state of the device at the time of the input of the pattern by the user. In some embodiments of the present disclosure, the operation may pertain to a function or command performed over a network. In some embodiments of the present disclosure, the operation may be a scrolling operation and the processing unit of the device may scroll the content on a display of the device.
Figure 15 is a flow diagram that illustrates another example of a method 1500 for operating a device, according to an embodiment of the present disclosure. At block 1510, the method 1500 includes detecting, by a processing unit, a tapping input of an object over an imaging unit. At block 1520, the method 1500 includes receiving, by the processing unit, one or more images captured by the imaging unit. At block 1530, the processing unit determines a color value of each of the received images. At block 1540, the method 1500 includes comparing, by the processing unit, the color value to a threshold color value. At block 1550, the method 1500 includes determining, by the processing unit, for each of the images, whether the image corresponds to a touch of the object on the imaging unit, or a release of the object from the imaging unit, based on the comparison. At block 1560, the method 1500 includes determining, by the processing unit, a first time interval between the touch event and the release event within a predefined time period, to determine the duration of the tap. At block 1570, the processing unit determines a pattern of the detected tapping input over the imaging unit based on a number of taps within the predefined time period, a duration of each of the taps, and a time period between the subsequent taps. At block 1580, the method 1500 includes comparing, by the processing unit, the pattern with one or more predefined patterns, wherein each of one or more patterns is allocated for performing an operation. At block 1590, the method 1500 includes operating, by the processing unit, the device by performing the operation allocated to the pattern.
Figure 16 is a flow diagram that illustrates another example of a method 1600 for operating a device, according to an embodiment of the present disclosure. At block 1610, the method 1600 includes detecting, by a processing unit, a tapping input of an object over an imaging unit. In some embodiments of the present disclosure, a user may input or enter any one of a plethora of taps or tapping patterns by performing a touch event or a release event on an imaging unit of the device. In some embodiments of the present disclosure, one or more patterns may be differentiated by the intensity of the touch event, the frequency of the touch and release events, the time variation between the touch and release events, the relative sequence of the touch and release events and/or the like. At block 1612, the method 1600 includes receiving, by the processing unit, one or more images captured by the imaging unit. At block 1614, the processing unit determines a color value of each of the received images. At block 1616, the method 1600 includes comparing, by the processing unit, the color value to a threshold color value. In some embodiments of the present disclosure, a first threshold color value and a second threshold color value may be predefined. A variation in the color value of the one or more captured images from the one or more threshold color values may be indicative of a touch event or a release event having occurred on the imaging unit. In some embodiments of the present disclosure, the one or more images captured by the imaging unit may be identified and differentiated on the color of the one or more images, such as one or more dark images may indicate a touch event and one or more lighter images may indicate a release event. At block 1618, the method 1600 includes determining, by the processing unit, for each of the images, whether the image corresponds to a touch of the object on the imaging unit, or a release of the object from the imaging unit, based on the comparison. In some embodiments of the present disclosure, the color value of one or more captured images being higher than the first threshold color value may be identified as a touch event. The one or more captured images in this scenario may be dark or darker due to lack or absence of surrounding light detected by the imaging unit. In some embodiments of the present disclosure, the color value of one or more captured images being lower than the second threshold color value may be identified as a release event. The one or more captured images in this scenario may be bright or brighter due to presence or abundance of surrounding light detected by the imaging unit. At block 1620, the method 1600 includes determining, by the processing unit, a first time interval between the touch event and the release event within a predefined time period, to determine the duration of the tap. At block 1622, the processing unit determines a second time interval between the release and subsequent the touch within the predefined time period, to determine the time period between subsequent the taps. At block 1624, the processing unit determines a pattern of the detected tapping input over the imaging unit based on a number of taps within the predefined time period, a duration of each of the taps, and a time period between the subsequent taps. At block 1626, the method 1600 includes comparing, by the processing unit, the pattern with one or more predefined patterns, wherein each of one or more patterns is allocated for performing an operation. In some embodiments of the present disclosure, one or more predefined patterns and their corresponding operations may be stored in a memory. The memory may store information regarding tap functions and tap patterns. The memory may store information regarding one or more allocated operations of one or more tap functions or tap patterns. In some embodiments of the present disclosure, the memory may include a database. In some embodiments of the present disclosure, one or more  predefined patterns and their corresponding operations may be initially determined and stored by the processing unit in the memory. In some embodiments of the present disclosure, the one or more predefined patterns and their corresponding operations may be determined by the processing unit based on the user’s behavior or preferences. In some embodiments of the present disclosure, the one or more predefined patterns and their corresponding operations may be determined by the processing unit based on accessibility functions of the device. In some embodiments of the present disclosure, one or more predefined patterns and their corresponding operations may be determined by the user and stored by the processing unit in the memory. At block 1628, the method 1600 includes operating, by the processing unit, the device by performing the operation allocated to the pattern. In some embodiments of the present disclosure, the operation may be a general operation or an application-specific operation. In some embodiments of the present disclosure, the operation may pertain to a function or command performed over a network. In some embodiments of the present disclosure, the operation may be a scrolling operation and the processing unit of the device may scroll the content on a display of the device.

Claims (24)

  1. A method for operating a device, the method comprising:
    detecting, by a processing unit, a tapping input of an object over an imaging unit;
    determining, by the processing unit, a pattern of the detected tapping input over the imaging unit based on a number of taps within a predefined time period, a duration of each of the taps, and a time period between the subsequent taps;
    comparing, by the processing unit, the pattern with one or more predefined patterns, wherein each of one or more patterns is allocated for performing an operation; and
    operating, by the processing unit, the device by performing the operation allocated to the pattern.
  2. The method of claim 1, further comprising:
    receiving, by the processing unit, one or more images captured by the imaging unit;
    determining, by the processing unit, a color value of each of the received images;
    comparing, by the processing unit, the color value to a threshold color value; and
    determining, by the processing unit, for each of the images, whether the image corresponds to a touch of the object on the imaging unit, or a release of the object from the imaging unit, based on the comparison.
  3. The method of claim 2, further comprising:
    determining, by the processing unit, a first time interval between the touch and the release within the predefined time period, to determine the duration of the tap.
  4. The method of claim 3, further comprising:
    determining, by the processing unit, a second time interval between the release and subsequent the touch within the predefined time period, to determine the time period between subsequent the taps.
  5. The method of claim 1, wherein the operation of one or more patterns varies based on a state of the device.
  6. The method of claim 1, wherein the operation of one or more patterns varies based on an application running on the device.
  7. The method of claim 1, wherein the operation includes a scrolling operation to navigate a movement of a content, displayed on a display, in any direction on the display.
  8. The method of claim 1, wherein the one or more predefined patterns are defined by a user.
  9. A system for operating a device comprising:
    an imaging unit; and
    a processing unit configured to detect a tapping input of an object over the imaging unit, the processing unit determines a pattern of the detected tapping input over the imaging unit based on a number of taps within a predefined time period, a duration of each of the taps, and a time period between the subsequent taps,
    wherein the processing unit compares the pattern with one or more predefined patterns, wherein each of one or more patterns is allocated for performing an operation, and
    wherein the processing unit operates the device by performing the operation allocated to the pattern.
  10. The system of claim 9, wherein the processing unit receives one or more images captured by the imaging unit and determines a color value of each of the received images, and
    wherein the processing unit compares the color value to a threshold color value, and determines for each of the images, whether the image corresponds to a touch of the object on the imaging unit, or a release of the object from the imaging unit, based on the comparison.
  11. The system of claim 10, wherein the processing unit determines a first time interval between the touch and the release within the predefined time period, to determine the duration of the tap.
  12. The system of claim 11, wherein the processing unit determines a second time interval between the release and subsequent the touch within the predefined time period, to determine the time period between subsequent the taps.
  13. The system of claim 9, wherein the operation of one or more patterns varies based on a state of the device.
  14. The system of claim 9, wherein the operation of one or more patterns varies based on an application running on the device.
  15. The system of claim 9, wherein the operation includes a scrolling operation to navigate a movement of a content, displayed on a display screen, in any direction on the display screen.
  16. The system of claim 9, wherein the one or more predefined patterns are defined by a user.
  17. A device comprising:
    an imaging unit; and
    a processing unit configured to detect a tapping input of a object over the imaging unit, the processing unit determines a pattern of the detected tapping input over the imaging unit based on a number of taps within a predefined time period, a duration of each of the taps, and a time period between the subsequent taps; and
    a memory for storing the tapping unit of the object,
    wherein the processing unit compares the pattern with one or more predefined patterns stored in the memory, wherein each of one or more patterns is allocated for performing an operation, and
    wherein the processing unit operates the device by performing the operation allocated to the pattern.
  18. The device of claim 17, wherein the processing unit receives one or more images captured by the imaging unit and determines a color value of each of the received images, and
    wherein the processing unit compares the color value to a threshold color value, and determines for each of the images, whether the image corresponds to a touch of the object on the imaging unit, or a release of the object from the imaging unit, based on the comparison.
  19. The device of claim 18, wherein the processing unit determines a first time interval between the touch and the release within the predefined time period, to determine the duration of the tap.
  20. The device of claim 19, wherein the processing unit determines a second time interval between the release and subsequent the touch within the predefined time period, to determine the time period between subsequent the taps.
  21. The device of claim 17, wherein the operation of one or more patterns varies based on a state of the device.
  22. The device of claim 17, wherein the operation of one or more patterns varies based on an application running on the device.
  23. The device of claim 17, wherein the operation includes a scrolling operation to navigate a movement of a content, displayed on a display screen, in any direction on the display screen.
  24. The device of claim 17, wherein the one or more predefined patterns are defined by a user.
PCT/CN2021/101830 2020-08-21 2021-06-23 Device, method and system for operating device WO2022037247A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IN202041036156 2020-08-21
IN202041036156 2020-08-21

Publications (1)

Publication Number Publication Date
WO2022037247A1 true WO2022037247A1 (en) 2022-02-24

Family

ID=80323378

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/101830 WO2022037247A1 (en) 2020-08-21 2021-06-23 Device, method and system for operating device

Country Status (1)

Country Link
WO (1) WO2022037247A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102662486A (en) * 2012-03-20 2012-09-12 人民搜索网络股份公司 Input method switching device and method based on light sensing technology
CN103226439A (en) * 2013-03-13 2013-07-31 百度在线网络技术(北京)有限公司 Mobile terminal as well as operation control method and device thereof
CN103516900A (en) * 2013-09-18 2014-01-15 广东欧珀移动通信有限公司 Method and device for starting front camera and rear camera of electronic device during system hibernation during system hibernation
US9400575B1 (en) * 2012-06-20 2016-07-26 Amazon Technologies, Inc. Finger detection for element selection
CN107454304A (en) * 2016-05-31 2017-12-08 宇龙计算机通信科技(深圳)有限公司 A kind of terminal control method, control device and terminal
CN108668025A (en) * 2018-05-07 2018-10-16 珠海格力电器股份有限公司 A kind of quick closedown is taken pictures the method and mobile terminal at interface

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102662486A (en) * 2012-03-20 2012-09-12 人民搜索网络股份公司 Input method switching device and method based on light sensing technology
US9400575B1 (en) * 2012-06-20 2016-07-26 Amazon Technologies, Inc. Finger detection for element selection
CN103226439A (en) * 2013-03-13 2013-07-31 百度在线网络技术(北京)有限公司 Mobile terminal as well as operation control method and device thereof
CN103516900A (en) * 2013-09-18 2014-01-15 广东欧珀移动通信有限公司 Method and device for starting front camera and rear camera of electronic device during system hibernation during system hibernation
CN107454304A (en) * 2016-05-31 2017-12-08 宇龙计算机通信科技(深圳)有限公司 A kind of terminal control method, control device and terminal
CN108668025A (en) * 2018-05-07 2018-10-16 珠海格力电器股份有限公司 A kind of quick closedown is taken pictures the method and mobile terminal at interface

Similar Documents

Publication Publication Date Title
US9286895B2 (en) Method and apparatus for processing multiple inputs
US10228848B2 (en) Gesture controlled adaptive projected information handling system input and output devices
KR102308645B1 (en) User termincal device and methods for controlling the user termincal device thereof
US9348420B2 (en) Adaptive projected information handling system output devices
KR102032449B1 (en) Method for displaying image and mobile terminal
US20170205894A1 (en) Method and device for switching tasks
EP3547218B1 (en) File processing device and method, and graphical user interface
US20140078091A1 (en) Terminal Device and Method for Quickly Starting Program
US20150268773A1 (en) Projected Information Handling System Input Interface with Dynamic Adjustment
US20150077362A1 (en) Terminal with fingerprint reader and method for processing user input through fingerprint reader
KR101343587B1 (en) Data transfering method using direction information and mobile device using the method
US20130271447A1 (en) Apparatus and method for providing a digital bezel
US9965038B2 (en) Context adaptable projected information handling system input environment
CN106998377A (en) Terminal device
US11886894B2 (en) Display control method and terminal device for determining a display layout manner of an application
US20140292724A1 (en) A display method, a display control method, and electric device
US9538086B2 (en) Method of performing previewing and electronic device for implementing the same
US20220357818A1 (en) Operation method and electronic device
US8860661B2 (en) Information processing apparatus and program
US10133355B2 (en) Interactive projected information handling system support input and output devices
US20150268739A1 (en) Projected Information Handling System Input Environment with Object Initiated Responses
EP4177722A1 (en) Keyboard display method, foldable screen device, and computer-readable storage medium
WO2023030238A1 (en) Secure input method and apparatus
WO2022037247A1 (en) Device, method and system for operating device
US20240103717A1 (en) Multi-Interface Display Method and Electronic Device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21857343

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21857343

Country of ref document: EP

Kind code of ref document: A1