US11531443B2 - Method, apparatus, and storage medium for determining relative position relationship of click event - Google Patents
Method, apparatus, and storage medium for determining relative position relationship of click event Download PDFInfo
- Publication number
- US11531443B2 US11531443B2 US17/183,818 US202117183818A US11531443B2 US 11531443 B2 US11531443 B2 US 11531443B2 US 202117183818 A US202117183818 A US 202117183818A US 11531443 B2 US11531443 B2 US 11531443B2
- Authority
- US
- United States
- Prior art keywords
- pixel
- click
- value corresponding
- similarity
- color
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/451—Execution arrangements for user interfaces
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G06K9/6215—
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/56—Extraction of image or video features relating to colour
Definitions
- This application relates to data processing technologies, and in particular, to a method, apparatus, and storage medium for processing information in a view.
- An interface view of an application program includes an image view.
- An image region displayed in the image view may be an irregular region. That is, an image displayed in the image view has an irregular shape.
- problems/issues associated with how to determine whether a click event occurs in the irregular region for example but not limited to, low efficiency, and/or poor accuracy.
- the present disclosure describes various embodiments for determining a relative position relationship of a click event in an image view, addressing at least some of the problems/issues discussed above, and improving accuracy and providing a better user experience.
- Embodiments of this application provide a method, an apparatus, and storage medium for determining a relative position relationship of a click event in a view, to improve the accuracy of processing a click event in an image view.
- the present disclosure describes a method for determining a relative position relationship of a click event.
- the method includes receiving, by a device, a click event in an interface view, the interface view comprising an image region configured to respond to the click event.
- the device includes a memory storing instructions and a processor in communication with the memory.
- the method also includes determining, by the device, a click position of the click event in the interface view; obtaining, by the device, color information of a pixel corresponding to the click position based on the click position of the click event in the interface view; determining, by the device according to the color information of the pixel corresponding to the click position, a relative position relationship between the click position and the image region; and processing, by the device, the click event based on the relative position relationship.
- the present disclosure describes an apparatus for determining a relative position relationship of a click event.
- the apparatus includes a memory storing instructions; and a processor in communication with the memory.
- the processor executes the instructions, the processor is configured to cause the apparatus to: receive a click event in an interface view, the interface view comprising an image region configured to respond to the click event, determine a click position of the click event in the interface view, obtain color information of a pixel corresponding to the click position based on the click position of the click event in the interface view, determine, according to the color information of the pixel corresponding to the click position, a relative position relationship between the click position and the image region, and process the click event based on the relative position relationship.
- the present disclosure describes a non-transitory computer readable storage medium, storing computer readable instructions.
- the computer readable instructions when executed by a processor, are configured to cause the processor to perform: receiving a click event in an interface view, the interface view comprising an image region configured to respond to the click event; determining a click position of the click event in the interface view; obtaining color information of a pixel corresponding to the click position based on the click position of the click event in the interface view; determining, according to the color information of the pixel corresponding to the click position, a relative position relationship between the click position and the image region; and processing the click event based on the relative position relationship.
- An embodiment of this application provides a method for processing information in a view, performed by an electronic device, including:
- the interface view including an image region for responding to the click event
- An embodiment of this application provides an apparatus for processing information in a view, including:
- a receiving unit configured to receive a click event for an interface view, the interface view including an image region for responding to the click event
- a positioning unit configured to determine a click position of the click event in the interface view
- an obtaining unit configured to obtain color information of a pixel corresponding to the click position based on the click position
- a determination unit configured to determine a relative position relationship between the click position and the image region according to the obtained color information of the pixel
- a processing unit configured to process the click event based on the determined relative position relationship.
- An embodiment of this application provides an apparatus for processing information in a view, including:
- a memory configured to store executable instructions
- a processor configured to implement the method for processing information in a view provided in the embodiments of this application when executing the executable instructions stored in the memory.
- An embodiment of this application provides a storage medium, storing executable instructions used for enabling a memory to perform the method for processing information in a view provided in the embodiments of this application.
- FIG. 1 is a schematic structural diagram of an apparatus for processing information in a view according to an embodiment of this application.
- FIG. 2 is a schematic structural diagram of a user interface kit (UIKit) according to an embodiment of this application.
- FIG. 3 is a schematic diagram of transferring a click event in an interface view according to an embodiment of this application.
- FIG. 4 is a schematic diagram of an irregular image region in an interface view according to an embodiment of this application.
- FIG. 5 is a schematic diagram of processing a click event in an image view according to an embodiment of this application.
- FIG. 6 is a schematic flowchart of a method for processing information in a view according to an embodiment of this application.
- FIG. 7 is a schematic diagram of an interface view of live streaming according to an embodiment of this application.
- FIG. 8 is a schematic diagram of different image filling manners of an interface view according to an embodiment of this application.
- FIG. 9 is a schematic diagram of determining a position of a click position relative to an image region according to an embodiment of this application.
- FIG. 10 is a schematic diagram of a scenario of a method for processing information in a view according to an embodiment of this application.
- FIG. 11 is a schematic diagram of a scenario of a method for processing information in a view according to an embodiment of this application.
- FIG. 12 is a schematic flowchart of a method for processing information in a view according to an embodiment of this application.
- FIG. 13 is a schematic flowchart of picture view initialization according to an embodiment of this application.
- FIG. 14 is a schematic structural diagram of an apparatus for processing information in a view according to an embodiment of this application.
- the terms “comprise”, “include”, and any variants thereof are intended to cover a non-exclusive inclusion. Therefore, a method or an apparatus that includes a series of elements not only includes such elements provided clearly, but also includes other elements not listed expressly, or may include inherent elements for implementing the method or apparatus. Without more limitations, an element limited by “include a/an . . . ” does not exclude other related elements (for example, steps in the method, or units in the apparatus, the units herein may be some circuits, some processors, some programs or software, and the like) existing in the method or the device that includes the element.
- the method provided in the embodiments of this application includes a series of steps, but the method provided in the embodiments of this application is not limited to the provided steps.
- the apparatus provided in the embodiments of this application includes a series of units, but the apparatus provided in the embodiments of this application is not limited to the clearly provided units, and may further include a unit that needs to be set to obtain related information or perform processing based on information.
- an image displayed in an image view has an irregular shape and the image view receives a click event from a user, to conveniently determine whether the click event occurs in an irregular region, the irregular region is usually approximated as a regular region, and the determination and corresponding processing are then performed.
- this processing manner requires complex operations and is not accurate enough.
- the present disclosure describes various embodiments for providing a method and an apparatus for processing information in a view and a storage medium, improving the accuracy of processing a click event in an image view.
- a relative position relationship between a click position and an image region is determined based on color information of a pixel corresponding to the click position, and it is not necessary to make any adjustment or change to the shape of the image region, so that the relative position relationship between the click position and the image region can be accurately determined.
- this application is simpler to implement and has higher applicability.
- the apparatus for processing information in a view provided in the embodiments of this application may be implemented in a manner of hardware, software, or a combination of hardware and software.
- the following describes various exemplary implementations of the apparatus for processing information in a view provided in the embodiments of this application.
- FIG. 1 is a schematic structural diagram of the hardware of an apparatus 100 for processing information in a view according to an embodiment of this application.
- the apparatus for processing information in a view may be an electronic device with a data processing capability and a display apparatus. It may be understood that FIG. 1 only shows the exemplary structure of the apparatus for processing information in a view rather than the entire structure. A part of the structure or the entire structure shown in FIG. 1 may be implemented as required.
- the apparatus 100 for processing information in a view includes at least one processor 110 , a memory 140 , at least one network interface 120 , and a user interface 130 .
- bus system 150 Various components in the apparatus for processing information in a view are coupled together by a bus system 150 . It may be understood that the bus system 150 is configured to implement connection and communication between the components. In addition to a data bus, the bus system 150 further includes a power bus, a control bus, and a status signal bus. However, for ease of clear description, all types of buses are marked as the bus system 150 in FIG. 1 .
- the user interface 130 may include a display, a keyboard, a mouse, a trackball, a click wheel, a key, a button, a touchpad, or a touch screen.
- the memory 140 may be a volatile memory or a non-volatile memory, or may include a volatile memory and a non-volatile memory.
- the non-volatile memory may be a read-only memory (ROM), a programmable read-only memory (PROM), an erasable programmable read-only memory (EPROM), a flash memory, and the like.
- the volatile memory may be a random access memory (RAM), used as an external cache. According to exemplary but not limited descriptions, many forms of RAMs are available, for example, a static random access memory (SRAM), a synchronous static random access memory (SSRAM), and the like.
- SRAM static random access memory
- SSRAM synchronous static random access memory
- the memory 140 described in this embodiment of this application aims to include these memories and any other suitable type of memories.
- the processor 110 may be an integrated circuit chip having a signal processing capability, for example, a general purpose processor, a DSP, or another PLD, discrete gate, transistor logical device, or discrete hardware component.
- the general purpose processor may be a microprocessor, any conventional processor, or the like.
- the memory 140 can store executable instructions 1401 to support operations of the apparatus 100 for processing information in a view.
- the executable instructions include: a program for operating on the apparatus 100 for processing information in a view; a software module with various forms such as a plug-in and a script; and a program may include an operating system and an application program, the operating system including various system programs such as a framework layer, a kernel library layer, and a driver layer, which are configured to implement various basic services and process hardware-based tasks.
- An interface view is instantiation of an interface view class (a process of creating an object with a class is referred to as instantiation), which defines a display region (for example, a rectangular region) on a screen, processes a touch event in the display region, and can display/update content based on the touch event.
- instantiation a process of creating an object with a class
- a display region for example, a rectangular region
- the display of the application program is implemented through a UIKit, and the UIKit is obtained by encapsulating a graph library such as an open graph library (OpenGL).
- OpenGL open graph library
- FIG. 2 is a schematic structural diagram of the UIKit according to an embodiment of this application.
- a root object (NSObject 280 ) is instantiation of a root class of an application program code structure (the NSObject is not inherited from any class in the application program).
- An order based on an inheritance relationship of the NSObject is: the NSObject, a user interface responder (UIResponder 282 ), a user interface view controller 284 , and a child view that is inherited from the interface view 286 .
- the interface view may be an image view, a scroll view, and a tab bar view, and the like.
- the image view (UIImageView 292 ) is inherited as a visual control from the interface view (UIView), and is used for displaying various formats of images. Each view is placed on a parent view of the view and manages child views of the view. If two sibling child views overlap, the view added later (or the view arranged behind a child view array) appears on the top layer of the other view.
- FIG. 2 The child views included in the interface view as the parent view are then exemplarily described according to FIG. 2 .
- the UIKit shown in FIG. 2 is only an example. According to different versions of the UIKit and devices/operating systems applicable to the UIKit, the inheritance relationship and the views shown in FIG. 2 have differences that may be easily implemented. Therefore, FIG. 2 is not to be considered as a limitation to the UIKit applicable to the embodiments of this application.
- the window is a user interface window (UIWindow 290 ) object.
- An application program only has one UIWindow object that is a root container of all views and is used as a container to provide a region to display the interface view and may cooperate with an interface view controller (UIViewController) to implement support for the rotation of display directions.
- UIWindow user interface window
- An application program only has one UIWindow object that is a root container of all views and is used as a container to provide a region to display the interface view and may cooperate with an interface view controller (UIViewController) to implement support for the rotation of display directions.
- UIViewController interface view controller
- the views include: an image view, used for displaying various formats of images; a text view (UITextView 293 ), used for displaying a text; a label view (UILabel 294 ), used for displaying various labels; a web view (UIWebView 295 ), used for displaying web page content; and a scroll view (UIScrollView), used for displaying content in a scrolling manner.
- the bars include: a user interface navigation bar (UINavigationBar 298 ), used for providing page operations such as “Next” and “Back”; a user interface tab bar (UITabBar), used for displaying tabs with various functions; a user interface search bar (UISearchBar), used for providing search operations; and a user interface tool bar (UIToolBar 299 ), used for providing various tools on the user interface.
- a user interface navigation bar UINavigationBar 298
- UATabBar user interface tab bar
- UISearchBar user interface search bar
- UIToolBar 299 user interface tool bar
- the interface view is a basic element that forms a screen display, has a position attribute and a rectangle with a particular size, usually has a background color, and is not movable.
- the interface view may also include content such as a character string or a picture.
- the UILabel is a UIKit including a character string
- the UIImageView is an interface view including a picture.
- FIG. 3 is a schematic diagram of transferring a click event in an interface view according to an embodiment of this application.
- a rectangular region numbered 1 is an interface view 1 (root view) as a visual parent control
- rectangular regions numbered 2 to 4 are child views that are inherited from the view 1 (views 2 - 1 , 2 - 2 , 3 - 1 , and 3 - 2 are exemplarily shown in the figure).
- the view 3 - 1 is a child view of the view 2 - 1
- a view 4 is the child view of the view 3 - 1 .
- an application program transfers the click event to a window, and the window determines whether the click event occurs in the window. If it is determined that the click event occurs in the window, that is, the window can receive the click event, to find the most suitable view, the window may traverse child controls of the window from back to front, and a view that is added later is placed on the top.
- the window transfers the click event to the root view 1 .
- the view 1 determines, according to a determination logic of the view, whether the view needs to respond to the click event.
- Each layer of view has a corresponding determination logic. That is, the determination logics of the layers of view are independent of each other.
- a transfer order of the click event is the view 1 , the view 2 - 1 , the view 3 - 1 , and the view 4 .
- an image view may be understood as a tab.
- the image view needs to support a clickable area comprising the irregular region.
- FIG. 4 is a schematic diagram of an irregular image region in an interface view according to an embodiment of this application. Referring to FIG. 4 , the region (that is, the display region of the image view) is a rectangle 480 . A picture of a pentagram 482 is displayed in the image view, and only the click event that occurs in the pentagram may be considered as a click on the tab.
- FIG. 5 is a schematic diagram of processing a click event in an image view according to an embodiment of this application.
- a picture displayed in the image view is an irregular region numbered 501 .
- the image view responds to the click event only when the click event occurs in the irregular region 501 .
- the irregular region 501 is approximated as a regular circle 502 , and it is then determined, according to position information of the click event, whether the click event occurs in the regular circle 502 .
- the determination of whether the click event occurs in the irregular region 501 is converted into the determination of whether the click event occurs in the regular circle region 502 , to further process the click event.
- a graph region on which the processing manner is based is not an original image region in the image view and cannot match the original graph outline, the accuracy of the determination is not high.
- the user changes picture materials in the view and then receives the click event, it is necessary to regenerate a regular region.
- the generation of the regular region often requires more complex mathematical geometry and complex operations.
- FIG. 6 is an exemplary schematic flowchart of a method for processing information in a view according to an embodiment of this application, including step 101 to step 105 , which are described respectively below.
- Step 101 Receive a click event in an interface view, the interface view including an image region configured to respond to the click event.
- the interface view is not limited to a root view.
- the interface view may also be a child view including the image region that can respond to the click event.
- FIG. 7 is a schematic diagram of an interface view of live streaming according to an embodiment of this application.
- a current interface view corresponds to an attractiveness matching page.
- the interface view includes a picture region for responding to the click event from the user, that is, the region formed by sides numbered 701 to 704 shown in FIG. 7 , that is, the irregular region formed by four sides including a “good looker” region.
- the click event occurs in the irregular region, and a UIViewController responds to the click event to perform a page jump.
- Step 102 Determine a click position of the click event in the interface view.
- the interface view detects the click event and determines the position of the click event in the interface view. For example, the size of the display region of the interface view on the screen is 320 ⁇ 480 pixels.
- the click position of the click event in the interface view is (160, 240).
- Step 103 Obtain color information of a pixel corresponding to the click position based on the click position.
- the color information of the pixel corresponding to the click position may be obtained in the following manner:
- the position of the click position relative to the image region for responding to the click event in the interface view is determined; according to the relative position, a pixel position corresponding to the relative position is determined; and the color information of the pixel corresponding to the pixel position is obtained.
- the position of the click position relative to the image region for responding to the click event in the interface view may be determined in the following manner:
- the image filling manner of the interface view is obtained; and the position of the click position relative to the image region is determined based on the image filling manner and the click position.
- the image filling manner may be understood as the position and size of the image in the interface view.
- the size of the image in the interface view involves stretching of the image
- the position of the image in the interface view involves the layout of an image.
- the size of the image in the interface view is keeping the size of the image unchanged, stretching the image in the horizontal direction or stretching the image in the vertical direction.
- the position of the image in the interface view is one or more of the following relationships: the image is located at the left of the region displayed in the interface view, the image is located at the right of the region displayed in the interface view, the image is located at the top of the region displayed in the interface view, the image is located at the bottom of the region displayed in the interface view, and the image is displayed in the center of the interface view.
- the image When the image is located at the left of the region displayed in the interface view, the image is located at the left of the region displayed in the interface view and is in contact with an edge of the region; when the image is located at the right of the region displayed in the interface view, the image is located at the right of the region displayed in the interface view and is in contact with an edge of the region; when the image is located at the top of the region displayed in the interface view, the image is located at the top of the region displayed in the interface view and is in contact with an edge of the region; when the image is located at the bottom of the region displayed in the interface view, the image is located at the bottom of the region displayed in the interface view and is in contact with an edge of the region; and when the image is displayed in the center of the interface view, the image may be or may not be in contact with an edge of the region displayed in the interface view.
- FIG. 8 is a schematic diagram of different image filling manners of an interface view according to an embodiment of this application.
- the size of the picture (pentagram) in the horizontal direction is the same as the size of the region displayed in the interface view in the horizontal direction. That is, the picture is stretched in the horizontal direction.
- the picture is located at the top of the region displayed in the interface view.
- the size of the picture (pentagram) is kept unchanged, and the picture is displayed at the left and top of the region displayed in the interface view. Still, as shown in FIG.
- the size of the picture (pentagram) is kept unchanged, and the picture is displayed in the center of the region displayed in the interface view and is not in contact with the edge of the region displayed in the interface view.
- the other image filling manners shown in FIG. 8 are not described herein one by one.
- the filling manner (referred to as a filling manner 1 ) in FIG. 8 ( g ) and the filling manner (referred to as a filling manner 2 ) in FIG. 9 are used as an example for description.
- the size of the picture (pentagram) in the vertical direction is the same as the size of the region displayed in the interface view in the vertical direction, and the picture is displayed in the center of the interface view.
- the size of the picture (pentagram) in the vertical direction is the same as the size of the region displayed in the interface view in the vertical direction, and the picture is displayed at the right in the interface view.
- the filling manner 2 in FIG. 9 is used as an example to describe the process of determining the position of the click position relative to the image region for responding to the click event 910 in the interface view.
- a coordinate position of the click position on the screen of the electronic device may be converted into a coordinate position of the click position relative to the image region for responding to the click event in the interface view.
- FIG. 9 is a schematic diagram of determining a position of a click position relative to an image region according to an embodiment of this application.
- the click position of the click event in the interface view is (50, 30). Because the current image filling manner is the filling manner 2 , the sizes of the pentagram (that is, the image region for responding to the click event in the interface view) in the view in the vertical and horizontal directions are both 60, and the coordinate position of the click position relative to the pentagram is (10, 30). Through the determination of the relative position, it may be determined whether the click event occurs in the picture region of the view (for example, the rectangular region shown in FIG. 9 ).
- the picture region includes an image region (for example, the picture of the pentagram shown in FIG. 9 ) that can respond to the click event and the region that cannot respond to the click event (for example, the region outside the picture of the pentagram shown in FIG. 9 ).
- the region that cannot respond to the click event for example, the region outside the picture of the pentagram shown in FIG. 9 .
- FIG. 7 in the foregoing determination manner, when the user clicks the central region of the screen, it may be determined that the click event is located inside the picture region. However, because only the irregular region formed by the sides numbered 701 to 704 can respond to the click event, the current view control (root view) cannot respond to the click event at the center of the screen, and the event may be transferred to a child view, that is, a next layer of view of the current view.
- the determining a pixel position corresponding to the relative position is further described.
- the pixel position of the relative position on the picture is (10 ⁇ (90/60), 30 ⁇ (90/60)), wherein 90 is the original pixel size of the picture and 60 is the displayed pixel size of the picture in the interface view.
- the pixel position of the relative position on the picture is (15, 45).
- the original pixel information of the image displayed in the interface view and color information corresponding to each pixel are pre-stored.
- color information corresponding to the pixel may be determined by searching stored picture information.
- the color information herein may be red-green-blue-Alpha (RGBA) information and is certainly not limited to the RGBA, for example, an HSV value.
- RGBA is a color space representing Red, Green, Blue, and Alpha.
- HSV is a color space created according to an intuitive feature of colors. Parameters of colors in the space are: Hue, Saturation, and Value.
- Step 104 Determine, according to the obtained color information of the pixel, a relative position relationship between the click position and the image region for responding to the click event in the interface view.
- the relative position relationship between the click position and the image region for responding to the click event in the interface view may be determined in the following manner:
- a similarity between the obtained color information of the pixel and color information of a preset color is calculated, the preset color being different from the color of the image region that can respond to the click event.
- the similarity threshold may be a pre-set similarity threshold.
- the similarity threshold may be a threshold adjustable during a process in real time. When the calculated similarity reaches the similarity threshold, the calculated similarity may be equal to or larger than the similarity threshold.
- the calculated similarity does not reach the similarity threshold, it is determined that the click position is inside the image region.
- the calculated similarity may be smaller than the similarity threshold.
- the transparently transmitting the click event is first described. If the current view control determines that it is not necessary to respond to the received click event, the click event is transparently transmitted (transferred) to a next layer of view, that is, a child view control continues to perform determination.
- the click event may be transparently transmitted, the set color herein being the preset color.
- the preset color may be set as a transparent color, and the setting of the preset color takes the color that is different from that of the image region responding to the click event as a reference.
- the preset color may be the color of the region, except for the image region, displayed in the current view. For example, the color of the image region responding to the click event is yellow, and the preset color is white.
- the similarity between the color information of the pixel and the color information of the preset color may be calculated in the following manner:
- a hue-saturation-value (HSV) value corresponding to the pixel is obtained; and a similarity between the HSV value corresponding to the pixel and an HSV value corresponding to the preset color is calculated.
- HSV hue-saturation-value
- the obtaining the HSV value of the pixel is first described. Because the color information of the image is pre-stored, if the stored color information of the image is the RGBA information of each pixel in the image, an RGB value corresponding to the pixel may be obtained, and the RGB value of the pixel is converted into the corresponding HSV value. If the stored color information of the image is the HSV value of each pixel in the image, the HSV value corresponding to the pixel may be directly obtained through searches.
- a distance in an HSV color space between the HSV value corresponding to the pixel and the HSV value corresponding to the preset color may be calculated; and the similarity may be calculated based on the distance.
- the similarity may be inversely related with the distance.
- a larger distance may indicate a smaller similarity; and a smaller distance may indicate a larger similarity.
- the similarity between the color information of the pixel and the color information of the preset color may further be calculated in the following manner:
- the RGBA value corresponding to the pixel is obtained; and a similarity between the RGBA value corresponding to the pixel and an RGBA value corresponding to the preset color is calculated. For example, it may be first determined whether a difference between the Alpha value corresponding to the pixel and the Alpha value of the preset color is less than a set threshold. When the difference between the Alpha values is not less than the set threshold, the similarity of RGBA definitely cannot reach the set similarity threshold. When the difference between the Alpha values is less than the set threshold, a difference in RGB is further compared. When the difference in RGB is less than a set threshold, it is considered that the similarity reaches the set similarity threshold, or otherwise it is considered that the similarity does not reach the set similarity threshold.
- the similarity may be calculated based on the difference in RGB and/or the difference in Alpha value.
- the similarity may be inversely related with the difference in RGB and/or the difference in Alpha value.
- a larger difference in RGB and/or a larger difference in Alpha value may indicate a smaller similarity; and a smaller difference in RGB and/or a smaller difference in Alpha value may indicate a larger similarity.
- Step 105 Process the click event based on the determined relative position relationship.
- the click event is transparently transmitted to the next layer of view, that is, the child view of the current view, and the next layer of view determines, according to a determination logic (for example, according to a position relationship of the click event), whether to respond to the click event, until the view that can respond to the click event is determined.
- a determination logic for example, according to a position relationship of the click event
- the view control When it is determined that the similarity between the pixel color of the click position and the preset color does not reach the set similarity threshold and the click position is in the image region for responding to the click event, the view control responds to the click event, for example, corresponding to the scenario in FIG. 7 , performs a corresponding page jump.
- FIG. 10 is a schematic diagram of a scenario of a method for processing information in a view according to an embodiment of this application.
- a view component of a frame animation player invokes a playing method of a frame animation to update an interface.
- each frame of the animation exists as a view.
- a view control receives a click event, and a relative position relationship between a click position and an image region (for example, a region 1001 corresponding to a little robot in FIG.
- a preset color (a color corresponding to a region 1002 ) is transparent, and a similarity between the color information of the pixel of the click position and the color information of the preset color is calculated.
- a similarity threshold it may be known that the click position is outside the little robot, and the click event is transparently transmitted to a next frame of the animation while the next frame is played.
- the similarity does not reach the similarity threshold, it may be known that the click position is inside the little robot, a response is made to the click event, for example, a jump from the current page to another page is performed.
- FIG. 11 is a schematic diagram of a scenario of a method for processing information in a view according to an embodiment of this application.
- a view component of a video player invokes a transparent channel video playing method to update an interface.
- the video is above a user interface, and the user may see icons displayed in the user interface under the video through the video.
- a view control receives a click event, and a relative position relationship between a click position and an image region (for example, a region corresponding to a little robot in FIG.
- a similarity between the color information of the pixel of the click position and the color information of the preset color is calculated.
- a similarity threshold it may be known that the click position is outside the little robot and the click event is transparently transmitted to a view corresponding to the icon.
- the similarity does not reach the similarity threshold, it may be known that the click position is inside the little robot, a response is made to the click event, for example, content in a current page is scaled up.
- FIG. 12 is an exemplary schematic flowchart of a method for processing information in a view according to an embodiment of this application. Referring to FIG. 12 , the method for processing information in a view provided in this embodiment of this application includes the following steps:
- Step 201 Obtain a click position of a view.
- the view herein may be understood as an image view that can display a picture, and a display region of the picture is used for responding to the click event of the display region.
- a view component receives the click event and locates the position of the click event (that is, a click position).
- the click position herein is a position of the click event relative to the image view with reference to the view interface.
- Step 202 Calculate a pixel position coordinate of the click position according to a picture filling manner and the click position.
- the pixel position coordinate herein is a pixel coordinate corresponding to a position of the click position relative to the picture in the view.
- the pixel position coordinate is a pixel coordinate corresponding to a position of the click position relative to the picture in the view.
- Step 203 Determine, according to the calculated pixel position, whether the click position is in a picture, and if the click position is in the picture, perform step 204 , or otherwise perform step 210 .
- Step 204 Obtain RGBA information of the pixel on the picture corresponding to the click position and RGBA information of the preset color.
- the color information of the picture for example, the RGBA information corresponding to each pixel in the picture
- the picture filling manner of the view the stretching and layout of the picture in the view, for example, the picture is placed at the center, and the picture is scaled up by 1.5 times
- the preset color is transparent.
- FIG. 13 is a schematic flowchart of picture view initialization according to an embodiment of this application. Referring to FIG. 13 , the picture view initialization includes the following steps:
- Step 301 Set a picture filling manner of a picture view.
- Step 302 Set color information of a preset color.
- the preset color herein is transparent by default, and the corresponding color information may be RGBA information of the preset color.
- Step 303 Set a picture of the picture view.
- Step 304 Store color information of the picture view in the form of an RGBA array of the pixel.
- Step 205 Determine whether a difference between the Alpha value of the pixel on the picture corresponding to the click position and the Alpha value of the preset color is less than a preset threshold, and if the difference is less than the preset threshold, perform step 206 , or otherwise perform step 210 .
- Step 206 Convert the RGB value of the pixel on the picture corresponding to the click position into the corresponding HSV value, and convert the RGB value of the preset color into the corresponding HSV value.
- RGB expresses colors directly
- RGB values are prone to deviations. Therefore, it is more accurate to convert RGB into HSV to compare the similarity.
- Step 207 Calculate a spatial distance between the HSV value of the pixel corresponding to the click position and the HSV value corresponding to the preset color.
- the calculating the distance on the HSV space may be a Euclidean distance between the two.
- Step 208 Determine whether the calculated distance is less than a preset distance threshold, and if the distance is not less than the preset distance threshold, perform step 209 , or if the distance is less than the preset distance threshold, perform step 210 .
- Step 209 Process the click event, and perform step 211 .
- the click position when it is determined that the spatial distance between the HSV value of the pixel corresponding to the click position and the HSV value corresponding to the preset color is not less than the preset distance threshold, the click position does not fall into the region corresponding to the preset color, that is, the click position is in the picture region for responding to the click event, and the view component responds to the click event, for example, to perform a page jump.
- Step 210 Transparently transmit the click event to a next layer of view of the current layer of view.
- the view component transparently transmits the click event to the next layer of view of the current layer of view for processing.
- the next layer of view is a picture view for displaying a picture in which a portrait is located, the view responds to the click event, and the picture in which the portrait is located is scaled up and displayed.
- the click event in the irregular image region can be accurately processed, so that implementation is simple and the applicability is high.
- Step 211 End this processing procedure.
- An apparatus for processing information in a view provided in the embodiments of this application is further described.
- the apparatus for processing information in a view provided in the embodiments of this application is implemented by using a combination of hardware and software.
- the apparatus for processing information in a view provided in the embodiments of this application may be directly a software module in different forms executed by a processor 1410 of an electronic device 1400 .
- the software module may be in a storage medium, and the storage medium is in a memory 1440 .
- the processor 1410 reads executable instructions included in the software module in the memory 1440 , and the method for processing information in a view provided in the embodiments of this application is completed in combination with necessary hardware (for example, including the processor 1410 and other components such as a network interface 1420 and a user interface 1430 that are connected to a bus 1450 ).
- the apparatus includes:
- a receiving unit 11 configured to receive a click event for an interface view, the interface view including an image region for responding to the click event;
- a positioning unit 12 configured to determine a click position of the click event in the interface view
- an obtaining unit 13 configured to obtain color information of a pixel corresponding to the click position based on the click position
- a determination unit 14 configured to determine a relative position relationship between the click position and the image region according to the obtained color information of the pixel
- a processing unit 15 configured to process the click event based on the determined relative position relationship.
- the determination unit 14 is further configured to calculate a similarity between the color information of the pixel and color information of a preset color, the preset color being different from the color of the image region;
- the determination unit 14 is further configured to: obtain a hue-saturation-value (HSV) value corresponding to the pixel; and
- the determination unit 14 is further configured to: obtain a red-green-blue (RGB) value corresponding to the pixel; and
- the determination unit 14 is further configured to calculate a distance in an HSV color space between the HSV value corresponding to the pixel and the HSV value corresponding to the preset color, the distance being used for representing the similarity.
- the determination unit 14 is further configured to: obtain an RGBA value corresponding to the pixel;
- the processing unit 15 is further configured to: transparently transmit the click event to a child view of the interface view when the click position is outside the image region;
- the obtaining unit 13 is further configured to: determine a position of the click position relative to the image region based on the click position;
- the obtaining unit 13 is further configured to: obtain an image filling manner of the interface view;
- An embodiment of this application further provides a storage medium storing an executable program, and the executable program implements the foregoing method for processing information in a view in the embodiments of this application when executed by the processor.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Software Systems (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Data Mining & Analysis (AREA)
- Life Sciences & Earth Sciences (AREA)
- Artificial Intelligence (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- User Interface Of Digital Computer (AREA)
- Multimedia (AREA)
Abstract
Description
Claims (17)
Applications Claiming Priority (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201811471442.9A CN111273971B (en) | 2018-12-04 | 2018-12-04 | Method and device for processing information in view and storage medium |
| CN201811471442.9 | 2018-12-04 | ||
| PCT/CN2019/117450 WO2020114210A1 (en) | 2018-12-04 | 2019-11-12 | Method and apparatus for processing information in view, and storage medium |
Related Parent Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/CN2019/117450 Continuation WO2020114210A1 (en) | 2018-12-04 | 2019-11-12 | Method and apparatus for processing information in view, and storage medium |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| US20210181899A1 US20210181899A1 (en) | 2021-06-17 |
| US11531443B2 true US11531443B2 (en) | 2022-12-20 |
Family
ID=70974500
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US17/183,818 Active US11531443B2 (en) | 2018-12-04 | 2021-02-24 | Method, apparatus, and storage medium for determining relative position relationship of click event |
Country Status (3)
| Country | Link |
|---|---|
| US (1) | US11531443B2 (en) |
| CN (1) | CN111273971B (en) |
| WO (1) | WO2020114210A1 (en) |
Families Citing this family (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN111859001B (en) * | 2020-07-06 | 2022-05-31 | Oppo(重庆)智能科技有限公司 | Image similarity detection method and device, storage medium and electronic equipment |
| CN112181760B (en) * | 2020-09-10 | 2024-03-22 | 北京三快在线科技有限公司 | Anomaly detection method and device |
| CN112379976A (en) * | 2020-11-16 | 2021-02-19 | 上海瑞家信息技术有限公司 | Event processing method and device, mobile terminal and storage medium |
| CN112862582B (en) * | 2021-02-18 | 2024-03-22 | 深圳无域科技技术有限公司 | User portrait generation system and method based on financial wind control |
| CN114265530A (en) * | 2021-12-08 | 2022-04-01 | 贵阳语玩科技有限公司 | Button construction and response method, device and terminal based on iOS system |
| CN117251081A (en) * | 2022-08-31 | 2023-12-19 | 腾讯科技(深圳)有限公司 | Detection method, device, computer equipment and storage medium for picking up objects |
| CN116560958B (en) * | 2023-04-24 | 2024-03-01 | 重庆赛力斯凤凰智创科技有限公司 | Implementation method, device, terminal and storage medium for judging event occurrence position |
| WO2025059917A1 (en) * | 2023-09-20 | 2025-03-27 | 京东方科技集团股份有限公司 | Screen input event processing method and apparatus, storage medium, and electronic device |
Citations (19)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20060095245A1 (en) | 2004-10-30 | 2006-05-04 | Von Ruff Alvin J | User interface mapping |
| KR100835212B1 (en) | 2007-01-24 | 2008-06-05 | 삼성전자주식회사 | Input recognition device and method of mobile terminal |
| US20080175433A1 (en) | 2006-01-13 | 2008-07-24 | The Boeing Company | Method for determining a set of boundary points in a lattice |
| CN101593110A (en) | 2009-06-17 | 2009-12-02 | 厦门敏讯信息技术股份有限公司 | A kind of judge coordinate points whether belong to the zone method |
| CN101661171A (en) | 2008-08-26 | 2010-03-03 | 乐金显示有限公司 | Method for setting compensation region for irregular defect region in image display device |
| CN102855132A (en) | 2011-06-30 | 2013-01-02 | 深圳市大族激光科技股份有限公司 | Method and system for selection of graphic objects |
| US20130019182A1 (en) * | 2011-07-14 | 2013-01-17 | Microsoft Corporation | Dynamic context based menus |
| CN103543923A (en) | 2013-07-23 | 2014-01-29 | Tcl集团股份有限公司 | Control clicking event processing method and system |
| CN103577322A (en) | 2012-08-08 | 2014-02-12 | 腾讯科技(深圳)有限公司 | Click testing method and device |
| US20150109323A1 (en) * | 2013-10-18 | 2015-04-23 | Apple Inc. | Interactive black and white image editing |
| CN104978739A (en) | 2015-04-29 | 2015-10-14 | 腾讯科技(深圳)有限公司 | Image object selection method and apparatus |
| US20170075544A1 (en) * | 2015-09-14 | 2017-03-16 | Adobe Systems Incorporated | Probabilistic Determination of Selected Image Portions |
| CN106843633A (en) | 2016-12-01 | 2017-06-13 | 湖北荆楚网络科技股份有限公司 | It is a kind of judge contact whether the method inside irregular polygon |
| CN106951138A (en) | 2017-03-08 | 2017-07-14 | 海南凯迪网络资讯股份有限公司 | A kind of method and device of icon obfuscation |
| US20170236001A1 (en) * | 2016-02-11 | 2017-08-17 | Daniel M. McLean | Device and method for transforming a facial image into a set of recognizable emoticons |
| CN107784301A (en) | 2016-08-31 | 2018-03-09 | 百度在线网络技术(北京)有限公司 | Method and apparatus for identifying character area in image |
| CN108319898A (en) | 2018-01-02 | 2018-07-24 | 中国神华能源股份有限公司 | Alarm method, medium and the electronic equipment of coalcutter work condition abnormality |
| US20190147279A1 (en) * | 2017-11-13 | 2019-05-16 | Aupera Technologies, Inc. | System of a video frame detector for video content identification and method thereof |
| US20210209153A1 (en) * | 2018-09-20 | 2021-07-08 | Hengzhong ZHANG | Adding system for adding scent information to digital photographs, and adding method for using the same |
Family Cites Families (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20080177994A1 (en) * | 2003-01-12 | 2008-07-24 | Yaron Mayer | System and method for improving the efficiency, comfort, and/or reliability in Operating Systems, such as for example Windows |
| CN100550059C (en) * | 2007-12-21 | 2009-10-14 | 炬力集成电路设计有限公司 | A vector graphics acceleration method and multimedia player |
| US20140245195A1 (en) * | 2013-02-25 | 2014-08-28 | International Business Machines Corporation | Duplicating graphical widgets |
| CN105335136B (en) * | 2014-07-16 | 2019-08-09 | 阿里巴巴集团控股有限公司 | The control method and device of smart machine |
| JP6459545B2 (en) * | 2015-01-21 | 2019-01-30 | 株式会社リコー | Image processing apparatus, image processing system, and image processing method |
| CN108845924B (en) * | 2017-05-10 | 2021-04-23 | 平安科技(深圳)有限公司 | Control response area display control method, electronic device, and storage medium |
-
2018
- 2018-12-04 CN CN201811471442.9A patent/CN111273971B/en active Active
-
2019
- 2019-11-12 WO PCT/CN2019/117450 patent/WO2020114210A1/en not_active Ceased
-
2021
- 2021-02-24 US US17/183,818 patent/US11531443B2/en active Active
Patent Citations (19)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20060095245A1 (en) | 2004-10-30 | 2006-05-04 | Von Ruff Alvin J | User interface mapping |
| US20080175433A1 (en) | 2006-01-13 | 2008-07-24 | The Boeing Company | Method for determining a set of boundary points in a lattice |
| KR100835212B1 (en) | 2007-01-24 | 2008-06-05 | 삼성전자주식회사 | Input recognition device and method of mobile terminal |
| CN101661171A (en) | 2008-08-26 | 2010-03-03 | 乐金显示有限公司 | Method for setting compensation region for irregular defect region in image display device |
| CN101593110A (en) | 2009-06-17 | 2009-12-02 | 厦门敏讯信息技术股份有限公司 | A kind of judge coordinate points whether belong to the zone method |
| CN102855132A (en) | 2011-06-30 | 2013-01-02 | 深圳市大族激光科技股份有限公司 | Method and system for selection of graphic objects |
| US20130019182A1 (en) * | 2011-07-14 | 2013-01-17 | Microsoft Corporation | Dynamic context based menus |
| CN103577322A (en) | 2012-08-08 | 2014-02-12 | 腾讯科技(深圳)有限公司 | Click testing method and device |
| CN103543923A (en) | 2013-07-23 | 2014-01-29 | Tcl集团股份有限公司 | Control clicking event processing method and system |
| US20150109323A1 (en) * | 2013-10-18 | 2015-04-23 | Apple Inc. | Interactive black and white image editing |
| CN104978739A (en) | 2015-04-29 | 2015-10-14 | 腾讯科技(深圳)有限公司 | Image object selection method and apparatus |
| US20170075544A1 (en) * | 2015-09-14 | 2017-03-16 | Adobe Systems Incorporated | Probabilistic Determination of Selected Image Portions |
| US20170236001A1 (en) * | 2016-02-11 | 2017-08-17 | Daniel M. McLean | Device and method for transforming a facial image into a set of recognizable emoticons |
| CN107784301A (en) | 2016-08-31 | 2018-03-09 | 百度在线网络技术(北京)有限公司 | Method and apparatus for identifying character area in image |
| CN106843633A (en) | 2016-12-01 | 2017-06-13 | 湖北荆楚网络科技股份有限公司 | It is a kind of judge contact whether the method inside irregular polygon |
| CN106951138A (en) | 2017-03-08 | 2017-07-14 | 海南凯迪网络资讯股份有限公司 | A kind of method and device of icon obfuscation |
| US20190147279A1 (en) * | 2017-11-13 | 2019-05-16 | Aupera Technologies, Inc. | System of a video frame detector for video content identification and method thereof |
| CN108319898A (en) | 2018-01-02 | 2018-07-24 | 中国神华能源股份有限公司 | Alarm method, medium and the electronic equipment of coalcutter work condition abnormality |
| US20210209153A1 (en) * | 2018-09-20 | 2021-07-08 | Hengzhong ZHANG | Adding system for adding scent information to digital photographs, and adding method for using the same |
Non-Patent Citations (6)
| Title |
|---|
| Chinese Office Action with concise English explanation regarding 201811471442.9 dated Dec. 3, 2021, 11 pages. |
| Chinese Office Action with concise English explanation regarding 201811471442.9 dated May 8, 2021, 9 pgs. |
| International Search Report with English translation and Written Opinion for corresponding PCT/CN2019/117450 dated Jan. 23, 2020. |
| Xiao pupu, "Recognition of Irregular graphics click events is better," with English summary by google translation, May 21, 2016, 13 pgs. |
| Xiaoyao, Wuji, Irregular Region Responding to Client Event, Sep. 30, 2015, with English abstract from google translation, https://blog.csdn.net/yy471101598/article/details/48826807. |
| Xiaoyao-Wuji, "Irregular areas respond to click events," with Abstract, Sep. 30, 2015, 10 pgs. |
Also Published As
| Publication number | Publication date |
|---|---|
| CN111273971A (en) | 2020-06-12 |
| US20210181899A1 (en) | 2021-06-17 |
| WO2020114210A1 (en) | 2020-06-11 |
| CN111273971B (en) | 2022-07-29 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11531443B2 (en) | Method, apparatus, and storage medium for determining relative position relationship of click event | |
| US9912874B2 (en) | Real-time visual effects for a live camera view | |
| EP3910598B1 (en) | Graphic typesetting method and related device | |
| CN113645494B (en) | Screen fusion method, display device, terminal device and server | |
| US10878577B2 (en) | Method, system and apparatus for segmenting an image of a scene | |
| US20240394842A1 (en) | Data Generation Method and Apparatus, Device, and Storage Medium | |
| WO2022222510A1 (en) | Interaction control method, terminal device, and storage medium | |
| CN116778507A (en) | Table recognition method, device and computer-readable storage medium | |
| US12322010B2 (en) | Logo labeling method and device, update method and system of logo detection model, and storage medium | |
| CN114627494B (en) | A method and display device for locating a hand area | |
| CN117408870A (en) | Image processing method, device, electronic equipment and readable storage medium | |
| HK40025255B (en) | Method and apparatus for processing information in view, and storage medium | |
| HK40025255A (en) | Method and apparatus for processing information in view, and storage medium | |
| CN112541903A (en) | Page comparison method and device, electronic equipment and computer storage medium | |
| US12493357B2 (en) | Method for gesture manipulation on a display and host | |
| TWI883984B (en) | Method for gesture manipulation on a display and host | |
| US12563153B2 (en) | Data processing method and apparatus, and electronic device | |
| CN112508774B (en) | Image processing method and device | |
| CN118921446A (en) | Projection image generation method, apparatus and storage medium | |
| WO2023082423A1 (en) | Display method and apparatus, and device, system and readable storage medium | |
| CN116935431A (en) | Determination method of centroid of human hand region and display device | |
| CN121210011A (en) | Navigation bar processing methods, devices, electronic devices, and storage media | |
| TW202548686A (en) | Method for gesture manipulation on a display and host | |
| CN117931034A (en) | Display device and image generation method | |
| CN115113795A (en) | Virtual keyboard calibration method, device, electronic device and medium |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: TENCENT TECHNOLOGY (SHENZHEN) COMPANY LIMITED, CHINA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:CHEN, ZHI;REEL/FRAME:055392/0609 Effective date: 20210220 |
|
| FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: APPLICATION DISPATCHED FROM PREEXAM, NOT YET DOCKETED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT RECEIVED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
| STCF | Information on status: patent grant |
Free format text: PATENTED CASE |