US20150286282A1 - Electronic system - Google Patents
Electronic system Download PDFInfo
- Publication number
- US20150286282A1 US20150286282A1 US14/744,366 US201514744366A US2015286282A1 US 20150286282 A1 US20150286282 A1 US 20150286282A1 US 201514744366 A US201514744366 A US 201514744366A US 2015286282 A1 US2015286282 A1 US 2015286282A1
- Authority
- US
- United States
- Prior art keywords
- region
- image
- picture
- electronic system
- noisy
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
- G06F3/0421—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by interrupting or reflecting a light beam, e.g. optical touch-screen
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
- G06F3/0425—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
-
- G06K9/00355—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/28—Recognition of hand or arm movements, e.g. recognition of deaf sign language
Definitions
- the present invention relates to an electronic system.
- Image sensors generate images and can be applicable for different purposes.
- an image sensor can be applied in a monitoring system in which a suspect hidden among people can be identified through analyzing the images generated by the image sensor.
- An image sensor can also be used in a video game console. The image sensor of a video game console can be moved and the movement of the image sensor can be calculated by analyzing the patterns of the images generated by the image sensor.
- An image sensor usually comprises a plurality of sensing elements, which may convert light into voltages.
- An extra circuit is used to convert the voltages into digital data.
- an image sensor to track the movement of an object relative to the image sensor to generate operational commands in an electronic device is a common input method.
- the object may reflect light to the image sensor to form a bright image on a picture generated by the image sensor.
- the movement of the object or the coordinates of the object relative to the image sensor can be obtained by correlating pictures generated by the image sensor or calculating the coordinates of the bright images of the pictures.
- the image sensor When the image sensor is capturing images, the image sensor receives the light reflected from the object, and simultaneously receives light in the environment.
- the light in the environment may be a source of noises of the pictures.
- High intensity environmental light may produce a bright image, which may be misinterpreted as a bright image produced by an object.
- the image sensor may not correctly identify the bright image produced by the object. As a result, either incorrect coordinate data may be determined or coordinate data cannot be determined.
- One embodiment of the present invention provides an electronic system, which comprises an image-sensing device and a processor.
- the image-sensing device comprises an image-sensing area that generates a picture comprising a noisy region.
- the processor is coupled with the image-sensing device and configured to select a tracking region from the portion of the picture outside the noisy region, wherein the tracking region corresponds to an operative region of the image-sensing area.
- Another embodiment of the present invention discloses an electronic system that comprises an image-sensing device and a processor.
- the image-sensing device generates a picture comprising a noisy region.
- the processor is coupled with the image-sensing device.
- the processor is configured to determine an edge of the noisy region, a portion of the picture outside the noisy region by the edge of the noisy region and a plurality of tracking regions from the portion of the picture.
- the processor is further configured to select a tracking region with a maximum area from the plurality of tracking regions.
- Another embodiment of the present invention discloses an electronic system that comprises an image-sensing device and a processor coupled with the image-sensing device.
- the image-sensing device generates a picture comprising a noisy region.
- the processor is configured to determine coordinate data by the noisy region and to determine a tracking region using the coordinate data.
- FIG. 1 is a schematic view showing an electronic system according to one embodiment of the present invention
- FIG. 2 is a block diagram of the electronic system according to one embodiment of the present invention.
- FIG. 3 is a schematic view showing an image-sensing device according to one embodiment of the present invention.
- FIG. 4 is a schematic view showing the mapping between an image-sensing area and a screen according to one embodiment of the present invention
- FIG. 5 is a schematic view showing a circuit of the lighting device according to one embodiment of the present invention.
- FIGS. 6A to 6C demonstrate a plurality of successively generated pictures according to one embodiment of the present invention
- FIG. 7 is a diagram showing pixel intensity levels along line 7 of FIG. 6B ;
- FIG. 8 is a schematic view showing a picture including a noisy region according to one embodiment of the present invention.
- FIG. 9 is a schematic view showing a picture including a noisy region according to another embodiment of the present invention.
- FIG. 10 is a schematic view showing a picture including a plurality of noisy regions according to one embodiment of the present invention.
- FIG. 11 is a schematic view showing a picture including a plurality of noisy regions according to another embodiment of the present invention.
- FIG, 12 is a schematic view showing the mapping between a tracking region/operative region and a screen according to one embodiment of the present invention.
- FIG. 13A is a schematic view showing a picture taken when a light source is turned off according to one embodiment of the present invention.
- FIG. 13B is a schematic view showing a picture taken when a light source is turned on according to one embodiment of the present invention.
- FIG. 13C is a schematic view showing a subtraction picture obtained by the subtraction of the pictures of FIGS. 13A and 13B according to one embodiment of the present invention.
- FIG. 1 is a schematic view showing an electronic system 1 according to one embodiment of the present invention.
- the electronic system 1 comprises an image-sensing device 11 , which can produce a picture.
- the image-sensing device 11 may be configured to acquire and then analyze a characteristic value of the picture.
- the image-sensing device 11 is configured to generate a picture, which includes an image created by an object 13 .
- the electronic system 1 can execute a corresponding application according to the result of analyzing the image of the object 13 in the picture.
- the object 13 is depicted as a finger; however, the present invention is not limited to such embodiment.
- the object 13 reflects light to form an object image in a picture.
- the electronic system 1 may comprise a screen 12 , and the electronic system 1 is configured to control a cursor 121 on the screen 12 by the result of analyzing the images of the object 13 on the pictures. In some embodiments, the image of the object 13 is mapped to the cursor 121 on the screen 12 .
- the electronic system 1 is depicted as a car's electronic system, the present invention is not limited to such embodiment.
- the electronic system 1 can be applied in other vehicles or non-vehicles.
- the electronic system 1 may be deployed in any portable product.
- the electronic system 1 may be disposed in any electronic product that is stationarily deployed in any location.
- the electronic system 1 may comprise a lighting device 14 .
- the lighting device 14 is configured to allow, when the image-sensing device 11 is taking a picture, the object 13 to form a bright image on the picture.
- the image-sensing device 11 and the lighting device 14 can be integrally formed into a module.
- FIG. 2 is a block diagram of the electronic system 1 according to one embodiment of the present invention and FIG. 3 is a schematic view showing an image-sensing device 11 according to one embodiment of the present invention.
- the electronic system 1 may comprise a processor 21 , which can be coupled with the image-sensing device 11 .
- the processor 21 may be configured to analyze the picture generated by the image-sensing device 11 .
- the processor 21 may be configured to correlate the object images of a plurality of pictures to calculate the displacement of the object 13 . In some embodiments, the processor 21 is configured to calculate coordinate data relating to the object images of pictures and map the coordinate data onto the screen 12 .
- the image-sensing device 11 may comprise a plurality of sensing elements 311 , which are closely arranged in order to form an image-sensing area 31 .
- the sensing element 311 comprises a CMOS sensing element, a CCD sensing element or the like.
- the processor 21 may determine mapping data between the image-sensing area 31 and the screen 12 to establish the mapping relationship between the image-sensing area 31 and the screen 12 as shown in FIG. 4 .
- the mapping data may comprise a ratio (W/w or H/h) of corresponding sides of the image-sensing area 31 and the screen 12 .
- the mapping between the image-sensing area 31 and the screen 12 is determined by a more complex method, in which the mapping data may comprise of conversion coefficients.
- FIG. 5 is a schematic view showing a circuit of the lighting device 14 according to one embodiment of the present invention.
- the lighting device 14 may emit intermittent light.
- the lighting device 14 may comprise a light flashing circuit 51 , which may be coupled with a light emitting element 52 .
- a power supply Vcc is configured to provide the light flashing circuit 51 with electrical power.
- the light flashing circuit 51 is configured to provide the light emitting element 52 with intermittent electrical signals.
- the light emitting element 52 may comprise a light bulb, a light emitting diode or other suitable light sources.
- the light flash frequency of the lighting device 14 may be determined by the frame rate of the image-sensing device 11 such that in a plurality of pictures, the image of the object 13 can be easily distinguished from the noisy region occurring in each picture. For example, as shown in FIGS. 6A to 6C , when the light flash frequency of the lighting device 14 is half of the frame rate of the image-sensing device 11 , in a plurality of successive pictures 61 to 63 , the object image 64 alternately occurs in the pictures 61 and 63 .
- the image-sensing device 11 is interfered and generates pictures 61 to 63 , each comprising a noisy region 65 .
- the interference may come from non-uniform environmental light or heat, in which the environmental light may be generated by the Sun or a light source.
- the interference may directly or indirectly cause the pictures 61 to 63 to include noisy regions 65 .
- the electronic system 1 can distinguish the noisy region 65 from an object image.
- the light flash frequency of the lighting device 14 is adjusted such that the object image 64 is alternately formed in the series of pictures 61 to 63 , and the electronic system 1 can determine which is the object image 64 or which is the noisy regions 65 by the appearing frequencies of the object image 64 and the noisy regions 65 .
- the electronic system 1 can compare the pictures 61 to 63 and determine that the object image 64 is formed in the picture of FIG. 6A , but not in the picture of FIG. 6B , and consequently determine that the object image 64 is created by an object. Alternatively, after comparison, the electronic system 1 can determine that FIGS. 6A to 6C all have a noisy region 65 , and therefore, determine that the noisy region 65 is not created by an object.
- the processor 21 can determine whether there is a noisy region 65 by a plurality of pictures generated by the image-sensing device 11 when the lighting device 14 is turned off.
- the processor 21 determines that there are noisy regions 65 in a plurality of pictures generated by the image-sensing device 11 when the lighting device 14 is turned off, the processor 21 then proceeds to the step of selecting a tracking region.
- the brightness level of the noisy region 65 is higher than that of the background 66 ; however, the present invention is not limited to such embodiment.
- the boundary pixel 651 of a noisy region 65 can be determined by several methods. In some embodiments, as shown in FIG. 7 , a boundary pixel 651 b of a noisy region 65 b can be any pixel on a signal edge 71 .
- the boundary pixel 651 of the noisy region 65 can be a pixel that is not on the signal edge 71 .
- a boundary pixel 651 a is selected and the noisy region 65 a is determined, and the boundary pixel 651 a can be the pixel that is located more inner than the signal edge 71 but adjacent to the signal edge 71 .
- the boundary pixel 651 c is selected and the noisy region 65 c is determined, and the boundary pixel 651 c is at the outside of the signal edge 71 but adjacent to the signal edge 71 .
- the processor 21 may select a tracking region 81 from the background portion 66 of the picture outside of the noisy region 65 of the picture 62 .
- the processor 21 uses the tracking region 81 to track the position or movement of an object. As a result, the tracking of an object will not be affected by the noisy region.
- the processor 21 may first identify the noisy region 65 , and then determine coordinate data or a position 82 , wherein the position 82 can be used to determine a tracking region 81 .
- the processor 21 may first determine the boundary pixels 651 defining the noisy region 65 .
- the processor 21 determines the boundary pixel 651 having the maximum x coordinate.
- the processor 21 may select the boundary pixel 651 with a maximum x coordinate or a pixel adjacent to the boundary pixel 651 with the maximum x coordinate as the position 82 for determining the tracking region 81 .
- the boundary pixel with a minimum x coordinate or the pixel adjacent to the boundary pixel with the minimum x coordinate is selected.
- the pixel with a minimum y coordinate or the pixel adjacent to the pixel with the minimum y coordinate is selected.
- the noisy region 65 is on the lower side of the picture 62 , the pixel with a maximum y coordinate or the pixel adjacent to the pixel with the maximum y coordinate is selected.
- a position adjacent to the corner and the noisy region is determined for defining a tracking region
- a maximum x coordinate is determined by the coordinate data of the boundary pixels 921 of the noisy region 92 of the picture 91
- a number “a” is determined by the maximum x coordinate, wherein the number “a” can be the maximum x coordinate or a number close to the maximum x coordinate.
- a minimum y coordinate is determined by the coordinate data of the boundary pixels 921 of the noisy region 92
- a number “b” is determined by the minimum y coordinate, wherein the number “b” can be the minimum y coordinate or a number close to the minimum y coordinate. Consequently, the processor 21 can determine a pixel position P(a, b) that can be used to determine a tracking region 93 .
- the tracking region 93 can be obtained by another method, First, a first maximum region (i.e., the candidate tracking region 95 of FIG. 9 ) is determined, wherein the first maximum region comprises a plurality of successive rows of pixels that do not include any pixel of the noisy region. Next, a second maximum region (i.e., the candidate tracking region 94 of FIG. 9 ) is determined, wherein the second maximum region comprises a plurality of successive columns of pixels that do not comprise any pixel of the noisy region. Finally, the intersection region of the first and second maximum regions is determined. In the present embodiment, the intersection region determined by the above-mentioned steps can be considered as the tracking region 93 .
- the tracking region can be the region with a maximum area selected from a plurality of candidate rectangular regions.
- the processor 21 determines a candidate tracking region 94 using the number “a” and another candidate tracking region 95 using the number “b”. Next, the processor 21 determines the region with the maximum area from the candidate tracking region 94 and the candidate tracking region 95 .
- the plurality of candidate rectangular regions can be determined by another method. Referring to FIG. 10 , in some embodiments, a plurality of boundary pixels 922 of a noisy region 92 and a plurality of boundary pixels 101 of a noisy region 100 can be determined. Next, a pixel is selected from the boundary pixels 922 as one of the opposite corner points of a candidate tracking region 102 and a pixel is selected from the boundary pixels 101 as another of the opposite corner points of the candidate tracking region 102 , and the area of the candidate tracking region 102 is calculated. The pixel selection and the area calculation in the above manner continue until all possible combinations are exhausted. Finally, the processor 21 determines the rectangular tracking region with the maximum area and records the coordinate data of the corresponding boundary pixel 922 and 101 .
- the tracking region can be the rectangular tracking region with a maximum area that can be obtained from the portion of a picture outside the noisy region.
- the processor 21 selects a pixel from the boundary pixels of the noisy region 92 and a pixel from the boundary pixels of the noisy region 100 as opposite corner points of a candidate rectangular tracking region, and calculates the area of the candidate rectangular tracking region. The above step continues to combine another two boundary pixels of the noisy region 92 and 100 until all possible combinations are exhausted. Finally, the processor 21 selects the rectangular tracking region with the maximum area and records the coordinate data of the corresponding boundary pixel 922 and 101 .
- pixels 111 and 112 of the edges of the picture that are adjacent to the noisy region 92 are selected, wherein the pixels 111 and 112 can be boundary pixels of the noisy region 92 or pixels adjacent to the boundary of the noisy region 92 .
- Pixels 113 and 114 of the edges of the picture that are adjacent to the noisy region 100 are selected, wherein the pixels 113 and 114 can be boundary pixels of the noisy region 92 or pixels adjacent to the boundary of the noisy region 100 .
- the picture can be divided into a plurality of rectangular sections 1101 to 1106 using the pixels 111 and 112 and the pixels 113 and 114 .
- the rectangular sections 1101 to 1106 can be used to form a plurality of candidate rectangular tracking regions.
- the rectangular tracking region with the maximum area is chosen as the embodiment; however, the final choice can be adjusted by the user's design and requirement.
- the processor 21 when it obtains a tracking region 200 , determines an operative region 201 of the image-sensing device 11 corresponding to the tracking region 200 .
- the processor 21 can also calculate mapping data between the operative region 201 (or the tracking region 200 ) and the screen 12 to obtain a new mapping relationship.
- the electronic system 1 can use the operative region 201 to track the position or movement of an object, and the tracking result can be used to perform full screen operations. Therefore, when a local area 202 does not function well, the situation in which a cursor cannot be properly operated in a local area of a screen cannot happen.
- the processor 21 may decide to not use an operative region or a tracking region.
- the processor 21 may compare a characteristic value of a tracking region with a threshold. When the characteristic value is less than the threshold, the processor 21 may stop outputting coordinate data of the object or moving distances of the object because the electronic system 1 is too sensitive to the movement of an object.
- the characteristic value may comprise a dimension of the tracking region, wherein the dimension is a length, a diagonal or the like. In some embodiments, the characteristic value may comprise the area of the tracking region.
- the operation mode of the electronic system 1 may be changed.
- the electronic system 1 may use the bright image of an object to calculate the coordinate or moving distance data of the object.
- the processor 21 determines that a characteristic value of a tracking region is less than a threshold
- the image-sensing device 11 generates a first picture (P 1 ), wherein the first picture (P 1 ) is taken when the lighting device 14 is turned off so that the first picture (P 1 ) comprises a dark image of the object.
- the image-sensing device 11 generates a second picture (P 2 ), wherein the second picture (P 2 ) is taken when the lighting device 14 is turned on so that the second picture (P 2 ) comprises a bright image of the object.
- the processor 21 then subtracts the second picture (P 2 ) from the first picture (P 1 ) to obtain a subtraction picture (P 1 -P 2 ), which includes a dark image of the object.
- the processor 21 determines the coordinate or moving distance data of the object according to the dark image of the object in the subtraction picture.
- a dark image 133 ′′ ( FIG. 13C ) of the hand portion can be obtained.
- the dark image 133 ′′ ( FIG. 13C ) has grey levels lower than those of its surrounding area.
- the first picture can be subtracted from the second picture, and a bright image of the hand portion can be obtained, wherein the bright image has grey levels higher than those of its surrounding area.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Multimedia (AREA)
- Social Psychology (AREA)
- Psychiatry (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
Abstract
An electronic system includes an image-sensing device and a processor coupled with the image-sensing device, The image-sensing device includes an image-sensing area configured to generate a picture. The picture includes a noisy region. The processor is configured to select a tracking region from the portion of the picture outside of the noisy region. The tracking region corresponds to an operative region of the image sensing area.
Description
- This application is a continuation application of U.S. patent application Ser. No. 13/926,724, filed Jun. 25, 2013.
- The present application is based on, and claims priority from, Taiwan Patent Application Serial Number 101133606, filed on Sep. 14, 2012, the disclosure of which is hereby incorporated by reference herein in its entirety.
- 1. Technical Field
- The present invention relates to an electronic system.
- 2. Related Art
- Image sensors generate images and can be applicable for different purposes. For example, an image sensor can be applied in a monitoring system in which a suspect hidden among people can be identified through analyzing the images generated by the image sensor. An image sensor can also be used in a video game console. The image sensor of a video game console can be moved and the movement of the image sensor can be calculated by analyzing the patterns of the images generated by the image sensor.
- An image sensor usually comprises a plurality of sensing elements, which may convert light into voltages. An extra circuit is used to convert the voltages into digital data.
- Using an image sensor to track the movement of an object relative to the image sensor to generate operational commands in an electronic device is a common input method. The object may reflect light to the image sensor to form a bright image on a picture generated by the image sensor. The movement of the object or the coordinates of the object relative to the image sensor can be obtained by correlating pictures generated by the image sensor or calculating the coordinates of the bright images of the pictures.
- When the image sensor is capturing images, the image sensor receives the light reflected from the object, and simultaneously receives light in the environment. The light in the environment may be a source of noises of the pictures. High intensity environmental light may produce a bright image, which may be misinterpreted as a bright image produced by an object. Moreover, when the bright image produced by an object and the bright image produced by high intensity environmental light overlap, the image sensor may not correctly identify the bright image produced by the object. As a result, either incorrect coordinate data may be determined or coordinate data cannot be determined.
- One embodiment of the present invention provides an electronic system, which comprises an image-sensing device and a processor. The image-sensing device comprises an image-sensing area that generates a picture comprising a noisy region. The processor is coupled with the image-sensing device and configured to select a tracking region from the portion of the picture outside the noisy region, wherein the tracking region corresponds to an operative region of the image-sensing area.
- Another embodiment of the present invention discloses an electronic system that comprises an image-sensing device and a processor. The image-sensing device generates a picture comprising a noisy region. The processor is coupled with the image-sensing device. The processor is configured to determine an edge of the noisy region, a portion of the picture outside the noisy region by the edge of the noisy region and a plurality of tracking regions from the portion of the picture. The processor is further configured to select a tracking region with a maximum area from the plurality of tracking regions.
- Another embodiment of the present invention discloses an electronic system that comprises an image-sensing device and a processor coupled with the image-sensing device. The image-sensing device generates a picture comprising a noisy region. The processor is configured to determine coordinate data by the noisy region and to determine a tracking region using the coordinate data.
- To provide a better understanding of the above-described objectives, characteristics and advantages of the present invention, a detailed explanation is provided in the following embodiments with reference to the drawings.
- The invention will be described according to the appended drawings in which:
-
FIG. 1 is a schematic view showing an electronic system according to one embodiment of the present invention; -
FIG. 2 is a block diagram of the electronic system according to one embodiment of the present invention; -
FIG. 3 is a schematic view showing an image-sensing device according to one embodiment of the present invention; -
FIG. 4 is a schematic view showing the mapping between an image-sensing area and a screen according to one embodiment of the present invention; -
FIG. 5 is a schematic view showing a circuit of the lighting device according to one embodiment of the present invention; -
FIGS. 6A to 6C demonstrate a plurality of successively generated pictures according to one embodiment of the present invention; -
FIG. 7 is a diagram showing pixel intensity levels alongline 7 ofFIG. 6B ; -
FIG. 8 is a schematic view showing a picture including a noisy region according to one embodiment of the present invention; -
FIG. 9 is a schematic view showing a picture including a noisy region according to another embodiment of the present invention; -
FIG. 10 is a schematic view showing a picture including a plurality of noisy regions according to one embodiment of the present invention; -
FIG. 11 is a schematic view showing a picture including a plurality of noisy regions according to another embodiment of the present invention; - FIG, 12 is a schematic view showing the mapping between a tracking region/operative region and a screen according to one embodiment of the present invention;
-
FIG. 13A is a schematic view showing a picture taken when a light source is turned off according to one embodiment of the present invention; -
FIG. 13B is a schematic view showing a picture taken when a light source is turned on according to one embodiment of the present invention; and -
FIG. 13C is a schematic view showing a subtraction picture obtained by the subtraction of the pictures ofFIGS. 13A and 13B according to one embodiment of the present invention. - The following description is presented to enable any person skilled in the art to make and use the disclosed embodiments, and is provided in the context of a particular application and its requirements. Various modifications to the disclosed embodiments will be readily apparent to those skilled in the art, and the general principles defined herein may be applied to other embodiments and applications without departing from the spirit and scope of the disclosed embodiments. Thus, the disclosed embodiments are not limited to the embodiments shown, but are to be accorded the widest scope consistent with the principles and features disclosed herein.
-
FIG. 1 is a schematic view showing anelectronic system 1 according to one embodiment of the present invention. As shown inFIG. 1 , theelectronic system 1 comprises an image-sensingdevice 11, which can produce a picture. The image-sensingdevice 11 may be configured to acquire and then analyze a characteristic value of the picture. - In some embodiments, the image-sensing
device 11 is configured to generate a picture, which includes an image created by anobject 13. Theelectronic system 1 can execute a corresponding application according to the result of analyzing the image of theobject 13 in the picture. - In
FIG. 1 , theobject 13 is depicted as a finger; however, the present invention is not limited to such embodiment. - In some embodiments, the
object 13 reflects light to form an object image in a picture. - In some embodiments, the
electronic system 1 may comprise ascreen 12, and theelectronic system 1 is configured to control acursor 121 on thescreen 12 by the result of analyzing the images of theobject 13 on the pictures. In some embodiments, the image of theobject 13 is mapped to thecursor 121 on thescreen 12. - In the embodiment of
FIG. 1 , although theelectronic system 1 is depicted as a car's electronic system, the present invention is not limited to such embodiment. Theelectronic system 1 can be applied in other vehicles or non-vehicles. Theelectronic system 1 may be deployed in any portable product. Theelectronic system 1 may be disposed in any electronic product that is stationarily deployed in any location. - In some embodiments, the
electronic system 1 may comprise alighting device 14. Thelighting device 14 is configured to allow, when the image-sensingdevice 11 is taking a picture, theobject 13 to form a bright image on the picture. In some embodiments, the image-sensingdevice 11 and thelighting device 14 can be integrally formed into a module. -
FIG. 2 is a block diagram of theelectronic system 1 according to one embodiment of the present invention andFIG. 3 is a schematic view showing an image-sensingdevice 11 according to one embodiment of the present invention. Referring toFIGS. 1-3 , theelectronic system 1 may comprise aprocessor 21, which can be coupled with the image-sensingdevice 11. Theprocessor 21 may be configured to analyze the picture generated by the image-sensingdevice 11. - In some embodiments, the
processor 21 may be configured to correlate the object images of a plurality of pictures to calculate the displacement of theobject 13. In some embodiments, theprocessor 21 is configured to calculate coordinate data relating to the object images of pictures and map the coordinate data onto thescreen 12. - In some embodiments, as shown in
FIG. 3 , the image-sensingdevice 11 may comprise a plurality ofsensing elements 311, which are closely arranged in order to form an image-sensing area 31. In some embodiments, thesensing element 311 comprises a CMOS sensing element, a CCD sensing element or the like. - The
processor 21 may determine mapping data between the image-sensing area 31 and thescreen 12 to establish the mapping relationship between the image-sensing area 31 and thescreen 12 as shown inFIG. 4 . In some embodiments, the mapping data may comprise a ratio (W/w or H/h) of corresponding sides of the image-sensing area 31 and thescreen 12. In some embodiments, the mapping between the image-sensing area 31 and thescreen 12 is determined by a more complex method, in which the mapping data may comprise of conversion coefficients. -
FIG. 5 is a schematic view showing a circuit of thelighting device 14 according to one embodiment of the present invention. Referring toFIGS. 1 and 5 , thelighting device 14 may emit intermittent light. Thelighting device 14 may comprise alight flashing circuit 51, which may be coupled with alight emitting element 52. A power supply Vcc is configured to provide thelight flashing circuit 51 with electrical power. Thelight flashing circuit 51 is configured to provide thelight emitting element 52 with intermittent electrical signals. Thelight emitting element 52 may comprise a light bulb, a light emitting diode or other suitable light sources. - The light flash frequency of the
lighting device 14 may be determined by the frame rate of the image-sensingdevice 11 such that in a plurality of pictures, the image of theobject 13 can be easily distinguished from the noisy region occurring in each picture. For example, as shown inFIGS. 6A to 6C , when the light flash frequency of thelighting device 14 is half of the frame rate of the image-sensingdevice 11, in a plurality ofsuccessive pictures 61 to 63, theobject image 64 alternately occurs in thepictures - Referring to
FIGS. 6A to 6C , under some circumstances, the image-sensingdevice 11 is interfered and generatespictures 61 to 63, each comprising anoisy region 65. The interference may come from non-uniform environmental light or heat, in which the environmental light may be generated by the Sun or a light source. The interference may directly or indirectly cause thepictures 61 to 63 to includenoisy regions 65. - The
electronic system 1 can distinguish thenoisy region 65 from an object image. In some embodiments, when thenoisy regions 65 continue occurring in a series ofpictures 61 to 63, the light flash frequency of thelighting device 14 is adjusted such that theobject image 64 is alternately formed in the series ofpictures 61 to 63, and theelectronic system 1 can determine which is theobject image 64 or which is thenoisy regions 65 by the appearing frequencies of theobject image 64 and thenoisy regions 65. - In addition to the above method, other methods may be applicable. Referring to
FIGS. 6A to 6C , theelectronic system 1 can compare thepictures 61 to 63 and determine that theobject image 64 is formed in the picture ofFIG. 6A , but not in the picture ofFIG. 6B , and consequently determine that theobject image 64 is created by an object. Alternatively, after comparison, theelectronic system 1 can determine thatFIGS. 6A to 6C all have anoisy region 65, and therefore, determine that thenoisy region 65 is not created by an object. - In some embodiments, when the
lighting device 14 is turned off, an object will not form an image on a picture generated by the image-sensingdevice 11. Accordingly, theprocessor 21 can determine whether there is anoisy region 65 by a plurality of pictures generated by the image-sensingdevice 11 when thelighting device 14 is turned off. - In some embodiments, when the
processor 21 determines that there arenoisy regions 65 in a plurality of pictures generated by the image-sensingdevice 11 when thelighting device 14 is turned off, theprocessor 21 then proceeds to the step of selecting a tracking region. - As shown in
FIGS. 6B and 7 , in the present embodiment, the brightness level of thenoisy region 65 is higher than that of thebackground 66; however, the present invention is not limited to such embodiment. Theboundary pixel 651 of anoisy region 65 can be determined by several methods. In some embodiments, as shown inFIG. 7 , aboundary pixel 651 b of anoisy region 65 b can be any pixel on asignal edge 71. - Furthermore, the
boundary pixel 651 of thenoisy region 65 can be a pixel that is not on thesignal edge 71. In some embodiments, aboundary pixel 651 a is selected and thenoisy region 65 a is determined, and theboundary pixel 651 a can be the pixel that is located more inner than thesignal edge 71 but adjacent to thesignal edge 71. In some embodiments, theboundary pixel 651 c is selected and thenoisy region 65 c is determined, and theboundary pixel 651 c is at the outside of thesignal edge 71 but adjacent to thesignal edge 71. - Referring to
FIGS. 3 and 8 , when theprocessor 21 determines that there is anoisy region 65 in thepicture 62, theprocessor 21 may select atracking region 81 from thebackground portion 66 of the picture outside of thenoisy region 65 of thepicture 62. Theprocessor 21 then uses thetracking region 81 to track the position or movement of an object. As a result, the tracking of an object will not be affected by the noisy region. - The
processor 21 may first identify thenoisy region 65, and then determine coordinate data or aposition 82, wherein theposition 82 can be used to determine atracking region 81. In some embodiments, if the left lower corner of thepicture 62 is considered as the origin O and when thenoisy region 65 is on the left side of thepicture 62, as shown in FIG, 8, theprocessor 21 may first determine theboundary pixels 651 defining thenoisy region 65. Next, theprocessor 21 determines theboundary pixel 651 having the maximum x coordinate. Thereafter, theprocessor 21 may select theboundary pixel 651 with a maximum x coordinate or a pixel adjacent to theboundary pixel 651 with the maximum x coordinate as theposition 82 for determining the trackingregion 81. In some embodiments, if thenoisy region 65 is on the right side of thepicture 62, the boundary pixel with a minimum x coordinate or the pixel adjacent to the boundary pixel with the minimum x coordinate is selected. Similarly, if thenoisy region 65 is on the upper side of thepicture 62, the pixel with a minimum y coordinate or the pixel adjacent to the pixel with the minimum y coordinate is selected. If thenoisy region 65 is on the lower side of thepicture 62, the pixel with a maximum y coordinate or the pixel adjacent to the pixel with the maximum y coordinate is selected. - When the noisy region is on a corner of a picture, a position adjacent to the corner and the noisy region is determined for defining a tracking region, In some embodiments, as shown in
FIG. 9 , a maximum x coordinate is determined by the coordinate data of theboundary pixels 921 of thenoisy region 92 of thepicture 91, and a number “a” is determined by the maximum x coordinate, wherein the number “a” can be the maximum x coordinate or a number close to the maximum x coordinate. Next, a minimum y coordinate is determined by the coordinate data of theboundary pixels 921 of thenoisy region 92, and a number “b” is determined by the minimum y coordinate, wherein the number “b” can be the minimum y coordinate or a number close to the minimum y coordinate. Consequently, theprocessor 21 can determine a pixel position P(a, b) that can be used to determine atracking region 93. - In addition, the tracking
region 93 can be obtained by another method, First, a first maximum region (i.e., thecandidate tracking region 95 ofFIG. 9 ) is determined, wherein the first maximum region comprises a plurality of successive rows of pixels that do not include any pixel of the noisy region. Next, a second maximum region (i.e., thecandidate tracking region 94 ofFIG. 9 ) is determined, wherein the second maximum region comprises a plurality of successive columns of pixels that do not comprise any pixel of the noisy region. Finally, the intersection region of the first and second maximum regions is determined. In the present embodiment, the intersection region determined by the above-mentioned steps can be considered as the trackingregion 93. - The tracking region can be the region with a maximum area selected from a plurality of candidate rectangular regions. Referring to
FIG. 9 , in some embodiments, theprocessor 21 determines acandidate tracking region 94 using the number “a” and anothercandidate tracking region 95 using the number “b”. Next, theprocessor 21 determines the region with the maximum area from thecandidate tracking region 94 and thecandidate tracking region 95. - The plurality of candidate rectangular regions can be determined by another method. Referring to
FIG. 10 , in some embodiments, a plurality ofboundary pixels 922 of anoisy region 92 and a plurality ofboundary pixels 101 of anoisy region 100 can be determined. Next, a pixel is selected from theboundary pixels 922 as one of the opposite corner points of acandidate tracking region 102 and a pixel is selected from theboundary pixels 101 as another of the opposite corner points of thecandidate tracking region 102, and the area of thecandidate tracking region 102 is calculated. The pixel selection and the area calculation in the above manner continue until all possible combinations are exhausted. Finally, theprocessor 21 determines the rectangular tracking region with the maximum area and records the coordinate data of the correspondingboundary pixel - The tracking region can be the rectangular tracking region with a maximum area that can be obtained from the portion of a picture outside the noisy region. Referring to
FIG. 10 , in some embodiments, theprocessor 21 selects a pixel from the boundary pixels of thenoisy region 92 and a pixel from the boundary pixels of thenoisy region 100 as opposite corner points of a candidate rectangular tracking region, and calculates the area of the candidate rectangular tracking region. The above step continues to combine another two boundary pixels of thenoisy region processor 21 selects the rectangular tracking region with the maximum area and records the coordinate data of the correspondingboundary pixel - When noisy regions are on the two corners of a picture, the following method can be applied to obtain a tracking region. Referring to
FIG. 11 ,pixels noisy region 92 are selected, wherein thepixels noisy region 92 or pixels adjacent to the boundary of thenoisy region 92.Pixels noisy region 100 are selected, wherein thepixels noisy region 92 or pixels adjacent to the boundary of thenoisy region 100. The picture can be divided into a plurality ofrectangular sections 1101 to 1106 using thepixels pixels rectangular sections 1101 to 1106 can be used to form a plurality of candidate rectangular tracking regions. Usually, the rectangular tracking region with the maximum area is chosen as the embodiment; however, the final choice can be adjusted by the user's design and requirement. - Referring to
FIG. 12 , when alocal area 202 of an image-sensingdevice 11 is adversely affected and generates large noise, object images in the region of a picture corresponding to thelocal area 202 cannot be properly determined and distinguished. As a result, the cursor that moves into a local area of a screen mapped to thelocal area 202 cannot be normally manipulated. To solve the issue, in one embodiment, theprocessor 21, after it obtains atracking region 200, determines anoperative region 201 of the image-sensingdevice 11 corresponding to thetracking region 200. Theprocessor 21 can also calculate mapping data between the operative region 201 (or the tracking region 200) and thescreen 12 to obtain a new mapping relationship. Accordingly, theelectronic system 1 can use theoperative region 201 to track the position or movement of an object, and the tracking result can be used to perform full screen operations. Therefore, when alocal area 202 does not function well, the situation in which a cursor cannot be properly operated in a local area of a screen cannot happen. - Under some circumstances, the
processor 21 may decide to not use an operative region or a tracking region. In some embodiments, theprocessor 21 may compare a characteristic value of a tracking region with a threshold. When the characteristic value is less than the threshold, theprocessor 21 may stop outputting coordinate data of the object or moving distances of the object because theelectronic system 1 is too sensitive to the movement of an object. - In some embodiments, the characteristic value may comprise a dimension of the tracking region, wherein the dimension is a length, a diagonal or the like. In some embodiments, the characteristic value may comprise the area of the tracking region.
- Referring to
FIG. 1 , under some circumstances, when the operative region or the tracking region is too small, the operation mode of theelectronic system 1 may be changed. In some embodiments, theelectronic system 1 may use the bright image of an object to calculate the coordinate or moving distance data of the object. When theprocessor 21 determines that a characteristic value of a tracking region is less than a threshold, the image-sensingdevice 11 generates a first picture (P1), wherein the first picture (P1) is taken when thelighting device 14 is turned off so that the first picture (P1) comprises a dark image of the object. Moreover, the image-sensingdevice 11 generates a second picture (P2), wherein the second picture (P2) is taken when thelighting device 14 is turned on so that the second picture (P2) comprises a bright image of the object. Theprocessor 21 then subtracts the second picture (P2) from the first picture (P1) to obtain a subtraction picture (P1-P2), which includes a dark image of the object. Theprocessor 21 then determines the coordinate or moving distance data of the object according to the dark image of the object in the subtraction picture. - For example, as shown in
FIGS. 1 , 5, and 13A to 13C, when the operative region or thetracking region 131 is too small, it indicates that thenoisy region 132 is huge. Under such a situation, when the first picture (P1) is taken without thelight emitting element 52 being turned on, the hand portion of a user may block a portion of light that creates thenoisy region 132, and then forms a dark image 133 (FIG. 13A ). Alternatively, when the second picture (P2) is generated with thelight emitting element 52 being turned on, the hand portion of the user may form abright image 133′ (FIG. 13B ) on the second picture (P2). When the second picture is subtracted to form the first picture (P1-P2), adark image 133″ (FIG. 13C ) of the hand portion can be obtained. The above embodiment is explained using a hand portion; however, the present invention is not limited to using a hand portion. Specifically, thedark image 133″ (FIG. 13C ) has grey levels lower than those of its surrounding area. In another embodiment, the first picture can be subtracted from the second picture, and a bright image of the hand portion can be obtained, wherein the bright image has grey levels higher than those of its surrounding area. - The above embodiments are demonstrated using rectangular tracking regions; however, the present invention is not limited to using rectangular tracking regions, and tracking regions with other shapes are applicable as well.
- It will be apparent to those skilled in the art that various modifications can be made to the disclosed embodiments. It is intended that the specification and examples be considered as exemplary only, with the true scope of the disclosure being indicated by the following claims and their equivalents.
Claims (10)
1. An electronic system comprising:
an image-sensing device comprising an image-sensing area that generates a picture comprising a noisy region; and
a processor coupled with the image-sensing device, configured to select a tracking region from the portion of the picture outside the noisy region, wherein the tracking region corresponds to an operative region of the image-sensing area.
2. The electronic system of claim 1 , comprising a screen, wherein the processor is configured to determine mapping data between the screen and the operative region.
3. The electronic system of claim 1 , wherein an edge of the tracking region is adjacent to or passes through a boundary pixel of the noisy region.
4. The electronic system of claim 1 , wherein the processor is configured to determine coordinate data of boundary pixels of the noisy region and to determine the tracking region using the coordinate data of the boundary pixels.
5. The electronic system of claim 1 , wherein the processor is configured to use a characteristic value of the tracking region to determine whether to output coordinate data.
6. The electronic system of claim 1 , further comprising a lighting device, wherein the processor is configured to subtract a second picture from a first picture when a characteristic value of the tracking region is less than a threshold, wherein the first picture comprises a dark image of an object taken when the lighting device is turned off, and the second picture comprises a bright image of the object taken when the object is illuminated by the lighting device.
7. The electronic system of claim 1 , further comprising a lighting device configured to emit intermittent light, which allows an object to alternatively fowl bright images in a plurality of successive pictures.
8. The electronic system of claim 7 , wherein the noisy region occurs in each picture of the plurality of successive pictures.
9. An electronic system comprising:
an image-sensing device generating a picture comprising a noisy region; and
a processor coupled with the image-sensing device, wherein the processor is configured to determine coordinate data by the noisy region and to determine a tracking region using the coordinate data.
10. The electronic system of claim 9 , further comprising a screen, wherein the processor is configured to determine mapping data between the screen and the tracking region.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/744,366 US20150286282A1 (en) | 2012-09-14 | 2015-06-19 | Electronic system |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
TW101133606 | 2012-09-14 | ||
TW101133606A TWI510087B (en) | 2012-09-14 | 2012-09-14 | Electronic system |
US13/926,724 US9092063B2 (en) | 2012-09-14 | 2013-06-25 | Electronic system |
US14/744,366 US20150286282A1 (en) | 2012-09-14 | 2015-06-19 | Electronic system |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/926,724 Continuation US9092063B2 (en) | 2012-09-14 | 2013-06-25 | Electronic system |
Publications (1)
Publication Number | Publication Date |
---|---|
US20150286282A1 true US20150286282A1 (en) | 2015-10-08 |
Family
ID=50274511
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/926,724 Active 2033-10-17 US9092063B2 (en) | 2012-09-14 | 2013-06-25 | Electronic system |
US14/744,366 Abandoned US20150286282A1 (en) | 2012-09-14 | 2015-06-19 | Electronic system |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/926,724 Active 2033-10-17 US9092063B2 (en) | 2012-09-14 | 2013-06-25 | Electronic system |
Country Status (2)
Country | Link |
---|---|
US (2) | US9092063B2 (en) |
TW (1) | TWI510087B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9927915B2 (en) * | 2014-09-26 | 2018-03-27 | Cypress Semiconductor Corporation | Optical navigation systems and methods for background light detection and avoiding false detection and auto-movement |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6128003A (en) * | 1996-12-20 | 2000-10-03 | Hitachi, Ltd. | Hand gesture recognition system and method |
US20060050927A1 (en) * | 2002-01-16 | 2006-03-09 | Marcus Klomark | Camera arrangement |
US20090091710A1 (en) * | 2007-10-05 | 2009-04-09 | Huebner Kenneth J | Interactive projector system and method |
US20130063336A1 (en) * | 2011-09-08 | 2013-03-14 | Honda Motor Co., Ltd. | Vehicle user interface system |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TWI471521B (en) * | 2010-07-23 | 2015-02-01 | Pixart Imaging Inc | Displacement estimation method and displacement estimation device using the same |
TWI420906B (en) * | 2010-10-13 | 2013-12-21 | Ind Tech Res Inst | Tracking system and method for regions of interest and computer program product thereof |
TWI484823B (en) * | 2011-02-18 | 2015-05-11 | Pixart Imaging Inc | Image system and denoising method thereof |
-
2012
- 2012-09-14 TW TW101133606A patent/TWI510087B/en active
-
2013
- 2013-06-25 US US13/926,724 patent/US9092063B2/en active Active
-
2015
- 2015-06-19 US US14/744,366 patent/US20150286282A1/en not_active Abandoned
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6128003A (en) * | 1996-12-20 | 2000-10-03 | Hitachi, Ltd. | Hand gesture recognition system and method |
US20060050927A1 (en) * | 2002-01-16 | 2006-03-09 | Marcus Klomark | Camera arrangement |
US20090091710A1 (en) * | 2007-10-05 | 2009-04-09 | Huebner Kenneth J | Interactive projector system and method |
US20130063336A1 (en) * | 2011-09-08 | 2013-03-14 | Honda Motor Co., Ltd. | Vehicle user interface system |
Also Published As
Publication number | Publication date |
---|---|
TWI510087B (en) | 2015-11-21 |
US9092063B2 (en) | 2015-07-28 |
US20140079284A1 (en) | 2014-03-20 |
TW201412107A (en) | 2014-03-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10255682B2 (en) | Image detection system using differences in illumination conditions | |
KR101465835B1 (en) | Correcting for ambient light in an optical touch-sensitive device | |
JP6240609B2 (en) | Vision-based interactive projection system | |
TWI450154B (en) | Optical touch system and object detection method therefor | |
US10354413B2 (en) | Detection system and picture filtering method thereof | |
RU2456659C2 (en) | Image capturing device, image display and capturing device and electronic device | |
WO2020059565A1 (en) | Depth acquisition device, depth acquisition method and program | |
US9958961B2 (en) | Optical pointing system | |
US20110164191A1 (en) | Interactive Projection Method, Apparatus and System | |
US20120274606A1 (en) | Optical navigation system with object detection | |
US20110148822A1 (en) | Three-Dimensional Space Touch Apparatus Using Multiple Infrared Cameras | |
TWI441060B (en) | Image processing method for optical touch system | |
US20110050644A1 (en) | Touch system and pointer coordinate detection method therefor | |
US10803625B2 (en) | Detection system and picturing filtering method thereof | |
US20140306934A1 (en) | Optical touch panel system, optical apparatus and positioning method thereof | |
US10748019B2 (en) | Image processing method and electronic apparatus for foreground image extraction | |
US9092063B2 (en) | Electronic system | |
US20140085264A1 (en) | Optical touch panel system, optical sensing module, and operation method thereof | |
RU2602829C2 (en) | Assessment of control criteria from remote control device with camera | |
US10379677B2 (en) | Optical touch device and operation method thereof | |
US20220091265A1 (en) | Mobile robot generating resized region of interest in image frame and using dual-bandpass filter | |
KR100917615B1 (en) | Method and apparatus for detecting location of laser beam with minimized error using mono-camera | |
JP6390163B2 (en) | Information processing apparatus, information processing method, and program | |
US9389731B2 (en) | Optical touch system having an image sensing module for generating a two-dimensional image and converting to a one-dimensional feature | |
US9234756B2 (en) | Object tracking device capable of removing background noise and operating method thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: PIXART IMAGING INC., TAIWAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KAO, MING TSAN;YANG, SHU SIAN;CHENG, HAN PING;REEL/FRAME:035867/0078 Effective date: 20130621 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |