JP5117613B1 - Video processing apparatus, video processing method, and storage medium - Google Patents

Video processing apparatus, video processing method, and storage medium Download PDF

Info

Publication number
JP5117613B1
JP5117613B1 JP2011270040A JP2011270040A JP5117613B1 JP 5117613 B1 JP5117613 B1 JP 5117613B1 JP 2011270040 A JP2011270040 A JP 2011270040A JP 2011270040 A JP2011270040 A JP 2011270040A JP 5117613 B1 JP5117613 B1 JP 5117613B1
Authority
JP
Japan
Prior art keywords
mark
viewing zone
viewer
parallax
test pattern
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
JP2011270040A
Other languages
Japanese (ja)
Other versions
JP2013123094A (en
Inventor
江 豊 入
星 利 弘 諸
Original Assignee
株式会社東芝
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社東芝 filed Critical 株式会社東芝
Priority to JP2011270040A priority Critical patent/JP5117613B1/en
Application granted granted Critical
Publication of JP5117613B1 publication Critical patent/JP5117613B1/en
Publication of JP2013123094A publication Critical patent/JP2013123094A/en
Application status is Expired - Fee Related legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • H04N13/305Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using lenticular lenses, e.g. arrangements of cylindrical lenses
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/327Calibration thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N17/00Diagnosis, testing or measuring for television systems or their details
    • H04N17/04Diagnosis, testing or measuring for television systems or their details for receivers
    • G02B30/27

Abstract

An image processing apparatus, an image processing method, and a storage medium that enable a viewer to easily check a viewing area are provided.
According to an embodiment, a video processing device displays a test pattern for displaying in three dimensions a test pattern for indicating whether or not the viewer is substantially in the center of a viewing zone where the viewer can view the video in three dimensions. A test pattern control unit for outputting to the unit. The test pattern includes first to sixth parallax images respectively corresponding to first to sixth viewpoints arranged in order from left to right. The third and fourth parallax images include a first mark, the first and sixth parallax images include a second mark different from the first mark, and the second and fifth The parallax image does not include the first mark and the second mark.
[Selection] Figure 1

Description

  Embodiments described herein relate generally to a video processing apparatus, a video processing method, and a storage medium.

  In recent years, stereoscopic image display devices (so-called naked-eye 3D televisions) that allow viewers to view stereoscopic images with the naked eye without using special glasses have become widespread. This stereoscopic video display device displays a plurality of images with different viewpoints. The light rays of these images are guided to the viewer's eyes by controlling the output direction by, for example, a parallax barrier or a lenticular lens. If the viewer's position is appropriate, the viewer sees different parallax images for the left eye and the right eye, and thus can recognize the image in three dimensions.

  However, with a naked-eye 3D television, there is a problem that an image cannot be seen stereoscopically depending on the position of the viewer. Even when a normal video is viewed, it may be difficult for the viewer to determine whether or not the viewer is in an appropriate position where the video can be viewed stereoscopically.

JP 2009-250987 A

  Provided are a video processing device, a video processing method, and a storage medium that enable a viewer to easily check a viewing zone.

  According to the embodiment, the video processing device outputs a test pattern for indicating whether or not the viewer is substantially in the center of the viewing area where the viewer can view the video stereoscopically to the display unit capable of stereoscopic display. A test pattern control unit is provided. The test pattern includes first to sixth parallax images respectively corresponding to the first to sixth viewpoints arranged in order in the horizontal direction. The third and fourth parallax images include a first mark, the first and sixth parallax images include a second mark different from the first mark, and the second and fifth The parallax image does not include the first mark and the second mark.

1 is an external view of a video display device 100 according to an embodiment. 1 is a block diagram showing a schematic configuration of a video display device 100. FIG. The figure which looked at a part of liquid crystal panel 1 and the lenticular lens 2 from upper direction. The figure which shows a visual field typically. The figure which shows an example of the test pattern of 1st Embodiment. The figure which shows typically the image which a viewer recognizes when a viewer views the parallax image 24 with a right eye and the parallax image 25 with a left eye. The figure which shows typically the image which a viewer recognizes when a viewer looks at two of parallax images 21-23, 28, and 29 with a right eye and a left eye. The figure which shows the appearance of a test pattern typically. The figure which shows an example of the remote control which a viewer uses. The flowchart which shows an example of the processing operation of the controller 10 according to remote control operation. The figure which shows an example of the test pattern of 2nd Embodiment. The flowchart which shows an example of the processing operation of the controller 10 in the case of adjusting a visual field using the test pattern of FIG. The figure which shows an example of the test pattern of 3rd Embodiment. The flowchart which shows an example of the processing operation of the controller 10 according to remote control operation. The block diagram which shows schematic structure of the video display apparatus 100 'which is a modification of FIG.

  Hereinafter, embodiments will be specifically described with reference to the drawings.

(First embodiment)
FIG. 1 is an external view of a video display device 100 according to an embodiment, and FIG. 2 is a block diagram showing a schematic configuration thereof. The video display device 100 includes a liquid crystal panel 1, a lenticular lens 2, a camera 3, a light receiving unit 4, and a controller 10.

  The liquid crystal panel (display unit) 1 displays a plurality of parallax images that can be observed as stereoscopic images by a viewer in the viewing area. The liquid crystal panel 1 is a 55-inch panel, for example, and has 4K2K (3840 * 2160) pixels. On the other hand, by means such as arranging the lenticular lens obliquely, 11520 (= 1280 * 9) pixels in the horizontal direction and 720 pixels in the vertical direction are arranged for stereoscopic use. It is possible to have a corresponding effect. The following description is based on this model in which the number of pixels in the horizontal direction is expanded. In each pixel, three subpixels, that is, an R subpixel, a G subpixel, and a B subpixel are formed in the vertical direction. The liquid crystal panel 1 is irradiated with light from a backlight device (not shown) provided on the back surface. Each pixel transmits light having a luminance corresponding to a parallax image signal (described later) supplied from the controller 10.

  The lenticular lens (aperture control unit) 2 outputs a plurality of parallax images displayed on the liquid crystal panel 1 (display unit) in a predetermined direction. The lenticular lens 2 has a plurality of convex portions arranged along the horizontal direction of the liquid crystal panel 1, and the number thereof is 1/9 of the number of pixels in the horizontal direction of the liquid crystal panel 1. The lenticular lens 2 is affixed to the surface of the liquid crystal panel 1 so that one convex portion corresponds to nine pixels arranged in the horizontal direction. The light transmitted through each pixel is output in a specific direction with directivity from the vicinity of the top of the convex portion.

  In the following description, an example in which nine pixels are provided corresponding to each convex portion of the lenticular lens 2 and a 9-parallax multi-parallax method can be adopted will be described. In the multi-parallax method, the first to ninth parallax images are displayed on nine pixels corresponding to the respective convex portions. The first to ninth parallax images are images obtained by viewing the subject from nine viewpoints arranged along the horizontal direction of the liquid crystal panel 1. The viewer can stereoscopically view the video through the lenticular lens 2 by viewing one parallax image of the first to ninth parallax images with the left eye and the other parallax image with the right eye. According to the multi-parallax method, the viewing zone can be expanded as the number of parallaxes is increased. The viewing area refers to an area in which an image can be viewed stereoscopically when the liquid crystal panel 1 is viewed from the front of the liquid crystal panel 1.

  The liquid crystal panel 1 can also display a two-dimensional image by displaying the same color with nine pixels corresponding to each convex portion.

  In the present embodiment, the relative positional relationship between the convex portion of the lenticular lens 2 and the displayed parallax image, that is, how to display the parallax image on the nine pixels corresponding to each convex portion, The viewing zone can be variably controlled according to the situation. Hereinafter, control of the viewing zone will be described.

  FIG. 3 is a view of a part of the liquid crystal panel 1 and the lenticular lens 2 as viewed from above. The shaded area in the figure shows the viewing area, and the image can be viewed stereoscopically when the liquid crystal panel 1 is viewed from the viewing area. The other areas are areas where reverse viewing and crosstalk occur, and it is difficult to stereoscopically view the video. Also, the more the viewer is in the center of the viewing area, the greater the sense of stereoscopic effect. However, even within the viewing area, if the viewer is at the end of the viewing area, the stereoscopic effect may not be felt or reverse viewing may occur. Sometimes.

  3 shows the relative positional relationship between the liquid crystal panel 1 and the lenticular lens 2, more specifically, the distance between the liquid crystal panel 1 and the lenticular lens 2, or the horizontal shift between the liquid crystal panel 1 and the lenticular lens 2. FIG. It shows how the viewing zone changes depending on the amount.

  Actually, since the lenticular lens 2 is attached to the liquid crystal panel 1 with high accuracy, it is difficult to physically change the relative position between the liquid crystal panel 1 and the lenticular lens 2. .

  Therefore, in the present embodiment, the relative positional relationship between the liquid crystal panel 1 and the lenticular lens 2 is apparently displayed by shifting the display positions of the first to ninth parallax images displayed on each pixel of the liquid crystal panel 1. Change and thus adjust the viewing zone.

  For example, when the first to ninth parallax images are respectively displayed on nine pixels corresponding to the respective convex portions (FIG. 3A), the parallax images are displayed while being shifted to the right as a whole (FIG. 3). (B)), the viewing zone moves to the left. Conversely, when the parallax image is displayed shifted to the left as a whole, the viewing zone moves to the right.

  Further, when the parallax image is not shifted in the vicinity of the center in the horizontal direction, and the parallax image is displayed with a larger shift toward the outside toward the outer side of the liquid crystal panel 1 (FIG. 3C), the viewing zone is closer to the liquid crystal panel 1. Moving. In addition, what is necessary is just to interpolate suitably the pixel between the parallax image which shifts and the parallax image which does not shift, and the pixel between the parallax images from which the shift amount differs according to a surrounding pixel. Contrary to FIG. 3C, when the parallax image is not shifted near the center in the horizontal direction, and the parallax image is displayed with a large shift toward the center toward the outside of the liquid crystal panel 1, the viewing area is the liquid crystal panel 1. Move away from the camera.

  As described above, the viewing area can be moved in the left-right direction or the front-rear direction with respect to the liquid crystal panel 1 by shifting the whole or part of the parallax image. In FIG. 3, only one viewing area is shown for the sake of simplicity, but actually, as shown in FIG. 4, a plurality of viewing areas exist in the viewing area P, and these move in conjunction with each other. To do. The viewing zone is controlled by the controller 10 shown in FIG.

  Returning to FIG. 1, the camera 3 is attached at a predetermined elevation near the lower center of the liquid crystal panel 1, and photographs a predetermined range in front of the liquid crystal panel 1. The captured video is supplied to the controller 10 and used to detect information about the viewer such as the viewer's position and the viewer's face. The camera 3 may shoot either a moving image or a still image.

  The light receiving unit 4 is provided on the left side of the lower part of the liquid crystal panel 1, for example. And the light-receiving part 4 receives the infrared signal transmitted from the remote control which a viewer uses. Whether the infrared signal is to display a stereoscopic image or a two-dimensional image, whether to use a multi-parallax method or a two-parallax method when displaying a stereoscopic image, whether to control the viewing area, Including signals indicating the like.

  Next, details of the components of the controller 10 will be described. As shown in FIG. 2, the controller 10 includes a tuner decoder 11, a parallax image conversion unit 12, a viewer position detection unit 13, a viewing area information calculation unit 14, a test pattern control unit 15, and an image output unit 16. And an image adjustment unit 17 and a storage unit 18. The controller 10 is mounted as one IC (Integrated Circuit), for example, and is disposed on the back side of the liquid crystal panel 1. Of course, a part of the controller 10 may be implemented by software.

  A tuner decoder (receiver) 11 receives and selects an input broadcast wave and decodes an encoded input video signal. When a data broadcast signal such as an electronic program guide (EPG) is superimposed on the broadcast wave, the tuner decoder 11 extracts it. Alternatively, the tuner decoder 11 receives not the broadcast wave but an input video signal encoded from a video output device such as an optical disk playback device or a personal computer and decodes it. The decoded signal is also called a baseband video signal, and is supplied to the parallax image conversion unit 12. When the video display apparatus 100 does not receive broadcast waves and displays an input video signal received exclusively from the video output device, a decoder having a decoding function may be provided as a receiving unit instead of the tuner decoder 11.

  The input video signal received by the tuner decoder 11 may be a two-dimensional video signal, or left-eye and right-eye images by frame packing (FP), side-by-side (SBS), top-and-bottom (TAB), or the like. May be a three-dimensional video signal. In addition, the video signal may be a three-dimensional video signal including an image having three or more parallaxes.

  The parallax image conversion unit 12 converts the baseband video signal into a plurality of parallax image signals in order to stereoscopically display the video. The processing content of the parallax image conversion unit 12 differs depending on whether the baseband video signal is a two-dimensional video signal or a three-dimensional video signal.

  When a two-dimensional video signal or a three-dimensional video signal including an image of 8 parallax or less is input, the parallax image conversion unit 12 performs first to ninth parallax images based on depth information of each pixel in the video signal. Generate a signal. The depth value is a value indicating how much each pixel is displayed so as to be seen in front of or behind the liquid crystal panel 1. The depth value may be added to the input video signal in advance, or the depth value may be generated by performing motion detection, composition identification, human face detection, and the like based on the characteristics of the input video signal. On the other hand, when a three-dimensional video signal including 9 parallax images is input, the parallax image conversion unit 12 generates first to ninth parallax image signals using the video signals.

  The parallax image signal of the input video signal generated as described above is supplied to the image output unit 16.

  The viewer position detection unit 13 performs face recognition using the video imaged by the camera 3 and acquires the position information. This position information is supplied to the viewing area information calculation unit 14.

  The viewer's position information is represented as, for example, positions on the X axis (horizontal direction), Y axis (vertical direction), and Z axis (direction orthogonal to the liquid crystal panel 1) with the center of the liquid crystal panel 1 as the origin. The More specifically, the viewer position detection unit 13 first recognizes the viewer by detecting a face from the video captured by the camera 3. Next, the viewer position detector 13 detects the position on the X axis and the Y axis from the position of the face in the video, and detects the position on the Z axis from the size of the face. When there are a plurality of viewers, the viewer position detector 13 may detect the positions of viewers by a predetermined number, for example, 10 people. In this case, when the number of detected faces is larger than 10, for example, the positions of 10 viewers are detected in order from the position closer to the liquid crystal panel 1, that is, the position on the Z-axis is smaller.

  In addition, there is no restriction | limiting in particular in the method in which the viewer position detection part 13 detects a viewer's position, The camera 3 may be an infrared camera and may detect a viewer's position using a sound wave.

  The viewing area information calculation unit 14 uses the viewer position information supplied from the viewer position detection unit 13 to calculate a control parameter for setting a viewing area in which the detected viewer is accommodated. This control parameter is, for example, an amount by which the parallax image described in FIG. 3 is shifted, and is a single parameter or a combination of a plurality of parameters. Then, the viewing area information calculation unit 14 supplies the calculated control parameter to the image adjustment unit 17.

  More specifically, in order to set a desired viewing zone, the viewing zone information calculation unit 14 uses a viewing zone database in which control parameters are associated with viewing zones set by the control parameters. This viewing area database is stored in the storage unit 18 in advance. The viewing zone information calculation unit 14 searches the viewing zone database to find a viewing zone in which a viewer can be accommodated.

  The test pattern control unit 15 outputs a test pattern signal to the image output unit 16 for indicating whether or not the viewer is in a viewing zone where the video can be viewed stereoscopically. The test pattern signal also includes a plurality of parallax images, more specifically, the same number of parallax images as the parallax images output by the parallax image conversion unit 12. This test pattern is one of the features of this embodiment and will be described in detail later.

  The image output unit 16 supplies an input video signal or a parallax image signal of a test pattern signal to the image adjustment unit 17 in accordance with a signal transmitted from the outside of the controller 10 such as a remote controller.

  The image adjusting unit (viewing zone control unit) 17 supplies the liquid crystal panel 1 after adjusting the parallax image signal to be shifted or interpolated according to the calculated control parameter in order to control the viewing zone. And displayed on the liquid crystal panel 1.

  The storage unit 18 is a nonvolatile memory such as a flash memory, and stores a viewing area database, a test pattern, and the like. The storage unit 18 may be provided outside the controller 10.

  The controller 10 can automatically set the viewing zone (auto tracking) according to the position of the viewer by the following processing operation. First, the viewer position detection unit 13 detects the position of the viewer using the video taken by the camera 3. The viewing area information calculation unit 14 calculates the control parameter so that the viewing area is set at the detected position of the viewer. Then, the image adjustment unit 17 adjusts the parallax image signal according to the control parameter, and the parallax image corresponding to the adjusted parallax image signal is displayed on the liquid crystal panel 1.

  In this way, by appropriately detecting the viewer's face and controlling the viewing zone as described above, a viewing zone appropriate for the viewer is set in real time. However, depending on the skin color or the like, the viewer's face cannot always be accurately detected, and an appropriate viewing zone may not be set. Therefore, in the present embodiment, the test pattern is displayed on the liquid crystal panel 1 to assist the viewer in moving to the viewing zone.

  FIG. 5 is a diagram illustrating an example of a test pattern according to the first embodiment. The test pattern includes parallax images 21 to 29 as illustrated in FIGS. These are parallax images respectively corresponding to the first to ninth viewpoints arranged in the horizontal direction from right to left.

  First, common points of each parallax image will be described.

  In the present embodiment, it is assumed that a test pattern is displayed on the entire surface of the liquid crystal panel 1. Therefore, the background of the test pattern is not a video corresponding to the input video signal but a predetermined video. The background image is arbitrary, but an image that can easily recognize a three-dimensional display is desirable. In the figure, an example of a cloud pattern is shown. In addition, a rectangular area 31 is provided near the center of the test pattern, and rectangular areas 32 and 33 are provided on the left side and the right side, respectively, spaced apart from the rectangular area 31.

  Furthermore, the numbers 1 to 5 are displayed in the horizontal direction slightly below the center. These numbers have different depth values depending on their values. For example, “1” is displayed in front of the liquid crystal panel 1, “3” is displayed on the liquid crystal panel 1, and “5” is displayed in the back of the liquid crystal panel 1. “1” may be displayed on the front side and “5” may be displayed on the back side, and may be used for adjusting the stereoscopic effect.

  Next, features of each parallax image will be described.

  In the central parallax image (fourth parallax image described in the claims) 25 and the parallax image (the third parallax image described in the claims) 24 adjacent to the left of the center parallax image suitable for viewing, A double circle (first appropriate mark) is displayed on the screen. Furthermore, a circle (second appropriate mark) is displayed in the rectangular areas 32 and 33. Double circles and circles are examples of the first mark, indicating that the viewer is approximately in the center of the viewing zone.

  The parallax image on the left side of the parallax image 24 (second parallax image in the claims) 23, the parallax image on the right side of the parallax image 25 (fifth parallax image in the claims) 26, and In the parallax image 29, no mark is displayed in the rectangular areas 31 to 33.

  In the parallax image 22 (the first parallax image described in the claims) 22 adjacent to the left side of the parallax image 23 and on the right side from the optimum viewing zone, and the parallax image 21 adjacent to the left side of the parallax image 23, the right half of the rectangular region 31 A left-pointing arrow is displayed, and an up-down arrow is displayed in the rectangular areas 32 and 33. In the parallax image 27 (the sixth parallax image described in the claims) 27 on the right side of the parallax image 26 and on the left side from the optimum viewing zone, and the parallax image 28 on the right side thereof, the left half of the rectangular region 31 A right-pointing arrow is displayed, and an up-down arrow is displayed in the rectangular areas 32 and 33.

  The left-pointing arrow, the right-pointing arrow, and the up-and-down arrow are examples of the second mark, and indicate that the viewer is not at the approximate center of the viewing zone.

  Note that FIG. 5 is merely an example of the test pattern, and the first mark is included in the parallax images 24 and 25 close to the center suitable for viewing, and the parallax images that are not suitable for viewing compared to the parallax images 24 and 25. The second marks 22 and 27 need not include the second marks, and the parallax images 23 and 26 may not include the first and second marks. Alternatively, the first mark and the second mark are not arranged in adjacent parallaxes, and one or a plurality of parallax images that do not necessarily include a mark may exist between them. For example, the vertical arrows in the rectangular areas 32 and 33 may be omitted in the parallax images 22 and 27, and the arrows in the rectangular area 31 may be omitted in the parallax images 21 and 28.

  FIG. 6 is a diagram schematically illustrating an image recognized by the viewer when the viewer views the parallax image 24 with the right eye and the parallax image 25 with the left eye. This figure shows a state in which the viewer can most appropriately view the image stereoscopically at a position almost in the center of the viewing area, in other words, a position far from the position for reverse viewing. In this case, a double circle appears in the rectangular area 31 and a circle appears in the rectangular areas 32 and 33. Thereby, the viewer can recognize that he is in an appropriate position.

  Here, the reason why the arrows are not displayed in the rectangular areas 31 to 33 of the parallax images 23 and 26 is as follows. In general, at the point where the right eye or left eye to be viewed exists, the light rays reaching there are not necessarily composed of 100% of the light rays of a specific one of the parallax images 21 to 29, but the nature of the light, etc. Therefore, the light beams of adjacent parallax images are usually mixed. That is, even when the parallax image 24 is viewed with the right eye, the parallax image 23 slightly enters the eye. Similarly, even when the parallax image 25 is viewed with the left eye, the parallax image 26 slightly enters the eye. Therefore, if there is any mark such as an arrow in the rectangular areas 31 to 33 of the parallax images 23 and 26, the viewer can slightly see the marks of the parallax images 23 and 26 even though the viewer is at an appropriate position. As a result, there is a possibility of misunderstanding that the viewer is in an inappropriate position.

  On the other hand, in the test pattern of the present embodiment, no marks are displayed in the rectangular regions 31 to 33 of the parallax images 23 and 26 adjacent to the parallax images 24 and 25 to prevent interference, A mark is displayed on the parallax images 22 and 27 that are one by one away from 25. Therefore, the viewer can correctly recognize that he / she is in an appropriate position even if the parallax images 23 and 26 slightly enter the eyes as well as the parallax images 24 and 25.

  FIG. 7 is a diagram schematically illustrating an image recognized by the viewer when the viewer views two of the parallax images 21 to 23 and 29 with the right eye and the left eye. This figure shows a state where the viewer may be at the right end in the viewing zone or on the right side out of the viewing zone, and the video may not be viewed stereoscopically. In this case, a left-pointing arrow is visible in the rectangular area 31 instead of a double circle, and an up-down arrow is visible in the rectangular areas 32 and 33 instead of a circle. Thereby, the viewer can recognize that he is not in an appropriate position. In addition, the numbers 1 to 5 appear double or not three-dimensionally in this order, or the background cloud pattern is blurred or doubled, so that you are not in the proper position. I can recognize that. For the reasons described above, the parallax image 24 may enter the eyes and double circles or circles may be slightly visible. In addition, the parallax image 28 may enter the eyes and the right-pointing arrow may be slightly visible.

  The viewer can enter the viewing zone by moving the viewer to the left with respect to the liquid crystal panel 1 in accordance with the leftward arrow in FIG. Alternatively, the viewer can enter the viewing zone by moving the viewer in the front-rear direction (direction approaching or moving away from the liquid crystal panel 1) with respect to the liquid crystal panel 1 in accordance with the up and down arrows.

  Similarly, when the viewer is at the left end in the viewing zone or on the left side outside the viewing zone, the viewer can see a right-pointing arrow. When the viewer moves to the right with respect to the liquid crystal panel 1 in accordance with the right-pointing arrow, the viewer can enter an appropriate position in the viewing zone.

  Here, the arrows are displayed on the right half of the rectangular area 31 of the parallax image 22 and the left half of the rectangular area 31 of the parallax image 28 for the following reason. For example, even when the parallax image 21 is viewed with one eye and the adjacent parallax image 29 is viewed with the other eye, the parallax image 28 may enter the eye. Therefore, if a leftward arrow is displayed over the entire rectangular area 31 of the parallax image 21 and a rightward arrow is displayed over the entire rectangular area 31 of the parallax image 28, as shown in FIG. , And there is a risk of not knowing the direction of the arrow or even what is displayed.

  On the other hand, in the test pattern of the present embodiment, since an arrow is displayed in half of the rectangular area 31, even if both parallax images 21 and 28 enter the eyes, as shown in FIG. An arrow is recognized. In this case, the arrow of the parallax image that enters more strongly into the eyes (in the example of the figure, the right-pointing arrow of the parallax image 28) is clearly recognized. It can be seen that the viewer should move according to the clearly displayed arrow.

  The display and non-display of this test pattern is controlled according to the remote control operation by the viewer. FIG. 9 is a diagram illustrating an example of a remote control used by a viewer. FIG. 10 is a flowchart showing an example of the processing operation of the controller 10 according to the remote control operation.

  When a viewer presses a predetermined key (for example, the blue key 51) of the remote controller while displaying a stereoscopic video corresponding to a video signal input from the outside, the remote controller transmits an infrared test pattern display signal. When this is received by the light receiving unit 4 (YES in step S1), the image output unit 16 of the controller 10 changes from the parallax image signal of the input video signal output from the parallax image conversion unit 12 to the test pattern control unit 15. The parallax image signal of the output test pattern signal is output to the image adjustment unit 17. The image adjustment unit 17 adjusts the parallax image signal of the test pattern signal according to the control parameter, and supplies it to the liquid crystal panel 1 (step S2). As a result, the test pattern of FIG. 5 is displayed on the liquid crystal panel 1.

  With this test pattern, the viewer can determine whether or not he / she is in a position where he / she can see the image three-dimensionally.

  Thereafter, when the viewer presses a predetermined key (for example, the end key 52 or the determination key 53) of the remote controller while the test pattern is displayed, the remote controller generates an infrared test pattern end signal. When the light receiving unit 4 receives this (YES in step S3), the image output unit 16 of the controller 10 replaces the parallax image signal of the test pattern signal output from the test pattern control unit 15 with the parallax image conversion unit 12. The parallax image signal of the output input video signal is output to the image adjustment unit 17. The image adjusting unit 17 adjusts the parallax image signal of the input video signal according to the control parameter, and supplies it to the liquid crystal panel 1 (step S4). As a result, a video signal corresponding to the input video signal is displayed on the liquid crystal panel 1.

  As described above, in the first embodiment, a test pattern including a plurality of parallax images including different marks is displayed. Therefore, the viewer can easily recognize whether or not the viewer is in an appropriate position according to the visible mark. That is, the double circles and circles of the parallax images 24 and 25 can be seen when near the center of the viewing zone, which is the most appropriate position. At this time, since the parallax images 23 and 26 adjacent to the two central parallax images 24 and 25 do not include an arrow, the arrow is hardly seen, and the viewer can surely recognize that it is near the center of the viewing zone. On the other hand, when not near the center of the viewing zone, the arrows of the parallax images 21, 22, 27, and 28 are visible. Since an arrow is displayed on the left half or the right half of the rectangular area 31, the viewer can surely recognize the direction of the arrow, and the viewer can move to the center of the viewing zone according to the arrow.

(Second Embodiment)
In the first embodiment described above, when the viewer is not in the center of the viewing zone, the viewer moves to the center of the viewing zone. On the other hand, in the second embodiment described below, the viewer moves without changing the viewing zone while the viewer watches the test pattern. Since the configuration of the video display device 100 is similar to that of FIG. 1, the illustration thereof will be omitted, and the difference will be mainly described below.

  The viewing area information calculation unit 14 according to the present embodiment calculates control parameters according to the position of the viewer detected by the viewer position detection unit 13 and responds to a signal input from the outside of the controller 10 such as a remote controller. Control parameters can also be adjusted.

  FIG. 11 is a diagram illustrating an example of a test pattern according to the second embodiment. In FIG. 11, only the parallax image 24a corresponding to the parallax image 24 of FIG. 5 is shown, but actually, the test pattern includes nine parallax images 21a to 29a.

  The parallax image 24a in FIG. 11 is further provided with an upper left rectangular area 34 where an explanatory text for assisting the viewer's operation is displayed, and upper right rectangular areas 35 and 36 where a slide bar described later is displayed. It is done. Other parallax images 21a to 29a not shown in FIG. 11 are also obtained by adding rectangular areas 34 to 36 to the parallax images 21 to 29 of FIG.

  FIG. 12 is a flowchart showing an example of the processing operation of the controller 10 when the viewing zone is adjusted using the test pattern of FIG.

  If the viewer does not see the image displayed on the liquid crystal panel 1 in a three-dimensional manner, it is considered that the viewer is not near the center of the viewing zone. The viewer then presses the blue button on the remote control. As a result, the test pattern is displayed as in the first embodiment. (YES in step S11, S12).

  While viewing the displayed test pattern, the viewer presses the left / right key 55 and the up / down key 56 on the remote controller to move the viewing zone. In response to this, various viewing zone adjustment signals are generated. When the light receiving unit 3 receives the viewing zone adjustment signal (YES in step S13), the viewing zone information calculation unit 14 recalculates the control parameter, and the image adjustment unit 17 adjusts the viewing zone accordingly (step S14). . Specifically:

  When an arrow is visible in the rectangular area 31 instead of a double circle, the viewer presses the left key 55 or the right key 55 on the remote controller to move the viewing area to the left or right. For example, when a left arrow is visible, the viewer presses the left key 55. In response to this, as described with reference to FIG. 3B, the image adjustment unit 17 performs a process of shifting the display position of the parallax image to the right side. As a result, the parallax moves to the left.

  The parallax can be adjusted by a predetermined number of steps. Each time the left key 55 or the right key 55 is pressed, the parallax moves by one step. A circle in the rectangular area 36 schematically shows the current position of the viewing area. That is, in the initial state, the circle is in the center, but when the left key 55 and the right key 55 are pressed, the circle moves to the left and right, respectively. Thereby, the viewer can visually grasp how the viewing zone moves.

  If the rectangular areas 32 and 33 show a vertical arrow instead of a circle, the viewer presses the upper key 56 or the lower key 56 of the remote controller to move the viewing area in the front-rear direction with respect to the liquid crystal panel 1. Let When the up key 56 or the down key 56 is pressed, as described with reference to FIG. 3C, the image adjustment unit 17 performs a process of shifting the parallax image by a different amount and appropriately interpolating. As a result, the parallax moves forward or backward.

  The parallax can be adjusted by a predetermined number of steps. Each time the up key 56 or the down key 56 is pressed, the parallax moves by one step. A circle in the rectangular area 35 schematically shows the current position of the viewing area. That is, in the initial state, the circle is in the center, but when the up key 56 and the down key 56 are pressed, the circle moves up and down, respectively.

  When the adjustment of the viewing area as described above is repeated and the position of the viewer is adjusted to be close to the center of the viewing area, the viewer is given a double circle in the rectangular area 31 and a circle in the rectangular areas 32 and 33. Will become visible. When the viewing zone adjustment is completed, the viewer presses the remote control end key 52 or the enter key 53. In response to this, the display of the test pattern ends, and the video corresponding to the input video signal is displayed (YES in step S15, S16).

  At this time, the control parameter recalculated by the viewing zone information calculation unit 14 is held in the storage unit 18 or the image adjustment unit 17. Therefore, the viewer can view the image three-dimensionally in an appropriately adjusted viewing area.

  In order to improve the convenience for the viewer, the controller 10 may perform processing other than the processing in FIG. For example, the viewing zone may be returned to the initial setting when the red key 57 is pressed during steps S11 to S15. Further, when the green key 58 is pressed, the viewing area before adjustment may be returned. Alternatively, when the blue key 51 is pressed, an image captured by the camera 3 may be displayed instead of the test pattern.

  As described above, in the second embodiment, the viewing zone control as shown in FIG. 3 and the description thereof is performed by the remote control operation of the viewer while displaying the test pattern including a plurality of parallax images including different marks. Implement based on. Therefore, the viewer can easily adjust the viewing zone to an appropriate position.

(Third embodiment)
In the first and second embodiments described above, a test pattern is displayed on the entire surface of the liquid crystal panel 1. On the other hand, in the third embodiment described below, a test pattern is displayed superimposed on a part of an image. Since the configuration of the video display device 100 is similar to that of FIG. 1, the illustration thereof will be omitted, and the difference will be mainly described below.

  FIG. 13 is a diagram illustrating an example of a test pattern according to the third embodiment. Unlike the test patterns of FIGS. 5 and 11, a video corresponding to a video signal input from the outside is displayed in the background, and rectangular areas 41 to 43 are provided in a part thereof. The marks displayed in the rectangular areas 41 to 43 are the same as the marks displayed in the rectangular areas 31 to 33 in FIG.

  FIG. 14 is a flowchart illustrating an example of a processing operation of the controller 10. When a viewer presses a predetermined key on the remote controller while displaying a stereoscopic video corresponding to a video signal input from the outside, the remote controller transmits an infrared test pattern display signal. When this is received by the light receiving unit 4 (YES in step S21), the image adjusting unit 17 of the controller 10 performs a test output from the test pattern control unit 15 to the parallax image signal of the video signal output from the parallax image conversion unit 12. The parallax image signal of the pattern signal is superimposed and supplied to the liquid crystal panel 1 (step S22). More specifically, the video signal output from the image adjustment unit 17 is obtained by overwriting the parallax image of the test pattern signal only in the rectangular areas 41 to 43 on the parallax image of the input video signal. Alternatively, the image adjustment unit 17 may blend the parallax images of the input video signal and the test pattern signal at a predetermined ratio in the rectangular areas 41 to 43.

  Thereby, the test pattern of FIG. 13 is superimposed and displayed on the input video. Since this test pattern is displayed on the edge of the liquid crystal panel 1, the viewer can check whether or not he / she is in a position where he / she can see the video in three dimensions while watching the input video. A person can move into the viewing zone.

  Thereafter, when the viewer presses a predetermined key on the remote controller during the display of the test pattern, the display of the test pattern is terminated and a video corresponding to the input video signal is displayed (YES in step S23, S24).

  Note that either the test pattern of FIG. 5 described in the first embodiment or the test pattern of FIG. 13 may be selectively displayed. In this case, the key for displaying the test pattern of FIG. 5 and the key for displaying the test pattern of FIG. 13 may be set separately on the remote controller.

  Thus, in the third embodiment, the test pattern is superimposed and displayed on a part of the video. Therefore, the viewer can determine whether the user is in an appropriate position while enjoying the video.

  In each embodiment, the lenticular lens 2 is used and the viewing zone is controlled by shifting the parallax image. However, the viewing zone may be controlled by other methods. For example, instead of the lenticular lens 2, a parallax barrier may be provided as the opening control unit 2 '. FIG. 15 is a block diagram showing a schematic configuration of a video processing apparatus 100 ′ that is a modification of each embodiment shown in FIG. 2. As shown in the figure, the controller 10 ′ of the video processing apparatus 100 ′ includes a viewing zone control unit 17 ′ instead of the image adjustment unit 17.

  The viewing zone control unit 17 ′ controls the aperture control unit 2 ′ according to the control parameter calculated by the viewing zone information calculation unit 14. In the case of this modification, the control parameters are a distance between the liquid crystal panel 1 and the opening control unit 2 ', a horizontal shift amount between the liquid crystal panel 1 and the opening control unit 2', and the like.

  In the present modification, the viewing zone is controlled by controlling the output direction of the parallax image displayed on the liquid crystal panel 1 with the aperture control unit 2 ′. As described above, the aperture control unit 2 ′ may be controlled by the viewing zone control unit 16 ′ without performing the process of shifting the parallax image.

  Moreover, in each embodiment, although the example which controls the processing operation of the controller 10 using a remote control was shown, you may provide a button in the main body of the video display apparatus 100, and you may control by pressing a button.

  At least a part of the controller 10 described in the above-described embodiment may be configured by hardware or software. When configured by software, a program for realizing at least a part of the functions of the controller 10 may be stored in a recording medium such as a flexible disk or a CD-ROM, and read and executed by a computer. The recording medium is not limited to a removable medium such as a magnetic disk or an optical disk, but may be a fixed recording medium such as a hard disk device or a memory.

  Further, a program for realizing at least a part of the functions of the controller 10 may be distributed via a communication line (including wireless communication) such as the Internet. Further, the program may be distributed in a state where the program is encrypted, modulated or compressed, and stored in a recording medium via a wired line such as the Internet or a wireless line.

  In each embodiment, an example of a video processing apparatus capable of processing 9 parallaxes has been described. However, the present invention can be applied to a 3D video display apparatus having 5 parallaxes or more, and of course, a 3D video display apparatus having parallaxes exceeding 9 parallaxes. Is also applicable.

  Although several embodiments of the present invention have been described, these embodiments are presented by way of example and are not intended to limit the scope of the invention. These embodiments can be implemented in various other forms, and various omissions, replacements, and changes can be made without departing from the spirit of the invention. These embodiments and their modifications are included in the scope and gist of the invention, and are also included in the invention described in the claims and the equivalents thereof.

DESCRIPTION OF SYMBOLS 1 Liquid crystal panel 2 Lenticular lens 3 Camera 4 Light-receiving part 10 Controller 11 Tuner decoder 12 Parallax image conversion part 13 Viewer position detection part 14 Viewing area information calculation part 15 Test pattern control part 16 Image output part 17 Image adjustment part

Claims (20)

  1. A test pattern control unit for outputting a test pattern for indicating whether or not the viewer is substantially in the center of the viewing zone where the video can be viewed stereoscopically to a display unit capable of stereoscopic display;
    The test pattern includes first to sixth parallax images respectively corresponding to first to sixth viewpoints arranged in order in the horizontal direction,
    The third and fourth parallax images include a first mark,
    The first and sixth parallax images include a second mark different from the first mark,
    The video processing device, wherein the second and fifth parallax images do not include the first mark and the second mark.
  2. The first mark is a mark indicating that the viewer is substantially in the center of the viewing zone,
    The video processing apparatus according to claim 1, wherein the second mark is a mark indicating that the viewer is not located at a substantially center of the viewing zone.
  3. The second mark included in the first parallax image is displayed on the right half of the area corresponding to the area where the first mark is displayed in the third parallax image,
    The said 2nd mark contained in the said 6th parallax image is displayed on the left half of the area | region corresponding to the area | region where the said 1st mark in the said 3rd parallax image is displayed. Video processing equipment.
  4. The first to sixth viewpoints are arranged in order from right to left,
    The second mark included in the first parallax image is a left-pointing arrow;
    The video processing apparatus according to claim 1, wherein the second mark included in the sixth parallax image is a right-pointing arrow.
  5.   The video processing apparatus according to claim 4, wherein the second mark further includes an up-down arrow displayed at a position different from the left-pointing arrow and the right-pointing arrow.
  6. The first mark is
    A first appropriate mark displayed at substantially the center of the third and fourth parallax images;
    The video processing apparatus according to claim 1, further comprising: a second proper mark displayed on the left and right of the first proper mark.
  7. The test pattern includes a plurality of numbers,
    The video processing apparatus according to claim 1, wherein each of the plurality of numbers is displayed on the display unit with a depth corresponding to the value.
  8. Among the plurality of numbers,
    At least one is visible on the display,
    At least one is visible in front of the display unit,
    The video processing apparatus according to claim 7, wherein at least one is displayed on the display unit so as to be visible behind the display unit.
  9. A receiving unit for receiving an input video signal input from the outside;
    A parallax image conversion unit that converts the input video signal into a plurality of parallax images;
    The video processing apparatus according to claim 1, further comprising: an image output unit that superimposes the test pattern on a part of the parallax image of the input video signal and outputs the superimposed image to the display unit.
  10.   The video processing apparatus according to claim 9, wherein the receiving unit receives, selects, and decodes a broadcast wave to obtain the input video signal.
  11. A viewing zone information calculation unit that calculates a control parameter for adjusting the viewing zone according to a viewing zone adjustment signal input from the outside;
    The image processing apparatus according to claim 1, further comprising: a viewing zone control unit that adjusts the viewing zone according to the control parameter.
  12.   The video processing device according to claim 11, wherein the viewing zone information calculation unit calculates the control parameter for adjusting the viewing zone in a horizontal direction and a front-rear direction with respect to the display unit.
  13. According to the control parameter, the viewing zone control unit
    By adjusting the display position of the first to sixth parallax images by a certain amount, the viewing zone is adjusted in the horizontal direction with respect to the display unit,
    By shifting the display positions of the first to sixth parallax images by different amounts between the third and fourth parallax images and the first and sixth parallax images, the viewing zone is changed to the display unit. The video processing apparatus according to claim 12, wherein the video processing apparatus is adjusted in the front-rear direction.
  14. The viewing zone information calculation unit calculates the control parameter for adjusting the viewing zone in a predetermined number of steps,
    The video processing apparatus according to claim 11, wherein the test pattern includes a viewing zone adjustment pattern indicating a current adjustment step.
  15.   The video processing apparatus according to claim 14, wherein the viewing zone adjustment pattern is a slide bar.
  16. A receiving unit for receiving an input video signal input from the outside;
    A parallax image conversion unit that converts the input video signal into a plurality of parallax images,
    The video processing according to claim 11, wherein the viewing zone control unit adjusts the viewing zone when displaying the parallax image of the input video signal according to the control parameter reflecting the adjustment result of the viewing zone. apparatus.
  17. A viewer position detection unit that detects the position of the viewer using video captured by the camera;
    The viewing area information calculation unit calculates a control parameter so that the viewing area is set in an area corresponding to the position of the viewer, and then adjusts the control parameter according to the viewing area adjustment signal. The video processing apparatus according to claim 11.
  18.   The video processing apparatus according to claim 1, further comprising a display unit that displays the test pattern.
  19. Outputting a test pattern for indicating whether or not the viewer is substantially in the center of the viewing area where the video can be viewed stereoscopically to a display unit capable of stereoscopic display;
    Adjusting the viewing zone according to a viewing zone adjustment signal input from the outside, and
    The test pattern includes first to sixth parallax images respectively corresponding to first to sixth viewpoints arranged in order in the horizontal direction,
    The third and fourth parallax images include a first mark,
    The first and sixth parallax images include a second mark different from the first mark,
    The video processing method, wherein the second and fifth parallax images do not include the first mark and the second mark.
  20. A storage medium storing a test pattern for indicating whether or not the viewer is substantially in the center of the viewing area where the image can be viewed stereoscopically,
    The test pattern includes first to sixth parallax images respectively corresponding to first to sixth viewpoints arranged in order in the horizontal direction,
    The third and fourth parallax images include a first mark,
    The first and sixth parallax images include a second mark different from the first mark,
    The storage medium in which the second and fifth parallax images do not include the first mark and the second mark.
JP2011270040A 2011-12-09 2011-12-09 Video processing apparatus, video processing method, and storage medium Expired - Fee Related JP5117613B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2011270040A JP5117613B1 (en) 2011-12-09 2011-12-09 Video processing apparatus, video processing method, and storage medium

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2011270040A JP5117613B1 (en) 2011-12-09 2011-12-09 Video processing apparatus, video processing method, and storage medium
US13/547,994 US8558877B2 (en) 2011-12-09 2012-07-12 Video processing device, video processing method and recording medium
CN2012102808413A CN103167311A (en) 2011-12-09 2012-08-08 Video processing device, video processing method and recording medium

Publications (2)

Publication Number Publication Date
JP5117613B1 true JP5117613B1 (en) 2013-01-16
JP2013123094A JP2013123094A (en) 2013-06-20

Family

ID=47692797

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2011270040A Expired - Fee Related JP5117613B1 (en) 2011-12-09 2011-12-09 Video processing apparatus, video processing method, and storage medium

Country Status (3)

Country Link
US (1) US8558877B2 (en)
JP (1) JP5117613B1 (en)
CN (1) CN103167311A (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103733118B (en) * 2011-08-11 2015-11-25 富士通株式会社 Stereoscopic display device
US9883179B2 (en) * 2014-07-16 2018-01-30 Echostar Technologies L.L.C. Measurement of IR emissions and adjustment of output signal
CN104702939B (en) * 2015-03-17 2017-09-15 京东方科技集团股份有限公司 Image processing system, method, the method for determining position and display system

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4468512B2 (en) 1999-04-30 2010-05-26 株式会社ソフタシア display control device
JP2002365593A (en) * 2001-06-08 2002-12-18 Sony Corp Display device, position-adjusting pattern display program, recording medium, polarized spectacles and method for adjusting filter position of the display device
JP4236428B2 (en) * 2001-09-21 2009-03-11 三洋電機株式会社 Stereoscopic image display method and stereoscopic image display apparatus
JP5322264B2 (en) * 2008-04-01 2013-10-23 Necカシオモバイルコミュニケーションズ株式会社 Image display apparatus and program
JP2010233158A (en) * 2009-03-30 2010-10-14 Namco Bandai Games Inc Program, information memory medium, and image generating device
JP5404246B2 (en) * 2009-08-25 2014-01-29 キヤノン株式会社 3D image processing apparatus and control method thereof
JP4758520B1 (en) 2010-03-05 2011-08-31 シャープ株式会社 Stereoscopic image display device and operation method of stereoscopic image display device
CN102449534B (en) * 2010-04-21 2014-07-02 松下电器产业株式会社 Three-dimensional video display device and three-dimensional video display method

Also Published As

Publication number Publication date
JP2013123094A (en) 2013-06-20
CN103167311A (en) 2013-06-19
US20130147929A1 (en) 2013-06-13
US8558877B2 (en) 2013-10-15

Similar Documents

Publication Publication Date Title
US7440004B2 (en) 3-D imaging arrangements
US9007442B2 (en) Stereo image display system, stereo imaging apparatus and stereo display apparatus
JP2012518317A (en) Transfer of 3D observer metadata
US20120147139A1 (en) Stereoscopic image aligning apparatus, stereoscopic image aligning method, and program of the same
US20120038745A1 (en) 2D to 3D User Interface Content Data Conversion
JP2011193460A (en) Method for adjusting 3d-image quality, 3d-display apparatus, 3d-glasses, and system for providing 3d-image
US20140145942A1 (en) Video reproduction apparatus and video reproduction method
US20120169730A1 (en) 3d image display device and 3d image display method
JP2012060607A (en) Three-dimensional image display apparatus, method, and program
US9215452B2 (en) Stereoscopic video display apparatus and stereoscopic video display method
EP2448276B1 (en) GUI providing method, and display apparatus and 3D image providing system using the same
TWI598846B (en) Image data processing method and stereoscopic image display using the same
JP2010068121A (en) Apparatus and method for analysis of image data, and program
EP2701390A2 (en) Apparatus for adjusting displayed picture, display apparatus and display method
US20110199465A1 (en) Method of processing parallax information comprised in a signal
US8930838B2 (en) Display apparatus and display method thereof
US20100265315A1 (en) Three-dimensional image combining apparatus
JP5404246B2 (en) 3D image processing apparatus and control method thereof
TW201234838A (en) Stereoscopic display device and control method of stereoscopic display device
TWI508519B (en) An image processing apparatus, a program, an image processing method, a recording method, and a recording medium
CN101668220A (en) Adjustable parallax barrier 3D display
KR20110044573A (en) Display device and image display method thereof
CN102450022B (en) Image-processing method for a display device which outputs three-dimensional content, and display device adopting the method
US20100103318A1 (en) Picture-in-picture display apparatus having stereoscopic display functionality and picture-in-picture display method
US8203599B2 (en) 3D image display apparatus and method using detected eye information

Legal Events

Date Code Title Description
TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20120918

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20151026

Year of fee payment: 3

LAPS Cancellation because of no payment of annual fees