CN212256285U - Extended screen display system - Google Patents

Extended screen display system Download PDF

Info

Publication number
CN212256285U
CN212256285U CN202021115205.1U CN202021115205U CN212256285U CN 212256285 U CN212256285 U CN 212256285U CN 202021115205 U CN202021115205 U CN 202021115205U CN 212256285 U CN212256285 U CN 212256285U
Authority
CN
China
Prior art keywords
smart glasses
screen
content
extended
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN202021115205.1U
Other languages
Chinese (zh)
Inventor
黄刚刚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chengdu Yiguangnian Culture Communication Co ltd
Original Assignee
Chengdu Yiguangnian Culture Communication Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chengdu Yiguangnian Culture Communication Co ltd filed Critical Chengdu Yiguangnian Culture Communication Co ltd
Priority to CN202021115205.1U priority Critical patent/CN212256285U/en
Application granted granted Critical
Publication of CN212256285U publication Critical patent/CN212256285U/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The utility model discloses an extension screen display system, but including main computing device, main display ware, input device, intelligent glasses and look to the tracking module, main computing device with main display ware communicative connection, intelligent glasses with main computing device communicative connection, including memory, treater and extension display, the extension display is used for providing the extension screen, and this extension screen display system has expanded the display screen that the user opened and handled the task of a plurality of application windows simultaneously through main computing device when handling for the user can easily handle a plurality of application windows.

Description

Extended screen display system
Technical Field
The utility model relates to a calculate the field, concretely relates to extension screen display system.
Background
The computing world has advanced rapidly in recent years, and as computing power has increased, so has the need for multitasking. As part of multitasking, a user is able to open multiple application windows simultaneously on a display linked to a computing device and work on different applications. As more and more applications are opened, the display area of the display cannot display more application windows with as large a display window as possible, for example, if a user needs to simultaneously display a plurality of word documents and excel documents in the display area of the display for simultaneous multi-document viewing or word cross processing, the plurality of word documents and excel documents can only shrink the display window so that all the documents can be viewed by the user at the same time in the display area of the display, limited by the size of the display area of the display. Thus, on one hand, the user needs to frequently operate each application (such as scrolling, font enlarging or reducing operations) to obtain the target content requirement required by the user, and on the other hand, the application display window which is too small is not favorable for the user to operate.
Against this background, embodiments of the present invention are presented.
Disclosure of Invention
The utility model aims at providing an extension screen display system to solve above-mentioned problem.
To achieve the above object, the present invention provides the following technical solutions:
an extended screen display system includes a primary computing device communicatively coupled to a primary display for providing a primary display screen, at least one input device, smart glasses communicatively coupled to the primary computing device, including a memory, a processor, and an extended display for providing an extended screen;
the intelligent glasses generate content to be presented according to the content provided by the main computing device, and control the extended display to display the content with presentation on the extended screen when the viewing direction is judged to be within the range of the target viewing direction of the user.
Optionally, look to the tracking module and be an external tracking module that shoots, set up on the intelligent glasses.
Optionally, a preset reference point is arranged on the main display, and the external shooting tracking module determines the visual direction of the user wearing the intelligent glasses through tracking identification of the preset reference point.
Optionally, the view tracking module is an external shooting tracking module, and is disposed outside the smart glasses and capable of acquiring any spatial position of a user image of the user wearing the smart glasses.
Optionally, the smart glasses are provided with a preset reference point,
optionally, look to tracking module still including the module of making a video recording of shooting to the eye, the module of making a video recording is used for shooting the eye pattern of the user who wears intelligent glasses to the eye.
Optionally, the viewing direction tracking module and the smart glasses are in communication connection, and send the viewing direction that the user wears the smart glasses to the smart glasses.
Optionally, the viewing direction tracking module and the smart glasses are in communication connection, and send the viewing direction that the user wears the smart glasses to the smart glasses.
Optionally, the input device may be communicably connected to the smart glasses for user interaction with content presented on the extended screen.
Optionally, a pre-stored window is disposed on a main display screen of the main display, the input device is communicably connected to the main computing device, and a user performs an interactive operation on the extended screen through an interactive operation between the input device and content in the pre-stored window on the main display.
The utility model provides an extension screen that extension screen display system intelligence glasses provided and the combination of the main display screen of main display ware have expanded the user through main computing device when carrying out the task processing of opening simultaneously and handling a plurality of application windows for the user can easily handle a plurality of application windows, improves the efficiency that the user used the computer to carry out multitask processing work.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required to be used in the embodiments will be briefly described below. It is understood that the following drawings depict only some embodiments of the invention and are therefore not to be considered limiting of its scope, for the person skilled in the art will be able to derive from them other related drawings without inventive faculty.
Fig. 1 is a schematic view of an extended screen display system provided by the present invention.
Fig. 2 is a schematic diagram of a display method of an extended screen according to the present invention.
Fig. 3 is a schematic diagram of another extended screen display system provided by the present invention.
Fig. 4 is a schematic diagram of a sub-expansion screen in a position configuration mode according to the present invention.
Fig. 5 is a schematic diagram of an extended screen having a sub extended screen according to the present invention.
Fig. 6 is a schematic diagram of an extended screen having a plurality of sub extended screens provided by the present invention.
Fig. 7 is a schematic diagram of the content in the pre-storing window on the main display provided by the present invention.
Fig. 8 is a schematic diagram of an extended screen having a sub extended screen according to the present invention.
Fig. 9 is a schematic diagram of coordinate setting of a viewing direction calculation method according to the present invention.
Detailed Description
Fig. 1 is a schematic diagram of an extended screen display system provided by the present invention, the display system includes a main computing device 101, a main display 102, an input device (not shown in the figure), smart glasses 90 and a direction-of-view tracking module 800. A host computing device 101 configured to be able to select and execute a plurality of interactive applications. The host computing device 101 may be a base computing device, such as a mobile computing device, e.g., a desktop computer, a laptop computing device, a tablet computing device, or a smartphone. The host computing device 101 is configured in this embodiment to selectably execute interactive applications locally at the computing device. In another embodiment, the host computing device 101 is configured to access interactive applications on a cloud server using a network, such as the internet, through a local application program interface. The cloud server may be accessed by the host computing device 101 through an internet access point (e.g., router, CPE). The cloud server may include one or more servers, such as game servers, application servers, content servers, and the like, that execute a plurality of applications. Primary display 102 is configured to display content of a user-selected application on primary computing device 101 for execution and/or presentation. The main display 102 is communicatively connected to the main computing device 101 using wired or wireless means. Main display 102 may be any of various types of display devices, such as a television, a projector, or any other type of display screen that may be used to present interactive application content. The smart glasses 90 include a processor and memory (not shown) and an extended display 901, the smart glasses 90 being communicatively coupled to the host computing device 101. In this embodiment, the smart glasses 90 are communicatively coupled to the host computing device 101 in a wireless manner, and in one possible embodiment, the smart glasses 90 may also be communicatively coupled to the host computing device 101 in a wired manner. In yet another possible embodiment, at least two smart glasses are configured to be communicatively coupled to the host computing device 101 at the same time.
The viewing direction tracking module 800 is configured to detect the viewing direction of the user wearing the smart glasses 90, the viewing direction tracking module 800 is communicably connected to the smart glasses 90, and transmits data including the viewing direction of the user wearing the smart glasses to the smart glasses 90, or the viewing direction tracking module 800 may be communicably connected to the host computing device 101, and at this time, the obtained data of the viewing direction of the user wearing the smart glasses is transmitted to the host computing device 101, and then the received viewing direction of the user wearing the smart glasses is transmitted to the smart glasses 90 by the host computing device 101.
Look to tracking module 800 can set up on intelligent glasses 90, also can set up outside intelligent glasses 90 for obtain the look of user wearing intelligent glasses to the direction.
The viewing direction tracking module 800 may be an external shooting module disposed on the smart glasses 90, the external shooting tracking module at least includes a camera, the environment image of the space where the main display 102 is located with the shooting direction departing from the direction of the eyes of the user is located, the image processing calculation can obtain the space coordinate data of the smart glasses 90 worn by the user with the main display as the coordinate reference, as shown in (a) in fig. 9, the viewing direction tracking module 800 includes a camera disposed on the smart glasses 90, the camera shoots and obtains the image of the main display 102, the three-dimensional reconstruction method of the monocular camera in the machine vision method can obtain the coordinate of any point on the main display 102 under the world coordinate system owxwzw, the Zw axis is generally parallel to the line of sight (indicated by O1G1 in the figure) in the direct viewing state of the user, based on the structural size of the main display 102, the coordinate of any point on the main display 102 can be obtained with the origin of the coordinate, the coordinates in the space coordinate system oyxz with the length direction as the X axis and the width direction as the Y axis are obtained from the coordinates in the world coordinate system oyxwywzw and the main display coordinate system oyxz of not less than 12 points on the main display 102, the transformation matrix of the two coordinate systems is obtained, the coordinates in the space coordinate system oyxywzw of the origin Ow of the world coordinate system oywxwywzw obtained from the transformation matrix are recorded as [ X0, Y0, Z0], and the azimuth angles in the space coordinate system oyxz of the Zw axis, namely three space angles of the Zw axis and the X axis, the Y axis and the Z axis respectively are recorded as [ a1, b1, g1 ]. When the head of the user wearing the intelligent glasses 90 rotates, the coordinates [ X0, Y0 and Z0] of the origin Ow in the XYZ coordinate system also change in real time, and the coordinates [ X0 and Y0] are used as the visual direction of the user wearing the intelligent glasses 90. The orientation coordinate [ a1, b1, g1] may also be used as a sight line direction vector when the user wears the smart glasses 90, and when the head of the user wearing the smart glasses 90 rotates, the sight line direction vector changes in real time, so that an intersection point coordinate of the sight line direction vector in any rotation direction and the OXY plane in the xyz coordinate system can be obtained, and the intersection point coordinate is the sight direction of the user wearing the smart glasses 90. In one possible embodiment, a preset reference point is disposed on the main display 102, and the viewing direction tracking module 800 determines the viewing direction of the smart glasses 90 worn by the user through recognition and tracking of the preset reference point on the main display 102, where the preset reference point may be a reference mark disposed on the main display 102, such as a marker affixed to a non-main display screen area of the main display 102 and containing no less than 12 sets of specific numbers and/or pattern features, a marker in a specific shape (such as a triangular object), a light-emitting marker (such as an infrared or visible LED lamp), and the like, and the preset reference point may also be a reference mark displayed on the main display screen of the main display 102 and presented with electronic information, such as a specific number and/or pattern. The camera takes a picture containing the preset reference point, and the direction of vision of the user wearing the intelligent glasses 90 can be obtained based on the preset reference point by adopting a method similar to the method.
The view direction tracking module 800 may also be disposed outside the smart glasses 90 and can acquire an image of a user wearing the smart glasses 90 at any spatial position, for example, the view direction tracking module 800 shown in fig. 1 may be attached to any position outside the main display area on the main display 102 by means of bonding or clipping. The direction-of-view tracking module 800 may also be disposed in any spatial location near the main display 102 where an image of the user wearing smart glasses 90 may be obtained, as shown in fig. 3, the direction-of-view tracking module 800 being disposed beside the main display 102. In addition, the view tracking module 800 may also be a built-in camera module of the main display 102, such as a camera on the main display of a notebook computer. For example, the view direction tracking module 800 shown in (b) of fig. 9 is disposed on the main display 102 in a pasting manner, and includes a camera, a world coordinate system owxywzw is established with the view direction tracking module 800 as a reference, coordinates of any point on the smart glasses 90 in the owxywzw coordinate system can be obtained based on a machine vision method, a point on a structural member near the bridge of the nose of the user on the smart glasses 90 is set as an origin, coordinates of a coordinate system O9X9Y9Z9, a Z9 axis and a line of sight (represented by O1G1 in the drawing) in a direct viewing state of the user are established in parallel, coordinates of not less than 12 points on the smart glasses 90 in the world coordinate system owxywzw and in the coordinate system O9X9Y9Z9 of the smart glasses 90 are obtained, a conversion matrix of the two coordinate systems can be obtained, and a conversion matrix between the coordinate system owxywzw of the view direction tracking module 800 and the owxzw 102 can be obtained by calculation according to a mounting position of the main display module 800 on the main display 102 and a specific structure of the xyz 102, the coordinates of the origin O9 of the coordinate system O9X9Y9Z9 of the smart glasses 90 obtained from the two transformation matrices in the xyz coordinate system are denoted as [ X90, Y90, Z90], and the azimuth angle of the Z9 axis in the xyz coordinate system, i.e., three spatial angles of the Z9 axis and the X, Y, Z axes, respectively, are denoted as [ a90, b90, g90 ]. The coordinates [ X90, Y90] are taken as the viewing direction of the user wearing the smart glasses 90. Alternatively, the azimuth coordinates [ X90, Y90, Z90, a9, b9, g9] may be used as the visual direction vector of the user wearing the smart glasses 90, and when the head of the user wearing the smart glasses 90 rotates, the visual direction vector changes in real time, so as to obtain the coordinates of the intersection point of the visual direction vector in any rotation direction and the OXY plane in the xyz coordinate system, where the coordinates of the intersection point is the visual direction of the user wearing the smart glasses 90. In a possible embodiment, a preset reference point may be further disposed on the smart glasses 90, and the viewing direction tracking module 800 disposed on the main display 102 determines the viewing direction of the user wearing the smart glasses 90 by identifying and tracking the preset reference point on the smart glasses 90, where the preset reference point may be a reference mark disposed on a structural member of the smart glasses 90 near the bridge of the user's nose, such as a light source marker (e.g., an LED lamp with infrared or visible light), a two-dimensional code icon, a number and/or pattern marker, a marker with a specific shape, and the calculation of the viewing direction of the user wearing the smart glasses 90 may also be calculated by using the above-mentioned machine vision-based method.
In yet another possible implementation, the viewing direction tracking module 800 further includes an eye-shooting camera module, and the eye-shooting camera module and the external-shooting tracking module together complete the determination of the viewing direction of the user wearing the smart glasses 90. The module of making a video recording is specifically an eye movement tracking module to eye shooting, sets up on intelligent glasses 90 for shoot the eye pattern of the user who wears intelligent glasses 90. The eye tracking module obtains the sight line position coordinate of the gaze direction of the eyes of the current user in a coordinate system established by taking the intelligent glasses 90 as a reference by shooting eye diagrams of single eyes or double eyes of the user and carrying out image processing calculation. The external shooting tracking module may obtain a conversion matrix for obtaining a coordinate system established with the smart glasses 90 as a reference and a coordinate system xyz of the main display 102, and obtain orientation coordinates of the sight-line direction coordinates in the xyz coordinate system, which are recorded as [ a91, b91, c91], based on the conversion matrix, and coordinates of an origin of the coordinate system established with the smart glasses 90 as a reference in the xyz coordinate system, which are recorded as [ X01, Y01, Z01], so that a sight-line direction vector [ X01, Y01, Z01, a91, b91, c91] of the user in the xyz coordinate system may be obtained, which is coordinates of an intersection point with an xy plane in the xyz coordinate system, where the coordinate of the intersection point is a sight direction of the user wearing the smart glasses 90.
Fig. 2 shows a display method of an extended screen provided by the present invention, which specifically includes the following steps:
step S10: the presence of smart glasses 90 in proximity to a primary computing device 101 having a primary display 102, the primary display of the primary computing device 101 configured to present content, is detected.
Step S20: the smart glasses 90 are communicably connected to the main computing device 101, and the extension screen provided by the extension display 901 of the smart glasses 90 is a virtual extension screen of the main display screen 102A of the main display 102 for extending the display area of the main display 102.
Step S30: the user selects content to be presented on the extended screen of the extended display 901 on the main display 102, the main computing device 101 transmits data information including at least the content to be presented on the smart glasses 90, the smart glasses 90 receives the data information and generates content to be presented, and the content to be presented is associated with the extended screen on the smart glasses 90.
Step S40: the viewing direction tracking module 800 detects the viewing direction of the smart glasses 90, and when the viewing direction of the smart glasses 90 is within the range of the target viewing direction of the user, the smart glasses 90 present the content to be presented on the extended screen.
The extended screen of the smart glasses 90 will be explained below.
The extended display of the smart eyewear 90 is an optical virtual image display, typically comprising miniature image display devices such as LCOS, OLED, imaging optics and drive controllers and connection structures. The image display device projects and enlarges a virtual image after passing through the imaging optical element, a user can observe the virtual image on the emergent surface of the expansion display, the effective display area of the image display device is generally rectangular, and the enlarged virtual area of the rectangular area is a virtual screen which can be provided by the expansion display. This virtual screen is fixed for the user when the user wears intelligent glasses 90, and under the arbitrary state that the user wore, if head level rotates an angle, an angle of raising the head upwards, can both observe a virtual screen, with the virtual screen under these different states collectively be called extension screen, promptly the utility model discloses in extension screen be one or more wear the virtual screen under the head state when intelligent glasses 90, for example, when describing extension screen configuration and being a sub-extension screen, mean only when the user wears intelligent glasses 90 and satisfy the head state of looking at within the user's target direction of vision scope, the virtual screen that the extension display presented is extension screen promptly. When the extended screen is described as being configured as a plurality of sub-extended screens, it means that the virtual screens presented by the extended display in a plurality of head states whose visual directions satisfy the user target visual direction range at different time periods when the user wears the smart glasses 90 are collectively referred to as extended screens, and a virtual screen in each head state is a sub-extended screen in that state
For better description of the embodiments of the present invention, two terms, namely "main viewing direction range" and "user target viewing direction range" are defined, and the specific definitions are as follows:
the main viewing direction range is defined as a spatial coordinate range with reference to the main display 102, and includes a horizontal coordinate range and a vertical coordinate range. As shown in fig. 3, the main extension screen 102A of the main display 102 is a rectangular display area, the height in the vertical direction is H1, the width in the horizontal direction is W1, a spatial coordinate system oyx is established with the center position of the main extension screen 102A of the main display 102 as an origin, the orientation of H1 is the Y direction of the spatial coordinate system, the orientation of W1 is the X direction of the spatial coordinate system, the vertical coordinate range in the main viewing direction range is [ - (H1+ e1)/2, (H1+ e1)/2], the horizontal coordinate range is [ - (W1+ e2)/2, (W1+ e2)/2], wherein e1 is a height direction margin value, e1 may be H1/2 or another value thereof set by the user, e2 is a width direction margin value, and e2 may be W1/2 or another value set by the user. The reference point of the spatial coordinate system may be a center of the main extended screen 102A, but may be another reference point, such as any position on the main display 102 as a reference point of the spatial coordinate system.
The user target viewing direction range is defined as a spatial coordinate range of the extended screen of the smart glasses 90 with respect to the main display 102, the smart glasses 90 configured as one sub extended screen have one user target viewing direction range, and the smart glasses configured as a plurality of sub extended screens have a plurality of user target viewing direction ranges, respectively.
When the extended screen of the smart glasses is configured as a sub extended screen, the range of the user target viewing direction is mainly determined by the position configuration mode of the sub extended screen, and in a possible embodiment, the position configuration mode of the sub extended screen is kept relatively fixed with respect to the smart glasses 90, that is, when the position of the smart glasses 90 changes, the sub extended screen also changes, as shown in fig. 4, the smart glasses 90 is configured as a sub extended screen 901a0, when the user wearing the smart glasses 90 lifts the head upwards, the user can see the content presented on the sub extended screen 901a0 through the smart glasses 90 in the upwards direction, and when the user lowers the head downwards, the user can see the content presented on the sub extended screen 901a0 through the smart glasses 90 in the downwards direction. In this position configuration mode, the user target viewing direction range is specifically other viewing directions outside the main viewing direction range. In another possible embodiment, the position mode of the sub-extension screen is configured to remain relatively fixed with respect to the main display 102, that is, the sub-extension screen is fixed with respect to the main display 102, and the sub-extension screen always remains relatively stationary with respect to the main display 102 when the user wears the smart glasses 90 and rotates and/or moves the head, and in this mode, the user target viewing direction range is specifically the coordinate range of the configured sub-extension screen in the spatial coordinate system ozx z. As shown in fig. 5, the smart glasses 90 are configured as one sub expansion screen 901a1, the sub expansion screen 901a1 shown in (a) in fig. 5 is configured on the left side of the main display screen 102, and the corresponding user target viewing direction range is a horizontal coordinate range along the OX direction [ (Wc + W1/2+ eh1), W1+ eh2], a vertical coordinate range along the OY direction [ - (Hc/2+ ev1), Hc/2+ ev2], where Wc is the length of the sub expansion screen in the X direction, Hc is the height of the sub expansion screen in the Y direction, eh1, eh2, ev1, ev2 are respectively the margins in the horizontal direction and the vertical direction, and the values may be eh1 ═ Wc/2, eh2 ═ 0, ev1 ═ 0, ev2 ═ Hc/2, or preset by the user. The sub expansion screen 901a1 shown in fig. 5 (b) is disposed on the upper side of the main display screen 102, and the corresponding user target viewing direction ranges are the horizontal coordinate range- (Wc/2+ eh1), Wc/2+ eh2, z 1) along the OX direction, the vertical coordinate range [ Hc + H1/2+ ev1, H1/2+ ev2, z2] along the OY direction, and similarly, the sub expansion screen may be disposed on the right side or the lower side of the main display screen.
When the extension screen of the smart glasses is configured as a plurality of sub extension screens, as shown in (a) of fig. 6, the extension screen of the smart glasses 90 is configured as 3 sub extension screens, which are respectively denoted as 901a1, 901a2, and 901A3, and the 3 sub extension screens are respectively disposed at the left side, the right side, and the lower side of the main display 102, and the position mode of the 3 sub extension screens is configured to be kept relatively fixed with respect to the main display 102, in which case the user target viewing direction range includes a first target viewing direction range AB1 corresponding to the sub extension screen 901a1, a second target viewing direction range AB2 corresponding to the sub extension screen 901a2, and a third target viewing direction range AB3 corresponding to the sub extension screen 901 A3. The first target viewing direction range AB1 is a horizontal coordinate range in the OX direction [ (Wc + W1/2+ eh1), W1+ eh2], a vertical coordinate range in the OY direction [ - (Hc/2+ ev1), Hc/2+ ev2 ]. The second target viewing direction range AB2 is a horizontal coordinate range [ - (W1+ eh2), - (Wc + W1/2+ eh1) ] along the OX direction, and a vertical coordinate range [ - (Hc/2+ ev1), Hc/2+ ev2] along the OY direction. The third target viewing direction range AB3 is a horizontal coordinate range [ - (Wc/2+ eh1), Wc/2+ eh2] along the OX direction, a vertical coordinate range [ - (H1/2+ ev2), - (Hc + H1/2+ ev1) along the OY direction. It should be noted that the user target viewing direction range when the position of the sub extension screen is set on the upper side or the left side or the right side of the main display 102 is calculated as a reference in a rectangular coordinate system, here, a spatial coordinate system xyz. In the case where the main display 102 is a desktop display or a display part of a notebook computer, as shown in fig. 6 (b), the extension screen of the smart glasses 90 is configured as 3 sub-extension screens, which are respectively designated as 901a4, 901a5, and 901a6, the user target viewing direction range in this case includes a fourth target viewing direction range AB4 corresponding to the sub-extension screen 901a4, a fifth target viewing direction range AB5 corresponding to the sub-extension screen 901a5, and a sixth target viewing direction range AB6 corresponding to the sub-extension screen 901a6, the fourth target viewing direction range AB4 is a horizontal coordinate range [ - (W1+ eh2), - (Wc + W1/2+ eh1) ] along the OX direction, and a vertical coordinate range [ - (Hc/2+ ev1), Hc/2+ ev2] along the OY direction. The fifth target viewing direction range AB5 is a horizontal coordinate range in the OX direction [ (W1/2+ eh2), W1/2+ Wc + eh2], a vertical coordinate range in the OY direction [ - (Hc/2+ ev1), Hc/2+ ev2 ]. The sixth target viewing direction range AB6 is a horizontal coordinate range [ - (Wc/2+ eh1), Wc/2+ eh2] along the OX direction, and a vertical coordinate range [ - (ev2), - (Hc + ev2) ] along the OZ direction. The plane of the sub extension screen may be not parallel to the display plane of the main display, i.e., the OXY plane, for example, when the planes of the sub extension screens 901a4 and OXZ are at an afa angle, at this time, the fourth target viewing direction range AB4 corresponding to the sub extension screen 901a4 is a horizontal coordinate range [ - (W1+ eh2), - (Wc + W1/2+ eh1) along the OX direction, and a vertical coordinate range [ - (Hc × sin (afa)/2+ ev1), Hc × sin (afa)/2+ ev2], a horizontal coordinate range [ - (W1+ eh2), - (Wc + W1/2+ eh1) along the OX direction.
The step of the user selecting the content on the main display 102 to be presented on the extended screen of the smart glasses 90 is realized by an input device communicably connected to the main computing device 101, such as a mouse, a touch input screen, a keyboard, a voice control input device, etc. communicably connected to the main computing device, and may also be a built-in input device integrated with the main computing device 101, such as a soft keyboard or touch screen built in a tablet computer, a keyboard carried by a notebook computer, an operation panel, etc. The user selects one or more contents to be presented on the smart glasses on the main display 102, where the contents include the contents existing in the form of a content window on the main display 102 and a part of the contents in the content window. The main computing device 101 responds to the operation of the user and transmits data information at least containing the content selected by the user to the smart glasses 90, and the smart glasses 90 receives the data information transmitted by the main computing device 101, generates the content to be presented, and associates the content to be presented with the extended screen of the smart glasses 90. The content selected by the user on the main display 102 is not removed from the main display screen 102A by being transmitted to the smart glasses 90, the main computing device 101 is capable of communicating the content selected by the user to the smart glasses 90 in a copy manner, and when the direction of view tracking module 800 detects that the direction of view of the smart glasses 90 is within the range of the target direction of view of the user, the smart glasses 90 present the content to be presented generated by the smart glasses 90 on the extended screen.
The host computing device 101 sends data information to the smart glasses 90, the data information at least including content selected by the user and needing to be presented on the smart glasses 90, and the smart glasses 90 receives the data information sent by the host computing device 101, generates content to be presented, and associates the generated content to be presented with the extended screen of the smart glasses 90.
In one embodiment, the extension screen of the smart glasses 90 is configured as a sub-extension screen 901a0, when the content selected by the user to be presented on the smart glasses 90 is a part of the content 102p of a content window on the main display 102, as shown in the content framed in the diagram in (a) in fig. 7, the main computing device 101 configures a content window for the part of the content to regenerate the content 102N, and (b) in fig. 7, and operates according to the attribute of the user, and transmits data information including the content 102N to the smart glasses 90, and the smart glasses 90 generates a content to be presented according to the received data information, and associates the content to be presented with the sub-extension screen 901a0 by setting the content to be presented on the sub-extension screen 901a 0. In one possible implementation, the main computing device 101 sends data information containing at least the part of the content 102p selected by the user on the main display 102 to the smart glasses 90, and the smart glasses 90 receives the data information and processes the data information, configures a content window for the part of the content 102p contained therein to generate the content to be presented, and sets the content to be presented on the sub-expansion screen 901a 0. When the extension screen of the smart glasses 90 is configured to have two sub extension screens, for example, the sub extension screen 901a1 and the sub extension screen 901a2, the user performs an attribute operation on the first content 1021 and the second content 1022 to be presented on the smart glasses 90 on the main display 102, respectively, such as selecting an attribute value of "send the selected content to the sub extension screen 1" for the attribute operation on the first content 1021, the main computing apparatus 101 sends data information including the first content 1021 and the attribute value to the smart glasses 90, the smart glasses 90 receives the data information to generate a content 1021n to be presented including the first content 1021, and associates the content 1021n to be presented with the sub extension screen 901a1 according to the attribute value information therein. The user selects an attribute value of "transmit selected content to the sub expansion screen 2" after operating the attribute of the second content 1022, the host computing apparatus 101 transmits data information including the second content 1022 and the attribute value to the smart glasses 90, the smart glasses 90 generates content to be presented 1022n including the second content 1022 after receiving the data information, and associates the content to be presented 1022n with the sub expansion screen 901a2 according to the attribute value information therein. When the viewing direction tracking module 800 detects that the viewing direction of the smart glasses 90 is within the user target viewing direction range corresponding to the sub-extension screen 901a1, the smart glasses 90 present the sub-extension screen 901a1 with the content 1021n to be presented to the user, when the user rotates and/or moves the head, the viewing direction tracking module 800 detects the viewing direction of the smart glasses 90, when the viewing direction of the smart glasses 90 is detected to be within the user target viewing direction range corresponding to the sub-extension screen 901a2, the smart glasses 90 present the sub-extension screen 901a2 with the content 1021n to be presented to the user, and when it is detected that none of the viewing directions of the smart glasses 90 is within the user target viewing direction range corresponding to the sub-extension screen 901a1 and the user target viewing direction range corresponding to the sub-extension screen 901a2, the smart glasses 90 do not or stop presenting any content to the user.
In one possible implementation, the implementation step of the user selecting the content to be presented on the smart glasses on the main display 102 includes:
the user selects the content to be displayed on the smart glasses 90 through the input device, obtains at least one attribute content corresponding to the smart glasses 90 through the attribute operation, and performs the selection operation on the attribute content, and the host computing device 101 sends data information at least containing the content selected by the user and required to be presented on the smart glasses to the smart glasses 90 after detecting the selection operation of the attribute.
In a specific implementation, the extended screen of the smart glasses 90 may be configured to have one sub extended screen, or may be configured to have at least two or more sub extended screens. The main computing device 101 detects the smart glasses 90, a communication connection is established between the smart glasses 90, and the smart glasses 90 send configuration information of the extended screen to the main computing device 101, wherein the configuration information of the extended screen includes the number of the sub extended screens and the corresponding identification of each sub extended screen. It is also possible that the setting user can select a desired configuration requirement of the extended screen of the smart glasses 90 on the smart glasses 90, and the smart glasses 90 transmit configuration information of the extended screen to the host computing device 101. The main computing device 101 configures the attribute contents that can be obtained by the attribute operation on the main display 102 according to the configuration information of the extended screen, for example, in the embodiment where the extended screen is configured to have one sub-extended screen, when the attribute operation, for example, the right mouse click operation, is performed on the main display 102, the attribute contents presented on the main display 102 include at least the attribute value such as "send content to the extended screen 1", for example, the user selects an icon corresponding to one PDF document on the main display 102 by left mouse click and clicks the right mouse click, the attribute contents including the attribute values of "open", "new", "delete", "cut", "copy", "send content to the extended screen 1", etc. are presented near the icon, and when the user selects the attribute value of "send content to the extended screen 1", the host computing device 101 sends the data information of the PDF document to the smart glasses 90. In the case where the extended screen is configured to have a plurality of sub extended screens, for example, to be configured as 3 sub extended screens, the configuration information of the extended screen includes the number "3" of sub extended screens and corresponding identifications, for example, "sub extended screen 1", "sub extended screen 2", and "sub extended screen 3". The attribute contents presented on the main display 102 through the attribute operation are configured to include at least a first attribute value of "send contents to the sub-extension screen 1", a second attribute value of "send contents to the sub-extension screen 2", and a third attribute value of "send contents to the sub-extension screen 3", and a user can select to send the currently selected contents to be presented on the smart glasses 90 to any one of the sub-extension screens for display, and after the user selects the first attribute value, the smart glasses 90 receives data information sent by the main computing device 101 and at least including the currently selected contents of the user, generates the contents to be presented, and controls the contents to be presented to be displayed on the sub-extension screen 1.
It should be noted that the specific name of the identifier corresponding to the sub extension screen is not limited, and may be other names capable of identifying each sub extension screen, and similarly, the content name of the attribute value in the attribute content acquired by the attribute operation on the main display 102 corresponding to each sub extension screen of the smart glasses 90 is not limited. In one possible embodiment, the host computing device 101 is communicatively connected to two smart glasses at the same time, in this way, the configuration information of the extended screen further includes an identifier of the smart glasses, and the content name of the corresponding attribute value in the attribute content presented on the main display 102 includes the identifier of the smart glasses, for example, "send content to the sub-extended screen 1 of the smart glasses a", "send content to the sub-extended screen 1 of the smart glasses B", thereby distinguishing the extended screens on different smart glasses.
One or more contents presented on the extended screen of the smart glasses 90 may be moved out of the extended screen by a removal operation, which may be a deletion operation or a cut-out deletion directly performed through the input device or a drag-and-drop of one or more contents out of the extended screen. In the removing operation, the user may select the content presentation mode after the removing operation, for example, the user selects that a content is moved out of the extended screen by the removing operation and the removed content is transmitted back to the main computing device 101, if the main display 102 already has the content with the same name as the content, the user may also select whether the main computing device 101 overlays the already existing content, and if the main display 102 does not have the content with the same name as the content, the user may select whether the main computing device 101 presents the content on the main display 102.
In yet another possible implementation, the specific implementation steps of the user selecting the content to be presented on the smart glasses 90 on the main display 102 may be:
the main display 102 is configured to have a pre-storage window smaller than the main display screen, which is denoted as 102B; the user places one or more contents that need to be presented on the smart glasses in the pre-storage window 102B through a placing operation based on an input device communicably connected with the host computing device 101.
As shown in fig. 8 (a), a diagram of a pre-storage window on a main display is shown. The active display area on the main display 102 is a main display screen 102A, and the pre-storage window 102B is a portion of the main display screen 102A.
When the content selected by the user is one or more content windows on the main display 102, such as one or more word processing application program windows, or one or more icons corresponding to specific application content, such as an icon of a PDF document or an icon of a word document, the user may place one or more contents on the main display 102 in the pre-storage window 102B by operating the input device in a conventional drag-copy or drag-move manner. When the content selected by the user to be presented on the smart glasses 90 is a part of content in a content window on the main display 102, the user can place the part of content in the content window in the pre-storage window 102B through a paste operation in a conventional copy or cut operation manner through the input device, and when the main computing device 101 detects that the content placed in the pre-storage window 102B is the part of content in one content window, one content window is created and the part of content is placed in the created content window in the form of characters or pictures.
The number of contents of the user's one-time placeable operation in the pre-storage window may be preset by the main computing apparatus 101, and may also be configured to be set by the user. The amount of content that can be placed in a single operation can be a single content or multiple contents.
In a possible embodiment, the number of the contents selected by the user in a single placeable operation is a single content, at this time, if the user selects a plurality of contents on the main display 102, the placement operation is invalid, none of the selected plurality of contents can be placed in the pre-storage window 102B, the user can only select one content to be placed in the pre-storage window 102B through the placement operation, and the user can place a plurality of contents in the pre-storage window 102B through a plurality of placement operations. After each placement operation is performed by the user, the single content selected by the user is adaptively maximized to fill the pre-placement window 102B, and cover the previous content, for example, a square window with a limited area of 1:1 or a square window with a limited area of 16:9 or other proportions of the pre-placement window 102B, and the single content selected by the user can be adaptively maximized to fill the limited area, or can be maximized according to a window display proportion preset by the single content window selected by the user.
When the number of the contents set by the user as the single placeable operation is a plurality of contents, the user may select one or more contents on the main display 102, which are to be presented in the pre-storage window in the form of content icons and arranged in the pre-storage window 102B in order of the user's selection, and when the user selects one or more contents again on the main display 102 to be placed in the pre-storage window 102B, the selected one or more contents are arranged in order of the user's selection at the end of the existing content icon queue to be presented in the pre-storage window 102B in the form of content icons.
The user's interaction with the content on the extended screen may interact with the content on the extended screen via one or more input devices communicably coupled to the smart glasses 90, such as a handle, a gesture tracking recognition device, an eye interaction device, an input interface (e.g., touch screen, buttons) built into the smart glasses 90, and so on.
The interaction between the user and the content presented on the expansion screen can also be realized by performing an interactive operation on the content in the pre-stored window 102B on the main display through an input device communicably connected to the main computing device 101, for example, if one or more content icons are presented on the pre-stored window 102B, the one or more content icons are presented on the expansion screen, and the user presents the content window corresponding to the content icon on the expansion screen by double-clicking or single-clicking one of the content icons on the pre-stored window 102B so that the content window is presented on the pre-stored window 102B in a maximized manner; the user may also adjust the display size of the content window in the pre-storage window (e.g., a display size smaller than the display size at the maximum), and double-click or click another content icon to display the corresponding content window and adjust the display size of the content window, at this time, the pre-storage window 102B displays a plurality of content icons and the two content windows, and the expansion screen also displays a plurality of content icons and the two content windows.
For the case where the extended screen of the smart glasses 90 is configured as a plurality of sub extended screens, as shown in (a) of fig. 6, the extended screen is configured as a sub extended screen 901a1, a sub extended screen 901a2, and a sub extended screen 901A3, and the pre-stored window 102B on the main display 102 is configured to have a corresponding number of pre-stored sub windows according to the number of sub extended screens, as shown in (B) of fig. 8, the pre-stored window 102B includes a pre-stored sub window 102B1, a pre-stored sub window 102B2, and a pre-stored sub window 102B3, and the pre-stored sub windows correspond to the sub extended screens one to one. The user places the content that needs to be displayed on the sub expansion screen 901a1 in the pre-stored sub window 102B1, the content that needs to be displayed on the sub expansion screen 901a2 in the pre-stored sub window 102B2, and the content that needs to be displayed on the sub expansion screen 901A3 in the pre-stored sub window 102B3 by the placing operation.
For the case where the main display 102 has the pre-stored window 102B thereon, the user's interaction with the content on the extended screen may also be transmitted to the main computing device 101, and the main computing device 101 changes the content on the pre-stored window 102B according to the result of the user's interaction with the content on the extended screen. For example, in the case that the extended screen is a sub-extended screen, the user operates a content icon on the extended screen to make a content window corresponding to the content icon appear on the extended screen in the display size of the sub-extended screen, and at this time, the host computing device 101 simultaneously displays the corresponding content icon in the pre-storage window 102B in full screen in the form of a content window in the pre-storage window 102B.
One or more contents on the extended screen of the smart glasses 90 may be moved out of the extended screen by a removing operation, or may be moved out of the pre-storage window 102B by a removing operation in the pre-storage window 102B. The expanded screen and the pre-stored window 102B will respond to the user's removal operation on the expanded screen or the pre-stored window 102B simultaneously. The removal operation may be a delete operation or a cut delete performed directly through the input device or a drag to move one or more contents out of the extended screen or the pre-stored window 102B. In the removing operation, the user may select the removing operation, for example, the user selects that when one content is moved out of the pre-stored window 102B or the extended screen, the one content will be presented on the main display 102, and if the content with the same name as the content already exists on the main display 102, the user may also select to replace the existing content; if the content does not have the same name as the content on the primary display, the user may select the content to be presented on the primary display or select the content not to be presented on the primary display.
In one possible implementation, the specific implementation of the user selecting the content to be presented on the smart glasses on the main display 102 can be further implemented by the following steps:
the user drags the content to be displayed on the extended screen of the smart glasses 90 out of the main display screen 102A area by a drag on the main display 102.
For the case where the extended screen is configured to have one sub-extended screen, the user selects and drags the content to be displayed on the smart glasses through the input device to the outside of the main display screen 120A of the main display 102, the main computing device 101 sends the content to the smart glasses 90, the smart glasses 90 generates the content to be presented and then presents the content on the sub-extended screen, and when the user performs the dragging operation again, the smart glasses 90 generates the content to be presented for the newly received content and presents the content to be presented on the sub-extended screen, and overlays the content to be presented existing on the sub-extended screen before. For the case that the extended screen is configured to have a plurality of sub extended screens, for example, configured to be 2 sub extended screens, the content to be presented, which is generated by the smart glasses 90 after the user has performed the drag operation for the first time, is configured to be presented on the first sub extended screen, the content to be presented, which is generated by the smart glasses 90 after the user has performed the drag operation for the second time, is configured to be presented on the second sub extended screen, and the content to be presented, which is generated by the smart glasses 90 after the user has performed the drag operation for the third time, is configured to be presented on the first sub extended screen and to overlay the existing display content. After the user passes through the above-mentioned drag operation for the fourth time, the content to be presented generated by the smart glasses 90 will be configured to be presented on the second sub-expansion screen and to cover the existing display content.
When the extended screen of the smart glasses 90 is configured as a sub-extended screen, the content moved out of the main display screen 102A area in the above-described dragging manner is sent to the smart glasses 90, the smart glasses 90 generates the content to be presented, and when the direction-of-view tracking module detects that the direction of view corresponding to the sub-extended screen of the smart glasses 90 is in the user target direction range, the content to be presented is displayed on the sub-extended screen.
In the case where the extension screen of the smart glasses 90 is configured as a plurality of sub extension screens, as shown in (a) of fig. 6, the sub extension screen 901a1, the sub extension screen 901a2, and the sub extension screen 901A3, the content moved out of the main display screen 102A area in the above-mentioned dragging manner is sent to the smart glasses 90, the smart glasses 90 generates the content to be presented including the content, and associates the content to be presented with one of the sub extension screens according to the association order of each sub extension screen, for example, the content to be presented is referred to as the content a to be presented, when the sub extension screen 901a1 has associated other content to be presented and the other two sub extension screens have not associated with the other content to be presented, the smart glasses 90 associates the content a to be presented with the sub extension screen 901a2, and when the viewing direction tracking module 800 detects that the viewing direction of the smart glasses 90 is toward the target viewing direction range of the user corresponding to the sub extension screen 901a2, the content a to be presented is displayed on the sub expansion screen 901a 2. When both the sub expansion screens 901a1 and 901a2 have associated other contents to be presented, the content a to be presented will be associated with the sub expansion screen 901A3, when the sub expansion screen 901a1 has associated two other contents to be presented, and the sub expansion screens 901a2 and 901A3 have associated one other content to be presented, respectively, the smart glasses 90 associate the content a to be presented with the sub expansion screen 901a2, and the user can select to overlay the content a to be presented over the other contents to be presented with which the sub expansion screen 901a2 has associated, when the sub expansion screen 901a2 has associated only one content to be presented, the user can also select to associate the content a to be presented in a juxtaposed manner with the sub expansion screen 901a2, when the sub expansion screen 901a2 has associated two contents to be presented, and when the module 800 detecting that the smart glasses 90 look into the range to the user's target corresponding to the sub expansion screen 901a2, the last associated content a to be presented is displayed on the sub expansion screen 901a 2.
It should be noted that the content dragged outside the area of the main display 102A is not removed from the main display 102.
The utility model provides an extension screen that extension screen display system intelligence glasses provided and the combination of the main display screen of main display ware have expanded the user through main computing device when carrying out the task processing of opening simultaneously and handling a plurality of application windows for the user can easily handle a plurality of application windows, improves the efficiency that the user used the computer to carry out multitask processing work.
The embodiment of the utility model provides a display method of another kind of extension screen is applied to extension screen display system, can make the user can be effectual carry out multitasking work between main display screen and extension screen.
All of the features disclosed in this specification, or all of the steps in any method or process so disclosed, may be combined in any combination, except combinations of features and/or steps that are mutually exclusive.
Any feature disclosed in this specification (including any accompanying claims, abstract and drawings), may be replaced by alternative features serving equivalent or similar purposes, unless expressly stated otherwise. That is, unless expressly stated otherwise, each feature is only an example of a generic series of equivalent or similar features. In the description of the present invention, it should be further noted that the terms "upper" and "lower" are used for indicating the position or positional relationship based on the position or positional relationship shown in the drawings, or the position or positional relationship which is usually placed when the product of the present invention is used, and are only for convenience of describing the present invention and simplifying the description, but not for indicating or implying that the indicated device or element must have a specific orientation, be constructed and operated in a specific orientation, and thus should not be construed as limiting the present invention.
The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (10)

1. An extended screen display system, the display system comprising a host computing device, a host display, at least one input device, smart glasses, and a view tracking module, the host computing device communicatively coupled to the host display, the host display configured to provide a host display screen, the smart glasses communicatively coupled to the host computing device, the smart glasses comprising a memory, a processor, and an extended display, the extended display configured to provide an extended screen;
the intelligent glasses generate content to be presented according to the content provided by the main computing device, and control the extended display to display the content with presentation on the extended screen when the viewing direction is judged to be within the range of the target viewing direction of the user.
2. The extended-screen display system of claim 1, wherein the viewing-direction tracking module is an outward shooting tracking module and is disposed on the smart glasses.
3. The extended-screen display system of claim 2, wherein a preset reference point is provided on the main display, and the outward shooting tracking module determines the viewing direction of the user wearing the smart glasses through tracking identification of the preset reference point.
4. The extended-screen display system of claim 1, wherein the viewing-direction tracking module is an outward shooting tracking module, and is disposed outside the smart glasses and capable of acquiring any spatial position of an image of a user wearing the smart glasses.
5. The extended-screen display system of claim 4, wherein the smart glasses are provided with a preset reference point.
6. The extended-screen display system of any one of claims 1-5, wherein the viewing direction tracking module further comprises an eye-shooting camera module, and the eye-shooting camera module is used for shooting an eye pattern of a user wearing the smart glasses.
7. The extended-screen display system of any one of claims 1-5, wherein the viewing direction tracking module is communicatively coupled to the smart glasses to send the smart glasses a viewing direction in which the user wears the smart glasses.
8. The extended-screen display system of claim 6, wherein the viewing direction tracking module is communicatively coupled to the smart glasses to send the smart glasses a viewing direction of a user wearing the smart glasses.
9. An extended screen display system according to any one of claims 1 to 5 wherein the input device is communicatively coupled to the smart glasses for user interaction with content presented on the extended screen.
10. An extended screen display system according to any one of claims 1 to 5, wherein a main display screen of the main display is provided with a pre-stored window, the input device is communicatively connected to the main computing device, and a user performs an interactive operation on the extended screen by an interactive operation of the input device with content in the pre-stored window on the main display.
CN202021115205.1U 2020-06-16 2020-06-16 Extended screen display system Expired - Fee Related CN212256285U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202021115205.1U CN212256285U (en) 2020-06-16 2020-06-16 Extended screen display system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202021115205.1U CN212256285U (en) 2020-06-16 2020-06-16 Extended screen display system

Publications (1)

Publication Number Publication Date
CN212256285U true CN212256285U (en) 2020-12-29

Family

ID=73986930

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202021115205.1U Expired - Fee Related CN212256285U (en) 2020-06-16 2020-06-16 Extended screen display system

Country Status (1)

Country Link
CN (1) CN212256285U (en)

Similar Documents

Publication Publication Date Title
US11636660B2 (en) Object creation with physical manipulation
Grossman et al. Multi-finger gestural interaction with 3d volumetric displays
CN115167676A (en) Apparatus and method for displaying applications in a three-dimensional environment
US10839572B2 (en) Contextual virtual reality interaction
US20160004300A1 (en) System, Method, Device and Computer Readable Medium for Use with Virtual Environments
US20220317776A1 (en) Methods for manipulating objects in an environment
KR20230025909A (en) Augmented Reality Eyewear 3D Painting
WO2021097600A1 (en) Inter-air interaction method and apparatus, and device
CN108027655A (en) Information processing system, information processing equipment, control method and program
EP2669781B1 (en) A user interface for navigating in a three-dimensional environment
WO2024064925A1 (en) Methods for displaying objects relative to virtual surfaces
CN113961107A (en) Screen-oriented augmented reality interaction method and device and storage medium
CN111708504A (en) Display method of extended screen
CN117897680A (en) Augmented reality communication exchange based on physical actions
WO2024064937A1 (en) Methods for interacting with user interfaces based on attention
CN212256285U (en) Extended screen display system
WO2024049578A1 (en) Scissor hand gesture for a collaborative object
JP5767371B1 (en) Game program for controlling display of objects placed on a virtual space plane
WO2023137402A1 (en) Methods for displaying, selecting and moving objects and containers in an environment
CN113457117B (en) Virtual unit selection method and device in game, storage medium and electronic equipment
KR20230079156A (en) Image Capture Eyewear with Context-Based Transfer
TW202044007A (en) Method for selecting interactive objects on display medium of device
JP2016016319A (en) Game program for display-controlling objects arranged on virtual spatial plane
WO2024049579A1 (en) Physical gesture interaction with objects based on intuitive design
WO2024064229A1 (en) Devices, methods, and graphical user interfaces for tabbed browsing in three-dimensional environments

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20201229