CN114115771B - Image processing method - Google Patents

Image processing method Download PDF

Info

Publication number
CN114115771B
CN114115771B CN202110631359.9A CN202110631359A CN114115771B CN 114115771 B CN114115771 B CN 114115771B CN 202110631359 A CN202110631359 A CN 202110631359A CN 114115771 B CN114115771 B CN 114115771B
Authority
CN
China
Prior art keywords
image processing
window
processing method
image
target computer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110631359.9A
Other languages
Chinese (zh)
Other versions
CN114115771A (en
Inventor
张立人
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Aten International Co Ltd
Original Assignee
Aten International Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Aten International Co Ltd filed Critical Aten International Co Ltd
Publication of CN114115771A publication Critical patent/CN114115771A/en
Application granted granted Critical
Publication of CN114115771B publication Critical patent/CN114115771B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

An image processing method for an image processing device. The image processing device is coupled between the source computer and at least one target computer. The image processing method comprises the following steps: receiving an operation picture of a source computer; identifying a plurality of windows contained in the running picture; acquiring all or part of the window to generate a plurality of corresponding window images; outputting the window image to at least one target computer; and displaying one or more of the window images.

Description

Image processing method
Technical Field
The present disclosure relates to an image processing method, and more particularly, to an image processing method for improving window display.
Background
Generally, an operator at a remote room in a control center needs to switch different connection terminals between multiple hosts to perform operations. However, when a plurality of windows are simultaneously displayed in one display screen, it is difficult for an operator to clearly view information on the display screen; in addition, in the prior art, even if a plurality of display screens are provided, the windows cannot be dragged from the original display screen to another display screen. The working environment is long, and the tiredness degree of operators on work is easy to be increased.
Therefore, how to improve the display of window images is one of the important subjects in the art.
Disclosure of Invention
The present disclosure relates to an image processing method. The image processing method is used for the image processing device. The image processing device is coupled between the source computer and at least one target computer. The image processing method comprises the following steps: receiving an operation picture of a source computer; identifying a plurality of windows contained in the running picture; acquiring all or part of the window to generate a plurality of corresponding window images; outputting the window image to at least one target computer; and displaying one or more of the window images.
In some embodiments, the step of identifying the window includes: shape objects conforming to at least one frame feature are detected inwardly from an outer edge of the running picture.
In some embodiments, the image processing method further includes: when the border of the first shape object and the border of the second shape object are staggered, overlapped or share a border, judging that the first shape object and the second shape object are at least partially overlapped; and when the first shape object and the second shape object are judged to be at least partially overlapped, controlling the display position of the window in the running picture of the source computer according to the preset instruction of the input interface device.
In some embodiments, the step of obtaining to generate a corresponding plurality of window images includes: and acquiring the content in the shape object in the running picture as a corresponding one of the window images.
In some embodiments, the image processing method further includes: receiving an operation picture by an image processing device; identifying the window by the image processing device; acquiring by an image processing device to generate a window image; compressing and outputting the window image by the image processing device; receiving the compressed window image by at least one target computer and decompressing; and displaying the decompressed window image by at least one target computer.
In some embodiments, the step of displaying one or more of the window images includes: one or more of the window images are displayed by a plurality of display devices connected to at least one target computer, respectively.
In some embodiments, the area of the window and the area of the running screen are in a first ratio, and the area of one of the window images and the area of a corresponding one of the display devices are in a second ratio that is greater than the first ratio.
In some embodiments, the image processing method further includes: compressing and outputting the running picture by the image processing device; receiving the compressed running picture by at least one target computer and decompressing; identifying the window by at least one target computer according to the decompressed running picture to generate a window image; and displaying the window image by at least one target computer.
In some embodiments, the image processing method further includes: and converting the original position of the mouse in the running picture of the source computer according to the absolute coordinate of the mouse in the display device.
In some embodiments, the image processing method further includes: and controlling the display position of the window in the running picture of the source computer according to the preset instruction of the input interface device.
In summary, by the image processing method, the image processing device performs window recognition and acquisition on the running image of the source computer, and transmits the acquired one or more window images to one or more target computers, so that one or more target computers can display the window images corresponding to one or more windows in the source computer through the connected display device.
Drawings
Fig. 1 is a schematic diagram illustrating an image processing system according to some embodiments of the present disclosure.
Fig. 2 is a flowchart illustrating an image processing method according to some embodiments of the present disclosure.
FIG. 3 is a schematic diagram illustrating the identification and acquisition of a window to generate a window image in accordance with some embodiments of the present disclosure.
FIG. 4A is a schematic diagram illustrating an output and display of a window image according to some embodiments of the present disclosure.
FIG. 4B is a schematic diagram illustrating another output and display of a window image according to some embodiments of the present disclosure.
FIG. 5 is a schematic diagram illustrating a window image recognition according to some embodiments of the present disclosure.
Fig. 6A is a schematic diagram illustrating window overlap in accordance with some embodiments of the present disclosure.
Fig. 6B is a schematic diagram illustrating a display segmentation in accordance with some embodiments of the present disclosure.
FIG. 7 is a schematic diagram illustrating a window image display according to some embodiments of the present disclosure.
Fig. 8A is a schematic diagram illustrating a display device displaying a window image according to some embodiments of the present disclosure.
Fig. 8B is a schematic diagram illustrating an operation screen of a source computer according to some embodiments of the present disclosure.
Fig. 9A is a schematic diagram illustrating another display device displaying a window image according to some embodiments of the present disclosure.
Fig. 9B is a schematic diagram illustrating an operation screen of another source computer according to some embodiments of the present disclosure.
Reference numerals illustrate:
100: image processing system
120_1 to 120_n: source computer
140: image processing apparatus
142: image identifier
144: processor and method for controlling the same
150: ethernet network
160_1 to 160_m: target computer
162: controller for controlling a power supply
164a,164b,164c,164d,164e: display card
180,180_11 to 180_mk: display device
IMD1, IMD3, IMD4: running picture
IMD2: window image
200: image processing method
S210, S220, S230, S240, S250: operation of
f1, f2, f3, f4, f4', f5, f5': window
P1, P2, P3: window image
W1, W2, W3: window image
SC: display area
M1, M1', M2': mouse with mouse body
X1, Y1, X2, Y2: coordinates of
Detailed Description
The following examples are given to provide a better understanding of the embodiments of the present disclosure, but the examples are not intended to limit the scope of the disclosure, and the description of the operation of the structure is not intended to limit the order in which it may be performed, as any arrangement of elements may be rearranged to produce an equivalent technical result. Moreover, the drawings are not drawn to scale and, in fact, the dimensions of the various features may be arbitrarily increased or decreased for clarity of illustration according to industry standards and practices. Like elements in the following description will be described with like reference numerals for ease of understanding.
The use of the term (terms) throughout this specification and related application documents, unless otherwise indicated, is generally used in the sense of each term used in this field, throughout the disclosure and in the special context. Certain terms used to describe the disclosure are discussed below, or elsewhere in this specification, to provide additional guidance to those skilled in the art in describing the disclosure.
Furthermore, the terms "comprising," including, "" having, "" containing, "and the like, as used herein, are open-ended terms, meaning" including, but not limited to. Furthermore, as used herein, "and/or" includes any one or more of the associated listed items and all combinations thereof.
Herein, when an element is referred to as being "connected" or "coupled," it can be referred to as being "electrically connected" or "electrically coupled. "connected" or "coupled" may also mean that two or more elements co-operate or interact with each other.
Furthermore, although the terms "first," "second," …, etc. may be used herein to describe various elements, this term is merely intended to distinguish between elements or operations that are described in the same technical term. Unless the context clearly indicates otherwise, the terms are not specifically intended or implied to be order or cis-ient nor intended to limit the invention.
For convenience of explanation, the lower case english indices 1 to n, 1 to m, 1 to k in the element numbers and signal numbers used in the present disclosure and drawings are for convenience of reference only to individual elements and signals, and the number of the foregoing elements and signals is not intended to be limited to a specific number. In the present disclosure and the drawings, if an element number or signal number is used, an index indicating the element number or signal number is not used, the element number or signal number refers to any element or signal that is not specific in the group of elements or signal groups to which the element number or signal number belongs. For example, the object indicated by the element number 120_1 is the source computer 120_1, and the object indicated by the element number 120 is any source computer not specified in the source computers 120_1 to 120—n. For another example, the object designated by the element number 160_1 is the target computer 160_1, and the object designated by the element number 160 is any target computer not specified among the target computers 160_1 to 160_m.
Please refer to fig. 1. Fig. 1 is a schematic diagram illustrating an image processing system 100 according to some embodiments of the present disclosure. As shown in fig. 1, the image processing system 100 includes source computers 120_1 to 120—n, an image processing device 140, target computers 160_1 to 160—m, and display devices 180_11 to 180—mk. The image processing device 140 is connected to the source computers 120_1 to 120_n and connected to the target computers 160_1 to 160_m through the Ethernet (Ethernet) 150. The target computers 160_1 to 160_m are respectively connected with the display devices 180_11 to 180_mk.
Specifically, the image processing device 140 may be a KVM switch. The image processing device 140 can be connected to the source computers 120_1 to 120—n through a high-definition multimedia interface (High Definition Multimedia Interface, HDMI), a digital video interface (Digital Visual Interface, DVI), a VGA terminal (Video Graphics Array connector) or a universal serial bus (Universal Serial Bus, USB). The target computers 160_1 to 160_m may be connected to the display devices 180_11 to 180_mk through a high-definition multimedia interface, a digital video interface, a VGA terminal, or a universal serial bus, respectively. For example, as shown in FIG. 1, the target computer 160_1 is connected to the display devices 180_11 to 180_1k. The target computer 160_2 is connected with the display devices 180_21 to 180_2k. Similarly, the target computer 160_m is connected to the display devices 180_m1 to 180_mk.
Operationally, one of the source computers (e.g., source computer 120_1 in fig. 1) is configured to output the running picture IMD1 to the image processing device 140. The running picture IMD1 is a picture generated by the source computer 120 in the current running state. In another illustration, the running picture IMD1 may be a picture image currently displayed by a display device connected to the source computer 120. When the image processing device 140 receives the running picture IMD1, the image processing device 140 is configured to identify a plurality of windows included in the running picture IMD1, acquire the windows to generate a corresponding plurality of window images IMD2, and transmit the window images IMD2 to one or more of the target computers 160_1 to 160_m via the ethernet network 150. The corresponding one of the target computers 160_1-160_m (e.g., the target computer 160_1 in fig. 1) is configured to receive the window image IMD2 and output one or more window images of the window image IMD2 to the corresponding connected display devices (e.g., the display devices 180_11-180_1k in fig. 1) for display, respectively.
It should be noted that the above hardware devices or the connection method is merely for illustration, but the disclosure is not limited thereto. That is, although the image processing system 100 shown in fig. 1 includes n source computers 120_1 to 120_n, m target computers 160_1 to 160_m, and m by k display devices 180_11 to 180_mk, where n, m, k are all positive integers, the number is merely an example for convenience of description and not intended to limit the disclosure. In other words, in other embodiments, the image processing system 100 may include only one source computer or one target computer, and/or the target computer may be connected to only one display device. Alternatively, the image processing system 100 may include a plurality of target computers, and the number of display devices connected to each target computer may be different (i.e., not necessarily k).
Please refer to fig. 2. Fig. 2 is a flow chart illustrating an image processing method 200 according to some embodiments of the present disclosure. For convenience of description, the following description will be given with reference to the embodiment of fig. 1, but not limited thereto, and various changes and modifications may be made by those skilled in the art without departing from the spirit and scope of the present disclosure. As shown in fig. 2, the image processing method 200 includes operations S210, S220, S230, S240, and S250.
First, in operation S210, the operation screen IMD1 of the source computer 120 is received. For example, the running picture IMD1 of the source computer 120 is shown in fig. 3 as running picture IMD1. The image processing device 140 receives the operation screen IMD1.
Next, in operation S220, a plurality of windows included in the operation screen IMD1 are identified. Next, in operation S230, all or part of the window is acquired to generate a corresponding plurality of window images. For example, as shown in fig. 3, the operation screen IMD1 includes windows f1 to f3. In some embodiments, the image processing device 140 includes an image identifier 142 and a processor 144. The image identifier 142 performs image recognition after receiving the operation screen IMD1 to generate window images P1-P3 corresponding to the windows f 1-f 3, and transmits the window images P1-P3 to the processor 144. After receiving the window images P1 to P3, the processor 144 performs image file compression on the window images P1 to P3 to generate a window image IMD2. In some other embodiments, the processor 144 may compress a portion of the window image (e.g., only window image P1) to generate window image IMD2. Details of image recognition will be described in the following paragraphs in conjunction with fig. 5.
Next, in operation S240, the window image IMD2 is outputted to one or more target computers 160. For example, as shown in fig. 4A, the window image IMD2 output from the image processing device 140 is received by the target computer 160.
Finally, in operation S250, one or more of the window images IMD2 are displayed by one or more of the display devices 180_11 to 180_1k. In some embodiments, as shown in fig. 4A, the target computer 160_1 includes a controller 162. The controller 162 receives the window image IMD2 and decompresses the window image to generate one or more window images (e.g., window images W1, W2, W3, W1-W3) included in the window image IMD2, and transmits the one or more window images to the corresponding display cards (e.g., the display cards 164a,164b,164c,164 d), respectively, so that the target computer 160_1 outputs the corresponding window images to the corresponding display devices (e.g., the display devices 180_11, 180_12, 180_13, 180_14) for displaying.
In detail, the controller 162 transmits the window image W1 to the display card 164a, so that the target computer 160_1 outputs the window image W1 to the display device 180_11 for displaying via the display card 164 a. The controller 162 transmits the window image W2 to the display card 164b, so that the target computer 160_1 outputs the window image W2 to the display device 180_12 for display via the display card 164 b. The controller 162 transmits the window image W3 to the display card 164c, so that the target computer 160_1 outputs the window image W3 to the display device 180_13 for display via the display card 164 c. The controller 162 transmits the window images W1-W3 to the display card 164d, so that the target computer 160_1 outputs the window images W1-W3 to the display device 180_14 for display via the display card 164 d.
In other words, the controller 162 may generate one or more of the window images W1-W3 included in the window image IMD2 to the display card 164. While different display cards 164 may independently output the same or different content to the connected display device 180 for display. Thus, one or more window images can be displayed on the same display device. The same, partially the same, and completely different window images can be displayed on different display devices.
In addition, in other embodiments, as shown in fig. 4B, the image processing system 100 includes a plurality of target computers 160_1 and 160_2. In this embodiment, the image processing device 140 outputs the windowed image IMD2 to the target computers 160_1 and 160_2, respectively. The controller 162_1 included in the target computer 160_1 receives the window image IMD2, decompresses the window image IMD2 to generate corresponding window images W1 and W2, and transmits the window images W1 and W2 to the corresponding display cards 164a and 164b, so that the target computer 160_1 outputs the window images W1 and W2 to the corresponding display devices 180_11 and 180_12 for display via the display cards 164a and 164b, respectively. The controller 162_2 included in the target computer 160_2 receives the window image IMD2 and decompresses the window image IMD2 to generate corresponding window images W2, W3, W1-W3, and transmits the window images W2, W3, W1-W3 to the corresponding display cards 164c,164d,164e, respectively, so that the target computer 160_2 outputs the window images W2, W3, W1-W3 to the corresponding display devices 180_21, 180_22, 180_23 for display via the display cards 164c,164d,164e, respectively.
In other words, the different target computers 162_1 and 162_2 receive the same window image IMD2. The display contents outputted from the display card 164 in the same or different target computers are independent. In this way, one or more window images that are identical, partially identical, and completely different may be displayed on different display devices respectively connected to one of the target computers (e.g., the target computer 160_1) and the other one (e.g., the target computer 160_2).
In addition, as shown in fig. 4A and 4B, the ratio of the area of the window image to the area of the corresponding display device is greater than the ratio of the area of the window to the area of the running screen. For example, the window image W1 occupies a larger display area of the display device 180_11 than the window f1 occupies the operation screen IMD1. The window image W2 occupies a larger proportion of the display area of the display device 180_12 than the window f2 occupies the operation screen IMD1. Details of displaying the window image will be described in the following paragraphs in conjunction with fig. 7.
In this way, by the image processing method 200, the image processing apparatus 140 performs window recognition and acquisition on the running image IMD1 of the source computer 120, and transmits the acquired one or more window images P1 to P3 to one or more of the target computers 160_1 to 160_m, respectively, so that one or more of the target computers 160_1 to 160_m can display the window images W1 to W3 corresponding to one or more windows f1 to f3 of the source computer 120 through the connected display device.
It should be noted that the number of windows and window images, target computers, display cards, and display devices is merely for convenience of illustration, but the disclosure is not limited thereto. In addition, although the image processing method 200 is described above in which the image processing apparatus 140 performs window recognition, acquisition and window image compression, and then the target computer 160_1 performs decompression; however, in other embodiments, the image processing method 200 can also directly compress the image of the operation screen IMD1 by the image processing apparatus 140, and transmit the compressed operation screen IMD1 to the target computer 160_1. Then, the target computer 160_1 receives the compressed running picture IMD1 and decompresses the compressed running picture by software, then performs window recognition and acquisition according to the decompressed running picture to generate window images W1 to W3, and finally the target computer 160_1 outputs the window images W1 to W3 to the display device 180 for display.
In addition, when the image processing system 100 includes a plurality of source computers 120, the image processing device 140 receives the switching signal from the target computer 160, so that the target computer 160 controls the switching of the source computers 120. Specifically, when the target computer 160 sends a switching signal, the image processing apparatus 140 determines which of the plurality of source computers 120_1 to 120—n to switch to according to the switching signal. For example, the switching signal may be a switching instruction generated by a keyboard input, a mouse operation, or other means. In this way, the user can send an instruction through the input interface device connected to the target computer 160 to switch the source computer connected to the control by the current operation, so that the user can switch different connection ends between multiple hosts to operate at the control center.
For details of image recognition, please refer to fig. 5. FIG. 5 is a schematic diagram illustrating a window image recognition according to some embodiments of the present disclosure. In the present embodiment, the operation screen of the source computer 120 is shown as operation screen IMD1 in fig. 5. The software in the image processing device 140 or the target computer 160 detects the shape objects conforming to the frame features from the outer edge of the running picture IMD1 inwards to identify the windows f 1-f 3, and obtains the content in the shape objects in the running picture IMD1 as window images P1-P3.
In some embodiments, the shaped article is a rectangular article. For example, for a rectangular object, the frame characteristics include whether there are horizontal or vertical lines, whether the colors are similar, whether the lines are connected to each other, whether the connected lines are rectangular, and whether there is a common frame. As shown by the arrow directions in fig. 5, the operation screen IMD1 is detected from above, below, up, left, right, and right to left, and whether or not a window exists and the range of the window is recognized is estimated by whether or not the frame characteristics are met. After grabbing four frames of a rectangular object, stopping inward continuing detection to avoid that if a table or other rectangular image in the window is mistakenly identified as the window. However, the outline of the shaped object is not limited thereto, and objects with different shapes can be defined according to the operation requirement of the user, and each object with different shapes has a plurality of frame features conforming to the outline thereof.
In some embodiments, when more than one rectangular object is detected and there is a phenomenon that the respective frames of the rectangular objects are partially staggered, overlapped or share a boundary, it is determined that the objects conforming to the frame feature are detected to have an overlap. As shown in fig. 6A, in the operation screen IMD3 of the target computer 160, the right side frame of the rectangular object f4 is partially interlaced with the left side frame of the rectangular object f5, for example, in the example shown in fig. 6A, since the rectangular object f4 is located in the foreground with respect to the rectangular object f5, the right side of the rectangular object f4 covers a portion of the left side of the rectangular object f5, and at this time, the vertical boundaries of the right side of the rectangular object f4 and the left side of the rectangular object f5 are overlapped and shared, so that it can be determined that the two rectangular objects are detected to have an overlap. At this time, the user may generate a frame dividing command to the target computer 160 through an input interface device (such as a keyboard or a mouse) connected to the target computer 160, and transmit the frame dividing command to the source computer 120 through the ethernet 150 and the image processing device 140, so that the source computer 120 divides the display screen into a plurality of display areas according to the frame dividing command, and reduces a plurality of windows in the current running frame IMD3 into different display areas according to the aspect ratio of the current display. As shown in fig. 6B, after receiving the frame division command, the source computer 120 divides the running frame IMD4 into four display areas, the window f4 'is reduced to a length or width corresponding to the size of the upper left display area according to the aspect ratio of the original window f4, and the window f5' is also reduced to a length or width corresponding to the size of the lower right display area according to the aspect ratio of the original window f 5.
Therefore, whether the frames of the rectangular objects are staggered, overlapped or share boundaries is detected to judge whether the windows are overlapped with each other, and then the display area is divided by the picture division instruction, so that the windows are ensured not to be overlapped with each other, and the situation that the window images are incomplete when the window images are acquired is avoided.
For details of displaying a window image, please refer to fig. 7. FIG. 7 is a schematic diagram illustrating a window image display according to some embodiments of the present disclosure. In the present embodiment, the window image W1 is displayed by the display device 180. The window image W1 corresponds to the window image P1 of the window f1 included in the running picture IMD1 in fig. 5. Specifically, the target computer 160 enlarges the window image W1 to be displayed on the display device 180 to a length-width ratio (i.e., the length-width ratio of the window image P1 in the running picture IMD1 shown in fig. 5) equal to or slightly smaller than the display area SC of the display device 180. For example, as shown in fig. 7, the window image Q1 is enlarged in equal proportion to the upper and lower inner frames of the display area SC, which are approximately overlapped on the upper and lower sides.
In this way, the window f1 with a smaller area is displayed on the operation screen IMD1 of the source computer 120, and after being identified and acquired, the window image W1 displayed on the display device 180 connected to the target computer 160 can be displayed with the same aspect ratio and a larger display area. In other words, the proportion of the window f1 to the running picture IMD1 will be less than or equal to the proportion of the window image W1 to the display area SC. Therefore, compared to displaying multiple windows (e.g., windows P1-P3 in fig. 5) in the same display screen, each window is independently displayed in a single display screen (e.g., window W1 in fig. 7) by the image processing method 200 of the present disclosure, so that an operator can easily and clearly view information. Under the condition that the using difficulty of operators is not increased, the burden of the operators can be effectively reduced, so that the tiredness degree of the operators is reduced, and the operating efficiency and the accuracy are improved.
It should be noted that although the window images in the above embodiments are all shown enlarged to be close to the display area SC of the display device, in other embodiments, the window images can be enlarged to be displayed in a specific predetermined ratio, for example, more than two thirds of the display area SC. The scale is shown by way of illustration only and is not intended to limit the present disclosure.
For the cooperation between the input interface device and the above-mentioned devices, please refer to fig. 8A, 8B, 9A and 9B. Fig. 8A and 9A are schematic diagrams respectively illustrating display window images of a display device according to different embodiments of the present disclosure. Fig. 8B and 9B are schematic diagrams illustrating operation images of the source computer according to the embodiments of fig. 8A and 9A, respectively.
The position of the mouse in the running screen of the source computer 120 is obtained by converting the absolute coordinate of the mouse in the display device 180 by an equal proportion formula. Specifically, the target computer 160 determines the position of the mouse on the display device 180 according to the detection signal generated by the input interface device (e.g., mouse). Meanwhile, the target computer 160 transmits the absolute coordinates of the mouse on the display device 180 to the image processing device 140. The image processing device 140 calculates and controls the position of the mouse in the operation screen of the source computer 120 according to the absolute coordinates of the mouse on the display device 180, the size and absolute coordinates of the window image in the display device 180, and the size and absolute coordinates of the window in the operation screen of the source computer.
For example, as shown in fig. 8A and 8B, when the image processing system 100 includes two display devices 180_11 and 180_12 of 1080×1920, the display device connected to the target computer 160 may be considered to be composed of two screens in an extended desktop manner, i.e. the size of the extended desktop is 1080×3840. When the coordinate position of the mouse M1 is located in the window image W3, the absolute coordinate position of the mouse M1 is between (0, 0-1080,1920), so that the mouse M1' is located in the window f3 in the running screen of the source computer 120. When the coordinate position of the mouse M2 is located in the window image W2, the absolute coordinate position of the mouse M2 is located between (0,1920-1080,3840), so that the mouse M2' is located in the window f2 in the running screen of the source computer 120. In other words, when the position of the mouse is moved from M1 to M2, the position of the mouse jumps from M1 'to M2' in the running screen of the source computer 120.
It should be noted that when the size of the window image is smaller than the display range of the display device, the mouse operation (e.g. clicking or moving) in the display area outside the window image is not performed in the source computer 120. For example, as shown in fig. 8A, the size of the window image W3 is smaller than the display range of the display device 180_11. When the target computer 160 or the image processing device 140 detects that the mouse is located in the display area outside the window image W3 (i.e. the network bottom area), the corresponding mouse operation behavior is not transmitted to the source computer 120. In other words, in the running screen of the source computer 120, the mouse position only appears in the range of the windows f1 to f3.
In other embodiments, when multiple persons are respectively operating the same source computer 120 from different target computers, the image processing device 140 records absolute coordinates of multiple mice in the corresponding display devices, and controls the source computer 120 according to the sequence of the user's operations on the mouse clicks or other input interface devices (e.g. keyboard) and by using the concept of time-division multiplexing. For example, as shown in fig. 9A and 9B, the image processing system 100 includes two target computers 160_1 and 160_2, which are respectively connected to the display devices 180_11 and 180_21, and both the display devices 180_11 and 180_21 display the same window image W2. The window image W2 corresponds to the window f2 in the running screen of the source computer 120.
When the mouse M1 clicks the left button on the right side of the window image W2 and sequentially inputs "BOOK" with the keyboard, and at the same time, the mouse M2 clicks the left button on the left side of the window image W2 and sequentially inputs "PEN" with the keyboard, the image processing apparatus 140 receives the absolute coordinates (X1, Y1) of the mouse M1 and the keyboard input command "BOOK" from the target computer 160_1, and receives the absolute coordinates (X2, Y2) of the mouse M2 and the keyboard input command "PEN" from the target computer 160_2. Therefore, even though the keyboard input commands "BOOK" and "PEN" are interleaved in time sequence, the image processing apparatus 140 can generate the operation command of clicking the left button on the absolute coordinate (X1, Y1) of the mouse M1' and inputting the command "BOOK" to the source computer 120 according to the absolute coordinate (X1, Y1) of the mouse M1 and the keyboard input command "BOOK" from the same target computer 160_1. Similarly, the image processing device 140 can generate an operation command for clicking the left button on the absolute coordinates (X2, Y2) of the mouse M2' and inputting the command "PEN" to the source computer 120 according to the absolute coordinates (X2, Y2) of the mouse M2 and the keyboard input command "PEN" from the same target computer 160_2.
While the disclosed method is illustrated and described herein as a series of steps or events, it will be appreciated that the illustrated ordering of such steps or events should not be interpreted in a limiting sense. For example, portions of the steps may occur in different orders and/or concurrently with other steps or events apart from those illustrated and/or described herein. In addition, not all illustrated steps may be required to implement one or more embodiments or examples described herein. Furthermore, one or more steps herein may also be performed in one or more separate steps and/or stages.
In summary, in the above embodiments, the image processing method 200 is utilized to perform window recognition and acquisition on the running screen IMD1 of the source computer 120 by the image processing device 140, and transmit the acquired one or more window images P1-P3 to one or more of the target computers 160_1-160_m, so that one or more of the target computers 160_1-160_m can display the window images W1-W3 corresponding to one or more windows f 1-f 3 of the source computer 120 through the connected display device.
While the present disclosure has been described with reference to the embodiments, it should be understood that the invention is not limited thereto, but may be variously modified and modified by those skilled in the art without departing from the spirit and scope of the present disclosure, and thus the scope of the present disclosure is defined by the appended claims.

Claims (9)

1. An image processing method for an image processing device coupled between a source computer and at least one target computer, the image processing method comprising:
the image processing device receives an operation picture of the source computer;
the image processing device identifies a plurality of windows contained in the running picture;
the image processing device obtains all or part of the windows to generate a window image set;
the image processing device outputs the window image set to the at least one target computer, and the at least one target computer processes the window image set to pivot a plurality of window images; and
and displaying one or more of the window images by a plurality of display devices connected with the at least one target computer respectively.
2. The image processing method of claim 1, wherein the step of identifying the plurality of windows comprises:
a shape object conforming to at least one frame feature is detected inwardly from an outer edge of the running picture.
3. The image processing method according to claim 2, further comprising:
when the border of a first shape object and the border of a second shape object are staggered, overlapped or share a border, determining that the first shape object and the second shape object are at least partially overlapped; and
when it is determined that the first shape object and the second shape object at least partially overlap, the display positions of the windows in the running screen of the source computer are controlled according to a preset instruction of an input interface device.
4. The image processing method according to claim 1, wherein the step of obtaining to generate a corresponding plurality of window images comprises:
and acquiring the content in a shape object in the running picture as a corresponding one of the window images.
5. The image processing method according to claim 1, further comprising:
compressing and outputting the plurality of window images by the image processing device; and
receiving the compressed window images by the at least one target computer and decompressing the window images; and
and displaying the decompressed window images by the at least one target computer.
6. The image processing method according to claim 1, wherein the area of the plurality of windows and the area of the operation screen are in a first ratio, and the area of one of the plurality of window images and the area of a corresponding one of the plurality of display devices are in a second ratio greater than the first ratio.
7. The image processing method according to claim 1, further comprising:
compressing and outputting the running picture by the image processing device;
receiving the compressed running picture by the at least one target computer and decompressing the running picture;
identifying the plurality of windows by the at least one target computer according to the decompressed running picture to generate a plurality of window images; and
the at least one target computer displays the plurality of window images.
8. The image processing method according to claim 1, further comprising:
and converting an original position of a mouse in the running picture of the source computer according to an absolute coordinate of the mouse in the display devices.
9. The image processing method according to claim 1, further comprising:
the display positions of the windows in the running picture of the source computer are controlled according to a preset instruction of an input interface device.
CN202110631359.9A 2020-06-11 2021-06-07 Image processing method Active CN114115771B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
TW109119743A TWI739474B (en) 2020-06-11 2020-06-11 Image processing device, image processing system and image processing method
TW109119743 2020-06-11

Publications (2)

Publication Number Publication Date
CN114115771A CN114115771A (en) 2022-03-01
CN114115771B true CN114115771B (en) 2024-04-02

Family

ID=78778202

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110631359.9A Active CN114115771B (en) 2020-06-11 2021-06-07 Image processing method

Country Status (2)

Country Link
CN (1) CN114115771B (en)
TW (1) TWI739474B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102622094A (en) * 2010-12-31 2012-08-01 宏正自动科技股份有限公司 Remote management system, multi-computer switcher and remote management method
CN102681690A (en) * 2010-12-31 2012-09-19 宏正自动科技股份有限公司 Remote management system, multi-computer switcher and remote management method
CN105302285A (en) * 2014-08-01 2016-02-03 福州瑞芯微电子股份有限公司 Multi-screen display method, equipment and system

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11327520A (en) * 1998-05-13 1999-11-26 Sony Corp Display control method and display controller
TW582168B (en) * 2002-03-01 2004-04-01 Huper Lab Co Ltd Method for abstracting multiple moving objects
TWI234396B (en) * 2004-05-10 2005-06-11 Jau-Hung Jang Image capturing/storage device to capture a specific targe image and method thereof
TWI297875B (en) * 2004-09-16 2008-06-11 Novatek Microelectronics Corp Image processing method and device using thereof
TWI275070B (en) * 2005-01-25 2007-03-01 Apex Internat Financial Engine Financial information multi-window projection display system and method thereof
TWM325143U (en) * 2007-08-02 2008-01-11 Astro Corp Multi-screen game machine
US8686921B2 (en) * 2008-12-31 2014-04-01 Intel Corporation Dynamic geometry management of virtual frame buffer for appendable logical displays
TWI430257B (en) * 2009-03-06 2014-03-11 Chunghwa Picture Tubes Ltd Image processing method for multi-depth three-dimension display
CN102645970B (en) * 2011-02-22 2015-10-28 鸿富锦精密工业(深圳)有限公司 Motion-vector trigger control method and use its electronic installation
TWM418495U (en) * 2011-04-12 2011-12-11 Shi-Liang Wang Image processing apparatus and system
TW201246175A (en) * 2011-05-09 2012-11-16 Acer Inc Method and device for displaying on a plurality of displayers
CN104092713B (en) * 2013-05-31 2018-06-15 腾讯科技(深圳)有限公司 The download information methods of exhibiting and device of a kind of Internet resources
EP3058509A4 (en) * 2013-10-16 2017-08-09 3M Innovative Properties Company Note recognition for overlapping physical notes
CN104750440B (en) * 2013-12-30 2017-09-29 纬创资通股份有限公司 Window management method, electronic installation and the computer program product of multi-screen
TW201843606A (en) * 2017-05-03 2018-12-16 冠捷投資有限公司 Real-time information searching apparatus when playing image and operation method thereof
TWI684977B (en) * 2017-10-30 2020-02-11 大陸商北京集創北方科技股份有限公司 Screen display method of display module and display module using the method
TWI693564B (en) * 2018-01-05 2020-05-11 竹陞科技股份有限公司 Automatic equipment management system and method thereof
TWI818913B (en) * 2018-06-26 2023-10-21 聯華電子股份有限公司 Device and method for artificial intelligence controlling manufacturing apparatus

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102622094A (en) * 2010-12-31 2012-08-01 宏正自动科技股份有限公司 Remote management system, multi-computer switcher and remote management method
CN102681690A (en) * 2010-12-31 2012-09-19 宏正自动科技股份有限公司 Remote management system, multi-computer switcher and remote management method
CN105302285A (en) * 2014-08-01 2016-02-03 福州瑞芯微电子股份有限公司 Multi-screen display method, equipment and system

Also Published As

Publication number Publication date
TWI739474B (en) 2021-09-11
CN114115771A (en) 2022-03-01
TW202147088A (en) 2021-12-16

Similar Documents

Publication Publication Date Title
US20160202887A1 (en) Method for managing application icon and terminal
CN105874783B (en) Apparatus, method and computer readable medium for up-converting a motion video frame rate
US10698530B2 (en) Touch display device
US20150286395A1 (en) Computer with touch panel, operation method, and recording medium
US20140184547A1 (en) Information processor and display control method
US10222971B2 (en) Display apparatus, method, and storage medium
US10719228B2 (en) Image processing apparatus, image processing system, and image processing method
EP2048651A1 (en) Apparatus, system, and method for displaying
CN110720086A (en) Control device, control method, and program
US9177405B2 (en) Image processing apparatus, computer program product, and image processing system
US7821575B2 (en) Image processing apparatus, receiver, and display device
US20170344248A1 (en) Image processing device, image processing system, and image processing method
CN111309199B (en) Display control method of touch display device and touch display device
EP2645622B1 (en) Image processing apparatus and image processing system
US20220319388A1 (en) Display control method and electronic device
US20070260767A1 (en) Information processing apparatus and information processing method
AU2014280985B2 (en) Image processing apparatus, image processing method, image processing system, and program
JP5728588B2 (en) Display control method, computer program, display control apparatus, and image display system
CN114115771B (en) Image processing method
US10388257B2 (en) Information processing apparatus, method of controlling the same and non-transitory computer-readable storage medium
US10803836B2 (en) Switch device and switch system and the methods thereof
CN115767176A (en) Image processing device and playing control method for display wall system
US20210072884A1 (en) Information processing apparatus and non-transitory computer readable medium
CN110968210A (en) Switching device and switching system and applicable method thereof
JP6161997B2 (en) Medical image display system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant