CN114115771A - Image processing method - Google Patents

Image processing method Download PDF

Info

Publication number
CN114115771A
CN114115771A CN202110631359.9A CN202110631359A CN114115771A CN 114115771 A CN114115771 A CN 114115771A CN 202110631359 A CN202110631359 A CN 202110631359A CN 114115771 A CN114115771 A CN 114115771A
Authority
CN
China
Prior art keywords
image processing
window
processing method
window images
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110631359.9A
Other languages
Chinese (zh)
Other versions
CN114115771B (en
Inventor
张立人
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Aten International Co Ltd
Original Assignee
Aten International Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Aten International Co Ltd filed Critical Aten International Co Ltd
Publication of CN114115771A publication Critical patent/CN114115771A/en
Application granted granted Critical
Publication of CN114115771B publication Critical patent/CN114115771B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

An image processing method for an image processing device. The image processing device is coupled between the source computer and at least one target computer. The image processing method comprises the following steps: receiving an operation picture of a source computer; identifying a plurality of windows contained in a running picture; acquiring all or part of the window to generate a plurality of corresponding window images; outputting the window image to at least one target computer; and displaying one or more of the window images.

Description

Image processing method
Technical Field
The present disclosure relates to an image processing method, and more particularly, to an image processing method for improving window display.
Background
Generally, an operator in a remote machine room of a control center needs to switch different connection ends among multiple hosts for operation. However, when a plurality of windows are simultaneously displayed in one display screen, it is difficult for the operator to clearly view information on the display screen; in addition, even if a plurality of display screens are provided, the windows cannot be dragged from the original display screen to another display screen. The fatigue degree of the operator on work is easily increased in the working environment for a long time.
Therefore, how to improve the display of the window image is one of the important issues in the art.
Disclosure of Invention
The present disclosure relates to an image processing method. The image processing method is used for the image processing device. The image processing device is coupled between the source computer and at least one target computer. The image processing method comprises the following steps: receiving an operation picture of a source computer; identifying a plurality of windows contained in a running picture; acquiring all or part of the window to generate a plurality of corresponding window images; outputting the window image to at least one target computer; and displaying one or more of the window images.
In some embodiments, the step of identifying the window comprises: detecting a shape object which accords with at least one frame characteristic from the outer edge of the running picture inwards.
In some embodiments, the image processing method further includes: when the border of the first shape object and the border of the second shape object are partially staggered, overlapped or share a boundary, determining that the first shape object and the second shape object are at least partially overlapped; and controlling the display position of the window in the running picture of the source computer according to a preset instruction of the input interface device when the first shape object and the second shape object are at least partially overlapped.
In some embodiments, the step of acquiring to generate the corresponding plurality of window images comprises: and acquiring the content positioned in the shape object in the running picture as a corresponding one of the window images.
In some embodiments, the image processing method further includes: receiving an operation picture by an image processing device; identifying the window by the image processing device; obtaining the window image by an image processing device; compressing and outputting the window image by the image processing device; receiving and decompressing the compressed window image by at least one target computer; and displaying the decompressed window image by at least one target computer.
In some embodiments, the step of displaying one or more of the window images includes: one or more of the window images are respectively displayed by a plurality of display devices connected with at least one target computer.
In some embodiments, the area of the window and the area of the operation screen are in a first ratio, and the area of one of the window images and the area of a corresponding one of the display devices are in a second ratio larger than the first ratio.
In some embodiments, the image processing method further includes: compressing and outputting the operation picture by the image processing device; receiving and decompressing the compressed running picture by at least one target computer; identifying the window by at least one target computer according to the decompressed running picture to generate a window image; and displaying the window image by at least one target computer.
In some embodiments, the image processing method further includes: and converting the original position of the mouse position in the running picture of the source computer according to the absolute coordinate of the mouse in the display device.
In some embodiments, the image processing method further includes: and controlling the display position of a window in the running picture of the source computer according to a preset instruction of the input interface device.
In summary, according to the image processing method, the image processing apparatus performs window identification and acquisition on the running picture of the source computer, and transmits the acquired one or more window images to one or more of the target computers, so that the one or more of the target computers can display the window images corresponding to the one or more windows of the source computer through the connected display apparatus.
Drawings
Fig. 1 is a schematic diagram illustrating an image processing system according to some embodiments of the present disclosure.
Fig. 2 is a flow chart illustrating an image processing method according to some embodiments of the present disclosure.
FIG. 3 is a schematic diagram illustrating identifying and capturing a window to generate an image of the window according to some embodiments of the present disclosure.
FIG. 4A is a diagram illustrating an output and display of a window image according to some embodiments of the present disclosure.
FIG. 4B is a schematic diagram illustrating another method for outputting and displaying a window image according to some embodiments of the present disclosure.
FIG. 5 is a diagram illustrating window image recognition, according to some embodiments of the present disclosure.
FIG. 6A is a schematic diagram illustrating a window overlay in accordance with some embodiments of the present disclosure.
Fig. 6B is a schematic diagram illustrating a display screen split according to some embodiments of the present disclosure.
FIG. 7 is a schematic diagram illustrating a window image display in accordance with some embodiments of the present disclosure.
FIG. 8A is a diagram illustrating a display device displaying a window image according to some embodiments of the present disclosure.
FIG. 8B is a diagram illustrating an operation screen of a source computer according to some embodiments of the present disclosure.
FIG. 9A is a diagram illustrating another display device displaying a window image according to some embodiments of the present disclosure.
FIG. 9B is a diagram illustrating another exemplary run-time screen of a source computer, according to some embodiments of the present disclosure.
Description of reference numerals:
100: image processing system
120_1 to 120_ n: source computer
140: image processing device
142: image recognizer
144: processor with a memory having a plurality of memory cells
150: ethernet network
160_1 to 160_ m: target computer
162: controller
164a,164b,164c,164d,164 e: display card
180,180_11 to 180_ mk: display device
IMD1, IMD3, IMD 4: running picture
IMD 2: window image
200: image processing method
S210, S220, S230, S240, S250: operation of
f1, f2, f3, f4, f4', f5, f 5': window
P1, P2, P3: window image
W1, W2, W3: window image
SC: display area
M1, M1', M2, M2': mouse (Saggar)
X1, Y1, X2, Y2: coordinates of the object
Detailed Description
The following detailed description is provided to best understand the embodiments of the present disclosure, but the embodiments are not provided to limit the scope of the present disclosure, and the structural operations are not described to limit the execution sequence thereof, and any structure resulting from the rearrangement of elements to produce an apparatus with equivalent technical effects is also within the scope of the present disclosure. Moreover, the drawings are for illustrative purposes only and are not drawn to scale in accordance with industry standard and conventional practice, and the dimensions of the various features may be arbitrarily increased or decreased for clarity of illustration. In the following description, the same elements will be described with the same reference numerals for ease of understanding.
The term (terms) used throughout this specification and related applications has the ordinary meaning as commonly understood in the art, in the disclosure herein and in the specific context, unless otherwise indicated. Certain words used to describe the disclosure are discussed below or elsewhere in this specification to provide additional guidance to those skilled in the art in describing the disclosure.
Furthermore, as used herein, the terms "comprising," including, "" having, "" containing, "and the like are open-ended terms that mean" including, but not limited to. Further, as used herein, "and/or" includes any and all combinations of one or more of the associated listed items.
When an element is referred to as being "connected" or "coupled," it can be referred to as being "electrically connected" or "electrically coupled. "connected" or "coupled" may also be used to indicate that two or more elements are in mutual engagement or interaction.
Moreover, although terms such as "first," "second," …, etc., may be used herein to describe various elements, these terms are used merely to distinguish one element or operation from another element or operation described in similar technical terms. Unless the context clearly dictates otherwise, the terms do not specifically refer or imply an order or sequence nor are they intended to limit the invention.
For convenience of description, the lower case english indexes 1 to n, 1 to m, 1 to k in the element numbers and the signal numbers used in the present specification and the drawings are only for convenience of referring to individual elements and signals, and are not intended to limit the number of the aforementioned elements and signals to a specific number. In the present disclosure and drawings, if an element number or a signal number is used without an index indicating the element number or the signal number, the element number or the signal number refers to any unspecified element or signal in an element group or a signal group. For example, the object designated by the component number 120_1 is the source computer 120_1, and the object designated by the component number 120 is any source computer not specified in the source computers 120_ 1-120 _ n. For another example, the object designated by the component number 160_1 is the target computer 160_1, and the object designated by the component number 160 is any unspecified target computer among the target computers 160_1 to 160_ m.
Please refer to fig. 1. Fig. 1 is a schematic diagram illustrating an image processing system 100 according to some embodiments of the present disclosure. As shown in FIG. 1, the image processing system 100 includes source computers 120_1 to 120_ n, an image processing device 140, target computers 160_1 to 160_ m, and display devices 180_11 to 180_ mk. The image processing device 140 is connected to the source computers 120_1 to 120_ n and connected to the target computers 160_1 to 160_ m through the Ethernet (Ethernet) 150. The target computers 160_1 to 160_ m are connected to the display devices 180_11 to 180_ mk, respectively.
Specifically, the image processing apparatus 140 may be a KVM switch. The image processing apparatus 140 may be connected to the source computers 120_1 to 120_ n through a High Definition Multimedia Interface (HDMI), a Digital Visual Interface (DVI), a VGA terminal (Video Graphics Array connector), or a Universal Serial Bus (USB). The target computers 160_1 to 160_ m can also be connected to the display devices 180_11 to 180_ mk respectively through the HDMI, DVI, VGA terminal or USB. For example, as shown in FIG. 1, the target computer 160_1 is connected to the display devices 180_11 to 180_1 k. The target computer 160_2 is connected to the display devices 180_21 to 180_2 k. In this way, the target computer 160_ m is connected to the display devices 180_ m 1-180 _ mk.
In operation, one of the source computers (e.g., source computer 120_1 in fig. 1) is configured to output the running IMD1 to the image processing device 140. The operation IMD1 is the frame generated by the source computer 120 in the current operation state. In another aspect, the operation view IMD1 may be a view image currently displayed by a display device connected to the source computer 120. When the image processing device 140 receives the working frame IMD1, the image processing device 140 is configured to identify a plurality of windows included in the working frame IMD1, acquire the windows to generate a plurality of corresponding window images IMD2, and transmit the window image IMD2 to one or more of the target computers 160_1 to 160_ m via the ethernet 150. The corresponding one of the target computers 160_1 to 160_ m (e.g., the target computer 160_1 in fig. 1) is configured to receive the window image IMD2 and output one or more window images of the window image IMD2 to the corresponding connected display devices (e.g., the display devices 180_11 to 180_1k in fig. 1) for display.
It should be noted that the hardware devices or the connection lines are only for convenience of illustration, but the disclosure is not limited thereto. That is, although FIG. 1 shows the image processing system 100 including n source computers 120_1 to 120_ n, m target computers 160_1 to 160_ m, and m by k display devices 180_11 to 180_ mk, where n, m, and k are all positive integers, the numbers are merely for convenience of illustration and are not intended to limit the disclosure. In other words, in some embodiments, the image processing system 100 may include only one source computer or only one target computer, and/or only one display device connected to the target computer. Alternatively, the image processing system 100 may include a plurality of target computers, and the number of display devices connected to each target computer may be different (i.e., not necessarily k).
Please refer to fig. 2. Fig. 2 is a flow chart illustrating an image processing method 200 according to some embodiments of the present disclosure. For convenience of explanation, the following description will be made in conjunction with the embodiment of fig. 1, but not limited thereto, and it will be apparent to those skilled in the art that various changes and modifications may be made therein without departing from the spirit and scope of the disclosure. As shown in fig. 2, the image processing method 200 includes operations S210, S220, S230, S240, and S250.
First, in operation S210, the IMD1 is received as a running screen from the source computer 120. For example, the operating frame IMD1 of the source computer 120 is shown as the operating frame IMD1 in FIG. 3. The operation screen IMD1 is received by the image processing device 140.
Next, in operation S220, a plurality of windows included in the operation screen IMD1 are identified. Then, in operation S230, all or a portion of the window is acquired to generate a plurality of corresponding window images. For example, as shown in FIG. 3, the IMD1 includes windows f 1-f 3. In some embodiments, the image processing device 140 includes an image identifier 142 and a processor 144. The image recognizer 142 receives the IMD1 to perform image recognition to generate window images P1-P3 corresponding to the windows f 1-f 3, and transmits the window images P1-P3 to the processor 144. The processor 144 receives the window images P1-P3, and then performs image compression on the window images P1-P3 to generate the window image IMD 2. In some other embodiments, the processor 144 may perform a graphics compression on a portion of the window image (e.g., window-only image P1) to generate the window image IMD 2. Details regarding image recognition will be described in subsequent paragraphs with reference to fig. 5.
Then, in operation S240, the window image IMD2 is output to one or more target computers 160. For example, as shown in FIG. 4A, a window image IMD2 output from the image processing device 140 is received by the target computer 160.
Finally, in operation S250, one or more of the window image IMDs 2 are displayed by one or more of the display devices 180_ 11-180 _1 k. In some embodiments, as shown in FIG. 4A, the target computer 160_1 includes a controller 162. The controller 162 receives the window image IMD2, decompresses the received window image IMD2 to generate one or more window images (e.g., window images W1, W2, W3, W1-W3) included in the window image IMD2, and transmits the one or more window images to the corresponding display cards (e.g., display cards 164a,164b,164c,164 d), so that the target computer 160_1 outputs the corresponding window images to the corresponding display devices (e.g., display devices 180_11, 180_12, 180_13, 180_14) for display via the display cards.
Specifically, the controller 162 transmits the window image W1 to the display card 164a, so that the target computer 160_1 outputs the window image W1 to the display device 180_11 via the display card 164a for displaying. The controller 162 transmits the window image W2 to the display card 164b, so that the target computer 160_1 outputs the window image W2 to the display device 180_12 via the display card 164b for displaying. The controller 162 transmits the window image W3 to the display card 164c, so that the target computer 160_1 outputs the window image W3 to the display device 180_13 via the display card 164c for displaying. The controller 162 transmits the window images W1-W3 to the display card 164d, so that the target computer 160_1 outputs the window images W1-W3 to the display device 180_14 via the display card 164d for display.
In other words, the controller 162 may generate one or more of the window images W1-W3 included in the window image IMD2 to the display card 164. And different display cards 164 may independently output the same or different content to the connected display device 180 for display. Therefore, one or more window images can be displayed on the same display device. The completely same, partially same and completely different window images can be displayed on different display devices.
In addition, in some other embodiments, as shown in fig. 4B, the image processing system 100 includes a plurality of target computers 160_1 and 160_ 2. In this embodiment, the image processing device 140 outputs the window image IMD2 to the target computers 160_1 and 160_2, respectively. The controller 162_1 of the target computer 160_1 receives the window image IMD2, decompresses the received window image to generate corresponding window images W1 and W2, and transmits the window images W1 and W2 to the corresponding display cards 164a and 164b, so that the target computer 160_1 respectively outputs the window images W1 and W2 to the corresponding display devices 180_11 and 180_12 through the display cards 164a and 164b for displaying. The controller 162_2 of the target computer 160_2 receives the window image IMD2 and decompresses the window image to generate corresponding window images W2, W3, and W1-W3, and transmits the window images W2, W3, and W1-W3 to the corresponding display cards 164c,164d, and 164e, respectively, so that the target computer 160_2 outputs the window images W2, W3, and W1-W3 to the corresponding display devices 180_21, 180_22, and 180_23 through the display cards 164c,164d, and 164e, respectively, for display.
In other words, different target computers 162_1 and 162_2 receive the same window image IMD 2. The display contents output by the display card 164 in the same or different target computers are independent. In this way, one or more window images that are completely the same, partially the same, and completely different may be displayed on different display devices respectively connected to one (e.g., the target computer 160_1) and the other (e.g., the target computer 160_2) of the plurality of target computers.
In addition, as shown in fig. 4A and 4B, the ratio of the area of the window image to the area of the corresponding display device is larger than the ratio of the area of the window image to the area of the operation screen. For example, the window image W1 occupies a larger area of the display device 180_11 than the window f1 occupies the IMD 1. The window image W2 occupies a larger area of the display device 180_12 than the window f2 occupies the IMD 1. Details regarding the display window image will be described in the following paragraphs with reference to FIG. 7.
In this way, according to the image processing method 200, the image processing device 140 performs window recognition and acquisition on the IMD1 of the running screen of the source computer 120, and transmits the acquired one or more window images P1-P3 to one or more of the target computers 160_ 1-160 _ m, respectively, so that the one or more of the target computers 160_ 1-160 _ m can display window images W1-W3 corresponding to one or more windows f 1-f 3 of the source computer 120 through the connected display device.
It should be noted that the numbers of the windows and the window images, the target computer, the display card, and the display device are merely examples for convenience of illustration, but the disclosure is not limited thereto. In addition, although the image processing method 200 is performed by the image processing apparatus 140 after performing window recognition, window acquisition and window image compression, the target computer 160_1 performs decompression; however, in some other embodiments, the image processing method 200 may also directly perform image file compression on the operation screen IMD1 by the image processing device 140, and transmit the compressed operation screen IMD1 to the target computer 160_ 1. Then, the target computer 160_1 receives the compressed running screen IMD1, decompresses the compressed running screen IMD1 by software, performs window recognition and acquisition according to the decompressed running screen IMD to generate window images W1-W3, and finally outputs the window images W1-W3 to the display device 180 for display by the target computer 160_ 1.
In addition, when the image processing system 100 includes a plurality of source computers 120, the image processing apparatus 140 receives a switching signal from the target computer 160, so that the target computer 160 controls the switching of the source computers 120. Specifically, when the target computer 160 sends a switching signal, the image processing apparatus 140 determines to switch to any of the source computers 120_1 to 120_ n according to the switching signal. For example, the switching signal may be a switching command generated by keyboard input, mouse operation, or other means. Therefore, the user can issue an instruction through the input interface device connected to the target computer 160 to switch the source computer which is currently operated and controlled, so that the user can switch different connection terminals among multiple hosts to operate in the control center.
For details of image recognition, please refer to fig. 5. FIG. 5 is a diagram illustrating window image recognition, according to some embodiments of the present disclosure. In the present embodiment, the operation screen of the source computer 120 is shown as the operation screen IMD1 in FIG. 5. Software in the image recognizer 142 or the target computer 160 in the image processing device 140 detects shape objects conforming to frame features from the outer edge of the IMD1 to recognize windows f 1-f 3, and acquires contents in the shape objects in the IMD1 as window images P1-P3.
In some embodiments, the shaped object is a rectangular object. For example, for a rectangular object, the frame features include whether there are horizontal or vertical lines, whether the colors are similar, whether the lines are connected, whether the lines are rectangular after being connected, and whether there is a common border. As shown by the arrows in FIG. 5, the IMD1 operates to detect top-down, bottom-up, left-side-right, and right-side-left, and to infer the presence of a window and identify the window's extent by determining whether frame features are met. When the four frames of a rectangular object are grabbed, the inward continuous detection is stopped, so that the condition that if tables or other rectangular images in the window are mistakenly determined as the window is avoided. However, the outline of the shape object is not limited to this, and objects with different shapes can be defined according to the operation requirement of the user, and the objects with different shapes respectively have a plurality of frame features which are in accordance with the outline thereof.
In some embodiments, when more than one rectangular object is detected and the respective frames of the rectangular objects have partially crossed, overlapped or shared boundaries, it is determined that the objects matching the frame features have overlapped conditions. As shown in fig. 6A, in the IMD3 of the operation screen of the target computer 160, the right side frame of the rectangular object f4 is partially interlaced with the left side frame of the rectangular object f 5. for example, in the example shown in fig. 6A, since the rectangular object f4 is located in the foreground with respect to the rectangular object f5, the right side of the rectangular object f4 covers a portion of the left side of the rectangular object f5, and the vertical boundaries of the right side of the rectangular object f4 and the left side of the rectangular object f5 have a portion overlapping and common, it can be determined that the two rectangular objects have been detected to have overlapping. At this time, the user may generate a frame splitting command to the target computer 160 by connecting an input interface device (e.g., a keyboard or a mouse) of the target computer 160, and transmit the frame splitting command to the source computer 120 through the ethernet 150 and the image processing device 140, so that the source computer 120 splits the display screen into a plurality of display areas according to the frame splitting command, and respectively reduces a plurality of windows in the currently-operating frame IMD3 to different display areas according to the currently-displayed length-width ratio. As shown in FIG. 6B, after the source computer 120 receives the frame splitting command, the IMD4 is split into four display regions, the window f4 'is scaled down to have a length or width corresponding to the size of the upper left display region by the length-width ratio of the original window f4, and the window f5' is also scaled down to have a length or width corresponding to the size of the lower right display region by the length-width ratio of the original window f 5.
Therefore, whether the windows are mutually overlapped is judged by detecting whether the frames of the rectangular object are staggered, overlapped or share the boundary, and then the display area is divided by the picture dividing instruction, so that the windows can be ensured not to be mutually overlapped, and the condition that the window image is incomplete when the window image is acquired is avoided.
Please refer to fig. 7 for details of the window image. FIG. 7 is a schematic diagram illustrating a window image display in accordance with some embodiments of the present disclosure. In the present embodiment, the window image W1 is displayed by the display device 180. The window image W1 corresponds to the window image P1 of the window f1 included in the window IMD1 in fig. 5. Specifically, the target computer 160 enlarges the window image W1 to be displayed on the display device 180 to a length/width ratio of the original image (i.e., the length/width ratio of the window image P1 in the IMD1 of fig. 5) that is equal to or slightly smaller than the display area SC of the display device 180. For example, as shown in fig. 7, the window image Q1 is enlarged proportionally to approximately coincide with the upper and lower inner frames of the display area SC.
In this way, the window image W1 displayed on the display device 180 connected to the target computer 160 can be displayed with the same aspect ratio and a larger display area after the window f1 with a smaller display area in the IMD1 of the source computer 120 is identified and acquired. In other words, the ratio of the window f1 to the IMD1 is less than or equal to the ratio of the window image W1 to the display area SC. Therefore, in comparison with displaying a plurality of windows (e.g., windows P1-P3 in fig. 5) on the same display screen, the image processing method 200 of the present disclosure displays each window independently on a single display screen (e.g., window W1 in fig. 7), so that the operator can easily and clearly view information. Under the condition that the use difficulty of the operator is not increased, the burden of the operator can be effectively reduced, the fatigue degree of the operator is further reduced, and the operation efficiency and the accuracy are improved.
It should be noted that, although the window images in the above embodiments are all shown enlarged to the display area SC close to the display device, in other embodiments, the window images can be enlarged to display at a specific predetermined ratio, for example, more than two thirds of the display area SC. The scale is shown merely as an example and is not intended to limit the disclosure.
For the cooperation between the input interface device and the above devices, please refer to fig. 8A, 8B, 9A and 9B. Fig. 8A and 9A are schematic views respectively illustrating window images displayed by a display device according to various embodiments of the disclosure. Fig. 8B and 9B are schematic views respectively illustrating the operation screen of the source computer according to the embodiments of fig. 8A and 9A.
The position of the mouse in the operation screen of the source computer 120 is converted by an equal-scale formula according to the absolute coordinates of the mouse in the display device 180. Specifically, the target computer 160 determines the position of the mouse on the display device 180 according to the detection signal generated by the input interface device (e.g., mouse). Meanwhile, the target computer 160 transmits the absolute coordinates of the mouse on the display device 180 to the image processing device 140. The image processing device 140 calculates and controls the position of the mouse in the operation screen of the source computer 120 according to the absolute coordinates of the mouse on the display device 180, the size and the absolute coordinates of the window image in the display device 180, and the size and the absolute coordinates of the window in the operation screen of the source computer.
For example, as shown in fig. 8A and 8B, when the image processing system 100 includes two 1080 × 1920 display devices 180_11 and 180_12, the display device connected to the target computer 160 can be regarded as being composed of two screens in the form of an extended desktop, i.e. the extended desktop has a size of 1080 × 3840. When the coordinate position of the mouse M1 is located in the window image W3, the absolute coordinate position of the mouse M1 is between (0,0-1080,1920), so the mouse M1' is located in the window f3 in the running picture of the source computer 120. When the coordinate position of the mouse M2 is located in the window image W2, the absolute coordinate position of the mouse M2 is between (0,1920-1080,3840), so that the mouse M2' is located in the window f2 in the running screen of the source computer 120. In other words, when the position of the mouse is moved from M1 to M2, the position of the mouse jumps from M1 'to M2' in the operation screen of the source computer 120.
It should be noted that when the size of the window image is smaller than the display range of the display device, the mouse operation (e.g., clicking, moving) in the display region outside the window image is not executed in the source computer 120. For example, as shown in FIG. 8A, the window image W3 is smaller in size than the display range of the display device 180_ 11. When the target computer 180 or the image processing device 140 detects that the mouse is located in the display area (i.e., the netbottom area) outside the window image W3, the corresponding mouse operation behavior is not transmitted to the source computer 120. In other words, in the operation screen of the source computer 120, the mouse position only appears in the windows f 1-f 3.
In some other embodiments, when a plurality of users connect to the same source computer 120 from different target computers, the image processing device 140 records absolute coordinates of a plurality of mice in corresponding display devices, and operates the source computer 120 according to the sequence of mouse clicks or other input interface devices (e.g., keyboard) operations by the users and by time-division multiplexing. For example, as shown in fig. 9A and 9B, the image processing system 100 includes two target computers 160_1 and 160_2, the two target computers are respectively connected to the display devices 180_11 and 180_21, and both the display devices 180_11 and 180_21 display the same window image W2. The window image W2 corresponds to the window f2 in the running screen of the source computer 120.
When the mouse M1 clicks the left button on the right side of the window image W2 and sequentially inputs "BOOK" with the keyboard, and at the same time, the mouse M2 clicks the left button on the left side of the window image W2 and sequentially inputs "PEN" with the keyboard, the image processing apparatus 140 receives the absolute coordinates (X1, Y1) of the mouse M1 and the keyboard input command "BOOK" from the target computer 160_1, and receives the absolute coordinates (X2, Y2) of the mouse M2 and the keyboard input command "PEN" from the target computer 160_ 2. Therefore, even if the keyboard input commands "BOOK" and "PEN" are interleaved in time sequence, the image processing apparatus 140 can generate the operation command of clicking the left button on the absolute coordinates (X1, Y1) of the mouse M1' and inputting the command "BOOK" to the source computer 120 according to the absolute coordinates (X1, Y1) of the mouse M1 and the keyboard input command "BOOK" from the same target computer 160_ 1. Similarly, the image processing apparatus 140 generates an operation command for clicking the left button on the absolute coordinates (X2, Y2) of the mouse M2' and inputting the command "PEN" to the source computer 120 via the keyboard according to the absolute coordinates (X2, Y2) of the mouse M2 and the keyboard input command "PEN" from the same target computer 160_ 2.
While the disclosed methods are illustrated and described herein as a series of steps or events, it will be appreciated that the order of the steps or events shown is not to be interpreted in a limiting sense. For example, some steps may occur in different orders and/or concurrently with other steps or events apart from those illustrated and/or described herein. In addition, not all illustrated steps may be required to implement one or more of the implementations or embodiments described herein. Furthermore, one or more steps herein may also be performed in one or more separate steps and/or stages.
In summary, in the present disclosure, by applying the above embodiments, the image processing method 200 is employed, the image processing device 140 performs window identification and acquisition on the IMD1 of the running screen of the source computer 120, and respectively transmits one or more window images P1-P3 acquired to one or more of the target computers 160_ 1-160 _ m, so that one or more of the target computers 160_ 1-160 _ m can display window images W1-W3 corresponding to one or more windows f 1-f 3 of the source computer 120 through the connected display device.
Although the present disclosure has been described with reference to the above embodiments, it should be understood that various changes and modifications can be made therein by those skilled in the art without departing from the spirit and scope of the present disclosure, and therefore, the scope of the present disclosure should be determined by that of the appended claims.

Claims (10)

1. An image processing method for an image processing apparatus coupled between a source computer and at least one target computer, the image processing method comprising:
receiving an operation picture of the source computer;
identifying a plurality of windows contained in the running picture;
acquiring all or part of the multiple windows to generate corresponding multiple window images;
outputting the plurality of window images to the at least one target computer; and
displaying one or more of the plurality of window images.
2. The image processing method of claim 1, wherein the step of identifying the plurality of windows comprises:
detecting a shape object which accords with at least one frame characteristic from an outer edge of the running picture inwards.
3. The image processing method as claimed in claim 2, further comprising:
determining that a first shape object and a second shape object at least partially overlap when the borders of the first shape object and the borders of the second shape object are partially interlaced, overlapping, or share a border; and
when the first shape object and the second shape object are at least partially overlapped, the display positions of the windows in the running picture of the source computer are controlled according to a preset instruction of an input interface device.
4. The image processing method of claim 1, wherein the step of acquiring to generate the corresponding plurality of window images comprises:
and acquiring the content of a shape object in the running picture as a corresponding one of the plurality of window images.
5. The image processing method as claimed in claim 1, further comprising:
receiving the operation picture by the image processing device;
identifying the plurality of windows by the image processing device;
obtaining the images by the image processing device to generate a plurality of window images;
compressing and outputting the plurality of window images by the image processing device;
receiving and decompressing the compressed window images by the at least one target computer; and
and displaying the decompressed window images by the at least one target computer.
6. The image processing method of claim 1, wherein displaying one or more of the plurality of window images comprises:
and respectively displaying one or more of the window images by a plurality of display devices connected with the at least one target computer.
7. The image processing method as claimed in claim 1, wherein the areas of the windows and the area of the running picture are in a first ratio, and the area of one of the window images and the area of the corresponding one of the display devices are in a second ratio greater than the first ratio.
8. The image processing method as claimed in claim 1, further comprising:
the image processing device compresses and outputs the running picture;
receiving the compressed running picture by the at least one target computer and decompressing;
identifying the plurality of windows by the at least one target computer according to the decompressed running picture to generate a plurality of window images; and
and displaying the window images by the at least one target computer.
9. The image processing method as claimed in claim 1, further comprising:
and converting the position of a mouse in an original position in the running picture of the source computer according to an absolute coordinate of the mouse in the plurality of display devices.
10. The image processing method as claimed in claim 1, further comprising:
and controlling the display positions of the windows in the running picture of the source computer according to a preset instruction of an input interface device.
CN202110631359.9A 2020-06-11 2021-06-07 Image processing method Active CN114115771B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
TW109119743A TWI739474B (en) 2020-06-11 2020-06-11 Image processing device, image processing system and image processing method
TW109119743 2020-06-11

Publications (2)

Publication Number Publication Date
CN114115771A true CN114115771A (en) 2022-03-01
CN114115771B CN114115771B (en) 2024-04-02

Family

ID=78778202

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110631359.9A Active CN114115771B (en) 2020-06-11 2021-06-07 Image processing method

Country Status (2)

Country Link
CN (1) CN114115771B (en)
TW (1) TWI739474B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102622094A (en) * 2010-12-31 2012-08-01 宏正自动科技股份有限公司 Remote management system, multi-computer switcher and remote management method
CN102681690A (en) * 2010-12-31 2012-09-19 宏正自动科技股份有限公司 Remote management system, multi-computer switcher and remote management method
CN105302285A (en) * 2014-08-01 2016-02-03 福州瑞芯微电子股份有限公司 Multi-screen display method, equipment and system

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11327520A (en) * 1998-05-13 1999-11-26 Sony Corp Display control method and display controller
TW582168B (en) * 2002-03-01 2004-04-01 Huper Lab Co Ltd Method for abstracting multiple moving objects
TWI234396B (en) * 2004-05-10 2005-06-11 Jau-Hung Jang Image capturing/storage device to capture a specific targe image and method thereof
TWI297875B (en) * 2004-09-16 2008-06-11 Novatek Microelectronics Corp Image processing method and device using thereof
TWI275070B (en) * 2005-01-25 2007-03-01 Apex Internat Financial Engine Financial information multi-window projection display system and method thereof
TWM325143U (en) * 2007-08-02 2008-01-11 Astro Corp Multi-screen game machine
US8686921B2 (en) * 2008-12-31 2014-04-01 Intel Corporation Dynamic geometry management of virtual frame buffer for appendable logical displays
TWI430257B (en) * 2009-03-06 2014-03-11 Chunghwa Picture Tubes Ltd Image processing method for multi-depth three-dimension display
CN102645970B (en) * 2011-02-22 2015-10-28 鸿富锦精密工业(深圳)有限公司 Motion-vector trigger control method and use its electronic installation
TWM418495U (en) * 2011-04-12 2011-12-11 Shi-Liang Wang Image processing apparatus and system
TW201246175A (en) * 2011-05-09 2012-11-16 Acer Inc Method and device for displaying on a plurality of displayers
CN104092713B (en) * 2013-05-31 2018-06-15 腾讯科技(深圳)有限公司 The download information methods of exhibiting and device of a kind of Internet resources
EP3058509A4 (en) * 2013-10-16 2017-08-09 3M Innovative Properties Company Note recognition for overlapping physical notes
CN104750440B (en) * 2013-12-30 2017-09-29 纬创资通股份有限公司 Window management method, electronic installation and the computer program product of multi-screen
TW201843606A (en) * 2017-05-03 2018-12-16 冠捷投資有限公司 Real-time information searching apparatus when playing image and operation method thereof
TWI684977B (en) * 2017-10-30 2020-02-11 大陸商北京集創北方科技股份有限公司 Screen display method of display module and display module using the method
TWI693564B (en) * 2018-01-05 2020-05-11 竹陞科技股份有限公司 Automatic equipment management system and method thereof
TWI818913B (en) * 2018-06-26 2023-10-21 聯華電子股份有限公司 Device and method for artificial intelligence controlling manufacturing apparatus

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102622094A (en) * 2010-12-31 2012-08-01 宏正自动科技股份有限公司 Remote management system, multi-computer switcher and remote management method
CN102681690A (en) * 2010-12-31 2012-09-19 宏正自动科技股份有限公司 Remote management system, multi-computer switcher and remote management method
CN105302285A (en) * 2014-08-01 2016-02-03 福州瑞芯微电子股份有限公司 Multi-screen display method, equipment and system

Also Published As

Publication number Publication date
TWI739474B (en) 2021-09-11
CN114115771B (en) 2024-04-02
TW202147088A (en) 2021-12-16

Similar Documents

Publication Publication Date Title
US10095395B2 (en) Computer with touch panel, operation method, and recording medium
US9317403B2 (en) Method for creating a label
US20150212728A1 (en) Image display apparatus, image display system, and image display method
US20160202887A1 (en) Method for managing application icon and terminal
US10698530B2 (en) Touch display device
CN101667058B (en) Interactive method for switching focuses among multiple systems
US10719228B2 (en) Image processing apparatus, image processing system, and image processing method
US9177405B2 (en) Image processing apparatus, computer program product, and image processing system
US20170344248A1 (en) Image processing device, image processing system, and image processing method
EP2645622B1 (en) Image processing apparatus and image processing system
CN111309199B (en) Display control method of touch display device and touch display device
JP2012027515A (en) Input method and input device
US20070260767A1 (en) Information processing apparatus and information processing method
JP5760886B2 (en) Image display device, image display method, and image display program
KR20200122945A (en) Electronic device for displaying execution screens of a plurality of applications and method for operating thereof
AU2014280985B2 (en) Image processing apparatus, image processing method, image processing system, and program
JP5728588B2 (en) Display control method, computer program, display control apparatus, and image display system
US10388257B2 (en) Information processing apparatus, method of controlling the same and non-transitory computer-readable storage medium
CN114115771B (en) Image processing method
CN113051022A (en) Graphical interface processing method and graphical interface processing device
US10803836B2 (en) Switch device and switch system and the methods thereof
JP5994903B2 (en) Image display device, image display method, and image display program
CN110968210A (en) Switching device and switching system and applicable method thereof
US20220391055A1 (en) Display apparatus, display system, and display method
US11822743B2 (en) Touch panel information terminal apparatus and information input processing method implemented with dual input devices arranged on two surfaces

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant