WO2021216037A1 - Image rendering - Google Patents

Image rendering Download PDF

Info

Publication number
WO2021216037A1
WO2021216037A1 PCT/US2020/028940 US2020028940W WO2021216037A1 WO 2021216037 A1 WO2021216037 A1 WO 2021216037A1 US 2020028940 W US2020028940 W US 2020028940W WO 2021216037 A1 WO2021216037 A1 WO 2021216037A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
processed
processing device
display screen
input data
Prior art date
Application number
PCT/US2020/028940
Other languages
French (fr)
Inventor
Simon Wong
Thong Thai
Original Assignee
Hewlett-Packard Development Company, L.P.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hewlett-Packard Development Company, L.P. filed Critical Hewlett-Packard Development Company, L.P.
Priority to PCT/US2020/028940 priority Critical patent/WO2021216037A1/en
Publication of WO2021216037A1 publication Critical patent/WO2021216037A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • An input article may be communicatively coupled to a computing device to control aspects of the computing device (e.g., computers, tablets, etc.).
  • An input article may include a stylus, keyboards, pointers, a pointing device, touchpads, and/or other articles for accepting user interaction.
  • an input article may create images on a display of a computing device, make selections on a computing device, control a position of a cursor on a display of a computing device, and/or otherwise facilitate interaction with the display of a computing device.
  • Figure 1 illustrates an example of a computing device consistent with the disclosure.
  • Figure 2 illustrates an example of an apparatus suitable with a computing device consistent with the disclosure.
  • Figure 3 illustrates an example of a computing device consistent with the disclosure.
  • Figure 4 illustrates an example of a flow diagram of a computing device consistent with the disclosure.
  • Figure 5 illustrates an example diagram of a non-transitory machine readable medium suitable with a computing device consistent with the disclosure.
  • Figures 6A & 6B illustrates examples of computing devices consistent with the disclosure. Detailed Description
  • Input articles may be communicatively coupled to a computing device to provide input data to the computing device.
  • an input article may create images on a display screen of a computing device and/or otherwise facilitate interaction with the display screen of a computing device.
  • an input article may contact a display screen of a computing device and may be moved in a given direction by a user to cause an image to appear on a display screen of a computing device.
  • there may be a delay between the time the image was created and the time the image appears on the display screen of the computing device. For instance, after the image is created by the user with the input article, several seconds may pass before the image is visible on the display screen.
  • Some computing devices include added software and/or hardware to reduce the latency of an image. However, adding software and/or hardware to a computing device may increase the cost of making the computing device and/or the size of the computing device.
  • latency refers to the delay between the command to create the image and the appearance of the image on a display screen.
  • computing devices may reduce the latency of an image without additional hardware and/or software.
  • a computing device may comprise a graphics processing device to receive input data from a touchscreen controller, render a temporary image from the input data, and send the temporary image to a display screen for display.
  • such computing devices may provide a more enjoyable experience and may reduce the cost of the computing device, as compared to computing devices that included added software and/or hardware to reduce the latency of images, as detailed herein.
  • FIG. 1 illustrates an example of a computing device 100 consistent with the disclosure.
  • the computing device 100 may be a variety of computing related devices, such as desktop computers, portable computers, tablets, etc.
  • the computing device 100 may be a mobile computing device.
  • the computing device 100 may include touch screen capabilities.
  • a user may be able to make selections, create images (e.g., markings, drawings, etc.), or facilitate interaction with the computing device 100 by touching a display screen 108 of a computing device 100.
  • a user may utilize an input article 102 to activate the touch screen capabilities of the computing device 100.
  • an “input article” refers to an article used to facilitate an interaction with a computing device.
  • an input article 102 is illustrated as a stylus in Figure 1, this disclosure is not so limited.
  • an input article could be a stylus (as illustrated in Figure 1), a finger of a user, a computer mouse, or an instrument used to contact a computing device to produce a mark and provide input.
  • the input article 102 may be used to produce an image 106 on the display screen 108 of the computing device 100.
  • an input article 102 may come in contact with and moved along the surface of the display screen 108 of the computing device 100 to create input data to produce an image 106 on the display screen 108.
  • an “image” refers to a visual representation.
  • an image may be a visual representation of a line, visual representation of a letter, visual representation of a number, visual representation of a person, etc.
  • a touchscreen controller may determine input coordinates in response to the input article 102 coming in contact with the display screen 108. That is, the touchscreen controller may determine the locations on the display screen 108 that the input article 102 comes in contact with to determine the input coordinates.
  • the touchscreen controller may send the input coordinates to the graphics processing device 104.
  • the touchscreen controller may receive the input data created by the input article 102 and send the input data directly to a graphics processing device 104.
  • the touchscreen controller may send the input data to the graphics processing device 104 once it is received by the touchscreen controller. For instance, the touchscreen controller may receive the input data in portions and send each portion of the input data to the graphics processing device as each portion is received.
  • a “touchscreen controller” refers to the element of the computing device that detects positional information from an input article in the immediate vicinity of a display screen, detects positional contact pressure on a display screen of a computing device, and receives input data from the detected positional contact pressure and detected positional information from the input article.
  • a “graphics processing device” refers to a device used to process, analyze, and/or alter data to render an image for display on a display screen.
  • the graphics processing device 104 may receive input data and input coordinates from a touchscreen controller. The graphics processing device 104 may then process the input data to render a temporary image (e.g., image 106) from the input data. That is, the graphics processing device 104 may determine the shape (e.g., frame) and render the temporary image based on preset attributes. In some examples, the graphics processing device may process and render a temporary image as each portion of the input data is received. The input data sent by the touchscreen controller to the graphics processing device 104 may provide the shape of the temporary image, but may not be able to produce a temporary image including the attributes provided by the input data. As used herein, “render” refers to the process of taking information and converting it into an image.
  • an “attribute” refers to the features of an image and/or shape.
  • an attribute of an image can include the color of the image, the shading of the image, the thickness of an image, the brush type used to create the image, the amount of pressure applied to create the image, etc.
  • a “preset attribute” refers to a determined attribute for input data that has not yet been received by the computing device.
  • a preset attribute may be an attribute that is selected by a user before input data is received by the touchscreen controller and/or default attributes of an application.
  • the graphics processing device 104 may send the temporary image to the display screen 108 and cause the display screen 108 to display the temporary image at the input coordinates. That is, the display screen 108 may display the temporary image based on the input coordinates determined by the touchscreen controller.
  • the touchscreen controller may send the input data and input coordinates to a processing application.
  • the processing application may process the input data to produce processed data.
  • the processing application may process the input coordinates to produce processed coordinates.
  • the processed coordinates may determine the display position and location of a processed image.
  • the processing application may send the processed data and the processed coordinates to the graphics processing device 104.
  • the processed data sent by the processing application to the graphics processing device 104 may provide the shape (e.g., frame) of the image and the attributes determined by the processed data.
  • a “processing application” refers to an element of the computing device that processes information relating to input data. For example, a processing application may process input data to determine the attributes of the input data to apply to an image, the shape of the image, the location of the image, etc.
  • the graphics processing device 104 may receive the processed data and the processed coordinates from the processing application.
  • the graphics processing device 104 may render a processed image (e.g., image 106) from the received processed data.
  • the processed image may be different or the same as the temporary image.
  • the graphics processing device 104 may send the processed image to the display screen 108 and cause the display screen 108 to display the processed image at the processed coordinates.
  • the processed image may then replace the temporary image providing a more detailed image.
  • the display screen 108 may display the processed image including attributes determined from the processed data to provide a more detailed image and remove the less detailed temporary image including preset attributes determined before input data was received by the computing device.
  • sending input data from a touchscreen controller directly to the graphics processing device may cause an image to appear on the display screen faster, as compared to sending the input data to a processing application and then to the graphics processing device. That is, by sending the input data from the touchscreen controller directly to the graphics processing device, a temporary image may be generated in 7 to 9 milliseconds (ms) from receipt of input data by the touchscreen controller.
  • ms milliseconds
  • examples described herein may be able to reduce the latency of an image.
  • replacing a temporary image with a processed image may ensure the accuracy of the displayed image. That is, examples described herein may be able to reduce the latency of the image, as compared to sending the input data to a processing application and then to the graphics processing device, and maintain the accuracy of the displayed image.
  • Figure 2 illustrates an example of an apparatus 220 suitable with a computing device consistent with the disclosure.
  • the apparatus 220 includes a processor 221 and a memory resource 222.
  • the processor 221 may be a hardware processing device such as a microprocessor, application specific instruction set processor, coprocessor, network processor, or similar hardware circuitry that may cause machine-readable instructions to be executed.
  • the processor 221 may be a plurality of hardware processing devices that may cause machine-readable instructions to be executed.
  • the processor 221 may include central processing devices (CPUs) among other types of processing devices.
  • the processor 221 may also include dedicated circuits and/or state machines, such as in an Application Specific Integrated Circuit (ASIC), Field Programmable Gate Array (FPGA) or similar design-specific hardware.
  • the memory resource 222 may be any type of volatile or non-volatile memory or storage, such as random-access memory (RAM), flash memory, read-only memory (ROM), storage volumes, a hard disk, or a combination thereof.
  • the memory resource 222 may store instructions thereon, such as instructions 223, 224, 225, 226, 227, and 228. When executed by the processor 221 , the instructions may cause the apparatus 220 to perform specific tasks and/or functions. For example, the memory resource 222 may store instructions 223 to send the input data to a graphics processing device.
  • An input article may come in contact with a display screen of a computing device. The display screen may have touch screen capabilities.
  • the processor 221 may cause a touchscreen controller to analyze the movements of the input article to determine the input data. That is, the touchscreen controller may receive input data from an input article by analyzing the movements of the input article and determining the input data.
  • the processor 221 may cause the touchscreen controller to determine input coordinates based on the movements of the input article.
  • the input coordinates may provide the location of the image that will appear on the display screen. That is, the touchscreen controller may receive input coordinates from an input article by determining the location of the input article as the input article contacts the surface of the display screen.
  • the processor 221 may cause the input data to be sent to the graphics processing device of the computing device.
  • the graphics processing device may also receive input coordinates. That is, the processor 221 may cause the touchscreen controller to send the input data and the input coordinates directly to the graphics processing device. While the processor 221 and the touchscreen controller are described as separate elements, this disclosure is not so limited. It should be understood that the processor 221 and the touchscreen controller may be the same element or separate elements.
  • the memory resource 222 may store instructions 224 to render a temporary image based on the input data, wherein the temporary image includes a preset attribute.
  • the processor 221 may cause the graphics processing device to render a temporary image.
  • the graphics processing device may process the input data to determine the shape of the temporary image.
  • the appearance of the temporary image may be based on preset attributes. That is, the color, brush type, thickness, shading, amount of pressure applied, etc. (e.g., attributes) of the temporary image may be determined before the input data is received by the touchscreen controller.
  • the graphics processing device may process the input data to determine the shape of the temporary image and apply preset attributes to the shape of the temporary image rendered.
  • the preset attributes may be attributes that take less time to process and/or may be directly processed by the graphics processing device.
  • graphics processing device may process the input coordinates to determine the location on the display screen where the temporary image is to appear. That is, the graphics processing device may process input data and input coordinates to produce a temporary image including preset attributes to display on the display screen.
  • the memory resource 222 may store instructions 225 to display the temporary image on the display screen of the computing device.
  • the processor 221 may cause the graphics processing device to send an image to the display screen. For instance, the graphics processing device may render a temporary image utilizing preset attributes and not attributes provided by the input data. The graphics processing device may then send the temporary image to the display screen.
  • the graphics processing device may send a display location of the temporary image to the display screen. Once the temporary image and the display location are received, the display screen may then display the temporary image on the display screen at the specified display location.
  • the memory resource 222 may store instructions 226 to send a processed data to the graphics processing device.
  • the processor 221 may cause a touchscreen controller to send input data and input coordinates to a processing application.
  • the touchscreen controller may send the input data and the input coordinates directly to the processing application or there may be intervening elements.
  • the processor 221 may cause the processing application to process the input data and the input coordinates.
  • the processing application may determine the different attributes of the input data. That is, the processing application may determine the color, brush type, thickness, shading, amount of pressure applied, etc. from the input data.
  • the processing application may determine the display location from the input coordinates.
  • the processing application may process the input data and the input coordinates to provide information to produce a detailed image and a display location.
  • the processor 221 may cause the processing application to send the processed data and the processed coordinates to the graphics processing device to render a processed image.
  • the memory resource 222 may store instructions 227 to render a processed image provided by the processed data, wherein the processed image includes an attribute based on the processed data.
  • the processor 221 may cause the graphics processing device to render the processed image from the processed data.
  • the graphics processing device may analyze the processed data to determine the shape of the processed image and the attributes provided by the processed data. The graphics processing device may then apply the attributes provided by the processed data to the shape of the processed image. That is, the processed image may have the color, brush type, thickness, shading, amount of pressure applied, etc. (e.g., attributes) provided by the processed data.
  • the graphics processing device may analyze the processed coordinates to determine the location on the display screen where the processed image is to appear. That is, the graphics processing device may utilize the processed data and processed coordinates to produce a processed image including the attributes provided by the processed data and display the processed image at the location specified by the processed coordinates.
  • the memory resource 222 may store instructions 228 to replace the temporary image on the display screen with the processed image.
  • processor 221 may cause the graphics processing device to send a processed image to the display screen of the computing device.
  • the graphics processing device may render a processed image including attributes determined by the processed data.
  • the graphics processing device may send the processed image to the display screen.
  • sending the processed image to the display screen may cause the temporary image to be removed. That is, the temporary image may be removed and replaced with the processed image.
  • the processed image may provide a more detailed image, in comparison to the temporary image.
  • the processed image may include the attributes provided by the processed data.
  • the display screen may receive the processed image, remove the temporary image, and display the processed image based on the location determined by the processed coordinates.
  • the graphics processing device may compare the temporary image with the processed image and determine the differences between the images before the processed image is sent to the display screen.
  • the graphics processing device may use the differences between the rendered images to determine the shape of the temporary image to remove.
  • the graphics processing device may use the differences between the images to determine which portions of the temporary image to replace.
  • Figure 3 illustrates an example of a computing device 300 consistent with the disclosure.
  • Computing device 300 may be a variety of computer related devices, such as desktop computers, portable computers, tablets, etc.
  • computing device 300 may be a portable computing device.
  • Figure 3 may include analogous or similar elements as Figure 1.
  • Figure 3 may include a computing device 300, a graphics processing device 304, and a display screen 308.
  • the computing device 300 may include a touchscreen controller 314.
  • the touchscreen controller 314 may detect an active input article in the vicinity of the computing device 300. As the input article comes in contact with the display screen 308, the touchscreen controller 314 may detect the contact of the input article with the display screen 308.
  • the touchscreen controller 314 may detect the movement of the input article on the display screen 308.
  • the movement of the input article on the display screen 308 may provide input data to the touchscreen controller 314. That is, the touchscreen controller 314 may receive input data based on the contact of the input article with the display screen 308.
  • the movement of the input article on the display screen 308 may correspond to an image rendered by the graphics processing device 304.
  • the touchscreen controller 314 may detect the coordinates of an input article in the vicinity of the computing device 300. As the input article comes in contact with the display screen 308, the touchscreen controller 314 may receive input coordinates related to the input data. That is, the touchscreen controller 314 may receive input coordinates of the input data to determine the display location of an image rendered by the graphics processing device 304.
  • the graphics processing device 304 may include a first sub-processing device 310 and a second sub-processing device 312.
  • the first subprocessing device 310 and the second sub-processing device 312 may perform different and/or similar task within the graphics processing device 304.
  • the first subprocessing device 310 of the graphics processing device 304 may utilize preset attributes to process input data received from a touchscreen controller 314 and render a temporary image. For instance, the temporary image may be rendered with preset attributes determined before the input data is received by the computing device 300.
  • a “first sub-processing device” refers to a component of the graphics processing device used to process, analyze, and/or alter input data to render a temporary image.
  • a “second sub-processing device” refers to a component of the graphics processing device used to process, analyze, and/or alter processed data to render a processed image.
  • the second sub-processing device 312 of the graphics processing device 304 may use processed data received from a processing application 316 to render a processed image.
  • the second sub-processing device 312 may render the processed image utilizing attributes provided by the processed data.
  • this disclosure is not so limited.
  • the graphics processing device 304 not include a first sub- processing device and a second sub-processing device.
  • the graphics processing device 304 may process input data received from a touchscreen controller 314 and render a temporary image without a first sub-processing device and utilize processed data received from a processing application 316 to render a processed image without a second sub-processing device.
  • the first sub-processing device 310 and the second sub-processing device 312 may be combined into a sub-processing device. That is, the graphics processing device 304 may include a sub-processing device to render a temporary image including preset attributes by processing input data received from a touchscreen controller 314 and render a processed image including attributes provided by processed data received from a processing application 316.
  • the touchscreen controller 314 may send input data and input coordinates to the first sub-processing device 310 of the graphics processing device 304.
  • the touchscreen controller 314 may send the input data and the input coordinates to the graphics processing device 304 and the graphics processing device 304 may send the input data and the input coordinates to the first sub-processing device 310.
  • the touchscreen controller 314 may send the input data and the input coordinates directly to the first sub-processing device 310 of the graphics processing device 304.
  • the first sub-processing device 310 may process the input data and input coordinates to produce a temporary image. That is, the first sub-processing device 310 may analyze the input data to determine the shape of the image and produce a temporary image based on the determination of the shape. The first sub-processing device 310 may also analyze the input coordinates to determine the display location of the image. The first sub-processing device 310 may render the temporary image based on preset attributes. For example, the color, brush type, thickness, shading, amount of pressure applied, etc. of the temporary image may be determined before input data is received by the touchscreen controller. In some examples, the first sub-processing device 310 may cause the display screen 308 to display the temporary image at the display location.
  • the first sub-processing device 310 may send the temporary image to the display screen 308 to cause the display screen 308 to display the temporary image.
  • the first sub-processing device 310 may send the temporary image to the second sub-processing device 312.
  • the touchscreen controller 314 may send the input data and the input coordinates to a processing application 316.
  • the touchscreen controller 314 may send the input data and the input coordinates directly to the processing application 316.
  • the touchscreen controller 314 may send the input data and the input coordinates to an intervening element and the intervening element may send the input data and the input coordinates to the processing application 316.
  • the processing application 316 may process the input data and determine the attributes of the input data to produce a processed data.
  • the attributes of the input data may provide a detailed image.
  • the processing application 316 may process the input coordinates to determine the display location of an image.
  • the processing application may then send the processed data and the processed coordinates to the second sub-processing device 312 of the graphics processing device 304.
  • the processing application 316 may update the preset attributes.
  • the processing application 316 may update the preset attributes with the attributes provided by the processed data. This may allow temporary images rendered from input data newly received by the computing device (e.g., input data that is received after the processing application 316 updated the preset attributes) to have the attributes provided by the processed data.
  • the second sub-processing device 312 may analyze the processed data and processed coordinates to produce a processed image. That is, the second subprocessing device 312 may determine the shape of the image from the processed data and produce a processed image based on the determination of the shape. The second sub-processing device 312 may render the processed image based on the attributes provided by the processed data. That is, the color, brush type, thickness, shading, amount of pressure applied, etc. of the processed image may be determined by the processed data sent by the processing application 316.
  • the second sub-processing device 312 may receive a temporary image from the first sub-processing device 310.
  • the second sub-processing device 312 may compare the received temporary image with the processed image and determine the differences between the images.
  • the second sub-processing device 312 may then cause the temporary image to be removed from the display screen 308 and replaced with the processed image.
  • the processed image may be displayed on the display screen 308 based on the processed coordinates. That is, the second subprocessing device 312 may cause the display screen 308 to display the processed image at the location determined by the processing application 316.
  • the display location determined by the processed coordinates is the same or substantially similar to the location of the temporary image.
  • ‘‘substantially” refers to intends that the characteristic does not have to be absolute but is close enough so as to achieve the characteristic. For example, “substantially similar” is not limited to absolute similar.
  • the touchscreen controller 314 sending the input data to the graphics processing device 304 and the processing application 316 may provide a faster processing time of an image while maintaining the accuracy of the image. That is, the computing device 300 is able to use existing software and hardware to process and render a temporary image faster by sending the input data from the touchscreen controller 314 directly to the graphics processing device 304 for processing and rendering, as compared to processing the input data before the input data is sent to the graphics processing device 304. In addition, the computing device 300 is able to ensure the accuracy of the image by sending the input data from the touchscreen controller 314 to the processing application 316 for processing and then sending the processed data to the graphics processing device 304 to render a processed image.
  • the input data sent to the processing application 316 from the touchscreen controller 314 may be the same as the input data sent to the graphics processing device 304 directly from the touchscreen controller 314.
  • this disclosure is not so limited.
  • the input data sent to the processing application 316 may be the different than the input data sent to the graphics processing device 304 directly from the touchscreen controller 314. That is, the input data sent from the touchscreen controller 314 to the processing application 316 may include more and/or less information than input data sent from the touchscreen controller 314 directly to the graphics processing device 304.
  • Figure 4 illustrates an example of a flow diagram 440 of a computing device 400 consistent with the disclosure.
  • flow diagram 440 may illustrate how information is transferred through a computing device 400.
  • Figure 4 may include analogous or similar elements as Figure 1 and Figure 3.
  • Figure 4 may include an input article 402, a graphics processing device 404, a first sub- processing device 410, a second sub-processing device 412, a display screen 408, a touchscreen controller 414, and a processing application 416.
  • the computing device 400 may receive input data and input coordinates from an input article 402.
  • the input data may be communicated to the touchscreen controller 414. That is, the touchscreen controller 414 may receive input data and input coordinates from an input article 402, through communication 430. It should be understood that the arrows illustrated in Figure 4 are used to represent one element communicating with another element.
  • the touchscreen controller 414 may send the input data and the input coordinates to the graphics processing device 404.
  • the touchscreen controller 414 may send input data and input coordinates directly to the first sub-processing device 410 of the graphics processing device 404, through communication 432.
  • the first sub-processing device 410 may utilize the input data sent by the touchscreen controller 414, through communication 432, to produce a temporary image.
  • the graphics processing device 404 may send the temporary image to the display screen 408.
  • the first sub-processing device 410 may send the temporary image and display location instructions to the display screen 408 though communication 434.
  • the display screen 408 may then display the temporary image at the display location.
  • the first sub-processing device 410 may also send the temporary image and the input coordinates to the second sub-processing device 412, through communication 436.
  • the second sub-processing device 412 may store the temporary image and input coordinates to compare the temporary image to a processed image and processed coordinates. For instance, once the second sub-processing device 412 prepares the processed image, the second sub-processing device 412 may compare the stored temporary image and input coordinates to the processed image and processed coordinates.
  • the touchscreen controller 414 may also send the input data and input coordinates to a processing application 416.
  • the touchscreen controller 414 may send the input data and input coordinates indirectly to the processing application 416, through communication 438.
  • the touchscreen controller 414 may send the input data and input coordinates directly to the processing application 416, through communication 438.
  • the processing application 416 may then process the input data and input coordinates, received through communication 438, to produce a processed data and processed coordinates.
  • the processing application 416 may send the processed data and processed coordinates to the graphics processing device 404.
  • the processing application 416 may send the processed data and processed coordinates to the second sub-processing device 412 of the graphics processing device 404, through communication 442.
  • the second sub-processing device 412 may utilize the processed data and processed coordinates sent by the processing application 416, through communication 442, to produce a processed image and determine a display location for the processed image.
  • the second sub-processing device 412 may then compare the processed image and the processed coordinates to the to a stored temporary image and input coordinates.
  • the graphics processing device 404 may cause the temporary image to be replaced with the processed image.
  • the second sub-processing device 412 may send the processed image, a display location for the processed image, and instructions to replace the temporary image with the processed image to the display screen 408, through communication 444.
  • the display screen 408 may then display the processed image at the display location for the processed image instead of the temporary image.
  • the communication between the touchscreen controller 414, processing application 416, graphics processing device 404, and display screen 408 may be conducted without the use of the first sub-processing device 410 and a second sub-processing device 412.
  • the touchscreen controller 414 may send the input data and input coordinates directly to the graphics processing device 404, though communication 432.
  • the processing application 416 may send the processed data and processed coordinates to the graphics processing device 404, though communication 442.
  • the graphics processing device 404 may send the temporary image to the display screen 408 through communication 434 and send the processed image to the display screen 408 though communication 444.
  • the approach described herein provides a computing device 400 that reduces the latency of the image, as compared to sending the input data to a processing application 416 and then to the graphics processing device 404. That is, the approach described herein may decrease the latency of the appearance of the image without increasing the cost of the computing device 400, as compared to computing devices that includes added software and/or hardware to reduce the latency of the appearance of an image. In addition, the computing device 400 described herein may reduce the latency of the appearance of the image while maintaining the accuracy of the image. For example, the image may be generated in 7 to 9 ms from receipt of input data by the touchscreen controller in a 60 hertz display screen.
  • Figure 5 illustrates an example diagram of a non-transitory machine readable medium 550 suitable with a computing device consistent with the disclosure.
  • the non-transitory machine-readable medium 550 may be any type of volatile or non- volatile memory or storage, such as random-access memory (RAM), flash memory, read-only memory (ROM), storage volumes, a hard disk, or a combination thereof.
  • RAM random-access memory
  • ROM read-only memory
  • the medium 550 stores instruction 551 executable by a processor to send the input data to graphics processing device and render a temporary image.
  • the processor may execute transfer instructions 551 to send, by a touchscreen controller, input data to a graphics processing device.
  • the graphics processing device may receive the input data directly from the touchscreen controller.
  • the graphics processing device may process the input data to determine the shape of the image provided by the input data.
  • the graphics processing device may then render an image from the input data.
  • the input data may provide information that may allow a graphics processing device to produce an image.
  • the graphics processing device may receive input coordinates from the touchscreen controller.
  • the input coordinates may determine the display location of the temporary image.
  • the graphics processing device may render a temporary image for display on a display screen of a computing device.
  • the medium 550 stores instructions 552 executable by a processor to display the rendered temporary image based on a preset attribute.
  • the processor may execute display instructions 552 to cause an image rendered by the graphics processing device to display on the display screen at a display location.
  • the temporary image displayed on the display screen may have attributes that were determined before input data was received by the touchscreen controller. Said differently, the attributes of the temporary image may not be able to be determined by the information provided in the input data.
  • the attributes of the temporary image may be default attributes programmed into the computing device.
  • the attributes of the temporary image may be attributes set by a user before the input data is received.
  • the medium 550 stores instruction 553 executable by a processor to send the processed data to a graphics processing device and render a processed image.
  • the processor may execute transfer instructions 553 to send by a processing application processed data to a graphics processing device.
  • the instruction 553 may cause the graphics processing device to receive processed data.
  • the graphics processing device may receive the processed data directly from the processing application.
  • the graphics processing device may receive the processed data indirectly from the processing application.
  • the graphics processing device may analyze the processed data to determine the shape of the image provided by the processed data.
  • the processed data may provide information that may allow a graphics processing device to produce an image based on the attributes provided in the processed data.
  • the graphics processing device may then render a processed image from the processed data.
  • the processed image may have attributes based on the information provided by the processed data. Said differently, the attributes of the processed image may be determined after input data is received and processed.
  • the medium 550 stores instruction 554 executable by a processor to analyze processed coordinates.
  • the processor may execute analyze instructions 554 to analyze processed coordinates received from the processing application.
  • the instructions 554 may cause the graphics processing device to determine a display location for the processed image. That is, the graphics processing device may cause a display screen to display the processed image at the display location provided by the processed coordinates.
  • the medium 550 stores instruction 555 executable by a processor to compare the temporary image to the processed image.
  • the processor may execute compare instructions 555 to compare the temporary image to the processed image to determine the difference between the images. That is, the processor may cause the graphics processing device to determine the differences between the temporary image and the processed image. Likewise, the processor may cause the graphics processing device to compare the input coordinates and the processed coordinates to determine the difference between the coordinates. In some examples, the graphics processing device may utilize the difference between the coordinates to determine the placement of the processed image.
  • the medium 550 stores instruction 556 executable by a processor to replace the temporary image on the display screen with the processed image.
  • the processor may execute display instructions 556 to cause the temporary image to be removed from the display screen and the processed image to display on the display screen. That is, the temporary image including preset attributes may be replaced by the processed image including attributes provided by the processed data.
  • the processed image may be displayed on the display screen at the display location provided by the processed coordinates and not the display location of the temporary image. That is, the computing device described herein may reduce the latency of the appearance of the image while maintaining the accuracy of the image without added hardware and/or software to do so.
  • the graphics processing device may utilize the comparison of the images and the comparison of the coordinates to determine the location and shape of the temporary image to remove. That is, the processor may cause the graphics processing device to determine, based on the compared images, which portion of the temporary image should be removed. For example, the graphics processing device may determine the location of the temporary image, remove the temporary image from the display screen, and then replace the temporary image with the processed image. In some examples, the graphics processing device may utilize the difference between the images to determine which portions of the temporary image to replace. That is, the portion of the temporary image that is different from the processed image may be replaced while the portion of the temporary image that is the same as the processed image may remain.
  • Figures 6A & 6B illustrates examples of computing devices 600A and 600B consistent with the disclosure.
  • Figures 6A and 6B may include analogous or similar elements as Figure 1 , Figure 3, and Figure 4.
  • Figures 6A and 6B may include computing devices 600A and 600B (collectively refer to as computing device 600), graphics processing devices 604A and 604B (collectively refer to as graphics processing device 604), first sub-processing devices 610A and 610B, second sub-processing devices 612A and 612B, and a display screens 608A and 608B (collectively refer to as display screen 608).
  • the computing device 600 may include a touchscreen controller to receive input data and input coordinates from an input article.
  • the touchscreen controller may send the input coordinates and the input data directly to a graphics processing device 604.
  • the graphics processing device 604A may then process the input data and input coordinates to render a temporary image 606A.
  • the graphics processing device 604A may cause the first sub-processing unit 610A to process the input data and render a temporary image 606A. That is, the first subprocessing device 610A may analyze the input data to determine the shape of the temporary image 606A.
  • the temporary image 606A rendered by the first sub-processing device 610A may include preset attributes.
  • the temporary image 606A may include standard features determined before input data was received by the touchscreen controller. That is, the temporary image 606A may have a standard color, brush type, thickness, shading, applied pressure, etc. processed by the first sub- processing device 610A and determined before the input data was received by the touchscreen controller.
  • the graphics processing device 604A may cause the first sub-processing device 610A to analyze the input coordinates to determine the display location of the temporary image 606A.
  • the first sub-processing device 610A may then send the temporary image 606A to the display screen 608A.
  • the display screen 608A may display the temporary image 606A at the display location determined by the input coordinates.
  • the touchscreen controller may send the input data and the input coordinates to a processing application.
  • the processing application may then process the input data and input coordinates to produce a processed data and processed coordinates.
  • the processing application may determine the different attributes provided by the input data. That is, the processing application may determine information relating to the color, brush type, thickness, shading, applied pressure, etc. when processing the input data.
  • the processing application may then send the processed data and processed coordinates to the graphics processing device 604B.
  • the graphics processing device 604B may cause the second sub-processing unit 612B to analyze the processed data and render a processed image 606B. That is, the second sub-processing device 612B may analyze the processed data to determine the shape of the processed image 606B.
  • the processed image 606B rendered by the second sub-processing device 612B may include attributes provided by the processed data.
  • the processed image 606B may include the color, brush type, thickness, shading, applied pressure, etc. implemented by the user when creating the input data. That is, the processed image 606B may have the attributes provided by the user when creating the input data.
  • the graphics processing device 604B may cause the second subprocessing device 612B to analyze the processed coordinates to determine the display location of the processed image 606B.
  • the second sub-processing device 612B may then send the processed image 606B to the display screen 608B.
  • the display screen 608B may remove the temporary image 606A and replace the temporary image 606A with the processed image 606B at the display location determined by the processed coordinates.
  • the processed image 606B may provide a more detailed image compared to the temporary image 606A.
  • the processed image 606B may have a slightly different display location compared to the temporary image 606A.
  • the processed image 606B including the attributes provided by the processed data may a different shape, compared to the shape of the temporary image 606A.
  • the different shapes of the processed image 606B and the temporary image 606A may cause the processed image 606B to have a slightly different display location than the temporary image 606A.
  • the attributes provided by the processed data may cause the processed image 606B to appear darker and/or have a different angle, thickness, and/or color compared to the temporary image 606A. That is, the processed image 606B may have a slightly different appearance compared to the temporary image 606A, but may display the same information, as illustrated by Figure 6A and 6B.
  • Replacing the temporary image 606A with the processed image 606B may reduce the latency of the rendered image without added software and/or hardware.
  • the graphics processing device 604 may take less time to process and render the temporary image 606A. This may allow the temporary image 606A to appear on the display screen 608 in less time than the processed image 606B, reducing the latency of the image.
  • replacing the temporary image 606A with the processed image 606B may maintain the accuracy of the rendered image. That is, the computing device 600 may start the process of rendering a processed image 606B at substantially the same time the computing device 600 starts the process of rendering the temporary image 606A. The computing device 600 may maintain the accuracy of the image by rendering a processed image 606B including the attributes implemented by the user to replace the temporary image 606A.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Examples described herein relate to computing devices and inputs consistent with the disclosure. For instance, the computing device may comprise a touchscreen controller, a display screen, and a graphics processing device to receive an input data from the touchscreen controller, render a temporary image based on the input data, send the temporary image to the display screen, receive processed data from a processing application, where the processing application is to process the input data received by the touchscreen controller, render a processed image based on the processed data, and send the processed image to the display screen to replace the temporary image.

Description

IMAGE RENDERING
BACKGROUND
[0001] An input article may be communicatively coupled to a computing device to control aspects of the computing device (e.g., computers, tablets, etc.). An input article may include a stylus, keyboards, pointers, a pointing device, touchpads, and/or other articles for accepting user interaction. For instance, an input article may create images on a display of a computing device, make selections on a computing device, control a position of a cursor on a display of a computing device, and/or otherwise facilitate interaction with the display of a computing device.
BRIEF DESCRIPTION OF THE DRAWINGS
[0002] Figure 1 illustrates an example of a computing device consistent with the disclosure.
[0003] Figure 2 illustrates an example of an apparatus suitable with a computing device consistent with the disclosure.
[0004] Figure 3 illustrates an example of a computing device consistent with the disclosure.
[0005] Figure 4 illustrates an example of a flow diagram of a computing device consistent with the disclosure.
[0006] Figure 5 illustrates an example diagram of a non-transitory machine readable medium suitable with a computing device consistent with the disclosure.
[0007] Figures 6A & 6B illustrates examples of computing devices consistent with the disclosure. Detailed Description
[0008] Input articles may be communicatively coupled to a computing device to provide input data to the computing device. As mentioned, an input article may create images on a display screen of a computing device and/or otherwise facilitate interaction with the display screen of a computing device. For instance, an input article may contact a display screen of a computing device and may be moved in a given direction by a user to cause an image to appear on a display screen of a computing device. [0009] However, in some computing devices there may be a delay between the time the image was created and the time the image appears on the display screen of the computing device. For instance, after the image is created by the user with the input article, several seconds may pass before the image is visible on the display screen. Some computing devices include added software and/or hardware to reduce the latency of an image. However, adding software and/or hardware to a computing device may increase the cost of making the computing device and/or the size of the computing device. As used herein, “latency” refers to the delay between the command to create the image and the appearance of the image on a display screen.
[0010] As such, computing devices, as described herein may reduce the latency of an image without additional hardware and/or software. For example, a computing device may comprise a graphics processing device to receive input data from a touchscreen controller, render a temporary image from the input data, and send the temporary image to a display screen for display. Notably, such computing devices may provide a more enjoyable experience and may reduce the cost of the computing device, as compared to computing devices that included added software and/or hardware to reduce the latency of images, as detailed herein.
[0011] Figure 1 illustrates an example of a computing device 100 consistent with the disclosure. The computing device 100 may be a variety of computing related devices, such as desktop computers, portable computers, tablets, etc. In some examples, the computing device 100 may be a mobile computing device. The computing device 100 may include touch screen capabilities. For example, a user may be able to make selections, create images (e.g., markings, drawings, etc.), or facilitate interaction with the computing device 100 by touching a display screen 108 of a computing device 100. In some examples, a user may utilize an input article 102 to activate the touch screen capabilities of the computing device 100. As used herein, an “input article” refers to an article used to facilitate an interaction with a computing device. While the input article 102 is illustrated as a stylus in Figure 1, this disclosure is not so limited. For example, an input article could be a stylus (as illustrated in Figure 1), a finger of a user, a computer mouse, or an instrument used to contact a computing device to produce a mark and provide input.
[0012] In some examples, the input article 102 may be used to produce an image 106 on the display screen 108 of the computing device 100. For example, an input article 102 may come in contact with and moved along the surface of the display screen 108 of the computing device 100 to create input data to produce an image 106 on the display screen 108. As used herein, an “image” refers to a visual representation. For example, an image may be a visual representation of a line, visual representation of a letter, visual representation of a number, visual representation of a person, etc.
[0013] In some examples, a touchscreen controller may determine input coordinates in response to the input article 102 coming in contact with the display screen 108. That is, the touchscreen controller may determine the locations on the display screen 108 that the input article 102 comes in contact with to determine the input coordinates. The touchscreen controller may send the input coordinates to the graphics processing device 104. In addition, the touchscreen controller may receive the input data created by the input article 102 and send the input data directly to a graphics processing device 104. In some examples, the touchscreen controller may send the input data to the graphics processing device 104 once it is received by the touchscreen controller. For instance, the touchscreen controller may receive the input data in portions and send each portion of the input data to the graphics processing device as each portion is received. That is, portions of input from the input data may be received by the graphics processing device 104 at different times. As used herein, a “touchscreen controller” refers to the element of the computing device that detects positional information from an input article in the immediate vicinity of a display screen, detects positional contact pressure on a display screen of a computing device, and receives input data from the detected positional contact pressure and detected positional information from the input article. As used herein, a “graphics processing device" refers to a device used to process, analyze, and/or alter data to render an image for display on a display screen.
[0014] In some examples, the graphics processing device 104 may receive input data and input coordinates from a touchscreen controller. The graphics processing device 104 may then process the input data to render a temporary image (e.g., image 106) from the input data. That is, the graphics processing device 104 may determine the shape (e.g., frame) and render the temporary image based on preset attributes. In some examples, the graphics processing device may process and render a temporary image as each portion of the input data is received. The input data sent by the touchscreen controller to the graphics processing device 104 may provide the shape of the temporary image, but may not be able to produce a temporary image including the attributes provided by the input data. As used herein, “render" refers to the process of taking information and converting it into an image. As used herein, an “attribute” refers to the features of an image and/or shape. For example, an attribute of an image can include the color of the image, the shading of the image, the thickness of an image, the brush type used to create the image, the amount of pressure applied to create the image, etc. As used herein, a “preset attribute” refers to a determined attribute for input data that has not yet been received by the computing device. For example, a preset attribute may be an attribute that is selected by a user before input data is received by the touchscreen controller and/or default attributes of an application.
[0015] Once the temporary image is rendered, the graphics processing device 104 may send the temporary image to the display screen 108 and cause the display screen 108 to display the temporary image at the input coordinates. That is, the display screen 108 may display the temporary image based on the input coordinates determined by the touchscreen controller.
[0016] In addition, the touchscreen controller may send the input data and input coordinates to a processing application. The processing application may process the input data to produce processed data. Similarly, the processing application may process the input coordinates to produce processed coordinates. The processed coordinates may determine the display position and location of a processed image. In some examples, the processing application may send the processed data and the processed coordinates to the graphics processing device 104. The processed data sent by the processing application to the graphics processing device 104 may provide the shape (e.g., frame) of the image and the attributes determined by the processed data.
It should be understood that when an object is referred to as being "sent to" or “received by” another element, it may be directly sent to or received by the other element or intervening elements may be present. In contrast, when an object is “directly sent” or “received directly from” another element it is understood that are no intervening elements. As used herein, a “processing application” refers to an element of the computing device that processes information relating to input data. For example, a processing application may process input data to determine the attributes of the input data to apply to an image, the shape of the image, the location of the image, etc.
[0017] The graphics processing device 104 may receive the processed data and the processed coordinates from the processing application. The graphics processing device 104 may render a processed image (e.g., image 106) from the received processed data. The processed image may be different or the same as the temporary image. Once the processed image is rendered the graphics processing device 104 may send the processed image to the display screen 108 and cause the display screen 108 to display the processed image at the processed coordinates. The processed image may then replace the temporary image providing a more detailed image. For example, the display screen 108 may display the processed image including attributes determined from the processed data to provide a more detailed image and remove the less detailed temporary image including preset attributes determined before input data was received by the computing device.
[0018] In some examples, sending input data from a touchscreen controller directly to the graphics processing device may cause an image to appear on the display screen faster, as compared to sending the input data to a processing application and then to the graphics processing device. That is, by sending the input data from the touchscreen controller directly to the graphics processing device, a temporary image may be generated in 7 to 9 milliseconds (ms) from receipt of input data by the touchscreen controller. Hence, examples described herein may be able to reduce the latency of an image. In addition, replacing a temporary image with a processed image may ensure the accuracy of the displayed image. That is, examples described herein may be able to reduce the latency of the image, as compared to sending the input data to a processing application and then to the graphics processing device, and maintain the accuracy of the displayed image.
[0019] Figure 2 illustrates an example of an apparatus 220 suitable with a computing device consistent with the disclosure. As illustrated in Figure 2, the apparatus 220 includes a processor 221 and a memory resource 222. The processor 221 may be a hardware processing device such as a microprocessor, application specific instruction set processor, coprocessor, network processor, or similar hardware circuitry that may cause machine-readable instructions to be executed. In some examples, the processor 221 may be a plurality of hardware processing devices that may cause machine-readable instructions to be executed. The processor 221 may include central processing devices (CPUs) among other types of processing devices. The processor 221 may also include dedicated circuits and/or state machines, such as in an Application Specific Integrated Circuit (ASIC), Field Programmable Gate Array (FPGA) or similar design-specific hardware. The memory resource 222 may be any type of volatile or non-volatile memory or storage, such as random-access memory (RAM), flash memory, read-only memory (ROM), storage volumes, a hard disk, or a combination thereof.
[0020] The memory resource 222 may store instructions thereon, such as instructions 223, 224, 225, 226, 227, and 228. When executed by the processor 221 , the instructions may cause the apparatus 220 to perform specific tasks and/or functions. For example, the memory resource 222 may store instructions 223 to send the input data to a graphics processing device. An input article may come in contact with a display screen of a computing device. The display screen may have touch screen capabilities. In some examples, as the input article moves across the display screen the processor 221 may cause a touchscreen controller to analyze the movements of the input article to determine the input data. That is, the touchscreen controller may receive input data from an input article by analyzing the movements of the input article and determining the input data. In addition, the processor 221 may cause the touchscreen controller to determine input coordinates based on the movements of the input article. The input coordinates may provide the location of the image that will appear on the display screen. That is, the touchscreen controller may receive input coordinates from an input article by determining the location of the input article as the input article contacts the surface of the display screen.
[0021] In some examples, the processor 221 may cause the input data to be sent to the graphics processing device of the computing device. In addition, the graphics processing device may also receive input coordinates. That is, the processor 221 may cause the touchscreen controller to send the input data and the input coordinates directly to the graphics processing device. While the processor 221 and the touchscreen controller are described as separate elements, this disclosure is not so limited. It should be understood that the processor 221 and the touchscreen controller may be the same element or separate elements.
[0022] The memory resource 222 may store instructions 224 to render a temporary image based on the input data, wherein the temporary image includes a preset attribute. The processor 221 may cause the graphics processing device to render a temporary image. For example, the graphics processing device may process the input data to determine the shape of the temporary image. However, the appearance of the temporary image may be based on preset attributes. That is, the color, brush type, thickness, shading, amount of pressure applied, etc. (e.g., attributes) of the temporary image may be determined before the input data is received by the touchscreen controller. For instance, the graphics processing device may process the input data to determine the shape of the temporary image and apply preset attributes to the shape of the temporary image rendered. The preset attributes may be attributes that take less time to process and/or may be directly processed by the graphics processing device. In some examples, graphics processing device may process the input coordinates to determine the location on the display screen where the temporary image is to appear. That is, the graphics processing device may process input data and input coordinates to produce a temporary image including preset attributes to display on the display screen. [0023] The memory resource 222 may store instructions 225 to display the temporary image on the display screen of the computing device. In some examples, the processor 221 may cause the graphics processing device to send an image to the display screen. For instance, the graphics processing device may render a temporary image utilizing preset attributes and not attributes provided by the input data. The graphics processing device may then send the temporary image to the display screen.
In addition, the graphics processing device may send a display location of the temporary image to the display screen. Once the temporary image and the display location are received, the display screen may then display the temporary image on the display screen at the specified display location.
[0024] The memory resource 222 may store instructions 226 to send a processed data to the graphics processing device. In some examples, the processor 221 may cause a touchscreen controller to send input data and input coordinates to a processing application. The touchscreen controller may send the input data and the input coordinates directly to the processing application or there may be intervening elements. The processor 221 may cause the processing application to process the input data and the input coordinates. For example, the processing application may determine the different attributes of the input data. That is, the processing application may determine the color, brush type, thickness, shading, amount of pressure applied, etc. from the input data. In addition, the processing application may determine the display location from the input coordinates. The processing application may process the input data and the input coordinates to provide information to produce a detailed image and a display location. In some examples, the processor 221 may cause the processing application to send the processed data and the processed coordinates to the graphics processing device to render a processed image.
[0025] The memory resource 222 may store instructions 227 to render a processed image provided by the processed data, wherein the processed image includes an attribute based on the processed data. The processor 221 may cause the graphics processing device to render the processed image from the processed data.
For example, the graphics processing device may analyze the processed data to determine the shape of the processed image and the attributes provided by the processed data. The graphics processing device may then apply the attributes provided by the processed data to the shape of the processed image. That is, the processed image may have the color, brush type, thickness, shading, amount of pressure applied, etc. (e.g., attributes) provided by the processed data. In addition, the graphics processing device may analyze the processed coordinates to determine the location on the display screen where the processed image is to appear. That is, the graphics processing device may utilize the processed data and processed coordinates to produce a processed image including the attributes provided by the processed data and display the processed image at the location specified by the processed coordinates. [0026] The memory resource 222 may store instructions 228 to replace the temporary image on the display screen with the processed image. In some examples, processor 221 may cause the graphics processing device to send a processed image to the display screen of the computing device. For example, the graphics processing device may render a processed image including attributes determined by the processed data. The graphics processing device may send the processed image to the display screen. In some examples, sending the processed image to the display screen may cause the temporary image to be removed. That is, the temporary image may be removed and replaced with the processed image. The processed image may provide a more detailed image, in comparison to the temporary image. For instance, the processed image may include the attributes provided by the processed data. The display screen may receive the processed image, remove the temporary image, and display the processed image based on the location determined by the processed coordinates.
[0027] In some examples, the graphics processing device may compare the temporary image with the processed image and determine the differences between the images before the processed image is sent to the display screen. The graphics processing device may use the differences between the rendered images to determine the shape of the temporary image to remove. In some examples, the graphics processing device may use the differences between the images to determine which portions of the temporary image to replace. [0028] Figure 3 illustrates an example of a computing device 300 consistent with the disclosure. Computing device 300 may be a variety of computer related devices, such as desktop computers, portable computers, tablets, etc. For example, computing device 300 may be a portable computing device. Figure 3 may include analogous or similar elements as Figure 1. For example, Figure 3 may include a computing device 300, a graphics processing device 304, and a display screen 308.
[0029] In some examples, the computing device 300 may include a touchscreen controller 314. The touchscreen controller 314 may detect an active input article in the vicinity of the computing device 300. As the input article comes in contact with the display screen 308, the touchscreen controller 314 may detect the contact of the input article with the display screen 308. The touchscreen controller 314 may detect the movement of the input article on the display screen 308. The movement of the input article on the display screen 308 may provide input data to the touchscreen controller 314. That is, the touchscreen controller 314 may receive input data based on the contact of the input article with the display screen 308. In some examples, the movement of the input article on the display screen 308 may correspond to an image rendered by the graphics processing device 304.
[0030] In addition, the touchscreen controller 314 may detect the coordinates of an input article in the vicinity of the computing device 300. As the input article comes in contact with the display screen 308, the touchscreen controller 314 may receive input coordinates related to the input data. That is, the touchscreen controller 314 may receive input coordinates of the input data to determine the display location of an image rendered by the graphics processing device 304.
[0031] In some examples, the graphics processing device 304 may include a first sub-processing device 310 and a second sub-processing device 312. The first subprocessing device 310 and the second sub-processing device 312 may perform different and/or similar task within the graphics processing device 304. The first subprocessing device 310 of the graphics processing device 304 may utilize preset attributes to process input data received from a touchscreen controller 314 and render a temporary image. For instance, the temporary image may be rendered with preset attributes determined before the input data is received by the computing device 300. As used herein, a “first sub-processing device" refers to a component of the graphics processing device used to process, analyze, and/or alter input data to render a temporary image. As used herein, a “second sub-processing device” refers to a component of the graphics processing device used to process, analyze, and/or alter processed data to render a processed image.
[0032] The second sub-processing device 312 of the graphics processing device 304 may use processed data received from a processing application 316 to render a processed image. The second sub-processing device 312 may render the processed image utilizing attributes provided by the processed data. However, this disclosure is not so limited. For example, the graphics processing device 304 not include a first sub- processing device and a second sub-processing device. In addition, the graphics processing device 304 may process input data received from a touchscreen controller 314 and render a temporary image without a first sub-processing device and utilize processed data received from a processing application 316 to render a processed image without a second sub-processing device. Further, in some examples, the first sub-processing device 310 and the second sub-processing device 312 may be combined into a sub-processing device. That is, the graphics processing device 304 may include a sub-processing device to render a temporary image including preset attributes by processing input data received from a touchscreen controller 314 and render a processed image including attributes provided by processed data received from a processing application 316.
[0033] In some examples, the touchscreen controller 314 may send input data and input coordinates to the first sub-processing device 310 of the graphics processing device 304. For example, the touchscreen controller 314 may send the input data and the input coordinates to the graphics processing device 304 and the graphics processing device 304 may send the input data and the input coordinates to the first sub-processing device 310. In some examples, the touchscreen controller 314 may send the input data and the input coordinates directly to the first sub-processing device 310 of the graphics processing device 304.
[0034] The first sub-processing device 310 may process the input data and input coordinates to produce a temporary image. That is, the first sub-processing device 310 may analyze the input data to determine the shape of the image and produce a temporary image based on the determination of the shape. The first sub-processing device 310 may also analyze the input coordinates to determine the display location of the image. The first sub-processing device 310 may render the temporary image based on preset attributes. For example, the color, brush type, thickness, shading, amount of pressure applied, etc. of the temporary image may be determined before input data is received by the touchscreen controller. In some examples, the first sub-processing device 310 may cause the display screen 308 to display the temporary image at the display location. Said differently, the first sub-processing device 310 may send the temporary image to the display screen 308 to cause the display screen 308 to display the temporary image. In some examples, the first sub-processing device 310 may send the temporary image to the second sub-processing device 312.
[0035] In addition, the touchscreen controller 314 may send the input data and the input coordinates to a processing application 316. In some examples, the touchscreen controller 314 may send the input data and the input coordinates directly to the processing application 316. In contrast, the touchscreen controller 314 may send the input data and the input coordinates to an intervening element and the intervening element may send the input data and the input coordinates to the processing application 316.
[0036] The processing application 316 may process the input data and determine the attributes of the input data to produce a processed data. The attributes of the input data may provide a detailed image. In some examples, the processing application 316 may process the input coordinates to determine the display location of an image. The processing application may then send the processed data and the processed coordinates to the second sub-processing device 312 of the graphics processing device 304.
[003η In some examples, the processing application 316 may update the preset attributes. For example, the processing application 316 may update the preset attributes with the attributes provided by the processed data. This may allow temporary images rendered from input data newly received by the computing device (e.g., input data that is received after the processing application 316 updated the preset attributes) to have the attributes provided by the processed data.
[0038] The second sub-processing device 312 may analyze the processed data and processed coordinates to produce a processed image. That is, the second subprocessing device 312 may determine the shape of the image from the processed data and produce a processed image based on the determination of the shape. The second sub-processing device 312 may render the processed image based on the attributes provided by the processed data. That is, the color, brush type, thickness, shading, amount of pressure applied, etc. of the processed image may be determined by the processed data sent by the processing application 316.
[0039] In some examples, the second sub-processing device 312 may receive a temporary image from the first sub-processing device 310. The second sub-processing device 312 may compare the received temporary image with the processed image and determine the differences between the images. The second sub-processing device 312 may then cause the temporary image to be removed from the display screen 308 and replaced with the processed image. The processed image may be displayed on the display screen 308 based on the processed coordinates. That is, the second subprocessing device 312 may cause the display screen 308 to display the processed image at the location determined by the processing application 316. In some examples, the display location determined by the processed coordinates is the same or substantially similar to the location of the temporary image. As used herein, ‘‘substantially” refers to intends that the characteristic does not have to be absolute but is close enough so as to achieve the characteristic. For example, “substantially similar” is not limited to absolute similar.
[0040] In some examples, the touchscreen controller 314 sending the input data to the graphics processing device 304 and the processing application 316 may provide a faster processing time of an image while maintaining the accuracy of the image. That is, the computing device 300 is able to use existing software and hardware to process and render a temporary image faster by sending the input data from the touchscreen controller 314 directly to the graphics processing device 304 for processing and rendering, as compared to processing the input data before the input data is sent to the graphics processing device 304. In addition, the computing device 300 is able to ensure the accuracy of the image by sending the input data from the touchscreen controller 314 to the processing application 316 for processing and then sending the processed data to the graphics processing device 304 to render a processed image.
[0041] In some examples, the input data sent to the processing application 316 from the touchscreen controller 314 may be the same as the input data sent to the graphics processing device 304 directly from the touchscreen controller 314. However, this disclosure is not so limited. For example, the input data sent to the processing application 316 may be the different than the input data sent to the graphics processing device 304 directly from the touchscreen controller 314. That is, the input data sent from the touchscreen controller 314 to the processing application 316 may include more and/or less information than input data sent from the touchscreen controller 314 directly to the graphics processing device 304.
[0042] Figure 4 illustrates an example of a flow diagram 440 of a computing device 400 consistent with the disclosure. In some examples, flow diagram 440 may illustrate how information is transferred through a computing device 400. Figure 4 may include analogous or similar elements as Figure 1 and Figure 3. For example, Figure 4 may include an input article 402, a graphics processing device 404, a first sub- processing device 410, a second sub-processing device 412, a display screen 408, a touchscreen controller 414, and a processing application 416.
[0043] In some examples, the computing device 400 may receive input data and input coordinates from an input article 402. The input data may be communicated to the touchscreen controller 414. That is, the touchscreen controller 414 may receive input data and input coordinates from an input article 402, through communication 430. It should be understood that the arrows illustrated in Figure 4 are used to represent one element communicating with another element. The touchscreen controller 414 may send the input data and the input coordinates to the graphics processing device 404.
For instance, the touchscreen controller 414 may send input data and input coordinates directly to the first sub-processing device 410 of the graphics processing device 404, through communication 432. The first sub-processing device 410 may utilize the input data sent by the touchscreen controller 414, through communication 432, to produce a temporary image. The graphics processing device 404 may send the temporary image to the display screen 408. For instance, the first sub-processing device 410 may send the temporary image and display location instructions to the display screen 408 though communication 434. The display screen 408 may then display the temporary image at the display location.
[0044] In some examples, the first sub-processing device 410 may also send the temporary image and the input coordinates to the second sub-processing device 412, through communication 436. The second sub-processing device 412 may store the temporary image and input coordinates to compare the temporary image to a processed image and processed coordinates. For instance, once the second sub-processing device 412 prepares the processed image, the second sub-processing device 412 may compare the stored temporary image and input coordinates to the processed image and processed coordinates.
[0045] The touchscreen controller 414 may also send the input data and input coordinates to a processing application 416. In some examples, the touchscreen controller 414 may send the input data and input coordinates indirectly to the processing application 416, through communication 438. In contrast, the touchscreen controller 414 may send the input data and input coordinates directly to the processing application 416, through communication 438. The processing application 416 may then process the input data and input coordinates, received through communication 438, to produce a processed data and processed coordinates.
[0046] The processing application 416 may send the processed data and processed coordinates to the graphics processing device 404. For instance, the processing application 416 may send the processed data and processed coordinates to the second sub-processing device 412 of the graphics processing device 404, through communication 442. The second sub-processing device 412 may utilize the processed data and processed coordinates sent by the processing application 416, through communication 442, to produce a processed image and determine a display location for the processed image. In some examples, the second sub-processing device 412 may then compare the processed image and the processed coordinates to the to a stored temporary image and input coordinates. [0047] In various examples, the graphics processing device 404 may cause the temporary image to be replaced with the processed image. For instance, the second sub-processing device 412 may send the processed image, a display location for the processed image, and instructions to replace the temporary image with the processed image to the display screen 408, through communication 444. The display screen 408 may then display the processed image at the display location for the processed image instead of the temporary image.
[0048] It should be understood that the communication between the touchscreen controller 414, processing application 416, graphics processing device 404, and display screen 408 may be conducted without the use of the first sub-processing device 410 and a second sub-processing device 412. For example, the touchscreen controller 414 may send the input data and input coordinates directly to the graphics processing device 404, though communication 432. Similarly, the processing application 416 may send the processed data and processed coordinates to the graphics processing device 404, though communication 442. In addition, the graphics processing device 404 may send the temporary image to the display screen 408 through communication 434 and send the processed image to the display screen 408 though communication 444.
[0049] The approach described herein provides a computing device 400 that reduces the latency of the image, as compared to sending the input data to a processing application 416 and then to the graphics processing device 404. That is, the approach described herein may decrease the latency of the appearance of the image without increasing the cost of the computing device 400, as compared to computing devices that includes added software and/or hardware to reduce the latency of the appearance of an image. In addition, the computing device 400 described herein may reduce the latency of the appearance of the image while maintaining the accuracy of the image. For example, the image may be generated in 7 to 9 ms from receipt of input data by the touchscreen controller in a 60 hertz display screen.
[0050] Figure 5 illustrates an example diagram of a non-transitory machine readable medium 550 suitable with a computing device consistent with the disclosure. The non-transitory machine-readable medium 550 may be any type of volatile or non- volatile memory or storage, such as random-access memory (RAM), flash memory, read-only memory (ROM), storage volumes, a hard disk, or a combination thereof.
[0051] The medium 550 stores instruction 551 executable by a processor to send the input data to graphics processing device and render a temporary image. In various examples, the processor may execute transfer instructions 551 to send, by a touchscreen controller, input data to a graphics processing device. The graphics processing device may receive the input data directly from the touchscreen controller. The graphics processing device may process the input data to determine the shape of the image provided by the input data. The graphics processing device may then render an image from the input data. For instance, the input data may provide information that may allow a graphics processing device to produce an image. In some examples, the graphics processing device may receive input coordinates from the touchscreen controller. The input coordinates may determine the display location of the temporary image. The graphics processing device may render a temporary image for display on a display screen of a computing device.
[0052] The medium 550 stores instructions 552 executable by a processor to display the rendered temporary image based on a preset attribute. In various examples, the processor may execute display instructions 552 to cause an image rendered by the graphics processing device to display on the display screen at a display location. The temporary image displayed on the display screen may have attributes that were determined before input data was received by the touchscreen controller. Said differently, the attributes of the temporary image may not be able to be determined by the information provided in the input data. For example, the attributes of the temporary image may be default attributes programmed into the computing device. In some examples, the attributes of the temporary image may be attributes set by a user before the input data is received.
[0053] The medium 550 stores instruction 553 executable by a processor to send the processed data to a graphics processing device and render a processed image. In various examples, the processor may execute transfer instructions 553 to send by a processing application processed data to a graphics processing device. For instance, the instruction 553 may cause the graphics processing device to receive processed data. The graphics processing device may receive the processed data directly from the processing application. In some examples, the graphics processing device may receive the processed data indirectly from the processing application. The graphics processing device may analyze the processed data to determine the shape of the image provided by the processed data. For instance, the processed data may provide information that may allow a graphics processing device to produce an image based on the attributes provided in the processed data. The graphics processing device may then render a processed image from the processed data. The processed image may have attributes based on the information provided by the processed data. Said differently, the attributes of the processed image may be determined after input data is received and processed.
[0054] The medium 550 stores instruction 554 executable by a processor to analyze processed coordinates. In various examples, the processor may execute analyze instructions 554 to analyze processed coordinates received from the processing application. For instance, the instructions 554 may cause the graphics processing device to determine a display location for the processed image. That is, the graphics processing device may cause a display screen to display the processed image at the display location provided by the processed coordinates.
[0055] The medium 550 stores instruction 555 executable by a processor to compare the temporary image to the processed image. In various examples, the processor may execute compare instructions 555 to compare the temporary image to the processed image to determine the difference between the images. That is, the processor may cause the graphics processing device to determine the differences between the temporary image and the processed image. Likewise, the processor may cause the graphics processing device to compare the input coordinates and the processed coordinates to determine the difference between the coordinates. In some examples, the graphics processing device may utilize the difference between the coordinates to determine the placement of the processed image.
[0056] The medium 550 stores instruction 556 executable by a processor to replace the temporary image on the display screen with the processed image. In various examples, the processor may execute display instructions 556 to cause the temporary image to be removed from the display screen and the processed image to display on the display screen. That is, the temporary image including preset attributes may be replaced by the processed image including attributes provided by the processed data. In some examples, the processed image may be displayed on the display screen at the display location provided by the processed coordinates and not the display location of the temporary image. That is, the computing device described herein may reduce the latency of the appearance of the image while maintaining the accuracy of the image without added hardware and/or software to do so.
[0057] In some examples, the graphics processing device may utilize the comparison of the images and the comparison of the coordinates to determine the location and shape of the temporary image to remove. That is, the processor may cause the graphics processing device to determine, based on the compared images, which portion of the temporary image should be removed. For example, the graphics processing device may determine the location of the temporary image, remove the temporary image from the display screen, and then replace the temporary image with the processed image. In some examples, the graphics processing device may utilize the difference between the images to determine which portions of the temporary image to replace. That is, the portion of the temporary image that is different from the processed image may be replaced while the portion of the temporary image that is the same as the processed image may remain.
[0058] Figures 6A & 6B illustrates examples of computing devices 600A and 600B consistent with the disclosure. Figures 6A and 6B may include analogous or similar elements as Figure 1 , Figure 3, and Figure 4. For example, Figures 6A and 6B may include computing devices 600A and 600B (collectively refer to as computing device 600), graphics processing devices 604A and 604B (collectively refer to as graphics processing device 604), first sub-processing devices 610A and 610B, second sub-processing devices 612A and 612B, and a display screens 608A and 608B (collectively refer to as display screen 608).
[0059] In some examples, the computing device 600 may include a touchscreen controller to receive input data and input coordinates from an input article. The touchscreen controller may send the input coordinates and the input data directly to a graphics processing device 604.
[0060] The graphics processing device 604A may then process the input data and input coordinates to render a temporary image 606A. In some examples, the graphics processing device 604A may cause the first sub-processing unit 610A to process the input data and render a temporary image 606A. That is, the first subprocessing device 610A may analyze the input data to determine the shape of the temporary image 606A.
[0061] The temporary image 606A rendered by the first sub-processing device 610A may include preset attributes. For instance, the temporary image 606A may include standard features determined before input data was received by the touchscreen controller. That is, the temporary image 606A may have a standard color, brush type, thickness, shading, applied pressure, etc. processed by the first sub- processing device 610A and determined before the input data was received by the touchscreen controller. Similarly, the graphics processing device 604A may cause the first sub-processing device 610A to analyze the input coordinates to determine the display location of the temporary image 606A. The first sub-processing device 610A may then send the temporary image 606A to the display screen 608A. The display screen 608A may display the temporary image 606A at the display location determined by the input coordinates.
[0062] In some examples, the touchscreen controller may send the input data and the input coordinates to a processing application. The processing application may then process the input data and input coordinates to produce a processed data and processed coordinates. For example, the processing application may determine the different attributes provided by the input data. That is, the processing application may determine information relating to the color, brush type, thickness, shading, applied pressure, etc. when processing the input data. The processing application may then send the processed data and processed coordinates to the graphics processing device 604B. In some examples, the graphics processing device 604B may cause the second sub-processing unit 612B to analyze the processed data and render a processed image 606B. That is, the second sub-processing device 612B may analyze the processed data to determine the shape of the processed image 606B.
[0063] The processed image 606B rendered by the second sub-processing device 612B may include attributes provided by the processed data. For instance, the processed image 606B may include the color, brush type, thickness, shading, applied pressure, etc. implemented by the user when creating the input data. That is, the processed image 606B may have the attributes provided by the user when creating the input data. Similarly, the graphics processing device 604B may cause the second subprocessing device 612B to analyze the processed coordinates to determine the display location of the processed image 606B. The second sub-processing device 612B may then send the processed image 606B to the display screen 608B. The display screen 608B may remove the temporary image 606A and replace the temporary image 606A with the processed image 606B at the display location determined by the processed coordinates. The processed image 606B may provide a more detailed image compared to the temporary image 606A.
[0064] For example, the processed image 606B may have a slightly different display location compared to the temporary image 606A. For instance, the processed image 606B including the attributes provided by the processed data may a different shape, compared to the shape of the temporary image 606A. The different shapes of the processed image 606B and the temporary image 606A may cause the processed image 606B to have a slightly different display location than the temporary image 606A. For example, the attributes provided by the processed data may cause the processed image 606B to appear darker and/or have a different angle, thickness, and/or color compared to the temporary image 606A. That is, the processed image 606B may have a slightly different appearance compared to the temporary image 606A, but may display the same information, as illustrated by Figure 6A and 6B.
[0065] Replacing the temporary image 606A with the processed image 606B may reduce the latency of the rendered image without added software and/or hardware.
That is, the graphics processing device 604 may take less time to process and render the temporary image 606A. This may allow the temporary image 606A to appear on the display screen 608 in less time than the processed image 606B, reducing the latency of the image.
[0066] In addition, replacing the temporary image 606A with the processed image 606B may maintain the accuracy of the rendered image. That is, the computing device 600 may start the process of rendering a processed image 606B at substantially the same time the computing device 600 starts the process of rendering the temporary image 606A. The computing device 600 may maintain the accuracy of the image by rendering a processed image 606B including the attributes implemented by the user to replace the temporary image 606A.
[0067] The figures herein follow a numbering convention in which the first digit corresponds to the drawing figure number and the remaining digits identify an element or component in the drawing. Similar elements or components between different figures may be identified by the use of similar digits. For example, 108 may reference element “08” in Figure 1 , and a similar element may be referenced as 308 in Figure 8.
[0068] Elements shown in the various figures herein may be capable of being added, exchanged, and/or eliminated so as to provide a number of additional examples of the disclosure. In addition, the proportion and the relative scale of the elements provided in the figures are intended to illustrate the examples of the disclosure and should not be taken in a limiting sense.
[0069] The above specification and examples provide a description of the method and applications and use of the system and method of the present disclosure. Since many examples can be made without departing from the scope of the system and method, this specification merely sets forth some of the many possible example configurations and implementations.
[0070] It should be understood that the descriptions of various examples may not be drawn to scale and thus, the descriptions may have a different size and/or configuration other than as shown therein.

Claims

What is claimed:
1. A computing device comprising: a touchscreen controller; a display screen; and a graphics processing device to: receive an input data from the touchscreen controller; render a temporary image based on the input data; send the temporary image to the display screen; receive processed data from a processing application, wherein the processing application is to process the input data; render a processed image based on the processed data; and send the processed image to the display screen to replace the temporary image.
2. The computing device of claim 1 , wherein the graphics processing device comprises: a first sub-processing device to render the temporary image; and a second sub-processing device to render the processed image.
3. The computing device of claim 2, wherein the first sub-processing device is to send the temporary image to the second sub-processing device.
4. The computing device of claim 1 , wherein the touchscreen controller is to determine input coordinates responsive to an input article contacting the computing device.
5. The computing device of claim 4, wherein the touchscreen controller is to send the input coordinates to the graphics processing device.
6. The computing device of claim 5, wherein the display screen is to display the temporary image on the display screen based on the input coordinates.
7. The computing device of claim 1 , wherein in the processed image is displayed based on an attribute provided by the processed data.
8. A non-transitory machine-readable medium storing instructions executable by a processor of a computing device to: send an input data to a graphics processing device of the computing device; render a temporary image based on the input data, wherein the temporary image includes a preset attribute; display the temporary image on a display screen of the computing device; send a processed data to the graphics processing device; render a processed image based on the processed data, wherein the processed image includes an attribute provided by the processed data; and replace the temporary image on the display screen with the processed image.
9. The non-transitory machine-readable medium of claim 8, further including instructions to compare the temporary image to the processed image before replacing the temporary image on the display screen with the processed image.
10. The non-transitory machine-readable medium of claim 8, wherein the preset attribute is different from the attribute provided by the processed data.
11. The non-transitory machine-readable medium of claim 8, further including instructions to analyze processed coordinates and display the processed image based on the processed coordinates.
12. A computing device comprising: a display screen; a touchscreen controller to send input data to a graphics processing device and a processing application; the processing application to send processed data to the graphics processing device; the graphics processing device to: render a temporary image based on the input data; send the temporary image to the display screen; render a processed image based on the processed data; compare a preset attribute of the temporary image to an attribute of the processed image to determine a difference between the temporary image and the processed image; and replace the temporary image on the display screen with the processed image based on the comparison.
13. The computing device of claim 12, wherein the touchscreen controller is to receive the input data in portions and send each portion of the input data to the graphics processing device as each portion is received.
14. The computing device of claim 12, wherein the processing application is to update the preset attribute.
15. The computing device of claim 14, wherein the preset attribute includes color, brush type, thickness, shading, amount of pressure applied, or a combination thereof.
PCT/US2020/028940 2020-04-20 2020-04-20 Image rendering WO2021216037A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2020/028940 WO2021216037A1 (en) 2020-04-20 2020-04-20 Image rendering

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2020/028940 WO2021216037A1 (en) 2020-04-20 2020-04-20 Image rendering

Publications (1)

Publication Number Publication Date
WO2021216037A1 true WO2021216037A1 (en) 2021-10-28

Family

ID=78269794

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2020/028940 WO2021216037A1 (en) 2020-04-20 2020-04-20 Image rendering

Country Status (1)

Country Link
WO (1) WO2021216037A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9772771B2 (en) * 2012-09-18 2017-09-26 Facebook, Inc. Image processing for introducing blurring effects to an image
US20190379842A1 (en) * 2018-06-07 2019-12-12 Eys3D Microelectronics, Co. Image processing device

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9772771B2 (en) * 2012-09-18 2017-09-26 Facebook, Inc. Image processing for introducing blurring effects to an image
US20190379842A1 (en) * 2018-06-07 2019-12-12 Eys3D Microelectronics, Co. Image processing device

Similar Documents

Publication Publication Date Title
US20220129060A1 (en) Three-dimensional object tracking to augment display area
US9612675B2 (en) Emulating pressure sensitivity on multi-touch devices
US20170205939A1 (en) Method and apparatus for touch responding of wearable device as well as wearable device
US20140118268A1 (en) Touch screen operation using additional inputs
US20100295796A1 (en) Drawing on capacitive touch screens
US9898126B2 (en) User defined active zones for touch screen displays on hand held device
US20180101298A1 (en) Graph display apparatus, graph display method and storage medium
US9678639B2 (en) Virtual mouse for a touch screen device
US9886190B2 (en) Gesture discernment and processing system
US20110291981A1 (en) Analog Touchscreen Methods and Apparatus
JP2013222263A (en) Information processing device, method for controlling the same, and computer program
US10459528B2 (en) Information handling system enhanced gesture management, control and detection
CN104281318B (en) The method and apparatus for reducing the display delay of soft keyboard pressing
EP3622382A1 (en) Disambiguating gesture input types using multiple heatmaps
US10073612B1 (en) Fixed cursor input interface for a computer aided design application executing on a touch screen device
WO2016081280A1 (en) Method and system for mouse pointer to automatically follow cursor
US10248307B2 (en) Virtual reality headset device with front touch screen
KR102078748B1 (en) Method for inputting for character in flexible display an electronic device thereof
US20150268736A1 (en) Information processing method and electronic device
CN104185823B (en) Display and method in electronic equipment
WO2021216037A1 (en) Image rendering
US20150355787A1 (en) Dynamically executing multi-device commands on a distributed control
KR20180088859A (en) A method for changing graphics processing resolution according to a scenario,
CN115068948A (en) Method and apparatus for performing actions in a virtual environment
WO2019024507A1 (en) Touch control method and device, and terminal

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20932702

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20932702

Country of ref document: EP

Kind code of ref document: A1