CN112882563A - Touch projection system and method thereof - Google Patents

Touch projection system and method thereof Download PDF

Info

Publication number
CN112882563A
CN112882563A CN201911198989.0A CN201911198989A CN112882563A CN 112882563 A CN112882563 A CN 112882563A CN 201911198989 A CN201911198989 A CN 201911198989A CN 112882563 A CN112882563 A CN 112882563A
Authority
CN
China
Prior art keywords
projection
touch
touch operation
gesture
screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911198989.0A
Other languages
Chinese (zh)
Inventor
许书豪
陈鸿钦
吴震亮
赖勇均
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Coretronic Corp
Original Assignee
Coretronic Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Coretronic Corp filed Critical Coretronic Corp
Priority to CN201911198989.0A priority Critical patent/CN112882563A/en
Priority to TW108148471A priority patent/TW202121146A/en
Publication of CN112882563A publication Critical patent/CN112882563A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/54Accessories
    • G03B21/56Projection screens
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Projection Apparatus (AREA)
  • Controls And Circuits For Display Device (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention provides a touch projection system and a method thereof. The touch projection system comprises a touch projection screen and a projection device. The touch projection screen is used for displaying a projection picture and detecting touch operation applied to the touch projection screen. The projection device is electrically connected with the touch projection screen, projects a projection picture onto the touch projection screen, and judges whether the touch operation conforms to a specific gesture. And responding to that the touch operation accords with the specific gesture, and executing a projection control function corresponding to the specific gesture by the projection device. The touch projection system and the method thereof provided by the invention can improve the convenience of the touch projection system in use.

Description

Touch projection system and method thereof
[ technical field ] A method for producing a semiconductor device
The present invention relates to a projection technology, and more particularly, to a touch projection system and a method thereof.
[ background of the invention ]
With the development of electronic technology, people often need to use electronic devices to perform multiple tasks in daily life. The touch projection system can be used for a user to perform touch operation on the projection screen, and the touch operation area of the projection screen is used for receiving the touch operation performed by the user. In a projection system with a touch interaction function, a user can start a corresponding function of a projector by contacting a control option presented on a projection screen. Generally, a user needs to contact an icon at a fixed position on a projection screen to call up a control menu, the control menu is displayed at a default position, and the user performs a touch operation on the control menu to perform a one-step control operation.
However, in larger touch projection systems, the user needs to move the position to reach each corner of the projection screen. Therefore, the user often needs to move the position to contact the icon located at the fixed position on the projection screen to call up the control menu or issue the user command, which causes inconvenience in use for the user. For example, if the user is located at the right side of the projection screen and the control menu is displayed at the left side of the projection screen, the user needs to walk to operate the control menu, which is not friendly to the user.
This background section is provided merely to aid in understanding the present disclosure, and thus the disclosure in the background section may include art that does not constitute a prior art that is known to those of ordinary skill in the art. Furthermore, the disclosure in the "background" does not represent a material or problem to be solved by one or more embodiments of the present invention, nor is it intended to be known or recognized by one of ordinary skill in the art prior to the filing of the present application.
[ summary of the invention ]
The invention provides a touch projection system and a method thereof, which can improve the use convenience of the touch projection system.
The embodiment of the invention provides a touch projection system. The touch projection system comprises a touch projection screen and a projection device. The touch projection screen is used for displaying a projection picture and detecting touch operation applied to the touch projection screen. The projection device is electrically connected with the touch projection screen and used for projecting a projection picture onto the touch projection screen and judging whether the touch operation conforms to a specific gesture. And responding to that the touch operation accords with the specific gesture, and executing a projection control function corresponding to the specific gesture by the projection device.
The embodiment of the invention provides a touch projection method. The touch projection method is suitable for a touch projection system comprising a touch projection screen and a projection device. The projection device is electrically connected with the touch projection screen. The touch projection method comprises the following steps: the projection device projects a projection picture onto the touch projection screen; detecting a touch operation applied to the touch projection screen by using the touch projection screen; the projection device judges whether the touch operation conforms to a specific gesture; and responding to that the touch operation accords with the specific gesture, and executing a projection control function corresponding to the specific gesture by the projection device.
In an embodiment of the invention, the projection apparatus determines whether the touch operation of the user conforms to the specific gesture according to the contact information detected by the touch projection screen, so as to further start a corresponding function of the projection apparatus. In the touch projection system and method of the invention, by means of the touch projection screen, a user can use a plurality of different gestures to start a plurality of functions of the projection device, so that the user can operate the touch projection system more intuitively, simply and conveniently or flexibly, thereby improving the convenience and variability of the touch projection system in use.
In order to make the aforementioned and other features and advantages of the invention more comprehensible, embodiments accompanied with figures are described in detail below.
[ description of the drawings ]
Fig. 1 is a schematic diagram of a touch projection system according to an embodiment of the invention.
Fig. 2 is a flowchart of a touch projection method according to an embodiment of the invention.
3A-3C are schematic diagrams of a situation in which a control menu is operated by a specific gesture according to an embodiment of the present invention.
Fig. 4A to 4C are schematic diagrams illustrating a situation in which a control menu is operated by a specific gesture during execution of an application according to an embodiment of the present invention.
FIGS. 5A and 5B are schematic diagrams of a scenario of performing a screenshot with a specific gesture according to an embodiment of the present invention.
FIGS. 6A and 6B are schematic diagrams illustrating a system mode switching by a specific gesture according to an embodiment of the present invention.
Fig. 7A to 7C are schematic views illustrating a situation in which a screen brush program is executed by a specific gesture according to an embodiment of the present invention.
[ notation ] to show
10: touch projection system
110: touch projection screen
120: projection device
121: storage device
122: processor with a memory having a plurality of memory cells
AF 1: application program screen
PI: projection picture
G1-G6: touch operation
M1: control menu
M2: submenus
M3, M4: notification
M5: painting brush function menu
T1: handwriting
F1: finger(s)
S201 to S204: and (5) carrying out the following steps.
[ detailed description ] embodiments
The foregoing and other technical and scientific aspects, features and advantages of the present invention will be apparent from the following detailed description of preferred embodiments, which is to be read in connection with the accompanying drawings. Directional terms as referred to in the following examples, for example: up, down, left, right, front or rear, etc., are simply directions with reference to the drawings. Accordingly, the directional terminology used is intended to be in the nature of words of description rather than of limitation.
Fig. 1 is a schematic view of a touch projection system according to an embodiment of the invention. Referring to fig. 1, the touch projection system 10 includes a touch projection screen 110 and a projection device 120. The components and the arrangement of the touch projection system 10 in fig. 1 are described with reference to fig. 2.
In an embodiment, the touch screen 110 can be used to display the projection image PI and detect a touch operation applied to the touch screen 110. In other words, the touch screen 110 has a touch display surface, and can display the projection image PI according to the image beam projected by the projection device 120, and can detect the touch operation of the user on the touch display surface. The touch screen 110 includes, for example, a projected capacitive touch screen, an electromagnetic touch screen, a resistive touch screen, or other suitable touch screens, which is not limited in the present invention.
The projection device 120 is electrically connected to the touch screen 110, and the projection device 120 can project the projection image PI on the touch screen 110. In some embodiments, the projection device 120 may include a Liquid Crystal Projector (LCP), a Digital Light Processing (DLP) Projector, or a Liquid Crystal On Silicon (LCOS) projection display device, among others. In this embodiment, the projection device 120 may include a storage device 121 and a processor 122, and may also include a light source module, an optical-mechanical module, a light valve, a lens module, and related optical and circuit control components, which are not limited in the present invention.
In an embodiment, the storage device 121 may include any type of fixed or removable Random Access Memory (RAM), Read-Only Memory (ROM), Flash Memory (Flash Memory), a hard disk, or other similar devices or combinations thereof, for recording a plurality of modules executable by the processor 122, and the modules may load the processor 122 to perform the touch projection method of the embodiment of the invention, which is not limited in this respect.
In one embodiment, the processor 122 may include a Field Programmable Gate Array (FPGA), a Programmable Logic Device (PLD), an Application Specific Integrated Circuit (ASIC), other similar devices, or a combination thereof. The Processor may also include a Central Processing Unit (CPU) or other programmable general purpose or special purpose Microprocessor (Microprocessor), Digital Signal Processor (DSP), Graphics Processing Unit (GPU), other similar devices, or a combination thereof, as the present invention is not limited in this respect. The processor 122 may be coupled to the storage device 121 to access and execute the module instructions recorded in the storage device 121. The processor 122 may be further coupled to the touch screen 110, so as to execute the touch projection method according to the embodiment of the invention according to the touch signal fed back by the touch screen 110.
Fig. 2 is a flowchart of a touch projection method according to an embodiment of the invention, and the method flow of fig. 2 can be implemented by the touch projection system 10 of fig. 1. Referring to fig. 1 and fig. 2, the steps of the touch projection method of the present embodiment are described below with reference to the touch projection system 10 in fig. 1.
In the embodiment shown in fig. 2, in step S201, the projection device 120 projects the projection image PI onto the touch screen 110. The projection device 120 projects the image beam toward the touch screen 110, so that the projection image PI is displayed on the touch screen 110.
In this embodiment, in step S202, the touch screen 110 detects a touch operation applied to the touch screen 110. In some embodiments, the touch screen 110 can include at least one touch sensing device, which can be arranged in an array and can be used to detect a touch operation. For example, the touch sensing device may include, but is not limited to, a capacitive touch sensing device, a surface acoustic wave touch sensing device, an electromagnetic touch sensing device, a near field imaging touch sensing device, and the like. When the touch sensing component is triggered, a sensing value is generated according to touch operation, wherein the touch operation may include at least one parameter selected from the following parameters: contact point location, contact point number, contact point spacing, contact duration, and combinations thereof. The invention is not limited in this regard.
In this embodiment, in step S203, the projection device 120 determines whether the touch operation is consistent with a specific gesture. In some embodiments, the projection device 120 may determine whether the touch operation is consistent with a specific gesture according to the at least one parameter of the touch operation, where the specific gesture includes a static gesture and a dynamic gesture. For example, the projection device 120 may determine that the touch operation is a click operation or a long press operation according to the contact duration. Alternatively, the projection device 120 may determine whether the touch operation is a single-touch operation or a multi-touch operation according to the number of the touch points. Alternatively, the projection apparatus 120 may determine whether the touch operation is to click the display object in the projection image PI according to the number of the contact points and the positions of the contact points.
In the embodiment of the present invention, the specific gesture includes, for example, two fingers pressing the touch screen 110 for more than 1 second. Correspondingly, the projection device 120 can determine whether the touch operation of the user is the long press of which the duration exceeds 1 second according to the number of the contact points and the distance between the contact points, and determine whether the touch operation of the user is the long press of which the duration exceeds 1 second, so as to determine whether the touch operation detected by the touch projection screen 110 meets a specific gesture. In the embodiment of the present invention, the specific gesture may also include five fingers touching and clicking the touch screen 110. Correspondingly, the projection device 120 can determine whether the touch operation of the user is that five fingers contact the touch screen 110 according to the number of the contact points and the distance between the contact points, so as to determine whether the touch operation detected by the touch screen 110 meets a specific gesture. However, the specific gesture can be designed according to the actual requirement, and can also be set by the user, which is not limited in the present invention.
In the embodiment of the present invention, in step S204, in response to the touch operation conforming to the specific gesture, the projection device 120 executes the projection control function corresponding to the specific gesture. That is, the user can control the projection device 120 to perform the corresponding projection control function through a specific gesture. In the embodiment of the present invention, the projection control function corresponding to the specific gesture includes, for example, a screenshot function, a control menu display function, a system mode switching function, or an execution tool application function, which is not limited in this respect. Further, a particular gesture may include more than one number of gestures. For example, when the touch operation is determined to be in accordance with a first gesture of the specific gestures, the projection device 120 may perform a first projection control function. When the touch operation is determined to be in accordance with a second gesture of the specific gestures, the projection device 120 may perform a second projection control function different from the first projection control function.
It should be noted that, in the embodiment of the present invention, the projection device 120 may determine whether to execute the projection control function according to whether the touch operation is in accordance with the specific gesture. Since the user can touch any position of the touch screen 110 to perform the touch operation, the user does not need to move the position to click the display object disposed at the fixed position on the touch screen 110 to control the projection device 120 to perform a specific function. The invention can greatly improve the operation convenience and the fluency of a user during the use of the touch projection system. In addition, during the execution of the application program by the projection device 120, the projection device 120 may also execute an additional projection control function in response to the touch operation conforming to the specific gesture. In other words, if the projection apparatus 120 determines that the touch operation matches the specific gesture, the projection apparatus 120 may perform an additional projection control function regardless of whether the application is executed by the projection apparatus 120. Specifically, when the projection device 120 executes the application program, the projection screen PI displayed by the touch screen 110 is an application screen. When the touch screen 110 detects a touch operation and the projection device 120 determines that the touch operation matches a specific gesture, the projection device 120 may execute a projection control function corresponding to the specific gesture while displaying an application screen.
3A-3C are schematic diagrams of a situation in which a control menu is operated by a specific gesture according to an embodiment of the present invention. In the present embodiment, the specific gesture includes a first gesture, wherein the first gesture is that two fingers contact the touch screen 110 for more than 1 second at the same time, but the invention is not limited thereto. Referring to fig. 3A, when the touch screen 110 detects a touch operation G1 applied to the touch screen 110 by a user, the projection device 120 (shown in fig. 1) determines that the touch operation G1 matches the first gesture according to the number of contact points of the touch operation G1 being equal to 2 and the duration of the contact time exceeding 1 second. Referring to fig. 3B, in response to the touch operation G1 conforming to the first gesture, the display control menu M1 corresponding to the projection device 120 is displayed for further operation by the user. In the present embodiment, the control menu M1 can be overlappingly displayed on the projection screen PI, and the control menu M1 can include a plurality of control options (e.g., the options "menu" (menu), option "C", option "APP" and option "S" shown in fig. 3B). For example, the control options may respectively correspond to an OSD setting menu or an application menu of the projection apparatus 120.
It should be noted that the position of the control menu M1 on the touch screen 110 corresponds to the position of the touch point of the touch operation G1. Specifically, the position of the control menu M1 on the projection screen PI can be determined based on the contact point position of the touch operation G1, so as to ensure that the user can select and operate the control menu M1 nearby. In other embodiments, the projection device 120 may also determine the touchable range according to the contact point position of the touch operation G1, and the touchable range may be defined as a range separated from the contact point position of the touch operation G1 by a predetermined distance. And the control menu M1 can be displayed within the accessible range to ensure that the display position of the control menu M1 can be easily reached by the user.
Referring to fig. 3C, after the control menu M1 is displayed, the touch screen 110 can detect another touch operation 2 applied within the range of the control menu M1. In response to the touch screen 110 detecting another touch operation G2 within the range of the control menu M1, the projection device 120 can adjust the setting parameters of the projection device 120. For example, the setting parameters may include, but are not limited to, volume, projection brightness, projection resolution, and frame keystone correction parameters. In the present embodiment, another touch operation G2 can be the user clicking on the option "menu" (menu) within the range of the control menu M1, in which case the touch screen 110 can display another sub-menu M2. In other embodiments, the touch operation G2 may be a user clicking on the option "S" within the control menu M1, in which case the user may set or adjust the relevant setting parameters of the projection device 120. In another embodiment, the control menu M1 may also directly provide a parameter setting interface, for example, the control menu M1 may include options of volume, projection brightness, projection resolution, and keystone correction, so as to provide more convenient parameter setting for the user. Referring to fig. 3A to 3C, the user can call up the control menu M1 with a specific gesture without moving, and further perform a projector setting or operation by means of the control menu M1.
Fig. 4A to 4C are schematic diagrams illustrating a situation in which a control menu is operated by a specific gesture during execution of an application according to an embodiment of the present invention. In the embodiment, the projection device 120 executes a projection control function corresponding to a specific gesture during the execution of the application program. For convenience of illustration, the specific gesture is similar to the implementation of fig. 3 to 3C, for example, a first gesture in which two fingers press the touch screen 110 for more than 1 second at the same time, but the invention is not limited thereto.
Referring to fig. 4A, during the application program being executed by the projection device 120, the touch screen 110 displays an application screen AF1, and in the embodiment, the application screen AF1 can be superimposed on the projection screen PI. The method for controlling or operating the touch projection system by the user through the touch operation G3 is similar to the embodiment shown in fig. 3A to 3C, and is not repeated herein. Referring to FIG. 4B, the control menu M1 is overlapped on the application frame AF 1. In other words, during the execution of the application program by the projection device 120, the user can call the control menu M1 according to the specific gesture without closing the application program. Similarly, the position of the control menu M1 on the application screen AF1 can be determined based on the contact point position of the touch operation G3, so as to ensure that the user can click and operate the control menu M nearby.
FIGS. 5A and 5B are schematic diagrams of a scenario of performing a screenshot with a specific gesture according to an embodiment of the present invention. In the present embodiment, the specific gesture includes a second gesture, wherein the second gesture is a simultaneous click of the touch screen 110 by five fingers, but the invention is not limited thereto. Referring to fig. 5A, when the touch screen 110 detects a touch operation G4 applied to the touch screen 110 by a user, the projection device 120 (shown in fig. 1) determines that the touch operation G4 matches the second gesture according to the number of contact points of the touch operation G4 being equal to 5 and the distance between the contact points of the touch operation G4 matching the one-hand touch condition. Referring to fig. 5B, in response to the touch operation G4 conforming to the second gesture, the projection device 120 executes a corresponding screenshot (Screen Shot) function to capture and store the currently displayed projection image PI. In this embodiment, the projection device 120 can store the captured projection image PI in the storage device 121 or other external storage devices. After completing the screenshot, the projection device 120 may display a notification M3 to inform the user that the screenshot function was performed. For example, the notification M3 may include the contents of "captured picture" or "stored captured picture". In another embodiment, the projection image PI may flash one or more times when capturing the projection image PI to notify the user of the execution of the screenshot function. Similarly, in an embodiment, during the application program executed by the projection device 120, if the touch screen 110 detects the touch operation G4 and the projection device 120 determines that the touch operation G4 matches the second gesture, the projection device 120 may also retrieve and store the projection image P including the application image.
FIGS. 6A and 6B are schematic diagrams illustrating a system mode switching by a specific gesture according to an embodiment of the present invention. In the present embodiment, the specific gesture includes a third gesture, wherein the third gesture is a continuous touch of two fingers on the touch screen 110, but the invention is not limited thereto. Referring to fig. 6A, when the touch screen 110 detects a touch operation G5 applied to the touch screen 110 by a user, the projection device 120 (shown in fig. 1) determines that the touch operation G5 satisfies the third gesture according to the condition that the number of contact points of the touch operation G5 is equal to 2 and the contact time interval of the touch operation G5 satisfies the continuous tapping condition. Referring to fig. 6B, in response to the touch operation G5 meeting the third gesture, the projection device 120 performs system mode switching to enter the system sleep mode. In this embodiment, the projection device 120 may display a sleep confirmation notification M4 to allow the user to confirm whether to enter the system sleep mode again. For example, the sleep confirmation notification M4 may include a plurality of control options (e.g., option "Y" and option "N" shown in fig. 6B), and when the user clicks "Y", the system enters the system sleep mode. In another embodiment, in response to the touch operation G5 conforming to the third gesture, the projection device 120 may directly enter the system sleep mode.
Fig. 7A and 7B are schematic views illustrating a situation in which a screen brush program is executed by a specific gesture according to an embodiment of the present invention. In the present embodiment, the specific gesture includes a fourth gesture, wherein the fourth gesture is a long press of the touch screen 110 for more than 5 seconds by a single finger, but the invention is not limited thereto. Referring to fig. 7A, when the touch screen 110 detects a touch operation G6 applied to the touch screen 110 by a user, the projection device 120 (shown in fig. 1) determines that the touch operation G6 matches the fourth gesture according to the number of contact points of the touch operation G6 being equal to 1 and the duration of contact of the touch operation G5 exceeding 5 seconds. Referring to fig. 7B, in response to the touch operation G6 corresponding to the fourth gesture, the projection device 120 executes a screen brush program to display a brush function menu M5 on the touch screen 110. In this embodiment, the brush function menu M5 may be displayed on the projection screen PI in an overlapping manner, and the brush function menu M5 may include a plurality of drawing options (e.g., the option "color" and the option "thickness" shown in fig. 7B). In other embodiments, the brush function menu M5 may also include a palette option. In this embodiment, the position of the brush function menu M5 on the touch screen 110 corresponds to the contact position of the touch operation G6. The user can then set the color of the line drawn by the brush through the brush function menu M5. Referring to fig. 7C, according to the detected moving track of the user's finger F1, the projection apparatus 120 can display a corresponding brush pen trace T1 on the projection screen PI.
In the above embodiments, when the touch screen detects a touch operation applied to the touch screen 110 by a user, and the projection device 120 (shown in fig. 1) determines that the touch operation does not conform to a specific gesture, the projection device 120 may determine a specific gesture most similar to the touch operation according to a parameter related to the touch operation, for example, the specific gesture is most similar to the fourth gesture. In this case, the projecting device 120 may execute a screen-brush program corresponding to the fourth gesture. Alternatively, in other cases, the projection device 120 may display a confirmation menu on the touch screen 110 to further confirm that the touch operation corresponds to the fourth gesture.
In summary, in the embodiment of the invention, the projection apparatus determines whether the touch operation of the user conforms to the specific gesture according to the contact information detected by the touch projection screen, so as to further start the corresponding function of the projection apparatus. In this way, since the user can contact any position of the touch projection screen to perform the touch operation, the user does not need to move the position to click the display object disposed at the fixed position on the touch projection screen to control the projection device to perform the specific function, and the projection device can perform additional projection control functions regardless of whether the application program is executed. Therefore, in the touch projection system and method of the invention, by means of the touch projection screen, a user can use a plurality of different gestures to activate a plurality of functions of the projection device, so that the user can operate the touch projection system more intuitively, simply and conveniently or more frequently, thereby improving the convenience and variability of the touch projection system in use.
Although the present invention has been described with reference to the above embodiments, it should be understood that various changes and modifications can be made therein by those skilled in the art without departing from the spirit and scope of the invention.
The above description is only a preferred embodiment of the present invention, and should not be taken as limiting the scope of the invention, and all the equivalent changes and modifications made by the claims and the description of the present invention are still within the scope of the invention. Moreover, not all objects or advantages or features disclosed herein are necessarily achieved by any one embodiment or claim of the invention. In addition, the abstract section and the title are provided for assisting the patent document retrieval and are not intended to limit the scope of the invention. Furthermore, the terms "first," "second," and the like, as used herein or in the appended claims, are used merely to name a component (element) or to distinguish between different embodiments or ranges, and are not used to limit upper or lower limits on the number of components.

Claims (17)

1. A touch projection system, the touch projection system comprising:
the touch control projection screen is used for displaying a projection picture and detecting touch control operation applied to the touch control projection screen; and
the projection device is electrically connected with the touch projection screen, projects the projection picture onto the touch projection screen, judges whether the touch operation accords with a specific gesture, and executes a projection control function corresponding to the specific gesture in response to the touch operation according with the specific gesture.
2. The touch projection system of claim 1, wherein the touch operation includes a parameter selected from at least one of: contact point location, contact point number, contact point spacing, contact duration, and combinations thereof.
3. The touch projection system of claim 2, wherein the projection device determines whether the touch operation matches the specific gesture according to the parameter of the touch operation.
4. The touch projection system of claim 1, wherein the projection device performs the projection control function corresponding to the specific gesture during execution of an application, the projection screen includes an application screen of the application, and the touch operation is applied to the touch screen when the application screen is displayed on the touch screen.
5. The touch projection system of claim 2, wherein the specific gesture comprises a first gesture, and in response to the touch operation conforming to the first gesture, the projection device executes a control menu display function corresponding to the first gesture to display the projection screen and a control menu on the touch projection screen, wherein a position of the control menu on the touch projection screen corresponds to the contact point position of the touch operation.
6. The touch projection system of claim 5, wherein the touch screen detects another touch operation applied within the control menu range, and the projection device adjusts the setting parameters of the projection device in response to the touch screen detecting the another touch operation within the control menu range.
7. The touch projection system of claim 1, wherein the specific gesture comprises a second gesture, and in response to the touch operation conforming to the second gesture, the projection device executes a screenshot function corresponding to the second gesture to capture and store the projection screen.
8. The touch projection system of claim 1, wherein the specific gesture comprises a third gesture, and the projection device enters a system sleep mode in response to the touch operation conforming to the third gesture.
9. The touch projection system of claim 2, wherein the specific gesture includes a fourth gesture, and in response to the touch operation conforming to the fourth gesture, the projection device executes a screen brush program corresponding to the fourth gesture to display the projection screen and a brush function menu on the touch projection screen, wherein a position of the brush function menu on the touch projection screen corresponds to the contact point position of the touch operation.
10. A touch operation method is suitable for a touch projection system comprising a touch projection screen and a projection device, wherein the projection device is electrically connected with the touch projection screen, and the touch operation method comprises the following steps:
the projection device projects a projection picture onto the touch projection screen;
the touch projection screen detects touch operation applied to the touch projection screen;
the projection device judges whether the touch operation conforms to a specific gesture; and
and responding to the fact that the touch operation accords with the specific gesture, and executing a projection control function corresponding to the specific gesture by the projection device.
11. The touch operation method according to claim 10, wherein the touch operation includes a parameter selected from at least one of: the method comprises the following steps of determining a touch operation by a projection device, wherein the touch operation is performed by the projection device according to a specific gesture, and the specific gesture comprises the following steps:
and judging whether the touch operation conforms to the specific gesture according to the contact point position, the contact point number, the contact point interval, the contact duration and/or the combination of the contact point position, the contact point number, the contact point interval and the contact duration of the touch operation.
12. The touch operation method according to claim 10, wherein in response to the touch operation conforming to the specific gesture, the step of the projection device performing the projection control function corresponding to the specific gesture includes:
executing the projection control function corresponding to the specific gesture during the process that the projection device executes an application program, wherein the projection screen comprises an application program screen of the application program, and when the touch projection screen displays the application program screen, the touch operation is applied to the touch projection screen.
13. The touch operation method according to claim 11, wherein the specific gesture includes a first gesture, and in response to the touch operation conforming to the specific gesture, the step of the projection device performing the projection control function corresponding to the specific gesture includes:
and in response to that the touch operation conforms to the first gesture, the projection device executes a control menu display function corresponding to the first gesture to display the projection picture and a control menu on the touch projection screen, wherein the position of the control menu on the touch projection screen corresponds to the contact point position of the touch operation.
14. The touch operation method according to claim 13, further comprising:
the touch projection screen detects another touch operation applied in the control menu range, and in response to the touch projection screen detecting the another touch operation in the control menu range, the projection device adjusts the setting parameters of the projection device.
15. The touch operation method according to claim 10, wherein the specific gesture includes a second gesture, and in response to the touch operation conforming to the specific gesture, the step of the projection device performing the projection control function corresponding to the specific gesture includes:
and responding to the touch operation conforming to the second gesture, and executing a screen capture function corresponding to the second gesture by the projection device so as to capture and store the projection picture.
16. The touch operation method according to claim 10, wherein the specific gesture includes a third gesture, and in response to the touch operation conforming to the specific gesture, the step of the projection device performing the projection control function corresponding to the specific gesture includes:
and responding to the third gesture which is accorded with the touch operation, and enabling the projection device to enter a system sleep mode.
17. The touch operation method according to claim 11, wherein the specific gesture includes a fourth gesture, and in response to the touch operation conforming to the specific gesture, the step of the projection device performing the projection control function corresponding to the specific gesture includes:
and in response to that the touch operation conforms to the fourth gesture, the projection device executes a screen painting brush program corresponding to the fourth gesture so as to display the projection picture and a painting brush function menu on the touch projection screen, wherein the position of the painting brush function menu on the touch projection screen corresponds to the position of the contact point of the touch operation.
CN201911198989.0A 2019-11-29 2019-11-29 Touch projection system and method thereof Pending CN112882563A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201911198989.0A CN112882563A (en) 2019-11-29 2019-11-29 Touch projection system and method thereof
TW108148471A TW202121146A (en) 2019-11-29 2019-12-30 Touch control projection system and method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911198989.0A CN112882563A (en) 2019-11-29 2019-11-29 Touch projection system and method thereof

Publications (1)

Publication Number Publication Date
CN112882563A true CN112882563A (en) 2021-06-01

Family

ID=76038415

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911198989.0A Pending CN112882563A (en) 2019-11-29 2019-11-29 Touch projection system and method thereof

Country Status (2)

Country Link
CN (1) CN112882563A (en)
TW (1) TW202121146A (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW201009650A (en) * 2008-08-28 2010-03-01 Acer Inc Gesture guide system and method for controlling computer system by gesture
TW201122842A (en) * 2009-12-22 2011-07-01 Foxconn Comm Technology Corp System and method for operating a powerpoint file
CN105183143A (en) * 2014-06-13 2015-12-23 洪水和 Gesture Identification System In Tablet Projector And Gesture Identification Method Thereof
CN105260022A (en) * 2015-10-15 2016-01-20 广东欧珀移动通信有限公司 Gesture-based screen capture control method and apparatus
US20160216771A1 (en) * 2015-01-26 2016-07-28 National Tsing Hua University Image projecting device having wireless controller and image projecting method thereof
WO2017041425A1 (en) * 2015-09-08 2017-03-16 京东方科技集团股份有限公司 Projector screen, and touch control screen projection display method and system
CN107357422A (en) * 2017-06-28 2017-11-17 深圳先进技术研究院 Video camera projection interaction touch control method, device and computer-readable recording medium
CN107493495A (en) * 2017-08-14 2017-12-19 深圳市国华识别科技开发有限公司 Interaction locations determine method, system, storage medium and intelligent terminal
CN109144598A (en) * 2017-06-19 2019-01-04 天津锋时互动科技有限公司深圳分公司 Electronics mask man-machine interaction method and system based on gesture
CN110413184A (en) * 2019-06-20 2019-11-05 视联动力信息技术股份有限公司 A kind of method for controlling projection and device

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW201009650A (en) * 2008-08-28 2010-03-01 Acer Inc Gesture guide system and method for controlling computer system by gesture
TW201122842A (en) * 2009-12-22 2011-07-01 Foxconn Comm Technology Corp System and method for operating a powerpoint file
CN105183143A (en) * 2014-06-13 2015-12-23 洪水和 Gesture Identification System In Tablet Projector And Gesture Identification Method Thereof
US20160216771A1 (en) * 2015-01-26 2016-07-28 National Tsing Hua University Image projecting device having wireless controller and image projecting method thereof
WO2017041425A1 (en) * 2015-09-08 2017-03-16 京东方科技集团股份有限公司 Projector screen, and touch control screen projection display method and system
CN105260022A (en) * 2015-10-15 2016-01-20 广东欧珀移动通信有限公司 Gesture-based screen capture control method and apparatus
CN109144598A (en) * 2017-06-19 2019-01-04 天津锋时互动科技有限公司深圳分公司 Electronics mask man-machine interaction method and system based on gesture
CN107357422A (en) * 2017-06-28 2017-11-17 深圳先进技术研究院 Video camera projection interaction touch control method, device and computer-readable recording medium
CN107493495A (en) * 2017-08-14 2017-12-19 深圳市国华识别科技开发有限公司 Interaction locations determine method, system, storage medium and intelligent terminal
CN110413184A (en) * 2019-06-20 2019-11-05 视联动力信息技术股份有限公司 A kind of method for controlling projection and device

Also Published As

Publication number Publication date
TW202121146A (en) 2021-06-01

Similar Documents

Publication Publication Date Title
US9405463B2 (en) Device and method for gesturally changing object attributes
US9400590B2 (en) Method and electronic device for displaying a virtual button
KR101087479B1 (en) Multi display device and method for controlling the same
US10108331B2 (en) Method, apparatus and computer readable medium for window management on extending screens
JP4979600B2 (en) Portable terminal device and display control method
US9250789B2 (en) Information processing apparatus, information processing apparatus control method and storage medium
US20120327122A1 (en) Mobile terminal device, storage medium and display control method of mobile terminal device
US20160004373A1 (en) Method for providing auxiliary information and touch control display apparatus using the same
US8775958B2 (en) Assigning Z-order to user interface elements
US20130061122A1 (en) Multi-cell selection using touch input
US9354780B2 (en) Gesture-based selection and movement of objects
US20160299632A1 (en) Method and device for implementing a touch interface
US10599317B2 (en) Information processing apparatus
WO2019233313A1 (en) Floating tab display method and device, terminal, and storage medium
US9575656B2 (en) Method, electronic device, and computer program product for displaying virtual button
JP5713180B2 (en) Touch panel device that operates as if the detection area is smaller than the display area of the display.
KR102205283B1 (en) Electro device executing at least one application and method for controlling thereof
WO2017185459A1 (en) Method and apparatus for moving icons
TWM486792U (en) Mobile device
WO2016183912A1 (en) Menu layout arrangement method and apparatus
US20180210597A1 (en) Information processing device, information processing method, and program
JP2009098990A (en) Display device
CN105988643A (en) Information processing method and electronic device
TWI607369B (en) System and method for adjusting image display
CN112882563A (en) Touch projection system and method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination