WO2023029822A1 - 一种显示设备及其智能触控的方法 - Google Patents

一种显示设备及其智能触控的方法 Download PDF

Info

Publication number
WO2023029822A1
WO2023029822A1 PCT/CN2022/108148 CN2022108148W WO2023029822A1 WO 2023029822 A1 WO2023029822 A1 WO 2023029822A1 CN 2022108148 W CN2022108148 W CN 2022108148W WO 2023029822 A1 WO2023029822 A1 WO 2023029822A1
Authority
WO
WIPO (PCT)
Prior art keywords
control window
touch
display
content
user
Prior art date
Application number
PCT/CN2022/108148
Other languages
English (en)
French (fr)
Inventor
王子锋
揭育顺
Original Assignee
京东方科技集团股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 京东方科技集团股份有限公司 filed Critical 京东方科技集团股份有限公司
Publication of WO2023029822A1 publication Critical patent/WO2023029822A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1407General aspects irrespective of display type, e.g. determination of decimal point position, display with fixed or driving decimal point, suppression of non-significant zeros

Definitions

  • the present disclosure relates to the technical field of human-computer interaction, and in particular to a display device and an intelligent touch method thereof.
  • the display screen As the most important interface of human-computer interaction, the display screen is more and more applied to all aspects of life. With the development of technology, the display screen itself has added many interactive functions. For example, users can use voice control, touch control, etc. Human-computer interaction with the display screen, in which touch methods include infrared, capacitive and other methods. As the functions of the display screen become more comprehensive and the application scenarios become more varied, the sizes of the display screens that can be selected in different scenarios are also more diverse.
  • the operation menu opens a small window, and the user performs touch operations in the small window and synchronizes to the display screen.
  • multiple selections must be made through the operation menu before switching to the small window, and the small screen operation is performed in a fixed small window.
  • the efficiency of human-computer interaction is low; the other is to use other devices and display devices for online touch to realize the synchronous operation of the two, but this method requires other devices, which is not convenient for direct human-computer interaction, cumbersome operation, and relatively poor experience. Difference.
  • the present disclosure provides a display device and an intelligent touch method thereof, which are used to reduce the steps of calling a small window for touch switching through a menu, provide more intelligent and portable human-computer interaction, and improve the efficiency of human-computer interaction.
  • an embodiment of the present disclosure provides a display device, including a display screen, a touch component, and a controller, wherein:
  • the display screen is used to display content
  • the touch component is used to receive a touch signal input by a user
  • the controller is configured to execute:
  • the display of the control window is triggered on the display screen of the display device, and the control window is controlled to be in the first state, and the first state indicates whether to judge whether to use the touch signal according to the received user input touch signal.
  • the transition state of the second state transition
  • controlling the control window to switch from the first state to the second state according to the detected positional relationship between the touch position of the touch signal input by the user and the control window;
  • control window is in the second state, and the display screen is controlled through the control window.
  • the display device provided in this embodiment can predict the user's next behavior according to the user's behavior, and gradually determine whether the user has the need to use the control window according to the user's position and the touch position of the touch signal, thereby providing a senseless
  • the method of opening the control window improves the user interaction efficiency.
  • the processor is specifically configured to execute:
  • the display of the control window is triggered on the display screen of the display device.
  • the processor is specifically configured to execute:
  • the display position of the control window on the display screen is determined according to the abscissa of the center of the control window.
  • the processor is specifically configured to execute:
  • the control window is controlled to be in the second state.
  • the processor is specifically configured to execute:
  • the processor is specifically further configured to execute:
  • the control window determines the display position corresponding to the touch position on the display screen; If the display content displayed in the second range centered on the position matches, the control window is controlled to be in the second state; or,
  • the touch position is located in the area where the control window is located, obtain the display content displayed on the display screen within the third range centered on the touch position, if the touch position is centered If the touch content generated within the range matches the display content, the control window is closed, and the content displayed at the display position corresponding to the touch position is deleted.
  • the processor is specifically configured to determine that the touch content matches the display content in the following manner:
  • the size and data type of the touch content and the display content determine that the touch content matches the display content; or,
  • the touch content matches the display content.
  • the processor is specifically configured to perform:
  • the touch content After enlarging the touch content according to a preset rule or reducing the display content according to a preset rule, it is determined that the touch content matches the display content according to the sizes of the touch content and the display content.
  • the display of the control window is triggered on the display screen of the display device according to the detected user position, and the processor is specifically further configured to execute:
  • At least part of the content on the display screen is mapped to the control window for display.
  • the processor is specifically further configured to execute:
  • the transparency of the content mapped in the control window decreases step by step.
  • the processor is specifically configured to execute:
  • the touch position is located in the area where the control window is located, and the touch content matches the display content, then display the mapped content in a non-transparent manner.
  • the processor is specifically further configured to execute:
  • the control window is closed.
  • the processor is specifically configured to determine the user location in the following manner:
  • Scanning is performed using the digital radar array of the display device, and the location of the user is determined according to the scanning results.
  • the processor is specifically further configured to:
  • control window If the control window is in the first state, the content is displayed through the control window, and the touch signal is received through the display screen;
  • control window If the control window is in the second state, a touch signal is received through the control window, and content is displayed through an area other than the control window.
  • an intelligent touch method provided by an embodiment of the present disclosure includes:
  • the display of the control window is triggered on the display screen of the display device, and the control window is controlled to be in the first state, and the first state indicates whether to judge whether to use the touch signal according to the received user input touch signal.
  • the transition state of the second state transition
  • controlling the control window to switch from the first state to the second state according to the detected positional relationship between the touch position of the touch signal input by the user and the control window;
  • control window is in the second state, and the display screen is controlled through the control window.
  • the triggering display of the control window on the display screen of the display device according to the detected user position includes:
  • the display of the control window is triggered on the display screen of the display device.
  • the triggering display of the control window on the display screen of the display device includes:
  • the display position of the control window on the display screen is determined according to the abscissa of the center of the control window.
  • controlling the control window to be in the second state according to the detected positional relationship between the touch position of the touch signal input by the user and the control window includes:
  • the control window is controlled to be in the second state.
  • controlling the control window to be in the second state according to the detected positional relationship between the touch position of the touch signal input by the user and the control window includes:
  • control The control window is in a second state, including:
  • the control window determines the display position corresponding to the touch position on the display screen; If the display content displayed in the second range centered on the position matches, the control window is controlled to be in the second state; or,
  • the touch position is located in the area where the control window is located, obtain the display content displayed on the display screen within the third range centered on the touch position, if the touch position is centered If the touch content generated within the range matches the display content, the control window is closed, and the content displayed at the display position corresponding to the touch position is deleted.
  • content matching is determined in the following manner:
  • the size and data type of the touch content and the display content determine that the touch content matches the display content; or,
  • the touch content matches the display content.
  • content matching is determined in the following manner:
  • the touch content After enlarging the touch content according to a preset rule or reducing the display content according to a preset rule, it is determined that the touch content matches the display content according to sizes of the touch content and the display content.
  • the triggering the display of the control window on the display screen of the display device according to the detected user position also includes:
  • At least part of the content on the display screen is mapped to the control window for display.
  • the zooming and mapping at least part of the content on the display screen to the control window for display further includes:
  • the transparency of the content mapped in the control window decreases step by step.
  • controlling the transparency of the content mapped in the control window is gradually reduced, including:
  • the touch position is located in the area where the control window is located, and the touch content matches the display content, then display the mapped content in a non-transparent manner.
  • the display of the control window after the display of the control window is triggered on the display screen of the display device, it further includes:
  • the control window is closed.
  • the user location is determined in the following manner:
  • Scanning is performed using the digital radar array of the display device, and the location of the user is determined according to the scanning results.
  • it also includes:
  • control window If the control window is in the first state, the content is displayed through the control window, and the touch signal is received through the display screen;
  • control window If the control window is in the second state, a touch signal is received through the control window, and content is displayed through an area other than the control window.
  • the embodiment of the present disclosure also provides a smart touch device, including:
  • the display unit is configured to trigger the display of the control window on the display screen of the display device according to the detected user position, and control the control window to be in a first state, and the first state represents a touch according to the received user input.
  • the control signal determines whether to transition to the second state transition state;
  • a conversion unit configured to control the control window to switch from the first state to the second state according to the detected positional relationship between the touch position of the touch signal input by the user and the control window;
  • a control unit configured to determine that the control window is in the second state, and control the display screen through the control window.
  • the display unit is specifically used for:
  • the display of the control window is triggered on the display screen of the display device.
  • the display unit is specifically used for:
  • the display position of the control window on the display screen is determined according to the abscissa of the center of the control window.
  • the conversion unit is specifically used for:
  • the control window is controlled to be in the second state.
  • the conversion unit is specifically used for:
  • the conversion unit is specifically used for:
  • the control window determines the display position corresponding to the touch position on the display screen; If the display content displayed in the second range centered on the position matches, the control window is controlled to be in the second state; or,
  • the touch position is located in the area where the control window is located, obtain the display content displayed on the display screen within the third range centered on the touch position, if the touch position is centered If the touch content generated within the range matches the display content, the control window is closed, and the content displayed at the display position corresponding to the touch position is deleted.
  • the conversion unit is specifically used for:
  • the size and data type of the touch content and the display content determine that the touch content matches the display content; or,
  • the touch content matches the display content.
  • the conversion unit is specifically used for:
  • the touch content After enlarging the touch content according to a preset rule or reducing the display content according to a preset rule, it is determined that the touch content matches the display content according to sizes of the touch content and the display content.
  • the display unit is specifically further used for:
  • At least part of the content on the display screen is mapped to the control window for display.
  • the display unit is specifically further used for:
  • the transparency of the content mapped in the control window decreases step by step.
  • the display unit is specifically further used for:
  • the touch position is located in the area where the control window is located, and the touch content matches the display content, then display the mapped content in a non-transparent manner.
  • a closing unit is further included:
  • the control window is closed.
  • the display unit is specifically configured to determine the user location in the following manner:
  • Scanning is performed using the digital radar array of the display device, and the location of the user is determined according to the scanning results.
  • the judging unit is also specifically configured to:
  • control window If the control window is in the first state, the content is displayed through the control window, and the touch signal is received through the display screen;
  • control window If the control window is in the second state, a touch signal is received through the control window, and content is displayed through an area other than the control window.
  • an embodiment of the present disclosure further provides a computer storage medium, on which a computer program is stored, and when the program is executed by a processor, the steps of the method described in the above-mentioned first aspect are implemented.
  • FIG. 1 is a schematic diagram of an existing touch control method provided by an embodiment of the present disclosure
  • FIG. 2 is a schematic diagram of a display device provided by an embodiment of the present disclosure
  • FIG. 3A is a schematic diagram of displaying a first control window provided by an embodiment of the present disclosure
  • FIG. 3B is a schematic diagram of displaying a second control window provided by an embodiment of the present disclosure.
  • FIG. 4A is a schematic diagram of detecting a user location provided by an embodiment of the present disclosure.
  • FIG. 4B is a schematic diagram of detecting a user location provided by an embodiment of the present disclosure.
  • FIG. 5 is a schematic diagram of a display position of a control window provided by an embodiment of the present disclosure
  • FIG. 6A is a schematic diagram of a first control window display provided by an embodiment of the present disclosure.
  • FIG. 6B is a schematic diagram of a second control window display provided by an embodiment of the present disclosure.
  • FIG. 6C is a schematic diagram of a third control window display provided by an embodiment of the present disclosure.
  • FIG. 7A is a schematic diagram of the first type of touch content matching provided by an embodiment of the present disclosure.
  • FIG. 7B is a schematic diagram of a second touch content matching provided by an embodiment of the present disclosure.
  • FIG. 7C is a schematic diagram of a third touch content matching provided by an embodiment of the present disclosure.
  • FIG. 7D is a schematic diagram of a fourth touch content matching provided by an embodiment of the present disclosure.
  • FIG. 8A is a schematic diagram of a fifth touch content matching provided by an embodiment of the present disclosure.
  • FIG. 8B is a schematic diagram of a sixth touch content matching provided by an embodiment of the present disclosure.
  • FIG. 8C is a schematic diagram of a seventh touch content matching provided by an embodiment of the present disclosure.
  • FIG. 8D is a schematic diagram of an eighth touch content matching provided by an embodiment of the present disclosure.
  • FIG. 9 is a detailed flow chart of implementing a smart touch method provided by an embodiment of the present disclosure.
  • FIG. 10 is a flow chart of implementing a smart touch method provided by an embodiment of the present disclosure.
  • FIG. 11 is a schematic diagram of a smart touch device provided by an embodiment of the present disclosure.
  • the display screen As the most important interface of human-computer interaction, the display screen is more and more applied to all aspects of life. With the development of technology, the display screen itself has added many interactive functions. For example, users can use voice control, touch control, etc. Human-computer interaction with the display screen, in which touch methods include infrared, capacitive and other methods. As the functions of the display screen become more comprehensive and the application scenarios become more varied, the sizes of the display screens that can be selected in different scenarios are also more diverse. For the current large-size display screens, such as conference whiteboards exceeding 86 inches, due to the large size of the display screens, it is not convenient for users to perform full-screen touch operations. Currently, the touch control methods provided are as shown in Figure 1.
  • the small window 101 is opened through the operation menu 100 on the display device, and the user performs a touch operation in the small window 101 and synchronizes it to the display device. , and the position of the small window 101 is fixed, the touch operation is performed in the fixed small window 101, and the efficiency of human-computer interaction is low; Synchronous operation, but this method requires other devices, such as terminals, tablets, etc., which is not convenient for direct human-computer interaction, cumbersome operation, and poor experience.
  • this embodiment provides a method for automatically judging whether it is necessary to use the small window to receive touch signals and control the display screen for large-size display devices. The method of corresponding operation and its display device.
  • the display devices provided in this embodiment include but are not limited to display devices such as large-size display screens, large-size smart tablets, and large-size electronic whiteboards.
  • the display device in this embodiment can also be equipped with a camera component for taking pictures of the user, and can also be equipped with a digital array radar for scanning the user and confirming the user's location.
  • the camera component and/or the digital array radar may be integrated on the display device, or may be communicated with the display device as a separate component, which is not limited in this embodiment.
  • the core idea of the smart touch method provided in this embodiment is to use the detection of the user to determine whether the user is about to perform a touch operation, and to specifically determine the user's behavior according to the received touch signal, for example, the touch input by the user at this time Whether the control operation is to perform a touch operation on the control window to control the corresponding display on the display screen, or to perform a touch operation on the display screen to accurately distinguish whether the user needs to use the control window to realize the control of the display screen, so that the user does not need to
  • the display device automatically confirms the user's behavior, and when it is determined that the user uses the control window to interact with the display device, it provides the user with a control window configured with a touch function for using the control window.
  • the control window controls the display screen.
  • the control window in this embodiment and the small window in the above content are the same concept in this embodiment, and they both refer to functions for receiving touch signals and controlling the display screen.
  • the default control mode is set to receive touch signals through the display screen of the display device and display them.
  • the display screen can receive touch signals.
  • the control signal can also be displayed corresponding to the touch signal.
  • the control mode is switched to the control window control mode, although the current display screen can also receive touch signals, it mainly receives touch signals through the control window, and performs synchronous display on the control window and the display screen.
  • the control window is located on the display screen at a position corresponding to the user, and the size of the control window is smaller than the size of the display screen, and can be set to a size convenient for the user to control the full screen without moving.
  • a display device provided in this embodiment includes a display screen 200, a touch component 201, and a controller 202, wherein:
  • the display screen 200 is used to display content
  • the touch component 201 is used to receive a touch signal input by a user
  • the controller 202 is configured to perform:
  • the display of the control window is triggered on the display screen 200 of the display device, and the control window is controlled to be in the first state, and the first state represents judging whether to A transition state to the second state; it should be noted that the first state of the control window is used to indicate that the control window has a display function but does not yet have a control function.
  • the area of the control window in this embodiment is shown in FIG. 3A and FIG. 3B .
  • the shape, size, and lines of the area of the control window are not limited too much.
  • controlling the control window to switch from the first state to the second state according to the detected positional relationship between the touch position of the touch signal input by the user and the control window;
  • control window is in the second state, and the display screen 200 is controlled through the control window.
  • the second state in this embodiment is used to indicate that the control window has a control function and a display function, and is used to realize the control display of the full screen.
  • control window if the control window is in the first state, the content is displayed through the control window, and the touch signal is received through the display screen; at this time, it is still in the full-screen touch state, and the full-screen receive Touch signals, while the control window only has the display function. If the control window is in the second state, the touch signal is received through the control window, and the content is displayed through an area other than the control window. At this time, the control window has a control function, can receive the touch signal, and Display content in the control window and areas other than the control window simultaneously.
  • control window If the control window is in the first state, the content is displayed through the control window; in this state, the control window is only used to display content, including the content mapped after full-screen scaling and the content written by the user in the control window ; Receive the control signal through the display screen for control display.
  • control window If the control window is in the second state, the content is displayed through the control window and a touch signal is received. In this state, the control window can not only display content, but also receive touch signals input by the user and perform full-screen control display on the display screen.
  • the display device in this embodiment can detect the user's location, and judge the actual distance between the user and the display screen through the user's location, thereby predicting the user's behavior and judging whether the user is about to perform a touch operation on the display screen. Therefore, according to the position of the user, when it is determined that the user is about to perform a touch operation on the display screen, a control window is displayed on the display screen, and the content on the display screen is zoomed and displayed in the control window, and at this time, the control window is displayed on the display screen. window, and displays the contents of the full-screen map within the control window.
  • control window is pre-displayed on the display screen, so as to judge whether the user needs to realize full-screen control through the control window according to the actual behavior of the user's touch operation.
  • a specific way of judging is based on the positional relationship between the touch position of the touch signal input by the user and the control window.
  • the touch position is located within the area where the control window is located, then determine that the control window is in the second state, configure a touch function for the control window, receive a touch signal through the control window, and The display screen is controlled by using the touch signal. It is easy to understand that if the user needs to use the control window, then the touch position of the touch signal input by the user must be within the control window; otherwise, it is considered that the user does not need to use the control window.
  • this embodiment determines the distance between the user and the display device through the user's location, and finally determines whether the user has an upcoming touch operation the behavior of.
  • the distance between the user and the display device is determined according to the detected user position; if the distance is less than a distance threshold, the display of the control window is triggered on the display screen of the display device.
  • the display device in this embodiment may detect the user's position through a camera component and/or a digital radar array, and specifically determine the user's position through any or more of the following methods:
  • Mode 1 Determine the user's position according to the depth information in the user's depth image captured by the display device;
  • the camera component includes but is not limited to a binocular camera, which is installed on the display device, and uses the depth information in the captured depth image containing the user to determine the position of the user and the distance from the user to the display device (binocular camera). distance. Provide accurate judgment basis for judging user behavior.
  • Mode 2 Use the digital radar array of the display device to scan, and determine the location of the user according to the scanning result.
  • the fluctuation information between the transmitting beam and the receiving beam of the digital radar array it is determined whether the user is scanned, and the position of the user is determined, and the distance between the user and the display device is further determined.
  • a digital radar array 400 is installed on the display device for scanning the user; as shown in FIG. 4A.
  • the fluctuation information between users is determined to determine the location of the user, and then the distance between the user and the display device is determined to be X. If the distance is less than the distance threshold L1, it means that the user is about to perform a touch operation on the display device, and a display control window is triggered on the display screen of the display device. If the distance X is less than the distance threshold L1, it means that the user is not about to perform a touch operation. The behavior of the operation, at this time the display still defaults to the display touch.
  • Method 3 Determine the first position of the user according to the depth information in the depth image of the user captured by the display device, scan with the digital radar array of the display device, and determine the second position of the user according to the scanning result, According to the weights respectively corresponding to the user's first position and the user's second position, the weighted sum of the user's first position and the user's second position is performed to determine the final user position.
  • the display device determines the user position respectively according to the camera component and the digital radar array, and finally performs weighted summation on the user positions respectively determined by the camera component and the digital radar array according to the corresponding weights, thereby determining the final user position.
  • this embodiment is not only based on the positional relationship between the user's position, touch position and control window, but also combined with the time of receiving touch signals to further determine whether the user has an imminent touch the behavior of the operation.
  • control window if no operation on the control window is detected for more than a preset period of time, the control window is closed; or, if the detection time of the touch signal is different from the detection time of the user position value is greater than the time threshold, the control window is closed.
  • the detection time is recorded by displaying the clock in the device, and the time threshold is customized.
  • the detection time is introduced as the judgment basis, thereby more accurately judging the behavior of the user operation, providing more basis for configuring and closing the touch function of the control window, and improving the user's senseless experience.
  • the control window and the touch function configured for the control window are closed. This can prevent the control window from still being displayed after the user has not operated for a long time, causing the control window to fail to be automatically closed after the user does not operate.
  • the time threshold it can be judged that after the user does not use the control window for touch operations for a long time, the control window will be automatically closed and the display screen control state will return.
  • the position of the control window in this embodiment is not fixed.
  • the position of the control window set in this embodiment is determined according to the position of the user, and when the position of the user moves, the position of the control window also moves accordingly.
  • the abscissa of the center of the control window is determined according to the abscissa of the user position; and the position of the control window displayed on the display screen is determined according to the abscissa of the center of the control window.
  • the position displayed on the display screen of the control window is determined through the following steps:
  • Step 1) determine the abscissa of the center of the control window according to the abscissa of the user position
  • Step 2) Determine the ordinate of the center of the control window according to the preset ordinate
  • Step 3 Determine the display position of the control window on the display screen according to the abscissa and the ordinate.
  • the central abscissa X of the central position of the control window is aligned with the abscissa of the user's position
  • the central ordinate Y of the central position of the control window may be the ordinate preset according to the average height of the user, or preset according to the user's needs.
  • the vertical axis is not too limited here.
  • control window in this embodiment can move with the user's movement.
  • control window In order to facilitate the user to reduce the number of movements, the control window is always set in front of the user, which is convenient for the user to operate.
  • the control window in order to prompt the user whether to perform a touch operation on the control window, after determining the location information of the user, if the distance between the user and the display screen is less than a distance threshold, at least part of the content on the display screen is zoomed Then map to the control window for display. Optionally, at least part of the content on the display screen is scaled and mapped to the control window for display.
  • the transparency of the content mapped in the control window is gradually reduced.
  • the transparency of the control window can be changed step by step based on the following methods:
  • the touch position is located in the area where the control window is located, and the touch content matches the display content, then display the mapped content in a non-transparent manner.
  • the content mapped in the control window can be displayed in a transparent manner, so that the content of the display screen itself is not completely blocked. In this case, remind the user that the control window can be used for touch display.
  • the control window itself may also be displayed in a step-by-step transparent display manner. It should be noted that, at this time, the displayed control window is in the first state, which only has a display function and does not have a function of receiving a touch signal. If the user needs to use the control window, the user must perform touch operations in the transparently displayed control window; if the user performs touch operations outside the control window, it is considered that the user does not need to use the control window, and the transparently displayed control window is closed .
  • the content of the control window is displayed according to a level of transparency. If the user continues to move forward and inputs a touch signal, and at this time the touch signal The difference between the detection time of the user position and the detection time of the user position is less than the time threshold, and the touch position of the touch signal is located in the area where the control window is located, then the content of the map is displayed according to the set secondary transparency. After it is determined that the display screen is controlled through the control window, the mapped content is displayed in the control window in a non-transparent manner. Through different transparency display modes, the user is reminded and the content of the control window is strengthened, thereby improving the user experience.
  • this embodiment further controls the control window according to the detected positional relationship between the touch position operated by the user and the control window, and the touch content generated in the first range centered on the touch position.
  • the control window is in the second state, and it is determined to configure the touch function for the control window.
  • this embodiment further judges whether the user uses the control window to perform touch operations according to the specific touch content input by the user. Improve the accuracy of judgment.
  • the touch content corresponding to the touch position in this embodiment includes: the touch content input by the user within the first range centered on the touch position; the display content corresponding to the display position includes: The display position is the center, and the display content on the display screen is within the second range.
  • the touch content corresponding to the touch position in the implementation includes the All touch content input in the control window, that is, the display content corresponding to the touch content displayed in the preset area.
  • the touch content specifically refers to the content input by the user in the control window
  • the display content specifically refers to the content displayed on the display screen.
  • the displayed content includes the content displayed on the display screen after being enlarged according to the content input by the user in the touch window.
  • the touch position directly based on the detected touch signal input by the user and the positional relationship between the control window , it is determined to configure the touch function for the control window. At this time, it is not necessary to match the touch content generated by the touch position with the display content corresponding to the display position. It is only necessary to determine whether the touch position of the touch signal input by the user is within The area where the control window is located.
  • the touch content matches the display content.
  • the content in the control window in this embodiment is obtained by proportionally scaling the content of the display screen, when performing content matching, a more accurate matching basis can be provided, and the equal proportion of size can be performed. match.
  • the ratio of the content on the display screen to the content in the control window is 5:1, after the size of the touch content is enlarged by 5 times, it is judged whether it is equal to the size of the displayed content.
  • the data types in this embodiment include but are not limited to: at least one of text, graphics, and tables.
  • the touch content and the display content are determined according to the size of the touch content and the display content. The above display content matches.
  • the specific implementation is as follows:
  • Step 3-1) If the touch position is located in the area where the control window is located, obtain the display content displayed on the display screen within a third range centered on the touch position;
  • the touch content generated within the first range of the touch position includes: the touch content input by the user within the first range centered on the touch position; the display content includes: Display content on the display screen within a third range centered on the touch position.
  • the display content means the original display content on the display screen, that is, the display content includes the original display content on the display screen and the display content is located in the control window. The content of the occluded display.
  • Step 3-2 If the touch content matches the display content, close the control window and the touch function configured for the control window, and delete all display content described above.
  • the touch content and the display content are determined according to the size of the touch content and the display content.
  • the display content described above matches.
  • the touch content input by the user in the control window is matched with the content in different areas (including the display area and the display area). If the touch content input by the user in the control window matches the display content in the display area, it means At this time, the user needs to use the control window for touch display, then configure the touch function for the control window; and the touch operation happens to be located in the control window, so the display content at the display position corresponding to the touch position is deleted, and the control window is closed.
  • misjudgment can be further avoided, the accuracy of judgment can be improved, and an accurate and senseless interactive experience can be provided.
  • This embodiment also provides a detailed smart touch method.
  • the core idea is to use the detected user position to judge the distance between the user and the display device.
  • the control window is displayed at the position corresponding to the user's position, and the content on the display screen is scaled and mapped to the control window for display. If the touch signal input by the user is further detected, and the detection time of the touch signal and the detection of the user position If the time difference is less than the time threshold, judge the positional relationship between the touch position of the touch signal and the control window. If the touch position is within the area where the control window is located, further determine the touch content and display content corresponding to the touch position , The matching relationship of the display content. If the touch content matches the display content, it means that the user needs to use the control window at this time.
  • the touch function for the control window and perform non-transparent display If the touch content matches the display content, then Note that at this time, the user uses the display screen to perform a touch operation, then close the control window and the touch function configured in the control window, and delete the display content of the display position corresponding to the touch position. It should be noted that, if there is no touch content input by the user on the display screen before receiving the touch signal input by the user, it is only necessary to determine the position relationship between the touch position of the touch signal input by the user and the control window, To determine whether to configure the touch function for the control window and perform non-transparent display, no need to match the touch content.
  • the control window can be opened, displayed, configured with touch functions, closed, etc. Provide users with an efficient and convenient experience.
  • Step 900 after booting, the default setting is that the display screen receives touch signals and controls the display;
  • Step 901 detecting the user's location
  • Step 902 judging whether the distance between the detected user position and the display device is less than the distance threshold, if so, execute step 903, otherwise execute step 911;
  • Step 903 triggering the display of the control window on the display screen, and displaying the content on the display screen in the control window after being proportionally scaled according to the first level of transparency.
  • Step 904 detecting a touch signal input by the user
  • Step 905 judging whether the difference between the detection time of the touch signal and the detection time of the user position is less than the time threshold, if so, perform step 906, otherwise, perform step 911;
  • Step 906 judging whether the touch position of the touch signal is located in the area where the control window is located, if so, execute step 907, otherwise, execute step 911;
  • Step 907 Display the content in the control window according to the set secondary transparency, wherein the secondary transparency is lower than the primary transparency.
  • Step 908 Determine whether the touch content corresponding to the touch position on the display screen matches the display content corresponding to the display position or matches the display content. If it matches the display content, perform step 909 ; otherwise, perform step 911 .
  • the display content includes: display content on the display screen within a second range centered on the display position.
  • the displayed content includes: displayed content on the display screen within a third range centered on the touch position.
  • Step 909 determining to configure the touch function for the control window, instead of transparently displaying the content in the control window;
  • Step 910 judging whether the time interval between two adjacent detected touch signals input by the user is greater than a preset time, if so, execute step 911 , otherwise, execute step 906 .
  • Step 911 close the control window and delete the display content of the display position corresponding to the touch position, return to the display screen to receive the touch signal and control the display.
  • the embodiment of the present disclosure also provides a method of intelligent touch, since this method is the method corresponding to the device in the embodiment of the present disclosure, and the principle of solving the problem of this method is similar to that of the device, so For the implementation of the method, reference may be made to the implementation of the device, and repeated descriptions will not be repeated.
  • the implementation process of a smart touch method provided in this embodiment is as follows:
  • Step 1000 According to the detected user position, trigger the display of the control window on the display screen of the display device, and control the control window to be in the first state, the first state represents the received touch signal input by the user A transition state for judging whether to switch to the second state;
  • Step 1001 Control the control window to switch from the first state to the second state according to the detected positional relationship between the touch position of the touch signal input by the user and the control window;
  • Step 1002. Determine that the control window is in the second state, and control the display screen through the control window.
  • the triggering display of the control window on the display screen of the display device according to the detected user position includes:
  • the display of the control window is triggered on the display screen of the display device.
  • the triggering display of the control window on the display screen of the display device includes:
  • the display position of the control window on the display screen is determined according to the abscissa of the center of the control window.
  • controlling the control window to be in the second state according to the detected positional relationship between the touch position of the touch signal input by the user and the control window includes:
  • the control window is controlled to be in the second state.
  • controlling the control window to be in the second state according to the detected positional relationship between the touch position of the touch signal input by the user and the control window includes:
  • control The control window is in a second state, including:
  • the control window determines the display position corresponding to the touch position on the display screen; If the display content displayed in the second range centered on the position matches, the control window is controlled to be in the second state; or,
  • the touch position is located in the area where the control window is located, obtain the display content displayed on the display screen within the third range centered on the touch position, if the touch position is centered If the touch content generated within the range matches the display content, the control window is closed, and the content displayed at the display position corresponding to the touch position is deleted.
  • content matching is determined in the following manner:
  • the size and data type of the touch content and the display content determine that the touch content matches the display content; or,
  • the touch content matches the display content.
  • content matching is determined in the following manner:
  • the touch content After enlarging the touch content according to a preset rule or reducing the display content according to a preset rule, it is determined that the touch content matches the display content according to sizes of the touch content and the display content.
  • the triggering display of the control window on the display screen of the display device according to the detected user position further includes:
  • At least part of the content on the display screen is mapped to the control window for display.
  • the zooming and mapping at least part of the content on the display screen to the control window for display further includes:
  • the transparency of the content mapped in the control window decreases step by step.
  • controlling the transparency of the content mapped in the control window is gradually reduced, including:
  • the touch position is located in the area where the control window is located, and the touch content matches the display content, then display the mapped content in a non-transparent manner.
  • the display of the control window after the display of the control window is triggered on the display screen of the display device, it further includes:
  • the control window is closed.
  • the user location is determined in the following manner:
  • Scanning is performed using the digital radar array of the display device, and the location of the user is determined according to the scanning results.
  • it also includes:
  • control window If the control window is in the first state, the content is displayed through the control window, and the touch signal is received through the display screen;
  • control window If the control window is in the second state, a touch signal is received through the control window, and content is displayed through an area other than the control window.
  • the embodiment of the present disclosure also provides an intelligent touch device, since the device is the device in the method in the embodiment of the present disclosure, and the problem-solving principle of the device is similar to the method, so For the implementation of the device, reference may be made to the implementation of the method, and repeated descriptions will not be repeated.
  • the device includes:
  • the display unit 1100 is configured to trigger the display of a control window on the display screen of the display device according to the detected user position, and control the control window to be in a first state, and the first state represents The touch signal determines whether to switch to the transition state of the second state;
  • the conversion unit 1101 is configured to control the control window to switch from the first state to the second state according to the detected positional relationship between the touch position of the touch signal input by the user and the control window;
  • the control unit 1102 is configured to determine that the control window is in the second state, and control the display screen through the control window.
  • the display unit 1100 is specifically used for:
  • the display of the control window is triggered on the display screen of the display device.
  • the display unit 1100 is specifically used for:
  • the converting unit 1101 is specifically configured to:
  • the control window is controlled to be in the second state.
  • the converting unit 1101 is specifically configured to:
  • the converting unit 1101 is specifically configured to:
  • the control window determines the display position corresponding to the touch position on the display screen; If the display content displayed in the second range centered on the position matches, the control window is controlled to be in the second state; or,
  • the touch position is located in the area where the control window is located, obtain the display content displayed on the display screen within the third range centered on the touch position, if the touch position is centered If the touch content generated within the range matches the display content, the control window is closed, and the content displayed at the display position corresponding to the touch position is deleted.
  • the converting unit 1101 is specifically configured to:
  • the size and data type of the touch content and the display content determine that the touch content matches the display content; or,
  • the touch content matches the display content.
  • the converting unit 1101 is specifically configured to:
  • the touch content After enlarging the touch content according to a preset rule or reducing the display content according to a preset rule, it is determined that the touch content matches the display content according to the size of the touch content and the display content.
  • the display unit is specifically further used for:
  • At least part of the content on the display screen is mapped to the control window for display.
  • the display unit 1100 is specifically further configured to:
  • the transparency of the content mapped in the control window decreases step by step.
  • the display unit 1100 is specifically further configured to:
  • the touch position is located in the area where the control window is located, and the touch content matches the display content, then display the mapped content in a non-transparent manner.
  • a closing unit is further included:
  • the control window is closed.
  • the display unit 1100 is specifically configured to determine the user location in the following manner:
  • Scanning is performed using the digital radar array of the display device, and the location of the user is determined according to the scanning results.
  • the judging unit is also specifically configured to:
  • control window If the control window is in the first state, the content is displayed through the control window, and the touch signal is received through the display screen;
  • control window If the control window is in the second state, a touch signal is received through the control window, and content is displayed through an area other than the control window.
  • an embodiment of the present disclosure also provides a computer storage medium on which a computer program is stored, and when the program is executed by a processor, the following steps are implemented:
  • the display of the control window is triggered on the display screen of the display device, and the control window is controlled to be in the first state, and the first state indicates whether to judge whether to use the touch signal according to the received user input touch signal.
  • the transition state of the second state transition
  • controlling the control window to switch from the first state to the second state according to the detected positional relationship between the touch position of the touch signal input by the user and the control window;
  • control window is in the second state, and the display screen is controlled through the control window.
  • the embodiments of the present disclosure may be provided as methods, systems, or computer program products. Accordingly, the present disclosure can take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present disclosure may take the form of a computer program product embodied on one or more computer-usable storage media (including but not limited to disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein.
  • computer-usable storage media including but not limited to disk storage, CD-ROM, optical storage, etc.
  • These computer program instructions may also be stored in a computer-readable memory capable of directing a computer or other programmable data processing apparatus to operate in a specific manner, such that the instructions stored in the computer-readable memory produce an article of manufacture comprising instruction means, the instructions
  • the device realizes the function specified in one or more procedures of the flowchart and/or one or more blocks of the block diagram.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本公开公开了一种显示设备及其智能触控的方法,用于减少通过菜单调用小窗口进行触控切换的步骤,提升人机交互的效率。包括显示屏、触控组件、控制器,其中:所述显示屏用于进行内容的显示;所述触控组件用于接收用户输入的触控信号;所述控制器被配置为执行:根据检测的用户位置,在所述显示设备的显示屏上触发控制窗口的显示,并控制所述控制窗口处于第一状态,所述第一状态表征根据接收的用户输入的触控信号判断是否向第二状态转换的过渡状态;根据检测的用户输入的触控信号的触控位置与所述控制窗口的位置关系,控制所述控制窗口由第一状态向第二状态转换;确定所述控制窗口处于第二状态,通过所述控制窗口对所述显示屏进行控制。

Description

一种显示设备及其智能触控的方法
相关申请的交叉引用
本申请要求在2021年08月31日提交中国专利局、申请号为202111016337.8、申请名称为“一种显示设备及其智能触控的方法”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本公开涉及人机交互技术领域,特别涉及一种显示设备及其智能触控的方法。
背景技术
显示屏作为人机交互最重要的界面,越来越多的应用到生活中的各个方面,随着技术的发展,显示屏本身增加了很多交互功能,例如用户可以通过音控、触控等方式与显示屏进行人机交互,其中触控方式包括红外、电容等多种方式。随着显示屏的功能更加全面,应用的场景更加多变,在不同场景中能够选择的显示屏的尺寸也更多样化。
而针对目前大尺寸显示屏,例如超过86寸的会议白板等,由于显示屏的尺寸较大,不便于用户进行全屏的触控操作,目前提供的触控方式,一种是通过显示屏上的操作菜单打开小窗口,用户在小窗口中进行触控操作并同步到显示屏,但是该方式中需要通过操作菜单进行多项选择后才能切换到小窗口,在固定的小窗口内进行小屏操作,人机交互效率较低;另一种是通过其他设备和显示设备进行联机触控,实现两者的同步操作,但是该方式需要其他设备,不便于直接进行人机交互,操作繁琐、体验较差。
发明内容
本公开提供一种显示设备及其智能触控的方法,用于减少通过菜单调用 小窗口进行触控切换的步骤,提供更智能便携的人机交互,同时提升人机交互的效率。
第一方面,本公开实施例提供的一种显示设备,包括显示屏、触控组件、控制器,其中:
所述显示屏用于进行内容的显示;
所述触控组件用于接收用户输入的触控信号;
所述控制器被配置为执行:
根据检测的用户位置,在所述显示设备的显示屏上触发控制窗口的显示,并控制所述控制窗口处于第一状态,所述第一状态表征根据接收的用户输入的触控信号判断是否向第二状态转换的过渡状态;
根据检测的用户输入的触控信号的触控位置与所述控制窗口的位置关系,控制所述控制窗口由第一状态向第二状态转换;
确定所述控制窗口处于第二状态,通过所述控制窗口对所述显示屏进行控制。
本实施例提供的显示设备能够根据用户的行为预估用户的下一行为,从而根据用户位置、触控信号的触控位置逐步判断出用户是否具有使用控制窗口的需求,从而提供一种无感的打开控制窗口的方式,提高用户的交互效率。
作为一种可选的实施方式,所述处理器具体被配置为执行:
根据检测的用户位置,确定所述用户和所述显示设备的距离;
若所述距离小于距离阈值,则在显示设备的显示屏上触发控制窗口的显示。
作为一种可选的实施方式,所述处理器具体被配置为执行:
根据所述用户位置的横坐标,确定所述控制窗口中心的横坐标;
根据所述控制窗口中心的横坐标,确定所述控制窗口在所述显示屏上显示的位置。
作为一种可选的实施方式,所述处理器具体被配置为执行:
若所述触控位置位于所述控制窗口所在区域内,则控制所述控制窗口处于第二状态。
作为一种可选的实施方式,所述处理器具体被配置为执行:
根据检测的用户操作的触控位置与所述控制窗口的位置关系,以及,以所述触控位置为中心的第一范围内生成的触控内容,控制所述控制窗口处于第二状态。
作为一种可选的实施方式,所述处理器具体还被配置为执行:
若所述触控位置位于所述控制窗口所在区域内,则确定所述显示屏上与所述触控位置对应的显示位置,若所述触控内容与所述显示屏上,以所述显示位置为中心的第二范围内显示的显示内容匹配,则控制所述控制窗口处于第二状态;或,
若所述触控位置位于所述控制窗口所在区域内,则获取以所述触控位置为中心的第三范围内所述显示屏上显示的展示内容,若以所述触控位置为中心的范围内生成的触控内容,与所述展示内容匹配,则关闭所述控制窗口,并删除与所述触控位置对应的显示位置显示的内容。
作为一种可选的实施方式,所述处理器具体被配置为通过如下方式确定所述触控内容与所述显示内容匹配:
根据所述触控内容和所述显示内容的尺寸、数据类型中的至少一种,确定所述触控内容与所述显示内容匹配;或,
作为一种可选的实施方式,根据所述触控内容和所述展示内容的尺寸、数据类型中的至少一种,确定所述触控内容与所述展示内容匹配。
所述处理器具体被配置为执行:
按预设规则放大所述触控内容或按预设规则缩小所述显示内容后,根据所述触控内容和所述显示内容的尺寸,确定所述触控内容与所述显示内容匹配;或,
按预设规则放大所述触控内容或按预设规则缩小所述展示内容后,根据所述触控内容和所述展示内容的尺寸,确定所述触控内容与所述展示内容匹 配。
作为一种可选的实施方式,所述根据检测的用户位置,在所述显示设备的显示屏上触发控制窗口的显示,所述处理器具体还被配置为执行:
将所述显示屏上的至少部分内容缩放后映射到所述控制窗口进行显示。
作为一种可选的实施方式,所述处理器具体还被配置为执行:
所述控制窗口从所述第一状态转换到所述第二状态的过程中,控制所述控制窗口中映射的内容的透明度逐级降低。
作为一种可选的实施方式,所述处理器具体被配置为执行:
若根据检测的用户位置,将所述显示屏上的至少部分内容缩放后映射到所述控制窗口进行显示,则按照一级透明度显示所述映射的内容;或,
若所述触控位置位于所述控制窗口所在区域内,则按照二级透明度显示所述映射的内容,其中所述二级透明度低于所述一级透明度;或,
若所述触控位置位于所述控制窗口所在区域内,且所述触控内容与所述显示内容匹配,则按照非透明方式显示所述映射的内容。
作为一种可选的实施方式,在所述显示设备的显示屏上触发控制窗口的显示之后,所述处理器具体还被配置为执行:
若超过预设时段未检测到触控信号,则关闭所述控制窗口;或,
若所述触控信号的检测时间,与所述用户位置的检测时间的差值大于时间阈值,则关闭所述控制窗口。
作为一种可选的实施方式,所述处理器具体被配置为通过如下方式确定所述用户位置:
根据所述显示设备拍摄的用户的深度图像中的深度信息,确定所述用户位置;或,
利用所述显示设备的数字雷达阵列进行扫描,根据扫描结果确定所述用户位置。
作为一种可选的实施方式,所述处理器具体还被配置为:
若所述控制窗口处于第一状态,则通过所述控制窗口进行内容的显示, 通过所述显示屏接收触控信号;
若所述控制窗口处于第二状态,则通过所述控制窗口接收触控信号,通过所述控制窗口以外的区域进行内容的显示。
第二方面,本公开实施例提供的一种智能触控的方法,包括:
根据检测的用户位置,在所述显示设备的显示屏上触发控制窗口的显示,并控制所述控制窗口处于第一状态,所述第一状态表征根据接收的用户输入的触控信号判断是否向第二状态转换的过渡状态;
根据检测的用户输入的触控信号的触控位置与所述控制窗口的位置关系,控制所述控制窗口由第一状态向第二状态转换;
确定所述控制窗口处于第二状态,通过所述控制窗口对所述显示屏进行控制。
作为一种可选的实施方式,所述根据检测的用户位置,在显示设备的显示屏上触发显示控制窗口,包括:
根据检测的用户位置,确定所述用户和所述显示设备的距离;
若所述距离小于距离阈值,则在显示设备的显示屏上触发控制窗口的显示。
作为一种可选的实施方式,所述在所述显示设备的显示屏上触发显示控制窗口,包括:
根据所述用户位置的横坐标,确定所述控制窗口中心的横坐标;
根据所述控制窗口中心的横坐标,确定所述控制窗口在所述显示屏上显示的位置。
作为一种可选的实施方式,所述根据检测的用户输入的触控信号的触控位置与所述控制窗口的位置关系,控制所述控制窗口处于第二状态,包括:
若所述触控位置位于所述控制窗口所在区域内,则控制所述控制窗口处于第二状态。
作为一种可选的实施方式,根据检测的用户输入的触控信号的触控位置与所述控制窗口的位置关系,控制所述控制窗口处于第二状态,包括:
根据检测的用户操作的触控位置与所述控制窗口的位置关系,以及以所述触控位置为中心的第一范围内生成的触控内容,控制所述控制窗口处于第二状态。
作为一种可选的实施方式,所述根据检测的用户操作的触控位置与所述控制窗口的位置关系,以及以所述触控位置为中心的第一范围内生成的触控内容,控制所述控制窗口处于第二状态,包括:
若所述触控位置位于所述控制窗口所在区域内,则确定所述显示屏上与所述触控位置对应的显示位置,若所述触控内容与所述显示屏上,以所述显示位置为中心的第二范围内显示的显示内容匹配,则控制所述控制窗口处于第二状态;或,
若所述触控位置位于所述控制窗口所在区域内,则获取以所述触控位置为中心的第三范围内所述显示屏上显示的展示内容,若以所述触控位置为中心的范围内生成的触控内容,与所述展示内容匹配,则关闭所述控制窗口,并删除与所述触控位置对应的显示位置显示的内容。
作为一种可选的实施方式,通过如下方式确定内容匹配:
根据所述触控内容和所述显示内容的尺寸、数据类型中的至少一种,确定所述触控内容与所述显示内容匹配;或,
根据所述触控内容和所述展示内容的尺寸、数据类型中的至少一种,确定所述触控内容与所述展示内容匹配。
作为一种可选的实施方式,通过如下方式确定内容匹配:
按预设规则放大所述触控内容或按预设规则缩小所述显示内容后,根据所述触控内容和所述显示内容的尺寸,确定所述触控内容与所述显示内容匹配;或,
按预设规则放大所述触控内容或按预设规则缩小所述展示内容后,根据所述触控内容和所述展示内容的尺寸,确定所述触控内容与所述展示内容匹配。
作为一种可选的实施方式,所述根据检测的用户位置,在所述显示设备 的显示屏上触发控制窗口的显示,还包括:
将所述显示屏上的至少部分内容缩放后映射到所述控制窗口进行显示。
作为一种可选的实施方式,所述将所述显示屏上的至少部分内容缩放后映射到所述控制窗口进行显示,还包括:
所述控制窗口从所述第一状态转换到所述第二状态的过程中,控制所述控制窗口中映射的内容的透明度逐级降低。
作为一种可选的实施方式,所述控制所述控制窗口中映射的内容的透明度逐级降低,包括:
若根据检测的用户位置,将所述显示屏上的至少部分内容缩放后映射到所述控制窗口进行显示,则按照一级透明度显示所述映射的内容;或,
若所述触控位置位于所述控制窗口所在区域内,则按照二级透明度显示所述映射的内容,其中所述二级透明度低于所述一级透明度;或,
若所述触控位置位于所述控制窗口所在区域内,且所述触控内容与所述显示内容匹配,则按照非透明方式显示所述映射的内容。
作为一种可选的实施方式,所述在所述显示设备的显示屏上触发控制窗口的显示之后,还包括:
若超过预设时段未检测到触控信号,则关闭所述控制窗口;或,
若所述触控信号的检测时间,与所述用户位置的检测时间的差值大于时间阈值,则关闭所述控制窗口。
作为一种可选的实施方式,通过如下方式确定所述用户位置:
根据所述显示设备拍摄的用户的深度图像中的深度信息,确定所述用户位置;或,
利用所述显示设备的数字雷达阵列进行扫描,根据扫描结果确定所述用户位置。
作为一种可选的实施方式,还包括:
若所述控制窗口处于第一状态,则通过所述控制窗口进行内容的显示,通过所述显示屏接收触控信号;
若所述控制窗口处于第二状态,则通过所述控制窗口接收触控信号,通过所述控制窗口以外的区域进行内容的显示。
第三方面,本公开实施例还提供一种智能触控的装置,包括:
显示单元,用于根据检测的用户位置,在所述显示设备的显示屏上触发控制窗口的显示,并控制所述控制窗口处于第一状态,所述第一状态表征根据接收的用户输入的触控信号判断是否向第二状态转换的过渡状态;
转换单元,用于根据检测的用户输入的触控信号的触控位置与所述控制窗口的位置关系,控制所述控制窗口由第一状态向第二状态转换;
控制单元,用于确定所述控制窗口处于第二状态,通过所述控制窗口对所述显示屏进行控制。
作为一种可选的实施方式,所述显示单元具体用于:
根据检测的用户位置,确定所述用户和所述显示设备的距离;
若所述距离小于距离阈值,则在显示设备的显示屏上触发控制窗口的显示。
作为一种可选的实施方式,所述显示单元具体用于:
根据所述用户位置的横坐标,确定所述控制窗口中心的横坐标;
根据所述控制窗口中心的横坐标,确定所述控制窗口在所述显示屏上显示的位置。
作为一种可选的实施方式,所述转换单元具体用于:
若所述触控位置位于所述控制窗口所在区域内,则控制所述控制窗口处于第二状态。
作为一种可选的实施方式,所述转换单元具体用于:
根据检测的用户操作的触控位置与所述控制窗口的位置关系,以及以所述触控位置为中心的第一范围内生成的触控内容,控制所述控制窗口处于第二状态。
作为一种可选的实施方式,所述转换单元具体用于:
若所述触控位置位于所述控制窗口所在区域内,则确定所述显示屏上与 所述触控位置对应的显示位置,若所述触控内容与所述显示屏上,以所述显示位置为中心的第二范围内显示的显示内容匹配,则控制所述控制窗口处于第二状态;或,
若所述触控位置位于所述控制窗口所在区域内,则获取以所述触控位置为中心的第三范围内所述显示屏上显示的展示内容,若以所述触控位置为中心的范围内生成的触控内容,与所述展示内容匹配,则关闭所述控制窗口,并删除与所述触控位置对应的显示位置显示的内容。
作为一种可选的实施方式,所述转换单元具体用于:
根据所述触控内容和所述显示内容的尺寸、数据类型中的至少一种,确定所述触控内容与所述显示内容匹配;或,
根据所述触控内容和所述展示内容的尺寸、数据类型中的至少一种,确定所述触控内容与所述展示内容匹配。
作为一种可选的实施方式,所述转换单元具体用于:
按预设规则放大所述触控内容或按预设规则缩小所述显示内容后,根据所述触控内容和所述显示内容的尺寸,确定所述触控内容与所述显示内容匹配;或,
按预设规则放大所述触控内容或按预设规则缩小所述展示内容后,根据所述触控内容和所述展示内容的尺寸,确定所述触控内容与所述展示内容匹配。
作为一种可选的实施方式,所述显示单元具体还用于:
将所述显示屏上的至少部分内容缩放后映射到所述控制窗口进行显示。
作为一种可选的实施方式,所述显示单元具体还用于:
所述控制窗口从所述第一状态转换到所述第二状态的过程中,控制所述控制窗口中映射的内容的透明度逐级降低。
作为一种可选的实施方式,所述显示单元具体还用于:
若根据检测的用户位置,将所述显示屏上的至少部分内容缩放后映射到所述控制窗口进行显示,则按照一级透明度显示所述映射的内容;或,
若所述触控位置位于所述控制窗口所在区域内,则按照二级透明度显示所述映射的内容,其中所述二级透明度低于所述一级透明度;或,
若所述触控位置位于所述控制窗口所在区域内,且所述触控内容与所述显示内容匹配,则按照非透明方式显示所述映射的内容。
作为一种可选的实施方式,所述在所述显示设备的显示屏上触发控制窗口的显示之后,还包括关闭单元具体用于:
若超过预设时段未检测到触控信号,则关闭所述控制窗口;或,
若所述触控信号的检测时间,与所述用户位置的检测时间的差值大于时间阈值,则关闭所述控制窗口。
作为一种可选的实施方式,所述显示单元具体用于通过如下方式确定所述用户位置:
根据所述显示设备拍摄的用户的深度图像中的深度信息,确定所述用户位置;或,
利用所述显示设备的数字雷达阵列进行扫描,根据扫描结果确定所述用户位置。
作为一种可选的实施方式,还包括判断单元具体用于:
若所述控制窗口处于第一状态,则通过所述控制窗口进行内容的显示,通过所述显示屏接收触控信号;
若所述控制窗口处于第二状态,则通过所述控制窗口接收触控信号,通过所述控制窗口以外的区域进行内容的显示。
第四方面,本公开实施例还提供计算机存储介质,其上存储有计算机程序,该程序被处理器执行时用于实现上述第一方面所述方法的步骤。
本公开的这些方面或其他方面在以下的实施例的描述中会更加简明易懂。
附图说明
为了更清楚地说明本公开实施例中的技术方案,下面将对实施例描述中所需要使用的附图作简要介绍,显而易见地,下面描述中的附图仅仅是本公 开的一些实施例,对于本领域的普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1为本公开实施例提供的一种现有的触控方式示意图;
图2为本公开实施例提供的一种显示设备示意图;
图3A为本公开实施例提供的第一种控制窗口显示示意图;
图3B为本公开实施例提供的第二种控制窗口显示示意图;
图4A为本公开实施例提供的一种检测用户位置的示意图;
图4B为本公开实施例提供的一种检测用户位置的示意图;
图5为本公开实施例提供的一种控制窗口显示位置的示意图;
图6A为本公开实施例提供的第一种控制窗口显示的示意图;
图6B为本公开实施例提供的第二种控制窗口显示的示意图;
图6C为本公开实施例提供的第三种控制窗口显示的示意图;
图7A为本公开实施例提供的第一种触控内容匹配的示意图;
图7B为本公开实施例提供的第二种触控内容匹配的示意图;
图7C为本公开实施例提供的第三种触控内容匹配的示意图;
图7D为本公开实施例提供的第四种触控内容匹配的示意图;
图8A为本公开实施例提供的第五种触控内容匹配的示意图;
图8B为本公开实施例提供的第六种触控内容匹配的示意图;
图8C为本公开实施例提供的第七种触控内容匹配的示意图;
图8D为本公开实施例提供的第八种触控内容匹配的示意图;
图9为本公开实施例提供的一种详细的智能触控的方法实施流程图;
图10为本公开实施例提供的一种智能触控的方法实施流程图;
图11为本公开实施例提供的一种智能触控的装置示意图。
具体实施方式
为了使本公开的目的、技术方案和优点更加清楚,下面将结合附图对本公开作进一步地详细描述,显然,所描述的实施例仅仅是本公开一部分实施 例,而不是全部的实施例。基于本公开中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其它实施例,都属于本公开保护的范围。
本公开实施例中术语“和/或”,描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况。字符“/”一般表示前后关联对象是一种“或”的关系。
本公开实施例描述的应用场景是为了更加清楚的说明本公开实施例的技术方案,并不构成对于本公开实施例提供的技术方案的限定,本领域普通技术人员可知,随着新应用场景的出现,本公开实施例提供的技术方案对于类似的技术问题,同样适用。其中,在本公开的描述中,除非另有说明,“多个”的含义是两个或两个以上。
显示屏作为人机交互最重要的界面,越来越多的应用到生活中的各个方面,随着技术的发展,显示屏本身增加了很多交互功能,例如用户可以通过音控、触控等方式与显示屏进行人机交互,其中触控方式包括红外、电容等多种方式。随着显示屏的功能更加全面,应用的场景更加多变,在不同场景中能够选择的显示屏的尺寸也更多样化。而针对目前大尺寸显示屏,例如超过86寸的会议白板等,由于显示屏的尺寸较大,不便于用户进行全屏的触控操作,目前提供的触控方式,如图1所示,一种是通过显示设备上的操作菜单100打开小窗口101,用户在小窗口101中进行触控操作并同步到显示设备,但是该方式中需要通过操作菜单100进行多项选择后才能切换到小窗口101,并且小窗口101的位置是固定的,在固定的小窗口101内进行触控操作,人机交互效率较低;另一种是通过其他设备102和显示设备进行联机触控,实现两者的同步操作,但是该方式需要其他设备,例如终端、平板等,不便于直接进行人机交互,操作繁琐、体验较差。为了减小用户在操作显示设备的过程中频繁通过操作菜单打开小窗口的使用步骤,本实施例针对大尺寸显示设备,提供一种自动判断是否需要使用小窗口接收触控信号并控制显示屏进行对应操作的方法及其显示设备。
本实施例提供的显示设备包括但不限于大尺寸显示屏、大尺寸智能平板、大尺寸电子白板等显示设备。本实施例中的显示设备还可以配置摄像组件,用于对用户进行拍摄,也可以配置数字阵列雷达,用于对用户进行扫描并确认用户的位置。其中,摄像组件和/或数字阵列雷达可集成于显示设备上,也可以作为单独的组件与显示设备进行通信连接,对此本实施例不作过多限定。
本实施例提供的智能触控的方法的核心思想是利用对用户的检测判断用户是否即将进行触控操作,并且根据接收到的触控信号来具体判断用户的行为,例如用户此时输入的触控操作是对控制窗口进行触控操作从而控制显示屏进行相应的显示,还是对显示屏进行的触控操作,准确地分辨出用户是否需要利用控制窗口实现对显示屏的控制,从而在用户无需进行主动操作菜单打开控制窗口的情况下,显示设备自动确认用户的行为,在确定用户使用控制窗口和显示设备进行交互的情况下,为用户提供配置触控功能的控制窗口,以用于使用该控制窗口对显示屏进行控制。其中,本实施例中的控制窗口和上述内容中的小窗口在本实施例中是同一概念,都是指用于接收触控信号并实现对显示屏进行控制的功能。
需要说明的是,本实施例中的显示设备在开机启动后初始状态下,设置默认的控制方式为通过显示设备的显示屏接收触控信号,并进行显示,此时的显示屏即可以接收触控信号,也可以进行与该触控信号对应的显示。如果控制方式切换至控制窗口控制的方式,则当前显示屏虽然也可以接收触控信号,但主要通过控制窗口接收触控信号,在控制窗口和显示屏上进行同步显示。其中,控制窗口位于显示屏上与用户对应的位置上,并且该控制窗口的尺寸小于显示屏的尺寸,可设置为便于用户在不移动的情况下实现对全屏进行控制的尺寸。
如图2所示,本实施例提供的一种显示设备,包括显示屏200、触控组件201、控制器202,其中:
所述显示屏200用于进行内容的显示;
所述触控组件201用于接收用户输入的触控信号;
所述控制器202被配置为执行:
根据检测的用户位置,在所述显示设备的显示屏200上触发控制窗口的显示,并控制所述控制窗口处于第一状态,所述第一状态表征根据接收的用户输入的触控信号判断是否向第二状态转换的过渡状态;需要说明的是,控制窗口的第一状态用于表示控制窗口具备显示功能但还不具备控制功能。
其中,本实施例中控制窗口的区域如图3A、图3B所示。本实施例对控制窗口的区域的形状、大小、线条不作过多限定。
根据检测的用户输入的触控信号的触控位置与所述控制窗口的位置关系,控制所述控制窗口由第一状态向第二状态转换;
确定所述控制窗口处于第二状态,通过所述控制窗口对所述显示屏200进行控制。需要说明的是,本实施例中的第二状态用于表征控制窗口具备控制功能和显示功能,用于实现对全屏的控制显示。
在一些实施例中,若所述控制窗口处于第一状态,则通过所述控制窗口进行内容的显示,通过所述显示屏接收触控信号;此时,仍处于全屏触控状态,通过全屏接收触控信号,而控制窗口只具备显示功能。若所述控制窗口处于第二状态,则通过所述控制窗口接收触控信号,通过所述控制窗口以外的区域进行内容的显示,此时,控制窗口具备控制功能,能接收触控信号,并同步在控制窗口和控制窗口以外的区域进行内容的显示。
若所述控制窗口处于第一状态,则通过所述控制窗口进行内容的显示;该状态下,控制窗口仅用于显示内容,包括全屏缩放后映射的内容以及用户在该控制窗口内书写的内容;通过显示屏接收控制信号进行控制显示。
若所述控制窗口处于第二状态,则通过所述控制窗口进行内容的显示,并接收触控信号。该状态下,控制窗口不仅可以显示内容,还可以接收用户输入的触控信号并对显示屏进行全屏的控制显示。
实施中,本实施例中的显示设备能检测到用户位置,通过用户位置来判断用户实际与显示屏之间的距离,从而预估用户的行为,判断用户是否即将进行显示屏的触控操作,从而根据用户位置确定用户即将进行显示屏的触控 操作时,在显示屏上显示控制窗口,且将显示屏上的内容进行缩放后在所述控制窗口进行显示,此时在显示屏上显示控制窗口,并且在控制窗口内显示全屏映射的内容。在实际应用的过程中,当用户接近显示屏且进行触控操作之前,在显示屏上预先显示控制窗口,从而根据用户进行触控操作的实际行为,判断用户是否需要通过控制窗口实现对全屏的控制。具体的判断方式是根据用户输入的触控信号的触控位置与所述控制窗口的位置关系。
在一些示例中,若所述触控位置位于所述控制窗口所在区域内,则确定控制窗口处于第二状态,为所述控制窗口配置触控功能,通过所述控制窗口接收触控信号,并利用所述触控信号对所述显示屏进行控制。容易理解的是,如果用户需要使用控制窗口,那么用户输入的触控信号的触控位置必然位于控制窗口内,否则认为用户不需要使用控制窗口。
在一些示例中,为了准确地判断出用户的行为,即用户是否即将使用显示设备进行触控操作,本实施例通过用户位置来确定用户和显示设备的距离,最终判断用户是否具有即将触控操作的行为。实施中,根据检测的用户位置,确定所述用户和所述显示设备的距离;若所述距离小于距离阈值,则在显示设备的显示屏上触发控制窗口的显示。
在一些示例中,本实施例中的显示设备可以通过摄像组件,和/或,数字雷达阵列来检测用户位置,具体通过如下任一或任多方式确定所述用户位置:
方式1、根据所述显示设备拍摄的用户的深度图像中的深度信息,确定所述用户位置;
实施中,摄像组件包括但不限于双目摄像头,该双目摄像头安装于显示设备上,利用拍摄的包含用户的深度图像中的深度信息,确定用户位置以及用户到显示设备(双目摄像头)的距离。为判断用户的行为提供准确地判断依据。
方式2、利用所述显示设备的数字雷达阵列进行扫描,根据扫描结果确定所述用户位置。
实施中,根据数字雷达阵列的发射波束和接收波束之间的波动信息,确 定是否扫描到用户,并确定用户位置,进一步确定出用户和显示设备之间的距离。
如图4A所示,显示设备上安装有数字雷达阵列400,用于对用户进行扫描;如图4B所示,L2表示数字雷达阵列400相对于用户的位置,显示设备通过发射波束和接收波束之间的波动信息,确定用户的位置,进而确定用户和显示设备之间的距离为X。若所述距离小于距离阈值L1,则说明用户即将对显示设备进行触控操作,在显示设备的显示屏上触发显示控制窗口,若距离X小于距离阈值L1,则说明用户还没有即将进行触控操作的行为,此时显示屏仍默认为显示屏触控。
方式3、根据所述显示设备拍摄的用户的深度图像中的深度信息,确定所述用户第一位置,利用所述显示设备的数字雷达阵列进行扫描,根据扫描结果确定所述用户第二位置,根据与用户第一位置和用户第二位置分别对应的权重,对所述用户第一位置和用户第二位置进行加权求和,确定最终的用户位置。
该方式下,显示设备根据摄像组件和数字雷达阵列分别确定用户位置,最终根据对应的权重,对摄像组件和数字雷达阵列分别确定的用户位置进行加权求和,从而确定最终的用户位置。
在一些示例中,为了提高判断用户行为的准确率,本实施例不仅基于用户位置、触控位置和控制窗口的位置关系,还结合接收触控信号的时间,进一步判断用户是否具有即将进行触控操作的行为。
在一些示例中,若超过预设时段未检测到对所述控制窗口的操作,则关闭所述控制窗口;或,若所述触控信号的检测时间,与所述用户位置的检测时间的差值大于时间阈值,则关闭所述控制窗口。
实施中,接收到用户的触控信号之后,确定用户输入的触控信号的触控位置与所述控制窗口的位置关系之前,先确定所述触控信号的检测时间,与所述用户位置的检测时间的差值小于时间阈值。实施中,通过显示设备中的时钟记录检测时间,并自定义时间阈值。本实施例中引入检测时间作为判断 依据,从而更加准确地判断出用户操作的行为,为控制窗口的触控功能的配置和关闭提供更多的依据,提高用户的无感体验。
在一些示例中,若相邻两次检测的用户输入的触控信号的时间间隔大于预设时间,则关闭所述控制窗口以及为所述控制窗口配置的触控功能。这样能够避免用户长时间未操作后仍然显示控制窗口,导致控制窗口无法在用户不操作后进行自动关闭。通过时间阈值的设定,能够判断用户长时间不使用控制窗口进行触控操作后,自动关闭控制窗口,返回显示屏控制的状态。
在一些示例中,如图5所示,本实施例中的控制窗口的位置并不是固定的,为了提高用户的使用体验,方便用户在进行大尺寸显示屏的触控操作时,能够减少移动,本实施例设置的控制窗口的位置是依据用户的位置确定的,当用户位置移动时,控制窗口的位置也随之移动。
在一些实施例中,根据所述用户位置的横坐标,确定所述控制窗口中心的横坐标;根据所述控制窗口中心的横坐标,确定所述控制窗口在所述显示屏上显示的位置。
实施中,通过如下步骤确定控制窗口的在显示屏上显示的位置:
步骤1)根据所述用户位置的横坐标,确定所述控制窗口中心的横坐标;
步骤2)根据预设纵坐标,确定所述控制窗口中心的纵坐标;
步骤3)根据所述横坐标和所述纵坐标,确定所述控制窗口在所述显示屏上显示的位置。
其中,控制窗口的中心位置的中心横坐标X与用户位置的横坐标对齐,控制窗口的中心位置的中心纵坐标Y,可以根据用户的平均身高预设的纵坐标,或者根据用户需求预设的纵坐标,此处不作过多限定。
容易理解的是,本实施例中的控制窗口的位置可随着用户的移动而移动,为了方便用户减少移动的次数,设置控制窗口始终位于用户的正前方,便于用户进行操作。
在一些实施例中,为了提示用户是否需要进行控制窗口的触控操作,在确定用户的位置信息后,若用户和显示屏的距离小于距离阈值,则将所述显 示屏上的至少部分内容缩放后映射到所述控制窗口进行显示。可选的,将所述显示屏上的至少部分内容进行等比例缩放后映射到所述控制窗口进行显示。
在一些实施例中,所述控制窗口从所述第一状态转换到所述第二状态的过程中,所述控制窗口中映射的内容的透明度是逐级降低的。
在一些实施例中,控制窗口具体可基于如下方式逐级变化透明度:
若根据检测的用户位置,将所述显示屏上的至少部分内容缩放后映射到所述控制窗口进行显示,则按照一级透明度显示所述映射的内容;或,
若所述触控位置位于所述控制窗口所在区域内,则按照二级透明度显示所述映射的内容,其中所述二级透明度低于所述一级透明度;或,
若所述触控位置位于所述控制窗口所在区域内,且所述触控内容与所述显示内容匹配,则按照非透明方式显示所述映射的内容。
具体实施中,由于仅通过用户位置预估用户即将对显示屏进行触控操作,因此可以通过透明显示的方式将控制窗口中映射的内容进行显示,这样能够在显示屏本身内容不被完全遮挡的情况下,提醒用户可以使用控制窗口进行触控显示。可选的,还可以通过逐级透明显示的方式显示控制窗口本身。需要说明的是,此时显示的控制窗口处于第一状态,只具备显示功能,并不具备接收触控信号的功能。如果用户需要使用控制窗口,则用户必然会在透明显示的控制窗口内进行触控操作,如果用户在控制窗口外进行触控操作,则认为用户不需要使用控制窗口,则关闭透明显示的控制窗口。
更进一步地,如图6A所示,如果检测到用户位置和显示设备的距离L小于距离阈值,则按照一级透明度显示控制窗口的内容,如图6B所示,如果检测到用户进一步向显示设备移动,并检测到输入触控信号位于控制窗口所在区域内,则按照二级透明度显示控制窗口的内容,其中所述二级透明度低于所述一级透明度,容易理解的是,当用户行为满足的判断条件越多,说明此时判断结果越准确,则调整控制窗口的内容的显示的透明度越低,从而确定最终确定用户使用控制窗口时,如图6C所示,将控制窗口非透明的完全显示出来。
在一些实施例中,如果检测到用户位置和显示设备的距离小于距离阈值,则按照一级透明度显示控制窗口的内容,若用户继续向前移动,并输入触控信号,且此时触控信号的检测时间和用户位置的检测时间的差值小于时间阈值,触控信号的触控位置位于所述控制窗口所在区域内,则按照设置的二级透明度显示所述映射的内容。确定通过所述控制窗口对所述显示屏进行控制之后,按照非透明方式,在所述控制窗口显示所述映射的内容。通过不同透明度的显示方式,提醒用户并强化控制窗口的内容,从而提高用户的使用体验。
在一些示例中,本实施例还根据检测的用户操作的触控位置与所述控制窗口的位置关系,以及以所述触控位置为中心的第一范围内生成的触控内容,控制所述控制窗口处于第二状态,确定为所述控制窗口配置触控功能。当确定用户的触控信号的触控位置位于控制窗口所在区域内后,为了避免误判,本实施例还根据用户输入的具体的触控内容,进一步判断用户是否使用控制窗口进行触控操作,提高判断的准确性。
在一些示例中,通过将触控位置生成的触控内容与显示屏上显示内容进行匹配,确定此时通过控制窗口接收触控信号,并将该触控信号显示的内容同步显示在显示屏上,具体通过如下步骤结合触控位置和触控内容进行判断:
步骤2-1)若所述触控位置位于所述控制窗口所在区域内,则确定所述显示屏上与所述触控位置对应的显示位置;
步骤2-2)若所述触控内容与所述显示屏上,以所述显示位置为中心的第二范围内显示的显示内容匹配,则控制所述控制窗口处于第二状态。
需要说明的是,本实施例中触控位置对应的触控内容包括:以所述触控位置为中心,第一范围内用户输入的触控内容;显示位置对应的显示内容包括:以所述显示位置为中心,第二范围内所述显示屏上的显示内容。
实施中,在还未确定用户是否使用控制窗口时,当在控制窗口内接收到用户输入的触控内容时,仍会将该触控内容同步在显示屏上进行显示。并且进一步根据触控位置对应的触控内容和显示内容是否匹配,来判断是否为控 制窗口配置触控功能,为了判断的准确性,实施中触控位置对应的触控内容包括用户在预设区域内输入的所有的触控内容,即预设区域内显示的与该触控内容对应的显示内容,触控内容具体是指用户在控制窗口中输入的内容,显示内容具体是指在显示屏上显示的内容,该显示的内容包括根据用户在触控窗口中输入的内容进行放大后,在显示屏上进行显示的内容。例如,如图7A所示,如果用户在触控位置上输入“吗”,并且在显示位置显示“吗”,进一步,判断触控位置对应的触控内容“您好吗”与显示位置对应的显示内容“吗”不匹配,则如图7B所示,确定此时用户并不需要使用控制窗口,用户实际上是在显示屏上进行触控操作,关闭控制窗口以及显示位置上显示的“吗”;如图7C所示,如果用户在触控位置上输入“吗”,并且在显示位置显示“吗”,进一步,判断触控位置对应的触控内容“您好吗”与显示位置对应的显示内容“您好吗”匹配,则如图7D所示,确定此时用户需要使用控制窗口,为所述控制窗口配置触控功能,还可以将控制窗口非透明的进行显示。
需要说明的是,如果检测到用户输入的触控信号之前,显示屏上没有用户输入的触控内容,则直接根据检测的用户输入的触控信号的触控位置与所述控制窗口的位置关系,确定为所述控制窗口配置触控功能,此时不需要进行触控位置生成的触控内容与显示位置对应的显示内容的匹配,只需确定用户输入的触控信号的触控位置是否在控制窗口所在区域内。
在一些实施例中,通过如下方式确定所述触控内容与所述显示内容匹配:
根据所述触控内容和所述显示内容的尺寸、数据类型中的至少一种,确定所述触控内容与所述显示内容匹配。需要说明的是,由于本实施例中的控制窗口中的内容是将显示屏的内容进行等比例缩放得到的,则在进行内容匹配时,能够提供更准确地匹配依据,可以进行尺寸的等比例匹配。例如显示屏内容和控制窗口中内容的比例为5:1,则将触控内容的尺寸放大5倍后,判断是否等于显示内容的尺寸。
可选的,本实施例中的数据类型包括但不限于:文字、图形、表格中的至少一种。
在一些实施例中,按预设规则放大所述触控内容或按预设规则缩小所述显示内容后,根据所述触控内容和所述显示内容的尺寸,确定所述触控内容与所述显示内容匹配。
在一些实施例中,本实施例除了提供触控内容与显示内容的匹配,为了更准确地判断用户的触控行为,还根据触控内容与触控位置周围显示屏上显示的展示内容是否匹配,进一步确定触控行为,具体实施如下:
步骤3-1)若所述触控位置位于所述控制窗口所在区域内,则获取以所述触控位置为中心,第三范围内所述显示屏上显示的展示内容;
需要说明的是,本实施例中触控位置的第一范围内生成的触控内容包括:以所述触控位置为中心的第一范围内用户输入的触控内容;展示内容包括:以所述触控位置为中心的第三范围内所述显示屏上的展示内容。展示内容表示显示屏上原有的显示的内容,即展示内容包括显示屏上原本显示的内容且恰好该展示内容位于控制窗口内,也就是说,展示内容是显示屏上在该控制窗口后面没有被遮挡的显示的内容。
步骤3-2)若所述触控内容与所述展示内容匹配,则关闭所述控制窗口以及为所述控制窗口配置的触控功能,并删除与所述触控位置对应的显示位置的所述显示内容。
实施中,如图8A所示,如果用户在触控位置上输入“吗”,并且在显示位置显示“吗”,进一步,判断触控位置对应的触控内容“您好吗”与展示内容“您好吗”匹配,则如图8B所示,确定此时用户并不需要使用控制窗口,用户实际上是在显示屏上进行触控操作,关闭控制窗口以及显示位置上显示的“吗”;如图8C所示,如果用户在触控位置上输入“吗”,并且在显示位置显示“吗”,进一步,判断触控位置对应的触控内容“您好吗”与显示位置对应的显示内容不匹配,此时,显示内容为显示屏在控制窗口内显示的内容,实际上是没有任何内容只有显示屏的背景,则如图8D所示,确定此时用户需要使用控制窗口,为所述控制窗口配置触控功能,还可以将控制窗口非透明的进行显示,其中包括将控制窗口的内容和/或背景按照非透明的方式显示。
在一些实施例中,按预设规则放大所述触控内容或按预设规则缩小所述展示内容后,根据所述触控内容和所述展示内容的尺寸,确定所述触控内容与所述展示内容匹配。
本实施例通过用户在控制窗口内输入的触控内容与不同区域(包括显示区域和展示区域)的内容进行匹配,若用户在控制窗口输入的触控内容与显示区域的显示内容匹配,则说明此时用户需要使用控制窗口进行触控显示,则为所述控制窗口配置触控功能;若用户在控制窗口输入的触控内容与展示内容匹配,则说明此时用户只是在显示屏上进行触控操作,而触控操作恰好位于控制窗口内,因此删除触控位置对应的显示位置的所述显示内容,并关闭控制窗口。通过增加判断依据,进一步避免误判,提高判断的准确性,提供一种准确无感的交互体验。
本实施例还提供一种详细的智能触控的方法,核心思想是利用检测的用户位置,判断用户和显示设备的距离,在确定该距离小于距离阈值时,按照一级透明度在显示屏的该用户位置对应的位置上显示控制窗口,并且将显示屏上的内容等比例缩放后映射到控制窗口显示,若进一步检测到用户输入的触控信号,且触控信号的检测时间和用户位置的检测时间的差值小于时间阈值,则判断该触控信号的触控位置和控制窗口的位置关系,若触控位置位于控制窗口所在区域内,则进一步判断触控位置对应的触控内容和显示内容、展示内容的匹配关系,若触控内容和显示内容匹配,则说明此时用户需要使用控制窗口,则为控制窗口配置触控功能并进行非透明显示,若触控内容和展示内容匹配,则说明此时用户使用显示屏进行触控操作,则关闭控制窗口和控制窗口配置的触控功能,并删除与该触控位置对应的显示位置的显示内容。需要说明的是,如果在接收用户输入的触控信号之前,显示屏上没有用户输入的触控内容,则只需判断用户输入的触控信号的触控位置与所述控制窗口的位置关系,来确定是否为控制窗口配置触控功能并进行非透明显示,不需要进行触控内容的匹配。通过上述流程,对于用户操作显示屏的操作过 程而言,能够实现对控制窗口打开、关闭的无感操作,在用户操作显示屏的过程中,不需要多余的步骤对控制窗口进行打开关闭,能够根据用户的行为、触控位置、触控内容实现对控制窗口的打开、显示、配置触控功能、关闭等。为用户提供一种高效、便捷的使用体验。
如图9所示,上述方法的实施流程具体如下所示:
步骤900、开机启动后默认设置为显示屏接收触控信号并控制显示;
步骤901、检测到用户位置;
步骤902、判断检测的用户位置和显示设备的距离是否小于距离阈值,若是执行步骤903,否则执行步骤911;
步骤903、在显示屏上触发显示控制窗口,并按照一级透明度将显示屏上的内容等比例缩放后在控制窗口进行显示。
步骤904、检测到用户输入的触控信号;
步骤905、判断触控信号的检测时间和用户位置的检测时间的差值是否小于时间阈值,若是执行步骤906,否则执行步骤911;
步骤906、判断触控信号的触控位置是否位于控制窗口所在区域内,若是执行步骤907,否则执行步骤911;
步骤907、按照设置的二级透明度显示控制窗口中的内容,其中所述二级透明度低于所述一级透明度。
步骤908、判断显示屏上触控位置对应的触控内容,与显示位置对应的显示内容匹配还是与展示内容匹配,若与显示内容匹配则执行步骤909,否则执行步骤911。
其中,显示内容包括:以所述显示位置为中心的第二范围内所述显示屏上的显示内容。
展示内容包括:以所述触控位置为中心的第三范围内所述显示屏上的展示内容。
步骤909、确定为所述控制窗口配置触控功能,并非透明显示所述控制窗口中的内容;
步骤910、判断相邻两次检测的用户输入的触控信号的时间间隔是否大于预设时间,若是执行步骤911,否则执行步骤906。
步骤911、关闭所述控制窗口并删除与所述触控位置对应的显示位置的所述显示内容,返回显示屏接收触控信号并控制显示。
基于相同的发明构思,本公开实施例还提供了一种智能触控的方法,由于该方法即是本公开实施例中的设备对应的方法,并且该方法解决问题的原理与该设备相似,因此该方法的实施可以参见设备的实施,重复之处不再赘述。
如图10所示,本实施例提供的一种智能触控的方法的实施流程如下所示:
步骤1000、根据检测的用户位置,在所述显示设备的显示屏上触发控制窗口的显示,并控制所述控制窗口处于第一状态,所述第一状态表征根据接收的用户输入的触控信号判断是否向第二状态转换的过渡状态;
步骤1001、根据检测的用户输入的触控信号的触控位置与所述控制窗口的位置关系,控制所述控制窗口由第一状态向第二状态转换;
步骤1002、确定所述控制窗口处于第二状态,通过所述控制窗口对所述显示屏进行控制。
作为一种可选的实施方式,所述根据检测的用户位置,在显示设备的显示屏上触发显示控制窗口,包括:
根据检测的用户位置,确定所述用户和所述显示设备的距离;
若所述距离小于距离阈值,则在显示设备的显示屏上触发控制窗口的显示。
作为一种可选的实施方式,所述在所述显示设备的显示屏上触发显示控制窗口,包括:
根据所述用户位置的横坐标,确定所述控制窗口中心的横坐标;
根据所述控制窗口中心的横坐标,确定所述控制窗口在所述显示屏上显示的位置。
作为一种可选的实施方式,所述根据检测的用户输入的触控信号的触控位置与所述控制窗口的位置关系,控制所述控制窗口处于第二状态,包括:
若所述触控位置位于所述控制窗口所在区域内,则控制所述控制窗口处于第二状态。
作为一种可选的实施方式,根据检测的用户输入的触控信号的触控位置与所述控制窗口的位置关系,控制所述控制窗口处于第二状态,包括:
根据检测的用户操作的触控位置与所述控制窗口的位置关系,以及以所述触控位置为中心的第一范围内生成的触控内容,控制所述控制窗口处于第二状态。
作为一种可选的实施方式,所述根据检测的用户操作的触控位置与所述控制窗口的位置关系,以及以所述触控位置为中心的第一范围内生成的触控内容,控制所述控制窗口处于第二状态,包括:
若所述触控位置位于所述控制窗口所在区域内,则确定所述显示屏上与所述触控位置对应的显示位置,若所述触控内容与所述显示屏上,以所述显示位置为中心的第二范围内显示的显示内容匹配,则控制所述控制窗口处于第二状态;或,
若所述触控位置位于所述控制窗口所在区域内,则获取以所述触控位置为中心的第三范围内所述显示屏上显示的展示内容,若以所述触控位置为中心的范围内生成的触控内容,与所述展示内容匹配,则关闭所述控制窗口,并删除与所述触控位置对应的显示位置显示的内容。
作为一种可选的实施方式,通过如下方式确定内容匹配:
根据所述触控内容和所述显示内容的尺寸、数据类型中的至少一种,确定所述触控内容与所述显示内容匹配;或,
根据所述触控内容和所述展示内容的尺寸、数据类型中的至少一种,确定所述触控内容与所述展示内容匹配。
作为一种可选的实施方式,通过如下方式确定内容匹配:
按预设规则放大所述触控内容或按预设规则缩小所述显示内容后,根据 所述触控内容和所述显示内容的尺寸,确定所述触控内容与所述显示内容匹配;或,
按预设规则放大所述触控内容或按预设规则缩小所述展示内容后,根据所述触控内容和所述展示内容的尺寸,确定所述触控内容与所述展示内容匹配。
作为一种可选的实施方式,所述根据检测的用户位置,在所述显示设备的显示屏上触发控制窗口的显示,还包括:
将所述显示屏上的至少部分内容缩放后映射到所述控制窗口进行显示。
作为一种可选的实施方式,所述将所述显示屏上的至少部分内容缩放后映射到所述控制窗口进行显示,还包括:
所述控制窗口从所述第一状态转换到所述第二状态的过程中,控制所述控制窗口中映射的内容的透明度逐级降低。
作为一种可选的实施方式,所述控制所述控制窗口中映射的内容的透明度逐级降低,包括:
若根据检测的用户位置,将所述显示屏上的至少部分内容缩放后映射到所述控制窗口进行显示,则按照一级透明度显示所述映射的内容;或,
若所述触控位置位于所述控制窗口所在区域内,则按照二级透明度显示所述映射的内容,其中所述二级透明度低于所述一级透明度;或,
若所述触控位置位于所述控制窗口所在区域内,且所述触控内容与所述显示内容匹配,则按照非透明方式显示所述映射的内容。
作为一种可选的实施方式,所述在所述显示设备的显示屏上触发控制窗口的显示之后,还包括:
若超过预设时段未检测到触控信号,则关闭所述控制窗口;或,
若所述触控信号的检测时间,与所述用户位置的检测时间的差值大于时间阈值,则关闭所述控制窗口。
作为一种可选的实施方式,通过如下方式确定所述用户位置:
根据所述显示设备拍摄的用户的深度图像中的深度信息,确定所述用户 位置;或,
利用所述显示设备的数字雷达阵列进行扫描,根据扫描结果确定所述用户位置。
作为一种可选的实施方式,还包括:
若所述控制窗口处于第一状态,则通过所述控制窗口进行内容的显示,通过所述显示屏接收触控信号;
若所述控制窗口处于第二状态,则通过所述控制窗口接收触控信号,通过所述控制窗口以外的区域进行内容的显示。
基于相同的发明构思,本公开实施例还提供了一种智能触控的装置,由于该设备即是本公开实施例中的方法中的设备,并且该设备解决问题的原理与该方法相似,因此该设备的实施可以参见方法的实施,重复之处不再赘述。
如图11所示,该装置包括:
显示单元1100,用于根据检测的用户位置,在所述显示设备的显示屏上触发控制窗口的显示,并控制所述控制窗口处于第一状态,所述第一状态表征根据接收的用户输入的触控信号判断是否向第二状态转换的过渡状态;
转换单元1101,用于根据检测的用户输入的触控信号的触控位置与所述控制窗口的位置关系,控制所述控制窗口由第一状态向第二状态转换;
控制单元1102、用于确定所述控制窗口处于第二状态,通过所述控制窗口对所述显示屏进行控制。
作为一种可选的实施方式,所述显示单元1100具体用于:
根据检测的用户位置,确定所述用户和所述显示设备的距离;
若所述距离小于距离阈值,则在显示设备的显示屏上触发控制窗口的显示。
作为一种可选的实施方式,所述显示单元1100具体用于:
根据所述用户位置的横坐标,确定所述控制窗口中心的横坐标;
根据所述控制窗口中心的横坐标,确定所述控制窗口在所述显示屏上显 示的位置。
作为一种可选的实施方式,所述转换单元1101具体用于:
若所述触控位置位于所述控制窗口所在区域内,则控制所述控制窗口处于第二状态。
作为一种可选的实施方式,所述转换单元1101具体用于:
根据检测的用户操作的触控位置与所述控制窗口的位置关系,以及以所述触控位置为中心的第一范围内生成的触控内容,控制所述控制窗口处于第二状态。
作为一种可选的实施方式,所述转换单元1101具体用于:
若所述触控位置位于所述控制窗口所在区域内,则确定所述显示屏上与所述触控位置对应的显示位置,若所述触控内容与所述显示屏上,以所述显示位置为中心的第二范围内显示的显示内容匹配,则控制所述控制窗口处于第二状态;或,
若所述触控位置位于所述控制窗口所在区域内,则获取以所述触控位置为中心的第三范围内所述显示屏上显示的展示内容,若以所述触控位置为中心的范围内生成的触控内容,与所述展示内容匹配,则关闭所述控制窗口,并删除与所述触控位置对应的显示位置显示的内容。
作为一种可选的实施方式,所述转换单元1101具体用于:
根据所述触控内容和所述显示内容的尺寸、数据类型中的至少一种,确定所述触控内容与所述显示内容匹配;或,
根据所述触控内容和所述展示内容的尺寸、数据类型中的至少一种,确定所述触控内容与所述展示内容匹配。
作为一种可选的实施方式,所述转换单元1101具体用于:
按预设规则放大所述触控内容或按预设规则缩小所述显示内容后,根据所述触控内容和所述显示内容的尺寸,确定所述触控内容与所述显示内容匹配;或,
按预设规则放大所述触控内容或按预设规则缩小所述展示内容后,根据 所述触控内容和所述展示内容的尺寸,确定所述触控内容与所述展示内容匹配。
作为一种可选的实施方式,所述显示单元具体还用于:
将所述显示屏上的至少部分内容缩放后映射到所述控制窗口进行显示。
作为一种可选的实施方式,所述显示单元1100具体还用于:
所述控制窗口从所述第一状态转换到所述第二状态的过程中,控制所述控制窗口中映射的内容的透明度逐级降低。
作为一种可选的实施方式,所述显示单元1100具体还用于:
若根据检测的用户位置,将所述显示屏上的至少部分内容缩放后映射到所述控制窗口进行显示,则按照一级透明度显示所述映射的内容;或,
若所述触控位置位于所述控制窗口所在区域内,则按照二级透明度显示所述映射的内容,其中所述二级透明度低于所述一级透明度;或,
若所述触控位置位于所述控制窗口所在区域内,且所述触控内容与所述显示内容匹配,则按照非透明方式显示所述映射的内容。
作为一种可选的实施方式,所述在所述显示设备的显示屏上触发控制窗口的显示之后,还包括关闭单元具体用于:
若超过预设时段未检测到触控信号,则关闭所述控制窗口;或,
若所述触控信号的检测时间,与所述用户位置的检测时间的差值大于时间阈值,则关闭所述控制窗口。
作为一种可选的实施方式,所述显示单元1100具体用于通过如下方式确定所述用户位置:
根据所述显示设备拍摄的用户的深度图像中的深度信息,确定所述用户位置;或,
利用所述显示设备的数字雷达阵列进行扫描,根据扫描结果确定所述用户位置。
作为一种可选的实施方式,还包括判断单元具体用于:
若所述控制窗口处于第一状态,则通过所述控制窗口进行内容的显示, 通过所述显示屏接收触控信号;
若所述控制窗口处于第二状态,则通过所述控制窗口接收触控信号,通过所述控制窗口以外的区域进行内容的显示。
基于相同的发明构思,本公开实施例还提供了一种计算机存储介质,其上存储有计算机程序,该程序被处理器执行时实现如下步骤:
根据检测的用户位置,在所述显示设备的显示屏上触发控制窗口的显示,并控制所述控制窗口处于第一状态,所述第一状态表征根据接收的用户输入的触控信号判断是否向第二状态转换的过渡状态;
根据检测的用户输入的触控信号的触控位置与所述控制窗口的位置关系,控制所述控制窗口由第一状态向第二状态转换;
确定所述控制窗口处于第二状态,通过所述控制窗口对所述显示屏进行控制。
本领域内的技术人员应明白,本公开的实施例可提供为方法、系统、或计算机程序产品。因此,本公开可采用完全硬件实施例、完全软件实施例、或结合软件和硬件方面的实施例的形式。而且,本公开可采用在一个或多个其中包含有计算机可用程序代码的计算机可用存储介质(包括但不限于磁盘存储器、CD-ROM、光学存储器等)上实施的计算机程序产品的形式。
本公开是参照根据本公开实施例的方法、设备(系统)、和计算机程序产品的流程图和/或方框图来描述的。应理解可由计算机程序指令实现流程图和/或方框图中的每一流程和/或方框、以及流程图和/或方框图中的流程和/或方框的结合。可提供这些计算机程序指令到通用计算机、专用计算机、嵌入式处理机或其他可编程数据处理设备的处理器以产生一个机器,使得通过计算机或其他可编程数据处理设备的处理器执行的指令产生用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的装置。
这些计算机程序指令也可存储在能引导计算机或其他可编程数据处理设 备以特定方式工作的计算机可读存储器中,使得存储在该计算机可读存储器中的指令产生包括指令装置的制造品,该指令装置实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能。
这些计算机程序指令也可装载到计算机或其他可编程数据处理设备上,使得在计算机或其他可编程设备上执行一系列操作步骤以产生计算机实现的处理,从而在计算机或其他可编程设备上执行的指令提供用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的步骤。
尽管已描述了本公开的优选实施例,但本领域内的技术人员一旦得知了基本创造性概念,则可对这些实施例作出另外的变更和修改。所以,所附权利要求意欲解释为包括优选实施例以及落入本公开范围的所有变更和修改。
显然,本领域的技术人员可以对本公开实施例进行各种改动和变型而不脱离本公开实施例的精神和范围。这样,倘若本公开实施例的这些修改和变型属于本公开权利要求及其等同技术的范围之内,则本公开也意图包含这些改动和变型在内。

Claims (22)

  1. 一种显示设备,其中,包括显示屏、触控组件、控制器,其中:
    所述显示屏用于进行内容的显示;
    所述触控组件用于接收用户输入的触控信号;
    所述控制器被配置为执行:
    根据检测的用户位置,在所述显示设备的显示屏上触发控制窗口的显示,并控制所述控制窗口处于第一状态,所述第一状态表征根据接收的用户输入的触控信号判断是否向第二状态转换的过渡状态;
    根据检测的用户输入的触控信号的触控位置与所述控制窗口的位置关系,控制所述控制窗口由第一状态向第二状态转换;
    确定所述控制窗口处于第二状态,通过所述控制窗口对所述显示屏进行控制。
  2. 根据权利要求1所述的显示设备,其中,所述处理器具体被配置为执行:
    根据检测的用户位置,确定所述用户和所述显示设备的距离;
    若所述距离小于距离阈值,则在显示设备的显示屏上触发控制窗口的显示。
  3. 根据权利要求1所述的显示设备,其中,所述处理器具体被配置为执行:
    根据所述用户位置的横坐标,确定所述控制窗口中心的横坐标;
    根据所述控制窗口中心的横坐标,确定所述控制窗口在所述显示屏上显示的位置。
  4. 根据权利要求1所述的显示设备,其中,所述处理器具体被配置为执行:
    若所述触控位置位于所述控制窗口所在区域内,则控制所述控制窗口处于第二状态。
  5. 根据权利要求1所述的显示设备,其中,所述处理器具体被配置为执行:
    根据检测的用户操作的触控位置与所述控制窗口的位置关系,以及,以所述触控位置为中心的第一范围内生成的触控内容,控制所述控制窗口处于第二状态。
  6. 根据权利要求5所述的显示设备,其中,所述处理器具体被配置为执行:
    若所述触控位置位于所述控制窗口所在区域内,则确定所述显示屏上与所述触控位置对应的显示位置,若所述触控内容与所述显示屏上,以所述显示位置为中心的第二范围内显示的显示内容匹配,则控制所述控制窗口处于第二状态;或,
    若所述触控位置位于所述控制窗口所在区域内,则获取以所述触控位置为中心的第三范围内所述显示屏上显示的展示内容,若以所述触控位置为中心的范围内生成的触控内容,与所述展示内容匹配,则关闭所述控制窗口,并删除与所述触控位置对应的显示位置显示的内容。
  7. 根据权利要求6所述的显示设备,其中,所述处理器具体被配置为:
    根据所述触控内容和所述显示内容的尺寸、数据类型中的至少一种,确定所述触控内容与所述显示内容匹配;或,
    根据所述触控内容和所述展示内容的尺寸、数据类型中的至少一种,确定所述触控内容与所述展示内容匹配。
  8. 根据权利要求6所述的显示设备,其中,所述处理器具体被配置为执行:
    按预设规则放大所述触控内容或按预设规则缩小所述显示内容后,根据所述触控内容和所述显示内容的尺寸,确定所述触控内容与所述显示内容匹配;或,
    按预设规则放大所述触控内容或按预设规则缩小所述展示内容后,根据所述触控内容和所述展示内容的尺寸,确定所述触控内容与所述展示内容匹 配。
  9. 根据权利要求1~8任一所述的显示设备,其中,所述根据检测的用户位置,在所述显示设备的显示屏上触发控制窗口的显示,所述处理器具体还被配置为执行:
    将所述显示屏上的至少部分内容缩放后映射到所述控制窗口进行显示。
  10. 根据权利要求9所述的显示设备,其中,所述处理器还被配置为执行:
    所述控制窗口从所述第一状态转换到所述第二状态的过程中,控制所述控制窗口中映射的内容的透明度逐级降低。
  11. 根据权利要求10所述的显示设备,其中,所述处理器具体被配置为执行:
    若根据检测的用户位置,将所述显示屏上的至少部分内容缩放后映射到所述控制窗口进行显示,则按照一级透明度显示所述映射的内容;或,
    若所述触控位置位于所述控制窗口所在区域内,则按照二级透明度显示所述映射的内容,其中所述二级透明度低于所述一级透明度;或,
    若所述触控位置位于所述控制窗口所在区域内,且所述触控内容与所述显示内容匹配,则按照非透明方式显示所述映射的内容。
  12. 根据权利要求1~8、10-11任一所述的显示设备,其中,所述在所述显示设备的显示屏上触发控制窗口的显示之后,所述处理器具体还被配置为执行:
    若超过预设时段未检测到触控信号,则关闭所述控制窗口;或,
    若所述触控信号的检测时间,与所述用户位置的检测时间的差值大于时间阈值,则关闭所述控制窗口。
  13. 根据权利要求1~8、10-11任一所述的显示设备,其中,所述处理器具体被配置为通过如下方式确定所述用户位置:
    根据所述显示设备拍摄的用户的深度图像中的深度信息,确定所述用户位置;或,
    利用所述显示设备的数字雷达阵列进行扫描,根据扫描结果确定所述用户位置。
  14. 根据权利要求1~8、10-11任一所述的显示设备,其中,所述处理器具体还被配置为:
    若所述控制窗口处于第一状态,则通过所述控制窗口进行内容的显示,通过所述显示屏接收触控信号;
    若所述控制窗口处于第二状态,则通过所述控制窗口接收触控信号,通过所述控制窗口以外的区域进行内容的显示。
  15. 一种智能触控的方法,其中,该方法包括:
    根据检测的用户位置,在所述显示设备的显示屏上触发控制窗口的显示,并控制所述控制窗口处于第一状态,所述第一状态表征根据接收的用户输入的触控信号判断是否向第二状态转换的过渡状态;
    根据检测的用户输入的触控信号的触控位置与所述控制窗口的位置关系,控制所述控制窗口由第一状态向第二状态转换;
    确定所述控制窗口处于第二状态,通过所述控制窗口对所述显示屏进行控制。
  16. 根据权利要求15所述的方法,其中,所述根据检测的用户位置,在所述显示设备的显示屏上触发控制窗口的显示,包括:
    根据检测的用户位置,确定所述用户和所述显示设备的距离;
    若所述距离小于距离阈值,则在显示设备的显示屏上触发控制窗口的显示。
  17. 根据权利要求15所述的方法,其中,通过如下方式确定所述控制窗口的位置:
    根据所述用户位置的横坐标,确定所述控制窗口中心的横坐标;
    根据所述控制窗口中心的横坐标,确定所述控制窗口在所述显示屏上显示的位置。
  18. 根据权利要求15所述的方法,其中,所述根据检测的用户输入的触 控信号的触控位置与所述控制窗口的位置关系,控制所述控制窗口处于第二状态,包括:
    若所述触控位置位于所述控制窗口所在区域内,则控制所述控制窗口处于第二状态。
  19. 根据权利要求15所述的方法,其中,所述根据检测的用户输入的触控信号的触控位置与所述控制窗口的位置关系,控制所述控制窗口处于第二状态,包括:
    根据检测的用户操作的触控位置与所述控制窗口的位置关系,以及,以所述触控位置为中心的第一范围内生成的触控内容,控制所述控制窗口处于第二状态。
  20. 根据权利要求15~19任一所述的方法,其中,所述所述根据检测的用户位置,在所述显示设备的显示屏上触发控制窗口的显示,还包括:
    将所述显示屏上的至少部分内容缩放后映射到所述控制窗口进行显示。
  21. 根据权利要求20所述的方法,其中,所述将所述显示屏上的至少部分内容缩放后映射到所述控制窗口进行显示,还包括:
    所述控制窗口从所述第一状态转换到所述第二状态的过程中,控制所述控制窗口中映射的内容的透明度逐级降低。
  22. 一种计算机存储介质,其上存储有计算机程序,其中,该程序被处理器执行时实现如权利要求15~21任一所述方法的步骤。
PCT/CN2022/108148 2021-08-31 2022-07-27 一种显示设备及其智能触控的方法 WO2023029822A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111016337.8 2021-08-31
CN202111016337.8A CN113703640A (zh) 2021-08-31 2021-08-31 一种显示设备及其智能触控的方法

Publications (1)

Publication Number Publication Date
WO2023029822A1 true WO2023029822A1 (zh) 2023-03-09

Family

ID=78658329

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/108148 WO2023029822A1 (zh) 2021-08-31 2022-07-27 一种显示设备及其智能触控的方法

Country Status (2)

Country Link
CN (1) CN113703640A (zh)
WO (1) WO2023029822A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113703640A (zh) * 2021-08-31 2021-11-26 京东方科技集团股份有限公司 一种显示设备及其智能触控的方法

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130293490A1 (en) * 2012-02-03 2013-11-07 Eldon Technology Limited Display zoom controlled by proximity detection
CN111897463A (zh) * 2020-07-29 2020-11-06 海信视像科技股份有限公司 屏幕界面交互显示方法及显示设备
CN111913622A (zh) * 2020-07-29 2020-11-10 海信视像科技股份有限公司 屏幕界面交互显示方法及显示设备
CN112346639A (zh) * 2020-11-04 2021-02-09 北京小米移动软件有限公司 一种显示应用界面的方法、装置、设备及存储介质
CN112923653A (zh) * 2021-03-01 2021-06-08 合肥美菱物联科技有限公司 一种基于位置和距离分析的冰箱智能控制系统及方法
CN113703640A (zh) * 2021-08-31 2021-11-26 京东方科技集团股份有限公司 一种显示设备及其智能触控的方法

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5606281B2 (ja) * 2010-11-08 2014-10-15 シャープ株式会社 表示装置
CN111722781A (zh) * 2020-06-22 2020-09-29 京东方科技集团股份有限公司 智能交互方法及设备、存储介质
CN111913621B (zh) * 2020-07-29 2022-04-19 海信视像科技股份有限公司 屏幕界面交互显示方法及显示设备

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130293490A1 (en) * 2012-02-03 2013-11-07 Eldon Technology Limited Display zoom controlled by proximity detection
CN111897463A (zh) * 2020-07-29 2020-11-06 海信视像科技股份有限公司 屏幕界面交互显示方法及显示设备
CN111913622A (zh) * 2020-07-29 2020-11-10 海信视像科技股份有限公司 屏幕界面交互显示方法及显示设备
CN112346639A (zh) * 2020-11-04 2021-02-09 北京小米移动软件有限公司 一种显示应用界面的方法、装置、设备及存储介质
CN112923653A (zh) * 2021-03-01 2021-06-08 合肥美菱物联科技有限公司 一种基于位置和距离分析的冰箱智能控制系统及方法
CN113703640A (zh) * 2021-08-31 2021-11-26 京东方科技集团股份有限公司 一种显示设备及其智能触控的方法

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113703640A (zh) * 2021-08-31 2021-11-26 京东方科技集团股份有限公司 一种显示设备及其智能触控的方法

Also Published As

Publication number Publication date
CN113703640A (zh) 2021-11-26

Similar Documents

Publication Publication Date Title
JP7035233B2 (ja) ユーザインタフェース間をナビゲートし、制御オブジェクトと対話するためのデバイス、方法及びグラフィカルユーザインタフェース
JP6970265B2 (ja) アフォーダンスを背景に表示するためのデバイス、方法、及びグラフィカルユーザインターフェース
US11567644B2 (en) Cursor integration with a touch screen user interface
US10282081B2 (en) Input and output method in touch screen terminal and apparatus therefor
US9086800B2 (en) Apparatus and method for controlling screen displays in touch screen terminal
JP4790847B2 (ja) タッチ・スクリーン操作インターフェース
EP2529292B1 (en) Device, method, and graphical user interface for resizing objects
US9678659B2 (en) Text entry for a touch screen
EP3693843B1 (en) Method and apparatus for designating entire area using partial area touch in a portable equipment
US20140173498A1 (en) Multiple screen mode in mobile terminal
JP2015167018A (ja) タッチスクリーンディスプレイを有する多機能デバイスでの編集の方法およびグラフィカルユーザインターフェース
KR101929316B1 (ko) 터치스크린을 구비한 단말기에서 키패드 표시 방법 및 장치
US20180165007A1 (en) Touchscreen keyboard
CN109753215B (zh) 一种窗口分屏显示方法、装置及设备
JP7337975B2 (ja) ユーザインタフェース間でのナビゲーション、ドックの表示、及びシステムユーザインタフェース要素の表示のためのデバイス、方法、及びグラフィカルユーザインタフェース
EP3961364A1 (en) Page operation method and apparatus, and terminal and storage medium
WO2023029822A1 (zh) 一种显示设备及其智能触控的方法
US20140195935A1 (en) Information processing device, information processing method, and information processing program
EP3977250A1 (en) User interface for managing input techniques
KR20160041898A (ko) 규정된 크로스 컨트롤 거동을 이용한 저감된 제어 응답 레이턴시
US10019127B2 (en) Remote display area including input lenses each depicting a region of a graphical user interface
JP6979817B2 (ja) グラフィック・ユーザー・インターフェースのコンポーネントのビューポート配列
WO2015167531A2 (en) Cursor grip
JP2015200977A (ja) 情報処理装置、コンピュータプログラムおよび記録媒体
WO2022059386A1 (ja) 操作領域を移動する情報処理装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22862963

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 11.06.2024)