WO2021004413A1 - Dispositif d'entrée portatif et procédé et appareil de commande d'extinction pour icône d'indication d'un dispositif d'entrée portatif - Google Patents

Dispositif d'entrée portatif et procédé et appareil de commande d'extinction pour icône d'indication d'un dispositif d'entrée portatif Download PDF

Info

Publication number
WO2021004413A1
WO2021004413A1 PCT/CN2020/100299 CN2020100299W WO2021004413A1 WO 2021004413 A1 WO2021004413 A1 WO 2021004413A1 CN 2020100299 W CN2020100299 W CN 2020100299W WO 2021004413 A1 WO2021004413 A1 WO 2021004413A1
Authority
WO
WIPO (PCT)
Prior art keywords
input device
handheld input
pointing
screen
image
Prior art date
Application number
PCT/CN2020/100299
Other languages
English (en)
Chinese (zh)
Inventor
杨超峰
Original Assignee
深圳市格上格创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市格上格创新科技有限公司 filed Critical 深圳市格上格创新科技有限公司
Publication of WO2021004413A1 publication Critical patent/WO2021004413A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser

Definitions

  • This application belongs to the field of handheld input devices, and in particular relates to a method and device for blanking control of a handheld input device and its indicator icons.
  • the handheld input device can be operated by the operator, and by controlling its pointing in the three-dimensional space, an indicator icon can be moved on the graphical user interface (GUI) of the display screen.
  • GUI graphical user interface
  • a specific product of handheld input devices is an air mouse.
  • the above indicator is the mouse cursor.
  • the air mouse does not require a support plane, and the operator can hold the air mouse to control the cursor to move on the graphical user interface.
  • Another application of handheld input devices is digital wireless presenters.
  • the wireless presenter is an important equipment in the conference room, used for the lecturer to highlight key content when explaining PPT and other materials.
  • the main category of wireless demonstrators is laser demonstrators, also known as laser pointers.
  • the price of LCD/LED large screens has become lower and lower.
  • the indicator icon that it controls is designed to be more eye-catching than the traditional mouse cursor. For example, it is designed as a bright red round spot to achieve an effect similar to a laser spot, which is the focus of speech The content is highlighted.
  • the indicator icon can be fixed on a position on the screen.
  • the hand-held input device is operated by the operator, and the indicator icon will always move with the operator's activity after being activated. Therefore, the hand-held input device needs a blanking control method for the indicator icon.
  • the blanking methods of the existing indicator icons are usually: 1. When the moving speed of the indicator icon is less than a speed threshold for a period of time, the blanking method is also required when the handheld input device does certain operations (such as instructions for key content).
  • the indicator icon is basically in a static state, so the time threshold for triggering blanking needs to be set to a very long time, such as 5 seconds.
  • This blanking method is not suitable for application scenarios that require quick blanking of indicator icons. For example, when using a digital wireless presenter to make a speech, you need to quickly activate indicator icons to indicate key content. After explaining a paragraph of key content, you often need to quickly Blanking the indicator icon; 2.
  • a key is set on the handheld input device, and the indicator icon is activated when the key is pressed, and the indicator icon is blanked when the key is released.
  • the problem with this blanking method is that the operator needs to press the button for a long time, which can easily cause finger fatigue. In addition, because the finger presses the activation button, other button functions can not be triggered during the movement of the control indicator icon; 3. Hand-held input Set a button on the device, click the button to activate the indicator icon, and click the blank indicator icon again.
  • the problem with this method is that the operator needs to actively click the button to blank the indicator icon, which is cumbersome, especially when the handheld input device is used as a digital wireless presenter, the speaker may forget to blank the indicator icon and turn his back to the screen. Explain, at this time, the indicator icon will shake on the screen with the speaker's gesture, which will affect the speech effect.
  • the embodiments of the present application provide a handheld input device and a blanking control method and device for indicating icons thereof, so as to solve the prior art blanking control method of indicating icons, which requires a long time to press the button to control the indication
  • the first aspect of the embodiments of the present application provides a blanking control method of the indicator icon of a handheld input device, and the blanking control method of the indicator icon of the handheld input device includes:
  • mapping model is established, and the mapping model is used to map the pointing direction of the handheld input device to the display position of the indicator icon, wherein, in the mapping model, the pointing range of the handheld input device includes a first pointing range , When the pointing of the handheld input device changes with any trajectory within the first pointing range, the display position of the indicator icon is all within a preset screen edge range;
  • the indicator icon moves into the range of the screen edge, it is detected whether the duration of the display position of the indicator icon within the range of the screen edge is greater than the preset first threshold, and if the duration is greater than the first threshold, A threshold value, the indicator icon is blanked.
  • the parameters of the mapping model include a first orientation, a mapping origin, and a mapping coefficient.
  • the first orientation is the orientation of the handheld input device when the mapping model is established, and the handheld input device Includes the yaw angle and pitch angle of the handheld input device;
  • the mapping origin is a preset screen position
  • the mapping coefficient is a preset value or determined according to the distance between the handheld input device and the screen.
  • the step of establishing the mapping model includes:
  • Collect the first image pointed by the handheld input device extract the image feature of the first image, match the image feature of the first image with the prefetched screen feature, and identify the screen in the first image Imaging area.
  • the parameters of the mapping model are determined according to the screen imaging area, wherein the first orientation is the orientation of the handheld input device when the first image is collected, and the orientation of the handheld input device includes the handheld input device
  • the mapping origin is determined based on the position of the screen imaging area in the first image
  • the mapping coefficient is based on the screen imaging area and the camera module that collects the first image Is determined by the angle of view.
  • the step of establishing a mapping model further includes:
  • the image feature of the second image is extracted, where the image feature of the second image is the prefetched screen feature.
  • the step of determining the display position of the indicator icon according to the mapping model includes:
  • a second aspect of the embodiments of the present application provides an apparatus for controlling blanking of an indicator icon of a handheld input device.
  • the apparatus for controlling blanking of an indicator icon of the handheld input device includes:
  • the mapping model establishment unit is configured to establish a mapping model for mapping the orientation of the handheld input device to the display position of the indicator icon, wherein, in the mapping model, the orientation of the handheld input device
  • the range of includes a first pointing range, and when the pointing of the handheld input device changes with any track within the first pointing range, the display position of the indicator icon is within a preset screen edge range;
  • a display position determining unit configured to determine the display position of the indicator icon according to the mapping model
  • the blanking control unit is used to detect whether the display position of the indicator icon is within the edge of the screen for a duration greater than a preset first threshold when the indicator icon moves within the edge of the screen. If the duration is greater than the first threshold, the indication icon is blanked.
  • the parameters of the mapping model include a first orientation, a mapping origin, and a mapping coefficient.
  • the mapping model establishment unit includes:
  • the first orientation setting subunit is configured to set the first orientation to the orientation of the handheld input device when the mapping model is established, and the orientation of the handheld input device includes the yaw angle and pitch of the handheld input device angle;
  • the mapping origin setting subunit is used to set the mapping origin to a preset screen position
  • the mapping coefficient setting subunit is used to set the mapping coefficient to a preset value or determine the mapping coefficient according to the distance between the handheld input device and the screen.
  • the mapping model establishment unit includes:
  • the screen imaging area recognition subunit is used to collect the first image pointed by the handheld input device, extract the image features of the first image, match the image features of the first image with the prefetched screen features, and recognize The screen imaging area in the first image.
  • the parameter determination subunit is configured to determine the parameters of the mapping model according to the screen imaging area, wherein the first orientation is the orientation of the handheld input device when the first image is collected, and the orientation of the handheld input device is
  • the pointing includes the yaw angle and the pitch angle of the handheld input device, the mapping origin is determined according to the position of the screen imaging area in the first image, and the mapping coefficient is determined according to the screen imaging area and the acquisition location.
  • the first image is determined by the angle of view of the camera module.
  • mapping model establishment unit further includes:
  • the second image collection subunit is used to collect the second image displayed on the screen
  • the screen feature extraction subunit is used to extract the image feature of the second image, wherein the image feature of the second image is the prefetched screen feature.
  • the display position determining unit includes:
  • the current pointing determination subunit is used to determine the current pointing of the handheld input device
  • a position offset determination subunit configured to determine the pointing position corresponding to the current pointing relative to the pointing position corresponding to the first pointing according to the angular offset of the current pointing relative to the first pointing and the mapping coefficient ⁇ position shift;
  • a pointing position determining subunit configured to determine a pointing position corresponding to the current pointing of the handheld input device according to the position offset and the mapping origin;
  • the display position determination subunit is configured to determine the display position of the indicator icon according to the pointing position corresponding to the current pointing, wherein, if the current pointing of the handheld input device is within the first pointing range, the display The position is within the edge of the screen.
  • the third aspect of the embodiments of the present application provides a handheld input device, including a memory, a processor, and a computer program stored in the memory and running on the processor, and the processor executes the computer program When realizing the steps of the blanking control method of the indicator icon of the handheld input device as described in any one of the first aspect.
  • a fourth aspect of the embodiments of the present application provides a computer-readable storage medium that stores a computer program, and is characterized in that, when the computer program is executed by a processor, a The steps of the blanking control method of the indicator icon of the handheld input device described in the item.
  • the pointing range of the handheld input device includes the first pointing range.
  • the display position of the indicator icon is within the range of the edge of the screen.
  • the first pointing range usually includes the pointing of the handheld input device when the speaker's hands are hanging down or when he turns his back to the screen.
  • the speaker does not need to press the button for a long time to control the indicator icon display, but only needs to click the activation button to trigger the indicator icon display.
  • the indicator icon can be blanked by dropping the hand, or the indicator icon can be blanked by turning around and facing the audience.
  • the blanking control is simple and natural, and the indicator icon will not shake randomly due to misoperation, and it does not need to press for a long time, which is in line with the operation of handheld input devices. Habits help improve user experience.
  • FIG. 1 is a schematic diagram of the implementation process of a method for controlling blanking of an indicator icon of a handheld input device according to an embodiment of the present application
  • FIG. 2 is a schematic diagram of a pointing control indicator icon through a handheld input device provided by an embodiment of the present application
  • FIG. 3 is a schematic diagram of the implementation process of a method for establishing a mapping model provided by an embodiment of the present application
  • FIG. 4 is a schematic diagram of a first image collected by a handheld input device according to an embodiment of the present application.
  • FIG. 5 is a schematic diagram of an implementation process of determining the display position of an indicator icon provided by an embodiment of the present application
  • FIG. 6 is a schematic structural diagram of a blanking control apparatus for an indicator icon of a handheld input device according to an embodiment of the application;
  • Fig. 7 is a schematic diagram of a handheld input device provided by an embodiment of the present application.
  • FIG. 1 is a schematic diagram of the implementation process of a blanking control method for an indicator icon of a handheld input device provided by an embodiment of the application, and the details are as follows:
  • step S101 a mapping model is established, and the mapping model is used to map the direction of the handheld input device to the display position of the indicator icon.
  • the handheld input device in this application can be an air mouse, a digital wireless presenter, or other types of input devices.
  • the handheld input device is provided with buttons, and the operator can activate the indicator icon or trigger other operations through button commands.
  • the handheld input device is operated by the operator, and the operator controls an indicator icon to move on the screen of the display device by controlling the movement of the handheld input device in the three-dimensional space during use, wherein the indicator icon is a computer graphic user
  • a graphic object on the interface can be a pointer-shaped icon, such as a mouse cursor.
  • the corresponding operation can be triggered by the function keys on the handheld input device.
  • the display device can be any display device such as a projector, an LCD/LED display, etc.
  • the blanking control method described in this application can be triggered when the operator activates the indicator icon, or when the indicator icon is in the blanking state, for example, it can be triggered immediately after the handheld input device is powered on Run, or trigger run when the pitch angle of the handheld input device is greater than the preset angle.
  • FIG 2 shows an application scenario of a handheld input device.
  • the handheld input device is held by an operator.
  • the gray area in the figure is a screen 201, which includes but is not limited to an electronic wall screen, a display screen, or a projection screen.
  • the handheld input device is pointed at the screen 201 for operation.
  • the movement of the handheld input device in the three-dimensional space can be represented by the change of the pointing D of the handheld input device.
  • mapping model is established, and the mapping model is used to map the direction of the handheld input device to the display position of the indicator icon.
  • the pointing D of the handheld input device is first mapped to a pointing position on the screen reference surface 202, and the display position of the indicator icon is determined according to the pointing position.
  • the operator can control the indication by manipulating the pointing D of the handheld input device
  • the icon moves on the screen 201.
  • the screen reference surface 202 may be determined according to the surface of the screen. If the screen surface of the display device is flat, preferably, as shown in FIG. 2, the screen reference surface 202 is set to the plane where the screen is located, and the dark gray area is the screen 201. If the screen surface is a curved surface, the screen projection surface can be used to approximate the screen 201, and the screen reference surface 202 is the plane where the screen projection surface is located, or the screen reference surface 202 can also be a curved surface. It should be noted that the screen 201 shown in Figure 2 is rectangular. In fact, the screen 201 can be any other shape, or it can be composed of several separate sub-screens. For a special-shaped screen, the mapping model is based on the principle of a rectangular screen. Are the same. Without loss of generality, this application uses a rectangular screen as an example to discuss the process of establishing a mapping model.
  • the pointing position of the screen reference surface mapped by the pointing D of the handheld input device is a different concept from the actual pointing position of the handheld input device, and the pointing position may be consistent with the actual pointing position of the handheld input device. It may also deviate from the position the handheld input device actually points to. For example, as shown in FIG. 2, if the pointing position mapped by the pointing D of the handheld input device is the intersection P of the pointing line 204 of the handheld input device and the screen reference surface 202, then the pointing position mapped by the pointing D is the same as the actual handheld input device. The pointing positions are consistent; if the pointing position mapped by the pointing D of the handheld input device is P1, the pointing position mapped by the pointing D is deviated from the actual pointing position of the handheld input device.
  • the pointing position of the handheld input device is first mapped to the pointing position of the screen reference surface 202, and then the display position of the indicator icon is determined according to the pointing position.
  • the pointing range of the hand-held input device includes a first pointing range.
  • the indication icon is displayed The positions are all within the preset screen edge range.
  • mapping model There are multiple implementations of the mapping model.
  • the parameters of the mapping model in this embodiment include the first orientation, the mapping origin, and the mapping coefficient.
  • the method for determining the parameters of the mapping model includes:
  • Step S1 the first orientation is the orientation of the handheld input device when the mapping model is established, and the orientation of the handheld input device includes the yaw angle and the pitch angle of the handheld input device;
  • the yaw angle and pitch angle of the handheld input device are used to model the pointing of the handheld input device.
  • the RY axis is the pitch axis
  • the RZ axis is the yaw axis.
  • An inertial motion sensor is provided in the handheld input device, including an accelerometer and a gyroscope, and the inertial motion sensor collects acceleration data and gyroscope data of the handheld input device.
  • the acceleration data and gyroscope data the pitch angle and yaw angle of the handheld input device can be calculated, that is, the orientation of the handheld input device in the mapping model of this embodiment. It is a known method to calculate the attitude angle of a handheld input device based on acceleration data and gyroscope data, such as an extended Kalman filter algorithm, which will not be repeated in this application.
  • the pitch angle and the yaw angle of the handheld input device when the mapping model is established are determined to obtain the first orientation.
  • Step S2 The mapping origin is set to a preset screen position.
  • the physical meaning of the mapping origin is the pointing position of the screen reference surface mapped by the first pointing.
  • the screen reference surface 202 is a plane, and a Cartesian coordinate system is set for the screen reference surface 202. Without loss of generality, the coordinate system is based on the upper left corner of the screen 201
  • the vertex O is the origin
  • the screen X axis is the X axis
  • the screen Y axis is the Y axis.
  • Any point on the screen reference surface 202 can be set as the mapping origin.
  • the mapping origin is set to a preset screen position, such as the center position of the screen 201 or the position where the indicator icon was blanked last time.
  • Step S3 The mapping coefficient is set to a preset value or determined according to the distance between the handheld input device and the screen.
  • the physical meaning of the mapping coefficient is the number of X-axis image points of the screen reference surface corresponding to each degree of the yaw angle of the handheld input device, and the Y-axis image of the screen reference surface corresponding to each degree of the pitch angle of the handheld input device Points.
  • the mapping origin is a preset screen position, which is usually not set at the intersection P between the pointing line 204 of the handheld input device and the screen reference surface 202, that is, The pointing position mapped by the first pointing is inconsistent with the actual pointing position of the handheld input device. Therefore, the pointing position mapped by the pointing of the subsequent handheld input device and the actual pointing position of the handheld input device are usually not consistent, which means The display position of the indicator icon determined according to the pointing position is inconsistent with the actual pointing position of the handheld input device.
  • the parameters of the mapping model are determined based on image analysis technology to make the display position of the indicator icon consistent with the position actually pointed by the handheld input device.
  • the steps are shown in Fig. 3 and include:
  • step S301 the first image pointed to by the handheld input device is collected, the image feature of the first image is extracted, the image feature of the first image is matched with the prefetched screen feature, and the first image is identified in the first image.
  • a camera module 203 is provided in the handheld input device, and the camera module 203 captures an image pointed by the handheld input device, and this image is the first image 401 shown in FIG. 4.
  • the installation method of the camera module 203 can be as follows: the pointing line 204 of the handheld input device is perpendicular to the photosensitive sensor array plane of the camera module 203 and passes through the center point thereof.
  • the imaging point of the intersection point P of the pointing line 204 of the handheld input device and the screen reference surface 202 in the first image 401 is the center point IP of the first image 401, which facilitates the calculation of the parameters of the mapping model.
  • the screen imaging area 402 in the first image 401 is described.
  • the pre-fetched screen feature is used to identify the image feature of the screen 201.
  • the analysis and matching algorithms of image features are known technologies and will not be repeated here. It should be noted that the actual imaging area of the screen 201 in the first image is usually deformed, which can be corrected according to the imaging principle of the camera to obtain a regular screen imaging area 402 as shown in FIG. 4.
  • Method 1 Set up a hardware identification device on the display device, for example, set up multiple infrared LEDs around the screen 201, pre-fetch the number of LEDs and LED spacing as screen features, extract the bright spots in the first image 401, The light points are matched according to the number characteristics of the LEDs and the LED spacing characteristics, and the screen imaging area 402 can be determined.
  • Manner 2 When the first image 401 is collected, one or more special graphic objects are displayed on a specific position of the screen 201 (such as the center of the screen), and the image features of these graphic objects are extracted in advance as screen features.
  • Manner 3 At the same time or before the first image is collected, the second image displayed on the screen 201 is collected, and the image feature of the second image is extracted as the screen feature.
  • An implementation of capturing the second image can use an independent hardware device that is connected to the image data transmission channel between the smart device and the display device, such as an HDMI line, and the device intercepts the second image output by the smart device .
  • Another implementation method for capturing the second image is a software screenshot mode. A software is installed on the smart device that outputs the second image, and the software intercepts the second image that the smart device outputs to the display device.
  • the parameters of the mapping model of this embodiment are determined according to the screen imaging area 402, so that the actual point of the handheld input device can be consistent with the display position of the indicator icon.
  • a first orientation is determined, where the first orientation is the orientation of the handheld input device when the first image is collected, wherein the orientation of the handheld input device includes the yaw angle of the handheld input device And pitch angle;
  • the orientation of the handheld input device is exactly the same as the orientation of the handheld input device described in step S1, and the orientation of the handheld input device when the first image is acquired is determined according to the acceleration data and gyroscope data of the handheld input device to obtain the First point.
  • step S303 the mapping origin is determined according to the position of the screen imaging area in the first image.
  • the top left vertex IO of the screen imaging area 402 is the imaging point of the top left vertex O of the screen 201
  • the point IP is the imaging point at the position P pointed by the handheld input device.
  • the point IP is located at the center point of the first image 401.
  • the screen reference surface 202 is a plane, and a Cartesian coordinate system is set for the screen reference surface 202.
  • the coordinate system takes the vertex O of the upper left corner of the screen 201 as the origin, and the screen
  • the X axis is the X axis
  • the screen Y axis is the Y axis.
  • the coordinate offset of the position P actually pointed by the handheld input device relative to the vertex O of the upper left corner of the screen 201 can be calculated, thereby obtaining the position P on the screen reference surface 202 ,
  • the coordinate is the mapping origin (VX 0 , VY 0 ). Calculated as follows:
  • VX 0 (CX/SW)*MaxX
  • VY 0 (CY/SH)*MaxY (1)
  • CX and CY are the number of horizontal and vertical image points of the point IP relative to the point IO;
  • SW and SH are the number of horizontal and vertical image points of the screen imaging area 402 respectively;
  • MaxX and MaxY are the number of horizontal image points of the screen 201 respectively And the number of vertical image points, for example, if the screen resolution of the screen 201 is 1920x1080, then MaxX is 1920 and MaxY is 1080.
  • mapping origin is the pointing position of the screen reference surface 202 corresponding to the first pointing. If the hand-held input device is pointing to a position in the screen when the first image 401 is taken, the mapping origin is located in the screen 201, otherwise , The mapping origin is located outside the screen 201.
  • step S304 a mapping coefficient is determined according to the imaging area of the screen and the angle of view of the camera module that collects the first image.
  • mapping coefficient is the number of X-axis image points of the screen reference surface corresponding to each degree of the yaw angle of the handheld input device, and the Y-axis image of the screen reference surface corresponding to each degree of the pitch angle of the handheld input device Points.
  • mapping coefficient S is:
  • is the horizontal field of view of the camera module
  • MaxX is the number of horizontal image points of the screen 201
  • SW is the number of horizontal image points of the screen imaging area 402
  • PicX is the horizontal image of the first image 401 Points.
  • mapping coefficient determined by formula (2) is simplified, which can make the calculation simpler.
  • the offset of the image point of the screen reference surface corresponding to the same angle of the yaw angle or pitch angle of the handheld input device is different.
  • corresponding mapping coefficients are determined for different directions of the handheld input device according to the known imaging principle of the camera.
  • the determined mapping coefficient can be a function whose input parameter is the orientation of the handheld input device, or it can be a discrete table. Each item in the table is a section corresponding to the orientation of the handheld input device. Mapping coefficient.
  • the step of mapping the current pointing of the handheld input device to the display position of the indicator icon is shown in FIG. 5 and includes:
  • step S501 the current orientation of the handheld input device is determined according to the mapping model.
  • the current pointing of the handheld input device can be determined according to the acceleration data of the handheld input device and the gyroscope data, and the current pointing includes the current pitch angle PA n and the current yaw angle YA n of the handheld input device.
  • the calculation method is the same as the method for calculating the first direction.
  • step S502 determine the position offset of the pointing position corresponding to the current pointing relative to the pointing position corresponding to the first pointing according to the angular offset of the current pointing relative to the first pointing and the mapping coefficient .
  • d_VX n (YA n -YA 0 )*S
  • d_VY n (PA n -PA 0 )*S
  • (d_VX n , d_VY n ) is the position offset
  • PA n and YA n are the pitch angle and yaw angle of the current pointing, respectively
  • PA 0 and YA 0 are the pitch angles of the first pointing, respectively
  • S is the mapping coefficient
  • mapping coefficient S is a constant obtained by formula (2). If different orientations of the handheld input device have different mapping coefficients, the above formula is an integral formula or a cumulative formula.
  • step S503 a pointing position corresponding to the current pointing of the handheld input device is determined according to the position offset and the mapping origin;
  • VX n VX 0 +d_VX n
  • VY n VY 0 +d_VY n
  • (VX n , VY n ) is the pointing position
  • (VX 0 , VY 0 ) is the mapping origin
  • (d_VX n , d_VY n ) is the position offset.
  • step S504 the display position of the indicator icon is determined according to the pointing position, wherein if the current pointing of the handheld input device is within the first pointing range, the display position is within the edge range of the screen Inside.
  • the display position of the indicator icon is the position of the reference point of the indicator icon, and the geometric center point of the indicator icon can usually be selected as the reference point. If the pointing position (VX n , VY n ) satisfies the condition: 0 ⁇ VX n ⁇ MaxX and 0 ⁇ VY n ⁇ MaxY, the pointing position is located in the screen 201, and there are multiple implementations for determining the display position of the indicator icon according to the pointing position the way. An implementation manner is to set the display position of the indicator icon to the pointing position, that is, to directly display the indicator icon at the pointing position.
  • Another implementation manner is to set corresponding anti-shake rules, anti-misoperation rules, and error range limitation rules, etc., and determine the display position of the indicator icon in combination with the current pointing position and one or more previous pointing positions, so as to realize the Jitter, misoperation prevention, etc.
  • the pointing position (VX n , VY n ) satisfies the condition: VX n ⁇ 0 or VX n ⁇ MaxX or VY n ⁇ 0 or VY n ⁇ MaxY, then the pointing position is outside the screen 201, and the indication icon
  • the display position of is set within the preset screen edge range.
  • the control indicator icon is stationary on the edge of the screen or slides with the pointing of the handheld input device on the edge of the screen.
  • the screen area preset to be several pixels from the edge of the screen is the screen edge range.
  • a screen area 10 pixels from the screen edge is set as the screen edge range, that is, the coordinate range is X ⁇ [0,9] or X ⁇ [MaxX-10,MaxX-1] or Y ⁇ [0,9] or Y ⁇ [MaxY-10,MaxY-1] screen area.
  • the display position of the set indicator icon is closer to the edge of the screen.
  • the indicator icon moves within the edge of the screen to remind the operator whether the pointing of the handheld input device is far away or close to the screen.
  • the display position of the indicator icon can be transmitted to the operating system of the smart device, and the operating system can display the indicator icon on the display position, or it can be transmitted to the application software on the smart device.
  • the software displays the indicator icon on the display position, or other methods may also be used, which is not limited by this application.
  • the range of the yaw angle and pitch angle of the handheld input device is 0-360 degrees. From the step S501-step S503 shown in Figure 5, the calculation steps of the pointing position of the screen reference surface show that if the pointing position of the handheld input device is Same, then, the pointing position corresponding to the pointing of the handheld input device at these two moments is the same, regardless of the movement track of the handheld input device between these two moments. According to this feature, the pointing range of the handheld input device can be divided.
  • the pointing range of the handheld input device includes a first pointing range, and when the pointing of the handheld input device changes with any track within the first pointing range, the display position of the indicator icon is in a preset screen Within the edge range.
  • mapping coefficient S is a constant obtained by formula (2)
  • the direction of the handheld input device that meets the following conditions belongs to the first direction range:
  • the range of the yaw angle PA and the pitch angle YA of the handheld input device is 0-360 degrees. In order to make the expression concise, the above expression does not deal with the boundary conditions for 0° and 360°.
  • step S501-step S503 shown in FIG. 5 the range of the pointing position mapped by the pointing in the first pointing range shown in formula (3) is: VX ⁇ 0 or VX ⁇ MaxX or VY ⁇ 0 or VY ⁇ MaxY , It can be seen that all are located outside the screen.
  • step S504 the display positions of the corresponding indicator icons in the first pointing range are set within the preset screen edge range. Therefore, when the pointing of the handheld input device changes with an arbitrary track within the first pointing range, the display position of the indicator icon will not move out of the preset screen edge range.
  • the first pointing range usually includes the pointing of most handheld input devices, including the pointing when the handheld input device is pointing to the ground, and the pointing when the operator turns his back to the screen.
  • the mapping model in this embodiment is a three-dimensional model.
  • the position of the screen in the three-dimensional space and the orientation of the handheld input device in the three-dimensional space are subjected to three-dimensional modeling, and the three-dimensional model is the mapping model of this embodiment.
  • the data used to build the three-dimensional model can be the first image pointed by the handheld input device taken by the camera module of the handheld input device; it is also possible to set up a camera device in the external environment of the handheld input device to capture the orientation of the handheld input device.
  • Build a three-dimensional model There are a variety of known 3D modeling methods, such as monocular vision 3D modeling, binocular vision 3D modeling, or depth camera 3D modeling, etc., which will not be repeated.
  • the screen reference surface 202 can be a plane or a curved surface. According to the three-dimensional model, the intersection point of the pointing line 204 of the handheld input device and the screen reference surface 202 can be obtained, and the intersection point is The pointing position of the screen reference surface corresponding to the pointing, and the point of intersection of the pointing line 204 and the screen reference surface outside the screen 201 belongs to the first pointing range.
  • mapping model of this embodiment can model curved screens, but has higher requirements on the computing performance of the system, and is suitable for handheld input devices with strong computing power and large battery capacity.
  • step S102 the display position of the indicator icon is determined according to the mapping model.
  • the current pointing of the handheld input device can be mapped to the display position of the indicator icon.
  • the pointing range of the handheld input device includes the first pointing range. If the current pointing of the handheld input device belongs to the first pointing range, the display position of the indicator icon corresponding to the current pointing is within the preset screen edge range.
  • step S103 when the indicator icon moves within the edge of the screen, it is detected whether the display position of the indicator icon is within the edge of the screen for a duration greater than a preset first threshold. When the time is greater than the first threshold, the indicator icon is blanked.
  • the mapping model if the pointing of the handheld input device moves from outside the first pointing range into the first pointing range, the indicator icon moves into the preset screen edge range, and the pointing of the handheld input device is in the first pointing range.
  • a pointing range changes with any trajectory, and the indicator icon remains displayed within the edge of the screen.
  • the duration is greater than the preset first threshold, such as 1.5 seconds, the indicator icon will be blanked.
  • a mapping model is established, and the direction of the handheld input device can be mapped to the display position of the indicator icon according to the mapping model.
  • the mapping model the pointing range of the handheld input device includes the first pointing range, and the display positions of the pointers mapped to the pointers within the first pointing range are all within the preset screen edge range, which means, The pointing of the hand-held input device is changed at any trajectory within the first pointing range, and the indicating icon can be kept displayed within the preset screen edge range.
  • the method for determining the display position of the mouse cursor in the prior art is: move the mouse in one direction to move the mouse cursor to the edge of the screen, and continue to move the mouse along that direction, the mouse cursor will remain displayed on the edge of the screen, but when the mouse edge is opposite When moving in the direction, the mouse cursor will immediately move into the screen. Therefore, if the traditional mouse pointer display position determination method is adopted, all the pointers of the handheld input device may control the pointer icon to move within the screen, and there is no pointing range that can keep the pointer icon within the edge of the screen.
  • the pointing range of the handheld input device includes the first pointing range, where the first pointing range usually includes the pointing of the handheld input device when the speaker's hands are drooping or when he turns his back to the screen.
  • the speaker only needs to click the activation button to trigger the display of the indicator icon.
  • the indicator icon When the indicator icon is no longer needed, drop his hand or turn to face the audience. The indicator icon will remain displayed within the edge of the screen.
  • Time such as 1.5 seconds, the indicator icon can be blanked.
  • the blanking control method of this application is particularly suitable for digital wireless presenters. During the lecture, the speaker does not need to continuously press the key to activate the indicator icon.
  • the blanking control of the indicator icon is simple and natural, conforms to the speaker’s action habits in the speech, and is beneficial Improve user experience.
  • FIG. 6 is a schematic structural diagram of a blanking control apparatus for an indicator icon of a handheld input device provided by an embodiment of the application.
  • the blanking control device of the indicator icon of the handheld input device includes:
  • the mapping model establishing unit 601 is configured to establish a mapping model, the mapping model is used to map the direction of the handheld input device to the display position of the indicator icon, wherein, in the mapping model, the handheld input device
  • the pointing range includes a first pointing range, and when the pointing of the handheld input device changes with an arbitrary track within the first pointing range, the display position of the indicator icon is all within a preset screen edge range;
  • the display position determining unit 602 is configured to determine the display position of the indicator icon according to the mapping model
  • the blanking control unit 603 is configured to detect whether the display position of the indicator icon is within the edge of the screen for a duration greater than a preset first threshold when the indicator icon moves into the edge of the screen, if If the duration is greater than the first threshold, the indication icon is blanked.
  • the parameters of the mapping model include a first orientation, a mapping origin, and a mapping coefficient.
  • mapping model establishing unit 601 includes:
  • the first orientation setting subunit is configured to set the first orientation to the orientation of the handheld input device when the mapping model is established, and the orientation of the handheld input device includes the yaw angle and pitch of the handheld input device angle;
  • the mapping origin setting subunit is used to set the mapping origin to a preset screen position
  • the mapping coefficient setting subunit is used to set the mapping coefficient to a preset value or determine the mapping coefficient according to the distance between the handheld input device and the screen.
  • mapping model establishing unit 601 includes:
  • the screen imaging area recognition subunit is used to collect the first image pointed by the handheld input device, extract the image features of the first image, match the image features of the first image with the prefetched screen features, and recognize The screen imaging area in the first image.
  • the parameter determination subunit is configured to determine the parameters of the mapping model according to the screen imaging area, wherein the first orientation is the orientation of the handheld input device when the first image is collected, and the orientation of the handheld input device is
  • the pointing includes the yaw angle and the pitch angle of the handheld input device, the mapping origin is determined according to the position of the screen imaging area in the first image, and the mapping coefficient is determined according to the screen imaging area and the acquisition location.
  • the first image is determined by the angle of view of the camera module.
  • mapping model establishing unit 601 further includes:
  • the second image collection subunit is used to collect the second image displayed on the screen
  • the screen feature extraction subunit is used to extract the image feature of the second image, where the image feature of the second image is the prefetched screen feature.
  • the display position determining unit 602 includes:
  • the current pointing determination subunit is used to determine the current pointing of the handheld input device
  • a position offset determination subunit configured to determine the pointing position corresponding to the current pointing relative to the pointing position corresponding to the first pointing according to the angular offset of the current pointing relative to the first pointing and the mapping coefficient ⁇ position shift;
  • a pointing position determining subunit configured to determine a pointing position corresponding to the current pointing of the handheld input device according to the position offset and the mapping origin;
  • the display position determination subunit is configured to determine the display position of the indicator icon according to the pointing position corresponding to the current pointing, wherein, if the current pointing of the handheld input device is within the first pointing range, the display The position is within the edge of the screen.
  • the blanking control device of the indicator icon of the handheld input device described in FIG. 6 corresponds to the blanking control method of the indicator icon of the handheld input device described in FIG. 1.
  • the electronic devices involved in the blanking control method described in this application include handheld input devices, smart devices, and display devices, and possibly other electronic devices, such as data processing devices between the smart device and the display device.
  • the blanking control method described in this application can be implemented by the operating system or application program of the above-mentioned related electronic device, or the program of the control method can be stored by a portable storage device.
  • the portable storage device is connected to the above-mentioned related device,
  • the above-mentioned smart devices may be electronic devices such as desktop computers, mobile phones, tablet computers, embedded devices, and cloud computing devices.
  • each module of the blanking control device described in this application in the aforementioned handheld input device, smart device, display device, and data processing device is not limited in this application.
  • the above-mentioned smart device, display device and data processing device may be independent devices or integrated integrated devices.
  • the connection between the above-mentioned devices can be wired connection or wireless connection.
  • Fig. 7 is a schematic diagram of a handheld input device provided by an embodiment of the present application.
  • the handheld input device 7 of this embodiment includes: a processor 70, a memory 71, and a computer program 72 stored in the memory 71 and running on the processor 70, such as a handheld input device Indicates the blanking control program of the icon.
  • the processor 70 executes the computer program 72, the steps in the embodiment of the blanking control method for the indication icons of the aforementioned handheld input devices are implemented.
  • the processor 70 executes the computer program 72, the functions of the modules/units in the foregoing device embodiments are implemented.
  • the computer program 72 may be divided into one or more modules/units, and the one or more modules/units are stored in the memory 71 and executed by the processor 70 to complete This application.
  • the one or more modules/units may be a series of computer program instruction segments capable of completing specific functions, and the instruction segments are used to describe the execution process of the computer program 72 in the handheld input device 7.
  • the computer program 72 can be divided into:
  • the mapping model establishment unit is configured to establish a mapping model for mapping the orientation of the handheld input device to the display position of the indicator icon, wherein, in the mapping model, the orientation of the handheld input device
  • the range of includes a first pointing range, and when the pointing of the handheld input device changes with any track within the first pointing range, the display position of the indicator icon is within a preset screen edge range;
  • a display position determining unit configured to determine the display position of the indicator icon according to the mapping model
  • the blanking control unit is used to detect whether the display position of the indicator icon is within the edge of the screen for a duration greater than a preset first threshold when the indicator icon moves within the edge of the screen. If the duration is greater than the first threshold, the indication icon is blanked.
  • the handheld input device may include, but is not limited to, a processor 70 and a memory 71.
  • FIG. 7 is only an example of the handheld input device 7, and does not constitute a limitation on the handheld input device 7. It may include more or less components than shown in the figure, or combine certain components, or different
  • the handheld input device may also include input and output devices, network access devices, buses, etc.
  • the handheld input device can be one device, or it can be composed of more than one separate sub-devices.
  • the handheld input device includes two sub-devices. One of the sub-devices is operated by the operator while the other is used for the analysis of the first image. The two sub-devices are connected by wireless communication.
  • the so-called processor 70 may be a central processing unit (Central Processing Unit, CPU), other general-purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), Ready-made programmable gate array (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc.
  • the general-purpose processor may be a microprocessor or the processor may also be any conventional processor or the like.
  • the memory 71 may be an internal storage unit of the handheld input device 7, such as a hard disk or memory of the handheld input device 7.
  • the memory 71 may also be an external storage device of the handheld input device 7, such as a plug-in hard disk equipped on the handheld input device 7, a smart memory card (Smart Media Card, SMC), and a secure digital (Secure Digital, SD card, Flash Card, etc.
  • the memory 71 may also include both an internal storage unit of the handheld input device 7 and an external storage device.
  • the memory 71 is used to store the computer program and other programs and data required by the handheld input device.
  • the memory 71 can also be used to temporarily store data that has been output or will be output.
  • the disclosed apparatus/terminal device and method may be implemented in other ways.
  • the device/terminal device embodiments described above are only illustrative.
  • the division of the modules or units is only a logical function division, and there may be other divisions in actual implementation, such as multiple units.
  • components can be combined or integrated into another system, or some features can be omitted or not implemented.
  • the displayed or discussed mutual coupling or direct coupling or communication connection may be indirect coupling or communication connection through some interfaces, devices or units, and may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, they may be located in one place, or they may be distributed on multiple network units. Some or all of the units may be selected according to actual needs to achieve the objectives of the solutions of the embodiments.
  • the functional units in the various embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units may be integrated into one unit.
  • the above-mentioned integrated unit can be implemented in the form of hardware or software functional unit.
  • the integrated module/unit is implemented in the form of a software functional unit and sold or used as an independent product, it can be stored in a computer readable storage medium.
  • this application implements all or part of the processes in the above-mentioned embodiments and methods, and can also be completed by instructing relevant hardware through a computer program.
  • the computer program can be stored in a computer-readable storage medium. When the program is executed by the processor, the steps of the foregoing method embodiments can be implemented.
  • the computer program includes computer program code, and the computer program code may be in the form of source code, object code, executable file, or some intermediate forms.
  • the computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, U disk, mobile hard disk, magnetic disk, optical disk, computer memory, read-only memory (ROM, Read-Only Memory) , Random Access Memory (RAM, Random Access Memory), electrical carrier signal, telecommunications signal, and software distribution media, etc.
  • ROM Read-Only Memory
  • RAM Random Access Memory
  • electrical carrier signal telecommunications signal
  • software distribution media etc.
  • the content contained in the computer-readable medium can be appropriately added or deleted in accordance with the requirements of the legislation and patent practice in the jurisdiction.
  • the computer-readable medium Does not include electrical carrier signals and telecommunication signals.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un procédé de commande d'extinction pour une icône d'indication d'un dispositif d'entrée portatif qui consiste : à établir un modèle de mappage, le modèle de mappage étant utilisé afin de mapper le pointage du dispositif d'entrée portatif dans une position d'affichage de l'icône d'indication, dans le modèle de mappage, la plage de pointage du dispositif d'entrée portatif comprenant une première plage de pointage, et lorsque le pointage du dispositif d'entrée portatif est changé, dans n'importe quelle piste, à l'intérieur de la première plage de pointage, la position d'affichage de l'icône d'indication se situant dans une plage de bord d'écran prédéfinie ; à déterminer la position d'affichage de l'icône d'indication selon le modèle de mappage ; et lorsque l'icône d'indication est déplacée dans la plage de bord d'écran, à détecter que la durée pendant laquelle la position d'affichage de l'icône d'indication se trouve à l'intérieur de la plage de bord d'écran est supérieure à un premier seuil prédéfini, et à effectuer l'extinction de l'icône d'indication.
PCT/CN2020/100299 2019-07-05 2020-07-04 Dispositif d'entrée portatif et procédé et appareil de commande d'extinction pour icône d'indication d'un dispositif d'entrée portatif WO2021004413A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910603013.0A CN110489026A (zh) 2019-07-05 2019-07-05 一种手持输入设备及其指示图标的消隐控制方法和装置
CN201910603013.0 2019-07-05

Publications (1)

Publication Number Publication Date
WO2021004413A1 true WO2021004413A1 (fr) 2021-01-14

Family

ID=68546629

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/100299 WO2021004413A1 (fr) 2019-07-05 2020-07-04 Dispositif d'entrée portatif et procédé et appareil de commande d'extinction pour icône d'indication d'un dispositif d'entrée portatif

Country Status (2)

Country Link
CN (1) CN110489026A (fr)
WO (1) WO2021004413A1 (fr)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110489026A (zh) * 2019-07-05 2019-11-22 深圳市格上格创新科技有限公司 一种手持输入设备及其指示图标的消隐控制方法和装置
CN110489027B (zh) * 2019-07-05 2021-07-23 深圳市格上格创新科技有限公司 手持输入设备及其指示图标的显示位置控制方法和装置

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101398721A (zh) * 2007-09-26 2009-04-01 昆盈企业股份有限公司 空中鼠标的光标位移速度控制方法
CN102292690A (zh) * 2009-01-22 2011-12-21 阿尔卡特朗讯美国公司 电子数据输入系统
CN103902061A (zh) * 2012-12-25 2014-07-02 华为技术有限公司 空中鼠标的光标显示方法、设备及系统
CN106774868A (zh) * 2016-12-06 2017-05-31 杨超峰 一种无线演示装置
US20190121449A1 (en) * 2009-06-04 2019-04-25 Sony Corporation Control device, input device, control system, handheld device, and control method
CN110489027A (zh) * 2019-07-05 2019-11-22 深圳市格上格创新科技有限公司 手持输入设备及其指示图标的显示位置控制方法和装置
CN110489026A (zh) * 2019-07-05 2019-11-22 深圳市格上格创新科技有限公司 一种手持输入设备及其指示图标的消隐控制方法和装置

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100472420C (zh) * 2006-09-14 2009-03-25 腾讯科技(深圳)有限公司 显示装置及显示方法
WO2009072504A1 (fr) * 2007-12-07 2009-06-11 Sony Corporation Dispositif de commande, dispositif d'entrée, système de commande, procédé de commande et dispositif portatif
CN103425408A (zh) * 2012-05-23 2013-12-04 深圳富泰宏精密工业有限公司 鼠标键盘共享方法及系统
CN108848374B (zh) * 2018-08-08 2020-08-04 京东方科技集团股份有限公司 显示参数测量方法及其装置、存储介质和测量系统

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101398721A (zh) * 2007-09-26 2009-04-01 昆盈企业股份有限公司 空中鼠标的光标位移速度控制方法
CN102292690A (zh) * 2009-01-22 2011-12-21 阿尔卡特朗讯美国公司 电子数据输入系统
US20190121449A1 (en) * 2009-06-04 2019-04-25 Sony Corporation Control device, input device, control system, handheld device, and control method
CN103902061A (zh) * 2012-12-25 2014-07-02 华为技术有限公司 空中鼠标的光标显示方法、设备及系统
CN106774868A (zh) * 2016-12-06 2017-05-31 杨超峰 一种无线演示装置
CN110489027A (zh) * 2019-07-05 2019-11-22 深圳市格上格创新科技有限公司 手持输入设备及其指示图标的显示位置控制方法和装置
CN110489026A (zh) * 2019-07-05 2019-11-22 深圳市格上格创新科技有限公司 一种手持输入设备及其指示图标的消隐控制方法和装置

Also Published As

Publication number Publication date
CN110489026A (zh) 2019-11-22

Similar Documents

Publication Publication Date Title
US11231845B2 (en) Display adaptation method and apparatus for application, and storage medium
US10705619B2 (en) System and method for gesture based data and command input via a wearable device
US9600078B2 (en) Method and system enabling natural user interface gestures with an electronic system
WO2021004412A1 (fr) Dispositif d'entrée portatif, et procédé et appareil de commande de la position d'affichage d'une icône d'indication de celui-ci
EP2864932B1 (fr) Positionnement d'extrémité de doigt pour une entrée de geste
US9268410B2 (en) Image processing device, image processing method, and program
WO2021097600A1 (fr) Procédé et appareil d'interaction inter-air et dispositif
US10979700B2 (en) Display control apparatus and control method
US10296096B2 (en) Operation recognition device and operation recognition method
US11886643B2 (en) Information processing apparatus and information processing method
US9400575B1 (en) Finger detection for element selection
WO2021004413A1 (fr) Dispositif d'entrée portatif et procédé et appareil de commande d'extinction pour icône d'indication d'un dispositif d'entrée portatif
JP2012238293A (ja) 入力装置
CN112150560A (zh) 确定消失点的方法、装置及计算机存储介质
CN109964202B (zh) 显示控制装置、显示控制方法和计算机可读存储介质
CN106569716B (zh) 单手操控方法及操控系统
CN106598422B (zh) 混合操控方法及操控系统和电子设备
US9665232B2 (en) Information-processing device, storage medium, information-processing method, and information-processing system for enlarging or reducing an image displayed on a display device
KR102118421B1 (ko) 카메라 커서 시스템
WO2021244650A1 (fr) Procédé et dispositif de commande, terminal et support d'enregistrement
CN114360047A (zh) 举手手势识别方法、装置、电子设备及存储介质
JP6075193B2 (ja) 携帯端末装置
CN114201028B (zh) 扩增实境系统与其锚定显示虚拟对象的方法
TW201913298A (zh) 可顯示實體輸入裝置即時影像之虛擬實境系統及其控制方法
US11054941B2 (en) Information processing system, information processing method, and program for correcting operation direction and operation amount

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20837423

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 20837423

Country of ref document: EP

Kind code of ref document: A1