WO2021004412A1 - 手持输入设备及其指示图标的显示位置控制方法和装置 - Google Patents

手持输入设备及其指示图标的显示位置控制方法和装置 Download PDF

Info

Publication number
WO2021004412A1
WO2021004412A1 PCT/CN2020/100298 CN2020100298W WO2021004412A1 WO 2021004412 A1 WO2021004412 A1 WO 2021004412A1 CN 2020100298 W CN2020100298 W CN 2020100298W WO 2021004412 A1 WO2021004412 A1 WO 2021004412A1
Authority
WO
WIPO (PCT)
Prior art keywords
input device
pointing
image
handheld input
screen
Prior art date
Application number
PCT/CN2020/100298
Other languages
English (en)
French (fr)
Inventor
杨超峰
Original Assignee
深圳市格上格创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市格上格创新科技有限公司 filed Critical 深圳市格上格创新科技有限公司
Publication of WO2021004412A1 publication Critical patent/WO2021004412A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons

Definitions

  • This application belongs to the field of handheld input devices, and in particular relates to a method and device for controlling the display position of a handheld input device and its indicator icons.
  • the handheld input device is operated by the operator, and the operator can control an indicator icon to move on the computer graphical user interface (GUI) by manipulating its movement in the three-dimensional space.
  • GUI computer graphical user interface
  • a specific product of handheld input devices is an air mouse.
  • the above indicator is the mouse cursor.
  • an air mouse does not require a support plane, and the operator can hold the air mouse to control the mouse cursor to move on the graphical user interface.
  • Another specific product of handheld input devices is a digital wireless demonstrator.
  • the wireless presenter is an important equipment in the conference room, used for the lecturer to highlight key content when explaining PPT and other materials.
  • the main category of wireless demonstrators is laser demonstrators, also known as laser pointers.
  • the price of LCD/LED large screens has become lower and lower.
  • the indicator icon that it controls is designed to be more eye-catching than the traditional mouse cursor. For example, it is designed as a bright red round spot to achieve an effect similar to a laser spot, which is the focus of speech The content is highlighted.
  • Patent No. US5440326 and Patent No. US5898421 disclose a gyroscope-based handheld input device.
  • Patent application number 201611109873.1 discloses an implementation method combining image technology and inertial motion sensor.
  • the method for controlling the display position of the indicator icon of the hand-held input device disclosed above is the same as the mouse cursor control method of the prior art.
  • the basic process is: in the calculation period of the display position of each indicator icon, according to the input device relative to the previous For periodic motion, calculate the coordinate offset relative to the previous cycle, and add the coordinate offset to the coordinate of the previous cycle to obtain the current coordinate.
  • the characteristic of this calculation method is: when the pointing icon moves to the edge of the screen, if the pointing of the handheld input device continues to move to the outside of the screen, the pointing icon will remain displayed on the edge position, once the pointing of the handheld input device changes to the inside of the screen Move, the indicator icon will immediately move to the inside of the screen.
  • the handheld input device Since the handheld input device is operated by hand, it is different from the desktop mouse and has poor controllability. It is easy to point to the outside of the screen during operation. Especially for digital wireless presenters, the pointing action of the speaker is part of the body language and will often Make the digital wireless presenter point to the outside of the screen.
  • the handheld input device points to the outside of the screen and then adjust it back again, the deviation between its pointing position and the display position of the indicator icon will increase a bit.
  • the handheld input device is pointing inside or outside the screen
  • the deviation will become larger and larger, resulting in increasingly unnatural operations and affecting user experience.
  • the embodiments of the present application provide a handheld input device and a method and device for controlling the display position of the indicator icon to solve the problem of the indicator icon when the position pointed to by the handheld input device moves inside or outside the screen in the prior art.
  • the display position deviates from the position pointed by the handheld input device, which causes unnatural operation and affects user experience.
  • the first aspect of the embodiments of the present application provides a method for controlling the display position of an indicator icon of a handheld input device.
  • the method for controlling the display position of an indicator icon of the handheld input device includes:
  • Collect the first image pointed by the handheld input device extract the image features of the first image; match the image features of the first image with the pre-fetched screen features, and identify the screen in the first image Imaging area
  • a mapping model for mapping the pointing position of the handheld input device to the pointing position of the screen reference surface is established according to the screen imaging area, wherein, in the mapping model, the pointing range of the handheld input device includes a first pointing range and A second pointing range, a pointing position mapped by a pointing in the first pointing range is located on the screen, and a pointing position mapped by a pointing in the second pointing range is located outside the screen;
  • the display position control method before the step of matching the image feature of the first image with the prefetched screen feature, the display position control method further includes :
  • the image feature of the second image is extracted, where the image feature of the second image is the prefetched screen feature.
  • the display position control method further includes:
  • the camera module for acquiring the first image is controlled to enter the working mode, and after the step of acquiring the first image pointed by the handheld input device, it is controlled for acquiring The camera module of the first image enters a power saving mode;
  • the image analysis module for extracting the image features is controlled to enter the working mode, and after the step of extracting the image features of the first image, the image analysis module is controlled to be used for extraction The image analysis module of the image feature enters the power saving mode.
  • the step of determining the display position of the indicator icon according to the pointing position includes:
  • the display position of the indicator icon is determined according to the pointing position at the current moment and at least one previous pointing position.
  • the step of determining the display position of the indicator icon according to the pointing position includes:
  • the parameters of the mapping model include a first orientation, a mapping origin, and a mapping coefficient, where:
  • the first orientation is the orientation of the handheld input device when the first image is collected, and the orientation of the handheld input device includes a yaw angle and a pitch angle of the handheld input device;
  • the mapping origin is determined according to the position of the screen imaging area in the first image
  • the mapping coefficient is determined according to the imaging area of the screen and the angle of view of the camera module that collects the first image.
  • the step of determining the pointing position of the screen reference surface corresponding to the current pointing of the handheld input device according to the mapping model include:
  • the current pointing position of the handheld input device corresponding to the screen reference surface is determined according to the position offset and the mapping origin.
  • a second aspect of the embodiments of the present application provides an apparatus for controlling the display position of an indicator icon of a handheld input device, and the apparatus for controlling the display position of an indicator icon of the handheld input device includes:
  • the first image acquisition unit is used to acquire the first image pointed by the handheld input device, extract the image characteristics of the first image; match the image characteristics of the first image with the prefetched screen characteristics, and identify The screen imaging area in the first image;
  • the mapping model establishment unit is configured to establish a mapping model that maps the pointing position of the handheld input device to the pointing position of the screen reference surface according to the screen imaging area, wherein, in the mapping model, the pointing position of the handheld input device is The range includes a first pointing range and a second pointing range, the pointing position mapped by the pointing in the first pointing range is located on the screen, and the pointing position mapped by the pointing in the second pointing range is located outside the screen;
  • the display position determining unit is configured to determine, according to the mapping model, the current pointing position of the handheld input device corresponding to the screen reference surface, and if the pointing position is located on the screen, determine the display of the indicator icon according to the pointing position Position; otherwise, set the display position of the indicator icon within a preset screen edge range or hide the indicator icon.
  • the display position control device further includes:
  • the second image acquisition unit is configured to acquire a second image displayed on the screen
  • the second image feature extraction unit is configured to extract the image feature of the second image, wherein the image feature of the second image is the prefetched screen feature.
  • the display position control device further includes:
  • the camera module control unit is used to control the camera module for collecting the first image to enter the working mode before the step of collecting the first image pointed by the handheld input device, and when the first image pointed by the handheld input device is collected After the step of controlling the camera module for collecting the first image to enter the power saving mode;
  • the image analysis module control unit is used to control the image analysis module for extracting the image features to enter the working mode before the step of extracting the image features of the first image. After the feature step, the image analysis module for extracting the image feature is controlled to enter the power saving mode.
  • the display position determining unit includes:
  • the first setting subunit is configured to set the display position of the indicator icon as the pointing position
  • the second setting subunit is configured to determine the display position of the indicator icon according to the pointing position at the current moment and at least one previous pointing position.
  • the display position determining unit includes:
  • the working mode obtaining subunit is used to obtain the current working mode of the handheld input device
  • the sensitivity query subunit is used to query the sensitivity corresponding to the current working mode according to the preset working mode of the handheld input device and the movement sensitivity table of the indicator icon;
  • the display position determining subunit is configured to determine the display position of the indicator icon according to the pointing position and the sensitivity corresponding to the current working mode.
  • the parameters of the mapping model include a first orientation, a mapping origin, and a mapping coefficient, where:
  • the first orientation is the orientation of the handheld input device when the first image is collected, and the orientation of the handheld input device includes a yaw angle and a pitch angle of the handheld input device;
  • the mapping origin is determined according to the position of the screen imaging area in the first image
  • the mapping coefficient is determined according to the imaging area of the screen and the angle of view of the camera module that collects the first image.
  • the display position determining unit includes:
  • the current pointing determination subunit is used to determine the current pointing of the handheld input device
  • a position offset determination subunit configured to determine the pointing position corresponding to the current pointing relative to the pointing position corresponding to the first pointing according to the angular offset of the current pointing relative to the first pointing and the mapping coefficient ⁇ position shift;
  • the pointing position determining subunit is configured to determine, according to the position offset and the mapping origin, the pointing position of the handheld input device corresponding to the screen reference surface currently pointing.
  • the third aspect of the embodiments of the present application provides a handheld input device, including a memory, a processor, and a computer program stored in the memory and running on the processor, and the processor executes the computer program When realizing the steps of the method for controlling the display position of the indicator icon of the handheld input device according to any one of the first aspect.
  • the fourth aspect of the embodiments of the present application provides a computer-readable storage medium, the computer-readable storage medium stores a computer program, and when the computer program is executed by a processor, the handheld The steps of the method for controlling the display position of the indicator icon of the input device.
  • the embodiment of the present application has the following beneficial effects: during the use of the handheld input device, the first image pointed by the handheld input device is acquired, the screen imaging area of the first image is recognized, and the screen imaging area of the first image is recognized according to the recognized
  • the screen imaging area determines a mapping model that maps the pointing direction of the handheld input device to the pointing position of the screen reference surface. According to the mapping model, the pointing position of the handheld input device corresponding to the screen reference surface is determined. When the pointing position is inside the screen, an indicator icon is displayed on the corresponding position of the screen. When the pointing position is outside the screen, the preset screen is displayed. The indicator icon is displayed or hidden in the edge range.
  • the display position control method of the handheld input device of this application is different from the mouse cursor control method of the prior art: this application first determines the screen position pointed by the handheld input device based on image technology and displays the indicator icon at that location, that is, the handheld input device The initial position of the pointing is the same as the first display position of the indicator icon.
  • the display position of the indicator remains Keep the position consistent with the actual pointing position of the handheld input device, and will not cause deviation due to the pointing of the handheld input device moving back and forth inside and outside the screen. Therefore, the user does not need to control the amplitude of the action during the operation, and can even swing the handheld input device as part of the body language, which is natural to operate and is beneficial to improve the user experience.
  • FIG. 1 is a schematic diagram of the implementation process of a method for controlling the display position of an indicator icon of a handheld input device according to an embodiment of the present application;
  • FIG. 2 is a schematic diagram of a pointing control indicator icon through a handheld input device provided by an embodiment of the present application
  • FIG. 3 is a method for calculating parameters of Embodiment 1 of the mapping model provided by the embodiment of the present application;
  • FIG. 4 is a schematic diagram of a first image collected by a handheld input device according to an embodiment of the present application.
  • FIG. 5 is a schematic diagram of an implementation process of determining a pointing position according to Embodiment 1 of a mapping model provided by an embodiment of the present application;
  • FIG. 6 is a schematic diagram of an apparatus for controlling the display position of an indicator icon of a handheld input device according to an embodiment of the present application
  • Fig. 7 is a schematic diagram of a handheld input device provided by an embodiment of the present application.
  • FIG. 1 is a schematic diagram of the implementation process of a method for controlling the display position of an indicator icon of a handheld input device according to an embodiment of the application, and the details are as follows:
  • step S101 the first image pointed to by the handheld input device is collected, and the image feature of the first image is extracted; the image feature of the first image is matched with the prefetched screen feature, and it is recognized in the first image.
  • the screen imaging area in an image.
  • the handheld input device in this application can be an air mouse, a digital wireless presenter, or other types of input devices.
  • the handheld input device is provided with buttons, and the operator can activate the indicator icon or trigger other operations through button commands.
  • the handheld input device is operated by the operator, and the operator controls an indicator icon to move on the screen of the display device by controlling the movement of the handheld input device in the three-dimensional space during use, wherein the indicator icon is a computer graphic user
  • a graphic object on the interface can be a pointer-shaped icon, such as a mouse cursor.
  • the corresponding operation can be triggered by the function keys on the handheld input device.
  • the display device can be any display device such as a projector, an LCD/LED display, etc.
  • the display position control method described in this application can be triggered when the operator activates the indicator icon, or when the indicator icon is in the blanking state, for example, it can be triggered immediately after the handheld input device is powered on Run, or trigger run when the pitch angle of the handheld input device is greater than the preset angle.
  • the background trigger method that precedes the activation of the indicator icon, when the operator activates the indicator icon, the display position of the indicator icon has been determined, the indicator icon can be displayed on the screen immediately, and the response speed is fast.
  • Figure 2 shows an application scenario of a handheld input device.
  • the handheld input device is held by an operator.
  • the gray area in the figure is a screen 201, which includes but is not limited to an electronic wall screen, a display screen, or a projection screen.
  • the handheld input device is pointed at the screen 201 for operation.
  • the movement of the handheld input device in the three-dimensional space can be represented by the change of the pointing D of the handheld input device, and a mapping model is established, through which the pointing D of the handheld input device is mapped to the screen reference surface 202
  • the operator can control the indicating icon to move on the screen 201 by manipulating the pointing D of the handheld input device.
  • the screen reference surface may be determined according to the surface of the screen. If the screen surface of the display device is flat, preferably, as shown in FIG. 2, the screen reference surface 202 is set to the plane where the screen is located, and the dark gray area in the figure is the screen 201. If the screen surface is a curved surface, the screen projection surface can be used to approximate the screen 201, and the screen reference surface 202 is the plane where the screen projection surface is located, or the screen reference surface 202 can also be a curved surface. It should be noted that the screen 201 shown in Figure 2 is rectangular. In fact, the screen 201 can be any other shape, or it can be composed of several separate sub-screens. For a special-shaped screen, the mapping model is based on the principle of a rectangular screen. Are the same. Without loss of generality, this application uses a rectangular screen as an example to discuss the process of establishing a mapping model.
  • the pointing position of the screen reference surface mapped by the pointing D of the handheld input device is a different concept from the actual pointing position of the handheld input device, and the pointing position may be consistent with the actual pointing position of the handheld input device. It may also deviate from the position the handheld input device actually points to. For example, as shown in FIG. 2, if the pointing position mapped by the pointing D of the handheld input device is the intersection P of the pointing line 204 of the handheld input device and the screen reference surface 202, then the pointing position mapped by the pointing D is the same as the actual handheld input device. The pointing positions are consistent; if the pointing position mapped by the pointing D of the handheld input device is P1, the pointing position mapped by the pointing D is deviated from the actual pointing position of the handheld input device.
  • the present application discloses a method for controlling the display position of an indicator icon, which can make the pointing position mapped by the pointing D of the handheld input device coincide with the actual pointing position of the handheld input device, so that the position pointed by the handheld input device and the display of the indicator icon Keeping the location consistent is conducive to improving the user experience.
  • a camera module 203 is provided in the handheld input device, and the camera module 203 captures an image pointed by the handheld input device, and this image is the first image 401 shown in FIG. 4.
  • the camera module 203 can be installed in a manner that the pointing line 204 of the handheld input device is perpendicular to the plane of the photosensitive sensor array of the camera module 203 and passes through the center point.
  • the imaging point of the intersection point P of the pointing line 204 of the handheld input device and the screen reference surface 202 in the first image 401 is the center point IP of the first image 401, which facilitates the calculation of the parameters of the mapping model.
  • the screen imaging area 402 in the first image 401 is described.
  • the pre-fetched screen feature is used to identify the image feature of the screen 201.
  • the analysis and matching algorithms of image features are known technologies and will not be repeated here. It should be noted that the actual imaging area of the screen 201 in the first image is usually deformed, which can be corrected according to the imaging principle of the camera to obtain a regular screen imaging area 402 as shown in FIG. 4.
  • Method 1 Set up a hardware identification device on the display device, for example, set up multiple infrared LEDs around the screen 201, pre-fetch the number of LEDs and LED spacing as screen features, extract the bright spots in the first image 401, The light points are matched according to the number characteristics of the LEDs and the LED spacing characteristics, and the screen imaging area 402 can be determined.
  • Manner 2 When the first image 401 is collected, one or more special graphic objects are displayed on a specific position of the screen 201 (such as the center of the screen), and the image features of these graphic objects are extracted in advance as screen features.
  • Manner 3 At the same time or before the first image 401 is collected, the second image displayed on the screen 201 is collected, and the image feature of the second image is extracted as the screen feature.
  • An implementation of capturing the second image can use an independent hardware device that is connected to the image data transmission channel between the smart device and the display device, such as an HDMI line, and the device intercepts the second image output by the smart device .
  • Another implementation manner of capturing the second image is a software screenshot mode. A software is installed on the smart device that outputs the second image, and the software intercepts the second image output by the smart device to the display device.
  • a mapping model that maps the pointing D of the handheld input device to the pointing position of the screen reference surface 202 is established, and the pointing position is used to control the display position of the indicator icon, and the actual pointing position of the handheld input device and the indicator icon can be realized.
  • the consistency of the display position is established, and the pointing position is used to control the display position of the indicator icon, and the actual pointing position of the handheld input device and the indicator icon can be realized.
  • a mapping model for mapping the pointing position of the handheld input device to the pointing position of the screen reference surface is established according to the screen imaging area, wherein, in the mapping model, the pointing range of the handheld input device includes The first pointing range and the second pointing range, the pointing position mapped by the pointing in the first pointing range is located inside the screen, and the pointing position mapped by the pointing in the second pointing range is outside the screen.
  • mapping model that maps the pointing position of the handheld input device to the pointing position of the screen reference surface can be established.
  • mapping model There are multiple implementation manners for establishing the mapping model.
  • the parameters of the mapping model in this embodiment include the first orientation, the mapping origin, and the mapping coefficient.
  • the calculation method for obtaining the parameters of the mapping model is shown in Fig. 3 and includes:
  • the first orientation is the orientation of the handheld input device when the first image is collected, wherein the orientation of the handheld input device includes the yaw angle and the pitch angle of the handheld input device;
  • the yaw angle and pitch angle of the handheld input device are used to model the pointing of the handheld input device.
  • the RY axis is the pitch axis
  • the RZ axis is the yaw axis.
  • An inertial motion sensor is provided in the handheld input device, including an accelerometer and a gyroscope, and the inertial motion sensor collects acceleration data and gyroscope data of the handheld input device.
  • the acceleration data and gyroscope data the pitch angle and yaw angle of the handheld input device can be calculated, that is, the orientation of the handheld input device in the mapping model of this embodiment. It is a known method to calculate the attitude angle of a handheld input device based on acceleration data and gyroscope data, such as an extended Kalman filter algorithm, which will not be repeated in this application.
  • the pitch angle and the yaw angle of the handheld input device when acquiring the first image are determined according to the acceleration data of the handheld input device and the gyroscope data, that is, the first orientation is obtained.
  • step S302 the mapping origin is determined according to the position of the screen imaging area in the first image.
  • the top left vertex IO of the screen imaging area 402 is the imaging point of the top left vertex O of the screen 201
  • the point IP is the imaging point at the position P pointed by the handheld input device.
  • the point IP is located at the center point of the first image 401.
  • the screen reference surface 202 is a plane, and a Cartesian coordinate system is set for the screen reference surface 202.
  • the coordinate system takes the vertex O of the upper left corner of the screen 201 as the origin, and the screen
  • the X axis is the X axis
  • the screen Y axis is the Y axis.
  • the coordinate offset of the position P actually pointed by the handheld input device relative to the vertex O of the upper left corner of the screen 201 can be calculated, thereby obtaining the position P on the screen reference surface 202 ,
  • the coordinate is the mapping origin (VX 0 , VY 0 ). Calculated as follows:
  • VX 0 (CX/SW)*MaxX
  • VY 0 (CY/SH)*MaxY (1)
  • CX and CY are the number of horizontal and vertical image points of the point IP relative to the point IO;
  • SW and SH are the number of horizontal and vertical image points of the screen imaging area 402 respectively;
  • MaxX and MaxY are the number of horizontal image points of the screen 201 respectively And the number of vertical image points, for example, if the screen resolution of the screen 201 is 1920x1080, then MaxX is 1920 and MaxY is 1080.
  • mapping origin is the pointing position of the screen reference surface corresponding to the first pointing. If the hand-held input device points to a position in the screen when the first image 401 is taken, the mapping origin is located in the screen, otherwise, the mapping The origin is off the screen.
  • step S303 the mapping coefficient is determined according to the imaging area of the screen and the angle of view of the camera module that collects the first image.
  • mapping coefficient is the number of X-axis image points of the screen reference surface corresponding to each degree of the yaw angle of the handheld input device, and the Y-axis image of the screen reference surface corresponding to each degree of the pitch angle of the handheld input device Points.
  • mapping coefficient S is:
  • is the horizontal field of view of the camera module
  • MaxX is the number of horizontal image points of the screen 201
  • SW is the number of horizontal image points of the screen imaging area 402
  • PicX is the horizontal image of the first image 401 Points.
  • mapping coefficient determined by formula (2) is simplified, which makes the calculation simpler.
  • the corresponding mapping coefficients are determined for different orientations of the handheld input device according to the known imaging principle of the camera.
  • the determined mapping coefficient can be a function whose input parameter is the orientation of the handheld input device, or it can be a discrete table. Each item in the table is a section corresponding to the orientation of the handheld input device. Mapping coefficient.
  • the pointing position of the screen reference surface corresponding to the current pointing of the handheld input device can be determined, and the calculation process is shown in FIG. 5.
  • step S501 the current direction of the handheld input device is determined.
  • the current pointing of the handheld input device can be determined according to the acceleration data of the handheld input device and the gyroscope data, and the current pointing includes the current pitch angle PA n and the current yaw angle YA n of the handheld input device.
  • the calculation method is the same as the method for calculating the first direction.
  • step S502 determine the position offset of the pointing position corresponding to the current pointing relative to the pointing position corresponding to the first pointing according to the angular offset of the current pointing relative to the first pointing and the mapping coefficient .
  • d_VX n (YA n -YA 0 )*S
  • d_VY n (PA n -PA 0 )*S
  • (d_VX n , d_VY n ) is the position offset
  • PA n and YA n are the pitch angle and yaw angle of the current pointing, respectively
  • PA 0 and YA 0 are the pitch angles of the first pointing, respectively
  • S is the mapping coefficient
  • mapping coefficient S is a constant obtained by formula (2). If different orientations of the handheld input device have different mapping coefficients, the above formula is an integral formula or a cumulative formula.
  • step S503 the current pointing position of the handheld input device corresponding to the screen reference surface is determined according to the position offset and the mapping origin.
  • VX n VX 0 +d_VX n
  • VY n VY 0 +d_VY n
  • (VX n , VY n ) is the pointing position
  • (VX 0 , VY 0 ) is the mapping origin
  • (d_VX n , d_VY n ) is the position offset.
  • the pointing range of the handheld input device includes a first pointing range and a second pointing range, and the pointing position mapped by the pointing in the first pointing range is located on the screen, The pointing position mapped by the pointing in the second pointing range is outside the screen.
  • the range of the yaw angle and pitch angle of the handheld input device is 0-360 degrees.
  • the calculation steps of the pointing position shown in Figure 5 show that if the pointing of the handheld input device at two moments is the same, then the handheld input at these two moments.
  • the pointing position of the screen reference surface corresponding to the pointing of the device is the same, which has nothing to do with the movement track of the handheld input device between these two moments.
  • the pointing range of the handheld input device can be divided, including the first pointing range and the second pointing range.
  • mapping coefficient S is a constant obtained by formula (2)
  • the direction of the handheld input device whose pitch angle PA and yaw angle YA meet the following conditions belongs to the first pointing range:
  • the range of the yaw angle and the pitch angle of the handheld input device is 0-360 degrees.
  • the above expression (3) does not deal with the boundary conditions for 0° and 360°.
  • the range of the pointing position of the first pointing range to the mapped screen reference surface in the above formula (3) is: 0 ⁇ VX ⁇ MaxX and 0 ⁇ VY ⁇ MaxY. It can be seen that the pointing position mapped by the pointing in the first pointing range is located in the screen.
  • the pointing range that does not belong to the first pointing range belongs to the second pointing range.
  • the pointing position of the mapped screen reference surface in the second pointing range is located outside the screen.
  • the movement of the handheld device in the three-dimensional space is not only the rotational movement such as yaw and pitch, but also translational movement.
  • the translational movement of the handheld input device may cause the pointing position of the screen reference surface determined according to the above mapping model to deviate from the actual pointing position of the handheld input device.
  • the translational movement of the handheld input device is further combined to compensate the pointing position calculated in the step shown in FIG. 5.
  • the translational motion of the handheld input device can be calculated according to the acceleration data and the gyroscope data of the handheld input device, which is a known method and will not be repeated in this application.
  • the mapping model in this embodiment is a three-dimensional model. According to the first image, the position of the screen in the three-dimensional space and the orientation of the handheld input device in the three-dimensional space can be three-dimensionally modeled.
  • the three-dimensional model is the mapping model of this embodiment.
  • 3D modeling methods such as monocular vision 3D modeling, binocular vision 3D modeling, or depth camera 3D modeling, etc., which will not be repeated.
  • the screen reference surface 202 can be a plane or a curved surface.
  • the intersection point of the pointing line 204 of the handheld input device and the screen reference surface 202 can be obtained, and the intersection point is The pointing position corresponding to the pointing, the direction where the intersection point of the pointing line 204 and the screen reference surface is located inside the screen belongs to the first pointing range, and the direction where the intersection point of the pointing line and the screen reference surface is outside the screen belongs to the second pointing range.
  • the three-dimensional model of this embodiment can accurately model the curved screen to ensure the consistency between the pointing position of the handheld input device and the display position of the indicator icon. However, it has high requirements on the calculation performance of the system and is suitable for strong calculation capabilities and large battery capacity. Handheld input device.
  • step S103 the current pointing position of the handheld input device corresponding to the screen reference surface is determined according to the mapping model, and if the pointing position is located on the screen, the display position of the indicator icon is determined according to the pointing position; Otherwise, the display position of the indicator icon is set within a preset screen edge range or the indicator icon is hidden.
  • the display position of the indicator icon is the position of the reference point of the indicator icon, and the geometric center point of the indicator icon can usually be selected as the reference point.
  • the display mode of the displayed icon is controlled according to the area where the pointing position is located, that is: if the pointing position is located on the screen, according to the pointing position The position determines the display position of the indicator icon; otherwise, the display position of the indicator icon is set within a preset screen edge range or the indicator icon is hidden.
  • the pointing position is located on the screen, there are multiple implementation manners for determining the display position of the indicator icon according to the pointing position.
  • An implementation manner is that the display position of the indicator icon is set to the pointing position, that is, the indicator icon is directly displayed at the pointing position.
  • Another implementation manner is to perform a fitting or low-pass filtering process between the current pointing position and the pointing position for a period of time before, so as to determine the display position of the indicator icon, so that the movement track of the indicator icon is smoother and more beautiful.
  • Another implementation is to set corresponding anti-shake rules, anti-misoperation rules, and error range limitation rules, etc., and combine the current pointing position and one or more previous pointing positions to realize anti-shake and anti-misoperation Wait.
  • the indicator icon For example, judging whether the pointing position at the current moment and the pointing position in the previous period of time are within a circle with a preset radius, if it is within a circle, the indicator icon remains stationary, otherwise, the indicator icon turns from static to moving to realize the indicator icon Debounce.
  • a correspondence table between the working mode of the handheld input device and the movement sensitivity of the indicator icon is preset, and the movement sensitivity of the indicator icon is controlled according to the working mode of the handheld input device, which is beneficial to the ease of operation. For example, when simply moving the indicator icon, you can set a higher sensitivity, and when you drag a graphic object, you can set a lower sensitivity.
  • determining the display position of the indicator icon according to the pointing position first obtain the current working mode of the handheld input device, query the sensitivity corresponding to the current working mode, and then convert the pointing position into the display position of the indicator icon according to the sensitivity .
  • the indicator icon is hidden, or the display position of the indicator icon is set within a preset screen edge range.
  • the indicator icon When the pointing position mapped by the pointing of the handheld input device is outside the screen, the indicator icon is hidden, and the indicator icon is not displayed on the screen. When the pointing position corresponding to the pointing position of the handheld input device moves back into the screen, the indicator icon is displayed again .
  • the control indicator icon is stationary on the edge of the screen or slides with the pointing of the handheld input device on the edge of the screen.
  • the screen area preset to be several pixels from the edge of the screen is the screen edge range.
  • a screen area with a width of 10 pixels from the edge of the screen is set as the screen edge range, that is, the coordinate range is X ⁇ [0,9] or X ⁇ [MaxX-10,MaxX-1] or Y ⁇ [0,9] or Y ⁇ [MaxY-10,MaxY-1] screen area.
  • the farther the pointing position is from the screen the closer the display position of the set indicator icon is to the edge of the screen.
  • the indicator icon moves within the edge of the screen to remind the operator whether the pointing of the handheld input device is far away or close to the screen.
  • the display position can be transmitted to the operating system of the smart device, and the operating system displays the indicator icon on the display position, or can also be transmitted to
  • the application software on the smart device the application software displays the indicator icon on the display position, or other methods may also be used, which is not limited in this application.
  • the method further includes the step of: controlling a camera for collecting the first image before the step of collecting the first image pointed by the handheld input device The module enters the working mode, and after the step of collecting the first image pointed to by the handheld input device, the camera module for collecting the first image is controlled to enter the power saving mode, and the power saving mode is sleep or power off.
  • the analysis of the first image is performed in a handheld input device. Since the image analysis module consumes a large amount of power, it further includes the step of: extracting the first image Before the step of image feature, the image analysis module used to extract the image feature is controlled to enter the working mode, and after the step of extracting the image feature of the first image, the image analysis module used to extract the image feature is controlled to enter the section. Power mode, the power saving mode is sleep or power off.
  • the initial pointing position of the handheld input device is determined based on image analysis technology to ensure that the initial display position of the indicator icon is aligned with the direction of the handheld input device. Then a mapping model is established.
  • the mapping model the pointing range of the handheld input device includes a first pointing range and a second pointing range. The pointing position mapped by the pointing in the first pointing range is located on the screen, and the pointing icon can be controlled. When moving within the screen, the pointing position mapped by the pointing in the second pointing range is outside the screen.
  • the indicator icon When the pointing of the handheld input device changes with any trajectory within the second pointing range, the indicator icon is always displayed or hidden within the preset screen edge range, which will not cause the actual pointing position of the handheld input device and the display position of the indicator icon Deviations between.
  • the prior art mouse cursor control method is: move the mouse in one direction, move the mouse cursor to the edge of the screen, continue to move the mouse along that direction, the mouse cursor will remain displayed on the edge of the screen, but when the mouse moves in the opposite direction , The mouse cursor will immediately move to the screen.
  • the traditional mouse cursor control method there is essentially only one pointing range, and all pointing may control the mouse cursor to move within the screen. Due to the poor stability of handheld operation, it is easy to point to the outside of the screen during operation. If the traditional mouse control method is used, the pointing of the handheld input device will move back and forth inside and outside the screen, which will cause the position of the handheld input device to point and the indicator icon. The deviation of the display position becomes larger and larger, and the operation becomes more and more unnatural. Using the method for controlling the display position of the indicator icon of the present application, during the operation, the pointing of the handheld input device moves arbitrarily inside and outside the screen, and the display position of the indicator icon remains consistent with the position pointed by the handheld input device.
  • the display position control method of the indicator icon of this application is particularly suitable for digital wireless presenters.
  • the speaker can control the indicator icon naturally and can be used as a part of body language to wave the handheld input device arbitrarily, which conforms to the habit of speaking. .
  • Fig. 6 is a schematic diagram of an apparatus for controlling the display position of an indicator icon of a handheld input device according to an embodiment of the application.
  • the apparatus for controlling the display position of an indicator icon of the handheld input device includes:
  • the first image acquisition unit 601 is used to acquire the first image pointed by the handheld input device, extract the image characteristics of the first image; match the image characteristics of the first image with the prefetched screen characteristics, and identify The screen imaging area in the first image;
  • the mapping model establishing unit 602 is configured to establish a mapping model that maps the orientation of the handheld input device to the orientation position of the screen reference surface according to the screen imaging area, wherein, in the mapping model, the orientation of the handheld input device
  • the range of includes a first pointing range and a second pointing range, the pointing position mapped by the pointing in the first pointing range is located inside the screen, and the pointing position mapped by the pointing in the second pointing range is outside the screen;
  • the display position determining unit 603 is configured to determine, according to the mapping model, the current pointing position of the handheld input device corresponding to the screen reference surface, and if the pointing position is within the screen, determine the position of the indicator icon according to the pointing position Display position; otherwise, set the display position of the indicator icon within a preset screen edge range or hide the indicator icon.
  • the display position control device further includes:
  • the second image acquisition unit is configured to acquire a second image displayed on the screen
  • the second image feature extraction unit is configured to extract the image feature of the second image, wherein the image feature of the second image is the prefetched screen feature.
  • the display position control device further includes:
  • the camera module control unit is used to control the camera module for collecting the first image to enter the working mode before the step of collecting the first image pointed by the handheld input device, and when the first image pointed by the handheld input device is collected After the step of controlling the camera module for collecting the first image to enter the power saving mode;
  • the image analysis module control unit is used to control the image analysis module for extracting the image features to enter the working mode before the step of extracting the image features of the first image. After the feature step, the image analysis module for extracting the image feature is controlled to enter the power saving mode.
  • the display position determining unit 603 includes:
  • the first setting subunit is configured to set the display position of the indicator icon as the pointing position
  • the second setting subunit is configured to determine the display position of the indicator icon according to the pointing position at the current moment and at least one previous pointing position.
  • the display position determining unit 603 includes:
  • the working mode obtaining subunit is used to obtain the current working mode of the handheld input device
  • the sensitivity setting subunit is used for querying and setting the sensitivity corresponding to the current working mode according to the preset working mode of the handheld input device and the movement sensitivity table of the indicator icon.
  • the display position determining subunit is configured to determine the display position of the indicator icon according to the pointing position and the sensitivity corresponding to the current working mode.
  • mapping model establishing unit 602 includes:
  • the first orientation determination subunit is configured to determine a first orientation, where the first orientation is the orientation of the handheld input device when the first image is collected, wherein the orientation of the handheld input device includes the handheld input device Yaw angle and pitch angle;
  • mapping origin determining subunit is configured to determine the mapping origin according to the position of the screen imaging area in the first image
  • mapping coefficient determining subunit is configured to determine the mapping coefficient according to the imaging area of the screen and the angle of view of the camera module that collects the first image.
  • the device for controlling the display position of the indicator icon of the handheld input device described in FIG. 6 corresponds to the method of controlling the display position of the indicator icon of the handheld input device described in FIG. 1.
  • the electronic devices involved in the display position control method described in this application include handheld input devices, smart devices, and display devices, and possibly other electronic devices, such as data processing devices between the smart device and the display device.
  • the display position control method described in this application can be implemented by the operating system or application program of the above-mentioned related electronic device, or the program of the control method can be stored by a portable storage device.
  • the above-mentioned smart devices may be electronic devices such as desktop computers, mobile phones, tablet computers, embedded devices, and cloud computing devices.
  • the specific deployment methods of the various modules of the display position control device described in this application in the aforementioned handheld input devices, smart devices, display devices, and data processing devices are not limited in this application.
  • the above-mentioned smart device, display device and data processing device may be independent devices or integrated integrated devices.
  • the connection between the above-mentioned devices can be wired connection or wireless connection.
  • Fig. 7 is a schematic diagram of a handheld input device provided by an embodiment of the present application.
  • the handheld input device 7 of this embodiment includes: a processor 70, a memory 71, and a computer program 72 stored in the memory 71 and running on the processor 70, such as a handheld input device The display position control program of the indicator icon.
  • the processor 70 executes the computer program 72, the steps in the embodiment of the method for controlling the display position of the indicator icons of the handheld input devices described above are implemented.
  • the processor 70 executes the computer program 72, the function of each module/unit in the foregoing device embodiments is realized.
  • the computer program 72 may be divided into one or more modules/units, and the one or more modules/units are stored in the memory 71 and executed by the processor 70 to complete This application.
  • the one or more modules/units may be a series of computer program instruction segments capable of completing specific functions, and the instruction segments are used to describe the execution process of the computer program 72 in the handheld input device 7.
  • the computer program 72 can be divided into:
  • the first image acquisition unit is used to acquire the first image pointed by the handheld input device, extract the image characteristics of the first image; match the image characteristics of the first image with the prefetched screen characteristics, and identify The screen imaging area in the first image;
  • the mapping model establishment unit is configured to establish a mapping model that maps the pointing position of the handheld input device to the pointing position of the screen reference surface according to the screen imaging area, wherein, in the mapping model, the pointing position of the handheld input device is The range includes a first pointing range and a second pointing range, the pointing position mapped by the pointing in the first pointing range is located on the screen, and the pointing position mapped by the pointing in the second pointing range is located outside the screen;
  • the display position determining unit is configured to determine, according to the mapping model, the current pointing position of the handheld input device corresponding to the screen reference surface, and if the pointing position is located on the screen, determine the display of the indicator icon according to the pointing position Position; otherwise, set the display position of the indicator icon within a preset screen edge range or hide the indicator icon.
  • the handheld input device may include, but is not limited to, a processor 70 and a memory 71.
  • FIG. 7 is only an example of the handheld input device 7, and does not constitute a limitation on the handheld input device 7. It may include more or less components than shown in the figure, or combine certain components, or different
  • the handheld input device may also include input and output devices, network access devices, buses, etc.
  • the handheld input device may be one device, or it may be composed of more than one separate sub-devices.
  • the handheld input device includes two sub-devices. One of the sub-devices is operated by the operator while the other is used for the first image. Analysis shows that the two sub-devices are connected by wireless communication.
  • the so-called processor 70 may be a central processing unit (Central Processing Unit, CPU), other general-purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), Ready-made programmable gate array (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc.
  • the general-purpose processor may be a microprocessor or the processor may also be any conventional processor or the like.
  • the memory 71 may be an internal storage unit of the handheld input device 7, such as a hard disk or memory of the handheld input device 7.
  • the memory 71 may also be an external storage device of the handheld input device 7, such as a plug-in hard disk equipped on the handheld input device 7, a smart media card (SMC), or a secure digital (Secure Digital, SD card, Flash Card, etc.
  • the memory 71 may also include both an internal storage unit of the handheld input device 7 and an external storage device.
  • the memory 71 is used to store the computer program and other programs and data required by the handheld input device.
  • the memory 71 can also be used to temporarily store data that has been output or will be output.
  • the disclosed apparatus/terminal device and method may be implemented in other ways.
  • the device/terminal device embodiments described above are only illustrative.
  • the division of the modules or units is only a logical function division, and there may be other divisions in actual implementation, such as multiple units.
  • components can be combined or integrated into another system, or some features can be omitted or not implemented.
  • the displayed or discussed mutual coupling or direct coupling or communication connection may be indirect coupling or communication connection through some interfaces, devices or units, and may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, they may be located in one place, or they may be distributed on multiple network units. Some or all of the units may be selected according to actual needs to achieve the objectives of the solutions of the embodiments.
  • each unit in each embodiment of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units may be integrated into one unit.
  • the above-mentioned integrated unit can be implemented in the form of hardware or software functional unit.
  • the integrated module/unit is implemented in the form of a software functional unit and sold or used as an independent product, it can be stored in a computer readable storage medium.
  • this application implements all or part of the processes in the above-mentioned embodiments and methods, and can also be completed by instructing relevant hardware through a computer program.
  • the computer program can be stored in a computer-readable storage medium. When the program is executed by the processor, the steps of the foregoing method embodiments can be implemented.
  • the computer program includes computer program code, and the computer program code may be in the form of source code, object code, executable file, or some intermediate forms.
  • the computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, U disk, mobile hard disk, magnetic disk, optical disk, computer memory, read-only memory (ROM, Read-Only Memory) , Random Access Memory (RAM, Random Access Memory), electrical carrier signal, telecommunications signal, and software distribution media, etc.
  • ROM Read-Only Memory
  • RAM Random Access Memory
  • electrical carrier signal telecommunications signal
  • software distribution media etc.
  • the content contained in the computer-readable medium can be appropriately added or deleted in accordance with the requirements of the legislation and patent practice in the jurisdiction.
  • the computer-readable medium Does not include electrical carrier signals and telecommunication signals.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

一种手持输入设备的指示图标的显示位置控制方法包括:采集手持输入设备指向的第一图像,提取第一图像的图像特征;根据所述图像特征识别在所述第一图像中的屏幕成像区域;根据所述屏幕成像区域建立映射模型;根据所述映射模型确定所述手持输入设备的当前指向对应的屏幕参照面的指向位置,如果该指向位置位于屏幕内,根据所述指向位置确定所述指示图标的显示位置;否则,将所述指示图标的显示位置设置在预设的屏幕边沿范围内或者隐藏所述指示图标。使得在使用过程中,即使手持输入设备的指向任意多次在屏幕内外移动,指示图标的显示位置保持与手持输入设备实际指向的位置相符,用户在操作过程中不用控制动作的幅度,使用自然,用户体验好。

Description

手持输入设备及其指示图标的显示位置控制方法和装置 技术领域
本申请属于手持输入设备领域,尤其涉及一种手持输入设备及其指示图标的显示位置控制方法和装置。
背景技术
手持输入设备,由操作者手持操作,操作者通过操控其在三维空间上的运动,可以控制一个指示图标在计算机图形用户界面(GUI)上移动。
手持输入设备的一种具体产品是空中鼠标。对于空中鼠标,上述指示图标即为鼠标光标。不同于桌面鼠标,空中鼠标不需要支撑平面,操作者可以手持空中鼠标控制鼠标光标在图形用户界面上移动。手持输入设备的另一种具体产品是数字无线演示器。无线演示器是会议室中的一个重要设备,用于演讲者在讲解PPT等资料时突出重点内容。目前无线演示器的主要品类是激光演示器,也称激光笔,演讲者按下激活按键触发激光,利用激光光斑照亮屏幕上重点内容,达到突出重点内容的目的。近年来LCD/LED大屏的价格越来越低,相对于投影机,其亮度高、对比度大、色彩还原度更好,因此,越来越多的会议室和培训室开始采用LCD/LED大屏。由于LCD/LED大屏的表面对激光的反射不是漫反射,造成入射方向光斑很暗,而出射方向的光斑过亮,因此激光演示器不适用于LCD/LED大屏。数字无线演示器可以解决上述问题,将其控制的指示图标设计的比传统的鼠标光标更醒目一些,如设计为高亮的红色圆斑,即可达到类似于激光光斑的效果,对演讲的重点内容进行突出标示。
专利号US5440326和专利号US5898421公开了一种基于陀螺仪的手持输入设备。专利申请号201611109873.1公开了一种将图像技术和惯性运动传感器相结合的实现方法。上述公开的手持输入设备的指示图标的显示位置控制方法与现有技术的鼠标光标控制方法相同,其基本过程为:在每个指示图标的显示位置的计算周期中,根据输入设备相对于上一周期的运动,计算相对于上一周期的坐标偏移,该坐标偏移与上一周期的坐标相加即获得当前坐标。这种计算方法的特点是:当指示图标移动至屏幕边沿后,若手持输入设备的指向继续往屏幕外部移动,指示图标将保持显示在边沿位置上,一旦手持输入设备的指向转变为往屏幕内部移动,指示图标会立即往屏幕内部移动。对于手持输入设备,采用上述传统鼠标光标控制方法有个很大的问题。由于手持输入设备是手持操作的,不同于桌面鼠标,可控性较差,操作过程中很容易指向屏幕外部,特别是对于数字无线演示器,演讲者的指向动作是肢体语言的一部分,会经常地使数字无线演示器指向屏幕外部,当手持输入设备指向屏幕外部再往回调整一次,其指向的位置和指示图标的显示位置的偏差就会加大一点,当手持输入设备的指向在屏幕内外来回移动时,偏差会越来越大,导致操作越来越不自然,影响用户体验。
发明内容
有鉴于此,本申请实施例提供了一种手持输入设备及其指示图标的显示位置控制方法和装置,以解决现有技术中当手持输入设备所指向的位置在屏幕内外移动后,指示图 标的显示位置与手持输入设备指向的位置出现偏差,导致操作不自然,影响用户体验的问题。
本申请实施例的第一方面提供了一种手持输入设备的指示图标的显示位置控制方法,所述手持输入设备的指示图标的显示位置控制方法包括:
采集所述手持输入设备指向的第一图像,提取所述第一图像的图像特征;将所述第一图像的图像特征与预取的屏幕特征进行匹配,识别在所述第一图像中的屏幕成像区域;
根据所述屏幕成像区域建立将所述手持输入设备的指向映射为屏幕参照面的指向位置的映射模型,其中,所述映射模型中,所述手持输入设备的指向的范围包括第一指向范围和第二指向范围,所述第一指向范围中的指向所映射的指向位置位于屏幕内,所述第二指向范围中的指向所映射的指向位置位于屏幕外;
根据所述映射模型确定所述手持输入设备的当前指向对应的屏幕参照面的指向位置,如果该指向位置位于屏幕内,根据所述指向位置确定所述指示图标的显示位置;否则,将所述指示图标的显示位置设置在预设的屏幕边沿范围内或者隐藏所述指示图标。
结合第一方面,在第一方面的第一种可能实现方式中,在所述将所述第一图像的图像特征与预取的屏幕特征进行匹配的步骤之前,所述显示位置控制方法还包括:
采集显示在所述屏幕上的第二图像;
提取所述第二图像的图像特征,其中,所述第二图像的图像特征为所述预取的屏幕特征。
结合第一方面,在第一方面的第二种可能实现方式中,所述显示位置控制方法还包括:
在采集所述手持输入设备指向的第一图像的步骤之前控制用于采集所述第一图像的摄像模块进入工作模式,在采集所述手持输入设备指向的第一图像的步骤之后控制用于采集所述第一图像的摄像模块进入节电模式;
和/或,
在所述提取所述第一图像的图像特征的步骤之前控制用于提取所述图像特征的图像分析模块进入工作模式,在所述提取所述第一图像的图像特征的步骤之后控制用于提取所述图像特征的图像分析模块进入节电模式。
结合第一方面,在第一方面的第三种可能实现方式中,所述根据所述指向位置确定所述指示图标的显示位置的步骤包括:
设置所述指示图标的显示位置为所述指向位置;
或者,根据当前时刻的所述指向位置以及至少一个之前的所述指向位置,确定所述指示图标的显示位置。
结合第一方面,在第一方面的第四种可能实现方式中,所述根据所述指向位置确定所述指示图标的显示位置的步骤包括:
获取所述手持输入设备的当前工作模式;
根据预设的手持输入设备的工作模式与指示图标的移动灵敏度对应表,查询所述当前工作模式对应的灵敏度;
根据所述指向位置以及所述当前工作模式对应的灵敏度,确定所述指示图标的显 示位置。
结合第一方面,在第一方面的第五种可能实现方式中,所述映射模型的参数包括第一指向、映射原点和映射系数,其中:
所述第一指向为采集所述第一图像时所述手持输入设备的指向,所述手持输入设备的指向包括所述手持输入设备的偏航角和俯仰角;
所述映射原点根据所述屏幕成像区域在所述第一图像中的位置所确定;
所述映射系数根据所述屏幕成像区域和采集所述第一图像的摄像模块的视场角所确定。
结合第一方面的第五种可能实现方式,在第一方面的第六种可能实现方式中,根据所述映射模型确定所述手持输入设备的当前指向所对应的屏幕参照面的指向位置的步骤包括:
确定所述手持输入设备的当前指向;
根据所述当前指向相对于所述第一指向的角度偏移及所述映射系数确定所述当前指向对应的指向位置相对于所述第一指向对应的指向位置的位置偏移;
根据所述位置偏移和所述映射原点确定所述手持输入设备的当前指向对应的屏幕参照面的指向位置。
本申请实施例的第二方面提供了一种手持输入设备的指示图标的显示位置控制装置,所述手持输入设备的指示图标的显示位置控制装置包括:
第一图像采集单元,用于采集所述手持输入设备指向的第一图像,提取所述第一图像的图像特征;将所述第一图像的图像特征与预取的屏幕特征进行匹配,识别在所述第一图像中的屏幕成像区域;
映射模型建立单元,用于根据所述屏幕成像区域建立将所述手持输入设备的指向映射为屏幕参照面的指向位置的映射模型,其中,所述映射模型中,所述手持输入设备的指向的范围包括第一指向范围和第二指向范围,所述第一指向范围中的指向所映射的指向位置位于屏幕内,所述第二指向范围中的指向所映射的指向位置位于屏幕外;
显示位置确定单元,用于根据所述映射模型确定所述手持输入设备的当前指向对应的屏幕参照面的指向位置,如果该指向位置位于屏幕内,根据所述指向位置确定所述指示图标的显示位置;否则,将所述指示图标的显示位置设置在预设的屏幕边沿范围内或者隐藏所述指示图标。
结合第二方面,在第二方面的第一种可能实现方式中,所述显示位置控制装置还包括:
第二图像采集单元,用于采集显示在所述屏幕上的第二图像;
第二图像特征提取单元,用于提取所述第二图像的图像特征,其中,所述第二图像的图像特征为所述预取的屏幕特征。
结合第二方面,在第二方面的第二种可能实现方式中,所述显示位置控制装置还包括:
摄像模块控制单元,用于在采集所述手持输入设备指向的第一图像的步骤之前控制用于采集所述第一图像的摄像模块进入工作模式,在采集所述手持输入设备指向的第一图像的步骤之后控制用于采集所述第一图像的摄像模块进入节电模式;
和/或,
图像分析模块控制单元,用于在所述提取所述第一图像的图像特征的步骤之前控制用于提取所述图像特征的图像分析模块进入工作模式,在所述提取所述第一图像的图像特征的步骤之后控制用于提取所述图像特征的图像分析模块进入节电模式。
结合第二方面,在第二方面的第三种可能实现方式中,所述显示位置确定单元包括:
第一设置子单元,用于设置所述指示图标的显示位置为所述指向位置;
或者,第二设置子单元,用于根据当前时刻的所述指向位置以及至少一个之前的所述指向位置,确定所述指示图标的显示位置。
结合第二方面,在第二方面的第四种可能实现方式中,所述显示位置确定单元包括:
工作模式获取子单元,用于获取所述手持输入设备的当前工作模式;
灵敏度查询子单元,用于根据预设的手持输入设备的工作模式与指示图标的移动灵敏度对应表,查询所述当前工作模式对应的灵敏度;
显示位置确定子单元,用于根据所述指向位置以及所述当前工作模式对应的灵敏度,确定所述指示图标的显示位置。
结合第二方面,在第二方面的第五种可能实现方式中,所述映射模型的参数包括第一指向、映射原点和映射系数,其中:
所述第一指向为采集所述第一图像时所述手持输入设备的指向,所述手持输入设备的指向包括所述手持输入设备的偏航角和俯仰角;
所述映射原点根据所述屏幕成像区域在所述第一图像中的位置所确定;
所述映射系数根据所述屏幕成像区域和采集所述第一图像的摄像模块的视场角所确定。
结合第二方面的第五种可能实现方式,在第二方面的第六种可能实现方式中,所述显示位置确定单元包括:
当前指向确定子单元,用于确定所述手持输入设备的当前指向;
位置偏移确定子单元,用于根据所述当前指向相对于所述第一指向的角度偏移及所述映射系数确定所述当前指向对应的指向位置相对于所述第一指向对应的指向位置的位置偏移;
指向位置确定子单元,用于根据所述位置偏移和所述映射原点确定所述手持输入设备的当前指向对应的屏幕参照面的指向位置。
本申请实施例的第三方面提供了一种手持输入设备,包括存储器、处理器以及存储在所述存储器中并可在所述处理器上运行的计算机程序,所述处理器执行所述计算机程序时实现如第一方面任一项所述手持输入设备的指示图标的显示位置控制方法的步骤。
本申请实施例的第四方面提供了一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,所述计算机程序被处理器执行时实现如第一方面任一项所述手持输入设备的指示图标的显示位置控制方法的步骤。
本申请实施例与现有技术相比存在的有益效果是:在手持输入设备的使用过程中,获取手持输入设备指向的第一图像,识别所述第一图像的屏幕成像区域,根据所识别的 屏幕成像区域确定将所述手持输入设备的指向映射为屏幕参照面的指向位置的映射模型。根据所述映射模型确定手持输入设备的指向对应的屏幕参照面的指向位置,当指向位置处于屏幕内时在屏幕的相应位置上显示指示图标,当指向位置处于屏幕外时,在预设的屏幕边沿范围显示指示图标或隐藏所述指示图标。可见,本申请的手持输入设备的显示位置控制方法与现有技术的鼠标光标控制方法不同:本申请首先基于图像技术确定手持输入设备指向的屏幕位置并在该位置显示指示图标,即手持输入设备指向的初始位置和指示图标的首个显示位置是一致的,在操作过程中,即使手持输入设备的指向由屏幕内移至屏幕外,再从屏幕外移回屏幕内,指示图标的显示位置仍然保持与手持输入设备实际指向的位置相符,不会因为手持输入设备的指向在屏幕内外来回移动而导致偏差。因此,用户在操作过程不用控制动作的幅度,甚至可以作为肢体语言的一部分任意挥动手持输入设备,操作自然,有利于提高用户使用体验。
附图说明
为了更清楚地说明本申请实施例中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1是本申请实施例提供的一种手持输入设备的指示图标的显示位置控制方法的实现流程示意图;
图2是本申请实施例提供的通过手持输入设备的指向控制指示图标的示意图;
图3是本申请实施例提供的映射模型实施例一的参数的一种计算方法;
图4是本申请实施例提供的一种手持输入设备采集的第一图像的示意图;
图5是本申请实施例提供的根据映射模型实施例一确定指向位置的实现流程示意图;
图6是本申请实施例提供的一种手持输入设备的指示图标的显示位置控制装置的示意图;
图7是本申请实施例提供的手持输入设备的示意图。
具体实施方式
以下描述中,为了说明而不是为了限定,提出了诸如特定系统结构、技术之类的具体细节,以便透彻理解本申请实施例。然而,本领域的技术人员应当清楚,在没有这些具体细节的其它实施例中也可以实现本申请。在其它情况中,省略对众所周知的系统、装置、电路以及方法的详细说明,以免不必要的细节妨碍本申请的描述。
为了说明本申请所述的技术方案,下面通过具体实施例来进行说明。
图1为本申请实施例提供的一种手持输入设备的指示图标的显示位置控制方法的实现流程示意图,详述如下:
在步骤S101中,采集所述手持输入设备指向的第一图像,提取所述第一图像的图像特征;将所述第一图像的图像特征与预取的屏幕特征进行匹配,识别在所述第一图像中的屏幕成像区域。
本申请中的手持输入设备可以为空中鼠标、数字无线演示器,或者其它类型的输入设备,通常手持输入设备上设有按键,操作者可以通过按键指令来激活指示图标或触发其它操作。
所述手持输入设备由操作者手持操作,操作者在使用过程中通过控制手持输入设备在三维空间中的运动控制一个指示图标在显示设备的屏幕上移动,其中,所述指示图标是计算机图形用户界面上的一个图形对象,可以为一个指针形状的图标,如鼠标光标。当操作者控制所述图形对象移至目标位置后,即可通过手持输入设备上的功能按键触发相应的操作。所述显示设备可以为投影机、LCD/LED显示器等任何显示设备。
本申请所述的显示位置控制方法可以在操作者激活所述指示图标时触发运行,也可以当所述指示图标处于消隐状态时就触发运行,例如,可以在手持输入设备加电后立刻触发运行,或者当手持输入设备的俯仰角大于预设的角度时触发运行。采用先于指示图标激活的后台触发方式,当操作者激活指示图标时,指示图标的显示位置已确定,可以在屏幕上立刻显示指示图标,响应速度快。
图2所示为手持输入设备的一个应用场景,手持输入设备由操作者手持,图中的灰色区域为屏幕201,所述屏幕201包括但不局限于电子墙屏幕、显示器屏幕或投影屏幕等。手持输入设备指向屏幕201进行操作。如图2所示,手持输入设备在三维空间中的运动可以用手持输入设备的指向D的变化来表示,建立一个映射模型,通过该映射模型将手持输入设备的指向D映射为屏幕参照面202的一个指向位置,并根据该指向位置控制指示图标的显示位置,操作者即可通过操控手持输入设备的指向D控制指示图标在屏幕201上移动。
所述屏幕参照面可以根据屏幕的表面来确定。如果显示设备的屏幕表面为平面,优选的,如图2所示,所述屏幕参照面202设置为屏幕所在平面,其中,图中深灰色区域是屏幕201。如果屏幕表面是曲面,可以用屏幕投射面近似表示屏幕201,而屏幕参照面202为屏幕投射面所在的平面,或者屏幕参照面202也可以是曲面。要说明的是,图2所示的屏幕201是矩形,实际上屏幕201可以为其它任意形状,也可以由几个分离的子屏幕组成,对于异形屏幕,建立映射模型时与矩形屏幕在原理上是相同的。不失一般性的,本申请以矩形屏幕为例论述映射模型的建立过程。
要说明的是,所述手持输入设备的指向D所映射的屏幕参照面的指向位置与手持输入设备实际指向的位置是不同的概念,所述指向位置可能与手持输入设备实际指向的位置相符,也可能与手持输入设备实际指向的位置有偏差。例如,如图2所示,如果手持输入设备的指向D所映射的指向位置为手持输入设备的指向线204与屏幕参照面202的交点P,则指向D所映射的指向位置与手持输入设备实际指向的位置相符;如果手持输入设备的指向D所映射的指向位置为P1,则指向D所映射的指向位置与手持输入设备实际指向的位置有偏差。
在操作过程中,当手持输入设备实际指向的位置与指示图标的显示位置基本一致时,操作会自然舒适,反之,会有不适感。本申请公开一种指示图标的显示位置控制方法,可使手持输入设备的指向D所映射的指向位置与手持输入设备实际指向的位置相符,从而使手持输入设备所指的位置与指示图标的显示位置保持一致性,有利于提升用户体验。
如图2所示,所述手持输入设备中设有摄像模块203,所述摄像模块203拍摄手持输入设备指向的图像,该图像即为如图4所示的第一图像401。优选的,摄像模块203的安装方式可以为:手持输入设备的指向线204垂直于摄像模块203的感光传感器阵列平面并过其中 心点。采用这种优选方式,手持输入设备的指向线204与屏幕参照面202的交点P在第一图像401中的成像点为第一图像401的中心点IP,利于映射模型的参数的计算。
获取所述第一图像401后,对所述第一图像401进行分析,提取所述第一图像的图像特征,将所述第一图像的图像特征与预取的屏幕特征进行匹配,识别在所述第一图像401中的屏幕成像区域402。所述预取的屏幕特征是用于识别屏幕201的图像特征。图像特征的分析和匹配算法是已知技术,不再赘述。要说明的是,屏幕201在第一图像中的实际成像区域通常是有变形的,根据摄像机的成像原理可以对其矫正,获得如图4所示的规则的屏幕成像区域402。
预取所述屏幕特征有多种实施方式,包括:
方式一:在显示设备上设置硬件识别装置,例如在屏幕201四周设置多个红外LED,预取LED数量和LED间距用作屏幕特征,在第一图像401中提取高亮的光点,将所述光点根据LED数量特征和LED间距特征进行匹配,可确定所述屏幕成像区域402。
方式二:采集第一图像401时,在屏幕201的特定位置(如屏幕正中心)上显示一个或多个特别的图形对象,预先提取这些图形对象的图像特征用作屏幕特征。
方式三:在采集第一图像401的同时或之前,采集显示在屏幕201上的第二图像,提取第二图像的图像特征用作屏幕特征。采集第二图像的一种实现方式可以采用一个独立的硬件装置,该装置连接在智能设备和显示设备之间的图像数据传输通道上,如HDMI线路上,该装置截获智能设备输出的第二图像。采集第二图像的另一种实现方式为软件截屏方式,在输出第二图像的智能设备上安装一个软件,该软件截取智能设备输出给显示设备的第二图像。采用方式三,相对于上述方式一,不需要设置特别的硬件装置,适用性好;相对于上述方式二,屏幕上不会时不时显示用于识别的图形对象,影响用户观感。
根据屏幕成像区域402建立将手持输入设备的指向D映射为屏幕参照面202的指向位置的映射模型,用该指向位置控制指示图标的显示位置,可以实现手持输入设备实际指向的位置与指示图标的显示位置的一致性。
在步骤S102中,根据所述屏幕成像区域建立将所述手持输入设备的指向映射为屏幕参照面的指向位置的映射模型,其中,所述映射模型中,所述手持输入设备的指向的范围包括第一指向范围和第二指向范围,所述第一指向范围中的指向所映射的指向位置位于屏幕内,所述第二指向范围中的指向所映射的指向位置位于屏幕外。
根据步骤S101获取的屏幕成像区域402,可以建立将所述手持输入设备的指向映射为屏幕参照面的指向位置的映射模型。所述建立映射模型有多种实施方式。
映射模型实施例一:
本实施例的映射模型的参数包括第一指向、映射原点和映射系数。获取所述映射模型的参数的计算方法如图3所示,包括:
在步骤S301中,所述第一指向为采集所述第一图像时所述手持输入设备的指向,其中,所述手持输入设备的指向包括所述手持输入设备的偏航角和俯仰角;
在本实施例的映射模型中,用手持输入设备的偏航角和俯仰角对所述手持输入设备的指向进行建模。如图2所示,根据飞行器姿态的描述习惯,设手持输入设备的指向D为机头方向,则RY轴为俯仰轴、RZ轴为偏航轴,当手持输入设备绕着俯仰轴RY旋转时,其指向的位置P在屏幕上上下移动,当手持输入设备绕着偏航轴RZ旋转时,其指向的位置P在屏幕上 左右移动。可见,手持输入设备的偏航角和俯仰角可以自然有效地对手持输入设备的指向进行建模。
手持输入设备中设置惯性运动传感器,包括加速度计和陀螺仪,所述惯性运动传感器采集手持输入设备的加速度数据和陀螺仪数据。根据所述加速度数据和陀螺仪数据可以计算出手持输入设备的俯仰角和偏航角,即本实施例映射模型中的手持输入设备的指向。根据加速度数据和陀螺仪数据计算手持输入设备的姿态角是一种已知方法,如扩展卡尔曼滤波算法,本申请不再赘述。
根据手持输入设备的加速度数据和陀螺仪数据确定采集第一图像时手持输入设备的俯仰角和偏航角,即获得所述第一指向。
在步骤S302中,所述映射原点根据所述屏幕成像区域在所述第一图像中的位置所确定。
如图2和图4所示,第一图像401中,屏幕成像区域402的左上角顶点IO为屏幕201的左上角顶点O的成像点,点IP为手持输入设备指向的位置P的成像点,点IP位于第一图像401的中心点。
本实施例的映射模型中,屏幕参照面202是一个平面,为屏幕参照面202设置一个笛卡尔坐标系,不失一般性的,该坐标系以屏幕201的左上角顶点O为原点,以屏幕X轴为其X轴,以屏幕Y轴为其Y轴。
根据第一图像401的中心点IP相对于点IO的位置,可以计算出手持输入设备实际指向的位置P相对于屏幕201的左上角顶点O的坐标偏移,从而获得位置P在屏幕参照面202中的坐标,该坐标即为所述映射原点(VX 0,VY 0)。计算公式如下:
VX 0=(CX/SW)*MaxX,VY 0=(CY/SH)*MaxY    (1)
其中,CX和CY分别为点IP相对点IO的水平像点数和垂直像点数;SW和SH分别为屏幕成像区域402的水平像点数和垂直像点数;MaxX和MaxY分别是屏幕201的水平像点数和垂直像点数,例如,设屏幕201的屏幕分辨率为1920x1080,则MaxX为1920,MaxY为1080。
值得说明的是,所述映射原点为所述第一指向对应的屏幕参照面的指向位置,如果拍摄第一图像401时手持输入设备指向屏幕内的位置,则映射原点位于屏幕内,否则,映射原点位于屏幕外。
在步骤S303中,所述映射系数根据所述屏幕成像区域和采集所述第一图像的摄像模块的视场角所确定。
所述映射系数的物理含义是手持输入设备的偏航角的每度角对应的屏幕参照面的X轴像点数,以及手持输入设备的俯仰角的每度角对应的屏幕参照面的Y轴像点数。
所述映射系数的确定有多种实施方式,在一种实施方式中,映射系数S的计算公式为:
S=MaxX/(Φ*SW/PicX)      (2)
其中,如图2和图4所示,Φ为摄像模块的水平视场角,MaxX是屏幕201的水平像点数,SW为屏幕成像区域402的水平像点数,PicX为第一图像401的水平像点数。
要说明的是,公式(2)确定的映射系数是简化的,可使计算更简单。实际上,当手持输入设备指向不同位置时,手持输入设备的偏航角或俯仰角偏移相同的角度对应的屏幕参照面的像点数的偏移是不同的。在另一实施例中,根据已知的摄像机的成像原理,为手持输 入设备的不同指向确定相应的映射系数。在该实施例中,确定的映射系数可以是一个输入参数为手持输入设备的指向的函数,也可以是一个离散表,表中的每一项为一个区段的手持输入设备的指向所对应的映射系数。
根据映射模型实施例一所述的映射模型,可以确定所述手持输入设备的当前指向所对应的屏幕参照面的指向位置,其计算过程如图5所示。
在步骤S501中,确定所述手持输入设备的当前指向。
根据手持输入设备的加速度数据和陀螺仪数据可以确定所述手持输入设备的当前指向,所述当前指向包括手持输入设备的当前俯仰角PA n和当前偏航角YA n。其计算方法与计算所述第一指向的方法相同。
在步骤S502中,根据所述当前指向相对于所述第一指向的角度偏移及所述映射系数确定所述当前指向对应的指向位置相对于所述第一指向对应的指向位置的位置偏移。
所述位置偏移的计算公式如下:
d_VX n=(YA n-YA 0)*S,d_VY n=(PA n-PA 0)*S
其中,(d_VX n,d_VY n)为所述位置偏移,PA n和YA n分别为所述当前指向的俯仰角和偏航角,PA 0和YA 0分别为所述第一指向的俯仰角和偏航角,S为映射系数。
要说明的是,上述公式中,映射系数S是公式(2)获取的常数,如果手持输入设备的不同指向有不同的映射系数,则上述公式为一个积分算式或累加算式。
在步骤S503中,根据所述位置偏移和所述映射原点确定所述手持输入设备的当前指向对应的屏幕参照面的指向位置。
所述当前指向对应的指向位置的计算公式如下:
VX n=VX 0+d_VX n,VY n=VY 0+d_VY n
其中,(VX n,VY n)为所述指向位置,(VX 0,VY 0)为所述映射原点,(d_VX n,d_VY n)为所述位置偏移。
根据映射模型实施例一所述的映射模型,所述手持输入设备的指向的范围包括第一指向范围和第二指向范围,所述第一指向范围中的指向所映射的指向位置位于屏幕内,所述第二指向范围中的指向所映射的指向位置位于屏幕外。
手持输入设备的偏航角和俯仰角的范围为0-360度,由图5所示的指向位置的计算步骤可知,如果两个时刻手持输入设备的指向相同,那么,这两个时刻手持输入设备的指向所对应的屏幕参照面的指向位置相同,与这两个时刻间的手持输入设备的移动轨迹无关。根据这个特性可以为手持输入设备的指向划分范围,包括第一指向范围和第二指向范围。
不失一般性的,设映射系数S为公式(2)所得的常数,则俯仰角PA和偏航角YA满足下列条件的手持输入设备的指向属于所述第一指向范围:
PA 0-VY 0/S≤PA<PA 0+(MaxY-VY 0)/S,且
YA 0-VX 0/S≤YA<YA 0+(MaxX-VX 0)/S    (3)
要说明的是,手持输入设备的偏航角和俯仰角的范围为0-360度,为了使表达式简明,上述表达式(3)中没有对针对0°和360°的边界条件进行处理。
根据图5所示的指向位置的计算步骤,上述公式(3)所示的第一指向范围中的指向所映射的屏幕参照面的指向位置的范围为:0≤VX<MaxX且0≤VY<MaxY。可见第一指向范围中的指向所映射的指向位置位于屏幕内。
不属于第一指向范围的指向属于第二指向范围。第二指向范围中的指向所映射的屏幕参照面的指向位置位于屏幕外。
要说明的是,手持设备在三维空间中的运动除了偏航、俯仰等旋转运动,还有平移运动。手持输入设备的平移运动会导致根据上述映射模型确定的屏幕参照面的指向位置与手持输入设备实际指向的位置有偏差。在另一实施例中,进一步结合手持输入设备的平移运动,对图5所示步骤计算所得的指向位置进行补偿。可以根据手持输入设备的加速度数据和陀螺仪数据计算手持输入设备的平移运动,是已知方法,本申请不再赘述。
要说明的是,消费级惯性运动传感器的精度较低,会产生累积误差,需要定期地采集所述第一图像并更新映射模型的参数。
映射模型实施例二:
本实施例的映射模型是一个三维模型。根据所述第一图像,可以将屏幕在三维空间中的位置和手持输入设备在三维空间中的指向进行三维建模,该三维模型即为本实施例的映射模型。有多种已知的三维建模方法,如单目视觉三维建模、双目视觉三维建模或者深度摄像头三维建模等,不再赘述。
在三维模型中,可以对曲面进行建模,因此屏幕参照面202可以是平面,也可以是曲面,根据三维模型可获得手持输入设备的指向线204与屏幕参照面202的交点,该交点即为该指向所对应的指向位置,指向线204与屏幕参照面的交点位于屏幕内的指向属于第一指向范围,指向线与屏幕参照面的交点位于屏幕外的指向属于第二指向范围。本实施例的三维模型可对曲面屏幕准确建模,保证手持输入设备指向的位置与指示图标的显示位置的一致性,不过对系统的计算性能要求较高,适用于计算能力强和电池容量大的手持输入设备。
在步骤S103中,根据所述映射模型确定所述手持输入设备的当前指向对应的屏幕参照面的指向位置,如果该指向位置位于屏幕内,根据所述指向位置确定所述指示图标的显示位置;否则,将所述指示图标的显示位置设置在预设的屏幕边沿范围内或者隐藏所述指示图标。
由于指示图标是有一定大小的,如32x32个像点,所述指示图标的显示位置为指示图标的参考点的位置,通常可以选择指示图标的几何中心点为参考点。根据本申请所述映射模型将手持输入设备的指向映射为屏幕参照面的指向位置后,根据指向位置所在的区域控制显示图标的显示方式,即:如果该指向位置位于屏幕内,根据所述指向位置确定所述指示图标的显示位置;否则,将所述指示图标的显示位置设置在预设的屏幕边沿范围内或者隐藏所述指示图标。
如果所述指向位置位于屏幕内,根据所述指向位置确定指示图标的显示位置有多种实施方式。一种实施方式为,将指示图标的显示位置设置为指向位置,即直接在指向位置上显示指示图标。另一种实施方式为,将当前指向位置与之前一段时间的指向位置进行拟合或者低通滤波处理,从而确定指示图标的显示位置,使得指示图标的移动轨迹更平滑美观。另一种实施方式为,设置相应的防抖动规则、防误操作规则以及误差范围限定规则等,结合当前时刻的指向位置以及之前的一个或者多个指向位置,实现防抖动、防误操作等。例如,判断当前时刻的指向位置与之前一段时间内的指向位置是否处于一个预设半径的圆内,如果处于一个圆内,则指示图标保持静止,否则,指示图标从静止转向移动,实现指示图 标的去抖动。
在另一个实施例中,预设手持输入设备的工作模式与指示图标的移动灵敏度对应表,根据手持输入设备的工作模式对指示图标的移动灵敏度进行控制,有利于操作的自如性。例如,单纯移动指示图标时,可以设置较高的灵敏度,而拖动一个图形对象时,可以设置较低的灵敏度。在根据指向位置确定指示图标的显示位置时,首先获取所述手持输入设备的当前工作模式,查询所述当前工作模式对应的灵敏度,然后根据该灵敏度将所述指向位置转化为指示图标的显示位置。
如果所述指向位置位于屏幕外,则隐藏所述指示图标,或者将所述指示图标的显示位置设置在预设的屏幕边沿范围内。
当手持输入设备的指向所映射的指向位置位于屏幕外时,隐藏所述指示图标,不在屏幕上显示指示图标,当手持输入设备的指向对应的指向位置移回屏幕内时,再重新显示指示图标。或者,将所述指示图标的显示位置设置在预设的屏幕边沿范围内,有多种实施方式:
一种实施方式中,预设的屏幕边沿范围为屏幕边沿,其中,屏幕边沿的坐标(X,Y)满足条件:X=0或X=MaxX-1或Y=0或Y=MaxY-1。控制指示图标在屏幕边沿上静止不动或在屏幕边沿上随着手持输入设备的指向滑动。
另一种实施方式中,预设距离屏幕边沿若干个像点的屏幕区域为屏幕边沿范围,例如,设置一个距离屏幕边沿10个像点宽度的屏幕区域为所述屏幕边沿范围,即坐标范围为X∈[0,9]或X∈[MaxX–10,MaxX-1]或Y∈[0,9]或Y∈[MaxY–10,MaxY-1]的屏幕区域。指向位置距离屏幕越远,所设置的指示图标的显示位置越靠近屏幕边沿。这种实施方式下,当手持输入设备的指向远离或靠近屏幕时,指示图标在该屏幕边沿范围内移动,提醒操作者手持输入设备的指向是远离还是靠近屏幕。
根据本申请所述的显示位置控制方法确定了所述指示图标的显示位置后,可以将该显示位置传送给智能设备的操作系统,由操作系统在该显示位置上显示指示图标,也可以传送给智能设备上的应用软件,由该应用软件在该显示位置上显示指示图标,或者还可以采用其它方式,本申请不作限定。
进一步的,为了提高手持输入设备的续航使用时间,在另一个实施例中,还包括步骤:在采集所述手持输入设备指向的第一图像的步骤之前控制用于采集所述第一图像的摄像模块进入工作模式,在采集所述手持输入设备指向的第一图像的步骤之后控制用于采集所述第一图像的摄像模块进入节电模式,所述节电模式为休眠或者断电。
进一步的,在另一个实施例中,所述第一图像的分析工作在手持输入设备中进行,由于图像分析模块的耗电量较大,还包括步骤:在所述提取所述第一图像的图像特征的步骤之前控制用于提取所述图像特征的图像分析模块进入工作模式,在所述提取所述第一图像的图像特征的步骤之后控制用于提取所述图像特征的图像分析模块进入节电模式,所述节电模式为休眠或者断电。
本申请的指示图标的显示位置控制方法中,首先基于图像分析技术确定手持输入设备初始指向的位置,保证指示图标的初始显示位置与手持输入设备的指向对准。然后建立一个映射模型,在该映射模型中,手持输入设备的指向的范围包括第一指向范围和第二指向范围,第一指向范围中的指向所映射的指向位置位于屏幕内,可控制指示图标在屏幕 内移动,第二指向范围中的指向所映射的指向位置位于屏幕外。当手持输入设备的指向在第二指向范围内以任意轨迹改变时,指示图标总是显示于预设的屏幕边沿范围内或隐藏,不会导致手持输入设备实际指向的位置与指示图标的显示位置之间产生偏差。这一点是本申请的指示图标的显示位置控制方法与现有技术的鼠标光标的显示位置控制方法的根本不同点。现有技术的鼠标光标控制方法为:沿着一个方向移动鼠标,使鼠标光标移至屏幕边沿,继续沿着该方向移动鼠标,鼠标光标会保持显示在屏幕边沿,但是当鼠标沿相反方向移动时,鼠标光标会立刻往屏幕内移动。因此,在传统鼠标光标控制方法中实质上只有一个指向范围,所有指向都可能控制鼠标光标在屏幕内移动。由于手持操作的稳定性较差,在操作过程中,很容易指向屏幕外面,若采用传统鼠标的控制方法,手持输入设备的指向在屏幕内外来回移动会导致手持输入设备指向的位置与指示图标的显示位置的偏差越来越大,操作越来越不自然。采用本申请的指示图标的显示位置控制方法,在操作过程中,手持输入设备的指向在屏幕内外任意移动,指示图标的显示位置仍然保持与手持输入设备指向的位置相符。因此,用户在操作过程不用控制动作的幅度,有利于提高用户使用体验。本申请的指示图标的显示位置控制方法特别适用于数字无线演示器,在演示过程中,演讲者对指示图标的操控自然,可以作为肢体语言的一部分任意挥动手持输入设备,符合演讲时的动作习惯。
应理解,上述实施例中各步骤的序号的大小并不意味着执行顺序的先后,各过程的执行顺序应以其功能和内在逻辑确定,而不应对本申请实施例的实施过程构成任何限定。
图6为本申请实施例提供的一种手持输入设备的指示图标的显示位置控制装置的示意图,所述手持输入设备的指示图标的显示位置控制装置包括:
第一图像采集单元601,用于采集所述手持输入设备指向的第一图像,提取所述第一图像的图像特征;将所述第一图像的图像特征与预取的屏幕特征进行匹配,识别在所述第一图像中的屏幕成像区域;
映射模型建立单元602,用于根据所述屏幕成像区域建立将所述手持输入设备的指向映射为屏幕参照面的指向位置的映射模型,其中,所述映射模型中,所述手持输入设备的指向的范围包括第一指向范围和第二指向范围,所述第一指向范围中的指向所映射的指向位置位于屏幕内,所述第二指向范围中的指向所映射的指向位置位于屏幕外;
显示位置确定单元603,用于根据所述映射模型确定所述手持输入设备的当前指向对应的屏幕参照面的指向位置,如果该指向位置在屏幕内,根据所述指向位置确定所述指示图标的显示位置;否则,将所述指示图标的显示位置设置在预设的屏幕边沿范围内或者隐藏所述指示图标。
优选的,所述显示位置控制装置还包括:
第二图像采集单元,用于采集显示在所述屏幕上的第二图像;
第二图像特征提取单元,用于提取所述第二图像的图像特征,其中,所述第二图像的图像特征为所述预取的屏幕特征。
优选的,所述显示位置控制装置还包括:
摄像模块控制单元,用于在采集所述手持输入设备指向的第一图像的步骤之前控制用于采集所述第一图像的摄像模块进入工作模式,在采集所述手持输入设备指向的第一图像的步骤之后控制用于采集所述第一图像的摄像模块进入节电模式;
和/或,
图像分析模块控制单元,用于在所述提取所述第一图像的图像特征的步骤之前控制用于提取所述图像特征的图像分析模块进入工作模式,在所述提取所述第一图像的图像特征的步骤之后控制用于提取所述图像特征的图像分析模块进入节电模式。
优选的,所述显示位置确定单元603包括:
第一设置子单元,用于设置所述指示图标的显示位置为所述指向位置;
或者,第二设置子单元,用于根据当前时刻的所述指向位置以及至少一个之前的所述指向位置,确定所述指示图标的显示位置。
优选的,所述显示位置确定单元603包括:
工作模式获取子单元,用于获取所述手持输入设备的当前工作模式;
灵敏度设置子单元,用于根据预设的手持输入设备的工作模式与指示图标的移动灵敏度对应表,查询并设置所述当前工作模式对应的灵敏度。
显示位置确定子单元,用于根据所述指向位置以及所述当前工作模式对应的灵敏度,确定所述指示图标的显示位置。
优选的,所述映射模型建立单元602包括:
第一指向确定子单元,用于确定第一指向,所述第一指向为采集所述第一图像时所述手持输入设备的指向,其中,所述手持输入设备的指向包括所述手持输入设备的偏航角和俯仰角;
以及,映射原点确定子单元,用于根据所述屏幕成像区域在所述第一图像中的位置确定映射原点;
以及,映射系数确定子单元,用于根据所述屏幕成像区域和采集所述第一图像的摄像模块的视场角确定映射系数。
图6所述手持输入设备的指示图标的显示位置控制装置,与图1所述的手持输入设备的指示图标的显示位置控制方法对应。
本申请所述显示位置控制方法涉及的电子设备有手持输入设备、智能设备和显示设备,还可能有其它电子设备,如在智能设备和显示设备之间的数据处理设备。本申请所述显示位置控制方法可以由上述相关电子设备的操作系统或者应用程序来实施,也可以通过便携式存储设备存储所述控制方法的程序,当所述便携式存储设备连接至上述相关设备时,触发上述相关设备执行所述控制方法的程序实施。上述智能设备可以为桌面电脑、手机、平板电脑、嵌入式设备及云计算设备等电子设备。
本申请所述显示位置控制装置的各个模块在上述手持输入设备、智能设备、显示设备以及数据处理设备中的具体部署方式,本申请不作限定。上述智能设备、显示设备及数据处理设备可以是独立的设备,也可以是集成的一体化设备。上述设备之间的连接方式可以是有线连接,也可以是无线连接。
图7是本申请一实施例提供的手持输入设备的示意图。如图7所示,该实施例的手持输入设备7包括:处理器70、存储器71以及存储在所述存储器71中并可在所述处理器70上运行的计算机程序72,例如手持输入设备的指示图标的显示位置控制程序。所述处理器70执行所述计算机程序72时实现上述各个手持输入设备的指示图标的显示位置控制方法实施例中的步骤。或者,所述处理器70执行所述计算机程序72时实现上述各装置实施例中各 模块/单元的功能。
示例性的,所述计算机程序72可以被分割成一个或多个模块/单元,所述一个或者多个模块/单元被存储在所述存储器71中,并由所述处理器70执行,以完成本申请。所述一个或多个模块/单元可以是能够完成特定功能的一系列计算机程序指令段,该指令段用于描述所述计算机程序72在所述手持输入设备7中的执行过程。例如,所述计算机程序72可以被分割成:
第一图像采集单元,用于采集所述手持输入设备指向的第一图像,提取所述第一图像的图像特征;将所述第一图像的图像特征与预取的屏幕特征进行匹配,识别在所述第一图像中的屏幕成像区域;
映射模型建立单元,用于根据所述屏幕成像区域建立将所述手持输入设备的指向映射为屏幕参照面的指向位置的映射模型,其中,所述映射模型中,所述手持输入设备的指向的范围包括第一指向范围和第二指向范围,所述第一指向范围中的指向所映射的指向位置位于屏幕内,所述第二指向范围中的指向所映射的指向位置位于屏幕外;
显示位置确定单元,用于根据所述映射模型确定所述手持输入设备的当前指向对应的屏幕参照面的指向位置,如果该指向位置位于屏幕内,根据所述指向位置确定所述指示图标的显示位置;否则,将所述指示图标的显示位置设置在预设的屏幕边沿范围内或者隐藏所述指示图标。
所述手持输入设备可包括,但不仅限于,处理器70、存储器71。本领域技术人员可以理解,图7仅仅是手持输入设备7的示例,并不构成对手持输入设备7的限定,可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件,例如所述手持输入设备还可以包括输入输出设备、网络接入设备、总线等。所述手持输入设备可以是一个设备,也可以由一个以上分离的子设备组成,例如手持输入设备包括两个子设备,其中一个子设备由操作者手持操作,另一个子设备用于第一图像的分析,两个子设备间用无线通讯实现连接。
所称处理器70可以是中央处理单元(Central Processing Unit,CPU),还可以是其他通用处理器、数字信号处理器(Digital Signal Processor,DSP)、专用集成电路(Application Specific Integrated Circuit,ASIC)、现成可编程门阵列(Field-Programmable Gate Array,FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件等。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。
所述存储器71可以是所述手持输入设备7的内部存储单元,例如手持输入设备7的硬盘或内存。所述存储器71也可以是所述手持输入设备7的外部存储设备,例如所述手持输入设备7上配备的插接式硬盘,智能存储卡(Smart Media Card,SMC),安全数字(Secure Digital,SD)卡,闪存卡(Flash Card)等。进一步地,所述存储器71还可以既包括所述手持输入设备7的内部存储单元也包括外部存储设备。所述存储器71用于存储所述计算机程序以及所述手持输入设备所需的其他程序和数据。所述存储器71还可以用于暂时地存储已经输出或者将要输出的数据。
所属领域的技术人员可以清楚地了解到,为了描述的方便和简洁,仅以上述各功能单元、模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能单元、模块完成,即将所述装置的内部结构划分成不同的功能单元或模块,以完成以上 描述的全部或者部分功能。实施例中的各功能单元、模块可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中,上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。另外,各功能单元、模块的具体名称也只是为了便于相互区分,并不用于限制本申请的保护范围。上述系统中单元、模块的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
在上述实施例中,对各个实施例的描述都各有侧重,某个实施例中没有详述或记载的部分,可以参见其它实施例的相关描述。
本领域普通技术人员可以意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,能够以电子硬件、或者计算机软件和电子硬件的结合来实现。这些功能究竟以硬件还是软件方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。
在本申请所提供的实施例中,应该理解到,所揭露的装置/终端设备和方法,可以通过其它的方式实现。例如,以上所描述的装置/终端设备实施例仅仅是示意性的,例如,所述模块或单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通讯连接可以是通过一些接口,装置或单元的间接耦合或通讯连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
所述集成的模块/单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请实现上述实施例方法中的全部或部分流程,也可以通过计算机程序来指令相关的硬件来完成,所述的计算机程序可存储于一计算机可读存储介质中,该计算机程序在被处理器执行时,可实现上述各个方法实施例的步骤。其中,所述计算机程序包括计算机程序代码,所述计算机程序代码可以为源代码形式、对象代码形式、可执行文件或某些中间形式等。所述计算机可读介质可以包括:能够携带所述计算机程序代码的任何实体或装置、记录介质、U盘、移动硬盘、磁碟、光盘、计算机存储器、只读存储器(ROM,Read-Only Memory)、随机存取存储器(RAM,Random Access Memory)、电载波信号、电信信号以及软件分发介质等。需要说明的是,所述计算机可读介质包含的内容可以根据司法管辖区内立法和专利实践的要求进行适当的增减,例如在某些司法管辖区,根据立法和专利实践,计算机可读介质不包括是电载波信号和电信信号。
以上所述实施例仅用以说明本申请的技术方案,而非对其限制;尽管参照前述实施例对本申请进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各 实施例所记载的技术方案进行修改,或者对其中部分技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本申请各实施例技术方案的精神和范围,均应包含在本申请的保护范围之内。

Claims (16)

  1. 一种手持输入设备的指示图标的显示位置控制方法,其特征在于,所述手持输入设备的指示图标的显示位置控制方法包括:
    采集所述手持输入设备指向的第一图像,提取所述第一图像的图像特征;将所述第一图像的图像特征与预取的屏幕特征进行匹配,识别在所述第一图像中的屏幕成像区域;
    根据所述屏幕成像区域建立将所述手持输入设备的指向映射为屏幕参照面的指向位置的映射模型,其中,所述映射模型中,所述手持输入设备的指向的范围包括第一指向范围和第二指向范围,所述第一指向范围中的指向所映射的指向位置位于屏幕内,所述第二指向范围中的指向所映射的指向位置位于屏幕外;
    根据所述映射模型确定所述手持输入设备的当前指向对应的屏幕参照面的指向位置,如果该指向位置位于屏幕内,根据所述指向位置确定所述指示图标的显示位置;否则,将所述指示图标的显示位置设置在预设的屏幕边沿范围内或者隐藏所述指示图标。
  2. 根据权利要求1所述的手持输入设备的指示图标的显示位置控制方法,其特征在于,在所述将所述第一图像的图像特征与预取的屏幕特征进行匹配的步骤之前,所述显示位置控制方法还包括:
    采集显示在所述屏幕上的第二图像;
    提取所述第二图像的图像特征,其中,所述第二图像的图像特征为所述预取的屏幕特征。
  3. 根据权利要求1所述的手持输入设备的指示图标的显示位置控制方法,其特征在于,所述显示位置控制方法还包括:
    在采集所述手持输入设备指向的第一图像的步骤之前控制用于采集所述第一图像的摄像模块进入工作模式,在采集所述手持输入设备指向的第一图像的步骤之后控制用于采集所述第一图像的摄像模块进入节电模式;
    和/或,
    在所述提取所述第一图像的图像特征的步骤之前控制用于提取所述图像特征的图像分析模块进入工作模式,在所述提取所述第一图像的图像特征的步骤之后控制用于提取所述图像特征的图像分析模块进入节电模式。
  4. 根据权利要求1所述的手持输入设备的指示图标的显示位置控制方法,其特征在于,所述根据所述指向位置确定所述指示图标的显示位置的步骤包括:
    设置所述指示图标的显示位置为所述指向位置;
    或者,根据当前时刻的所述指向位置以及至少一个之前的所述指向位置,确定所述指示图标的显示位置。
  5. 根据权利要求1所述的手持输入设备的指示图标的显示位置控制方法,其特征在于,所述根据所述指向位置确定所述指示图标的显示位置的步骤包括:
    获取所述手持输入设备的当前工作模式;
    根据预设的手持输入设备的工作模式与指示图标的移动灵敏度对应表,查询所述当前工作模式对应的灵敏度;
    根据所述指向位置以及所述当前工作模式对应的灵敏度,确定所述指示图标的显示位置。
  6. 根据权利要求1所述的手持输入设备的指示图标的显示位置控制方法,其特征在于, 所述映射模型的参数包括第一指向、映射原点和映射系数,其中:
    所述第一指向为采集所述第一图像时所述手持输入设备的指向,所述手持输入设备的指向包括所述手持输入设备的偏航角和俯仰角;
    所述映射原点根据所述屏幕成像区域在所述第一图像中的位置所确定;
    所述映射系数根据所述屏幕成像区域和采集所述第一图像的摄像模块的视场角所确定。
  7. 根据权利要求6所述的手持输入设备的指示图标的显示位置控制方法,其特征在于,根据所述映射模型确定所述手持输入设备的当前指向对应的屏幕参照面的指向位置的步骤包括:
    确定所述手持输入设备的当前指向;
    根据所述当前指向相对于所述第一指向的角度偏移及所述映射系数确定所述当前指向对应的指向位置相对于所述第一指向对应的指向位置的位置偏移;
    根据所述位置偏移和所述映射原点确定所述手持输入设备的当前指向对应的屏幕参照面的指向位置。
  8. 一种手持输入设备的指示图标的显示位置控制装置,其特征在于,所述手持输入设备的指示图标的显示位置控制装置包括:
    第一图像采集单元,用于采集所述手持输入设备指向的第一图像,提取所述第一图像的图像特征;将所述第一图像的图像特征与预取的屏幕特征进行匹配,识别在所述第一图像中的屏幕成像区域;
    映射模型建立单元,用于根据所述屏幕成像区域建立将所述手持输入设备的指向映射为屏幕参照面的指向位置的映射模型,其中,所述映射模型中,所述手持输入设备的指向的范围包括第一指向范围和第二指向范围,所述第一指向范围中的指向所映射的指向位置位于屏幕内,所述第二指向范围中的指向所映射的指向位置位于屏幕外;
    显示位置确定单元,用于根据所述映射模型确定所述手持输入设备的当前指向对应的屏幕参照面的指向位置,如果该指向位置位于屏幕内,根据所述指向位置确定所述指示图标的显示位置;否则,将所述指示图标的显示位置设置在预设的屏幕边沿范围内或者隐藏所述指示图标。
  9. 根据权利要求8所述的手持输入设备的指示图标的显示位置控制装置,其特征在于,所述显示位置控制装置还包括:
    第二图像采集单元,用于采集显示在所述屏幕上的第二图像;
    第二图像特征提取单元,用于提取所述第二图像的图像特征,其中,所述第二图像的图像特征为所述预取的屏幕特征。
  10. 根据权利要求8所述的手持输入设备的指示图标的显示位置控制装置,其特征在于,所述显示位置控制装置还包括:
    摄像模块控制单元,用于在采集所述手持输入设备指向的第一图像的步骤之前控制用于采集所述第一图像的摄像模块进入工作模式,在采集所述手持输入设备指向的第一图像的步骤之后控制用于采集所述第一图像的摄像模块进入节电模式;
    和/或,
    图像分析模块控制单元,用于在所述提取所述第一图像的图像特征的步骤之前控制用 于提取所述图像特征的图像分析模块进入工作模式,在所述提取所述第一图像的图像特征的步骤之后控制用于提取所述图像特征的图像分析模块进入节电模式。
  11. 根据权利要求8所述的手持输入设备的指示图标的显示位置控制装置,其特征在于,所述显示位置确定单元包括:
    第一设置子单元,用于设置所述指示图标的显示位置为所述指向位置;
    或者,第二设置子单元,用于根据当前时刻的所述指向位置以及至少一个之前的所述指向位置,确定所述指示图标的显示位置。
  12. 根据权利要求8所述的手持输入设备的指示图标的显示位置控制装置,其特征在于,所述显示位置确定单元包括:
    工作模式获取子单元,用于获取所述手持输入设备的当前工作模式;
    灵敏度查询子单元,用于根据预设的手持输入设备的工作模式与指示图标的移动灵敏度对应表,查询所述当前工作模式对应的灵敏度;
    显示位置确定子单元,用于根据所述指向位置以及所述当前工作模式对应的灵敏度,确定所述指示图标的显示位置。
  13. 根据权利要求8所述的手持输入设备的指示图标的显示位置控制装置,其特征在于,所述映射模型的参数包括第一指向、映射原点和映射系数,其中:
    所述第一指向为采集所述第一图像时所述手持输入设备的指向,所述手持输入设备的指向包括所述手持输入设备的偏航角和俯仰角;
    所述映射原点根据所述屏幕成像区域在所述第一图像中的位置所确定;
    所述映射系数根据所述屏幕成像区域和采集所述第一图像的摄像模块的视场角所确定。
  14. 根据权利要求13所述的手持输入设备的指示图标的显示位置控制装置,其特征在于,所述显示位置确定单元包括:
    当前指向确定子单元,用于确定所述手持输入设备的当前指向;
    位置偏移确定子单元,用于根据所述当前指向相对于所述第一指向的角度偏移及所述映射系数确定所述当前指向对应的指向位置相对于所述第一指向对应的指向位置的位置偏移;
    指向位置确定子单元,用于根据所述位置偏移和所述映射原点确定所述手持输入设备的当前指向对应的屏幕参照面的指向位置。
  15. 一种手持输入设备,包括存储器、处理器以及存储在所述存储器中并可在所述处理器上运行的计算机程序,其特征在于,所述处理器执行所述计算机程序时实现如权利要求1至7任一项所述手持输入设备的指示图标的显示位置控制方法的步骤。
  16. 一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,其特征在于,所述计算机程序被处理器执行时实现如权利要求1至7任一项所述手持输入设备的指示图标的显示位置控制方法的步骤。
PCT/CN2020/100298 2019-07-05 2020-07-04 手持输入设备及其指示图标的显示位置控制方法和装置 WO2021004412A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910603033.8 2019-07-05
CN201910603033.8A CN110489027B (zh) 2019-07-05 2019-07-05 手持输入设备及其指示图标的显示位置控制方法和装置

Publications (1)

Publication Number Publication Date
WO2021004412A1 true WO2021004412A1 (zh) 2021-01-14

Family

ID=68546713

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/100298 WO2021004412A1 (zh) 2019-07-05 2020-07-04 手持输入设备及其指示图标的显示位置控制方法和装置

Country Status (2)

Country Link
CN (1) CN110489027B (zh)
WO (1) WO2021004412A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114030355A (zh) * 2021-11-15 2022-02-11 智己汽车科技有限公司 一种车辆控制方法、装置、车辆及介质

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110489027B (zh) * 2019-07-05 2021-07-23 深圳市格上格创新科技有限公司 手持输入设备及其指示图标的显示位置控制方法和装置
CN110489026A (zh) * 2019-07-05 2019-11-22 深圳市格上格创新科技有限公司 一种手持输入设备及其指示图标的消隐控制方法和装置
CN111556223B (zh) * 2020-04-28 2022-08-23 广州视源电子科技股份有限公司 指向信息确定方法、装置、演示设备、显示设备及介质
CN115079888A (zh) * 2021-03-10 2022-09-20 北京有竹居网络技术有限公司 显示设备的控制方法、装置、终端和存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101398721A (zh) * 2007-09-26 2009-04-01 昆盈企业股份有限公司 空中鼠标的光标位移速度控制方法
CN103902061A (zh) * 2012-12-25 2014-07-02 华为技术有限公司 空中鼠标的光标显示方法、设备及系统
CN106774868A (zh) * 2016-12-06 2017-05-31 杨超峰 一种无线演示装置
US20190121449A1 (en) * 2009-06-04 2019-04-25 Sony Corporation Control device, input device, control system, handheld device, and control method
CN110489027A (zh) * 2019-07-05 2019-11-22 深圳市格上格创新科技有限公司 手持输入设备及其指示图标的显示位置控制方法和装置
CN110489026A (zh) * 2019-07-05 2019-11-22 深圳市格上格创新科技有限公司 一种手持输入设备及其指示图标的消隐控制方法和装置

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI396117B (zh) * 2007-10-30 2013-05-11 Imu Solutions Inc 用於控制游標移動的加速度計資料處理方法及游標控制裝置
CN102467254B (zh) * 2010-11-11 2014-06-25 Tcl集团股份有限公司 光标控制方法及系统
CN101980299B (zh) * 2010-11-24 2013-01-02 河海大学 基于棋盘标定的摄像机映射方法
KR20130130453A (ko) * 2012-05-22 2013-12-02 엘지전자 주식회사 영상표시장치 및 그 동작 방법
CN104331212B (zh) * 2013-07-22 2017-11-10 原相科技股份有限公司 手持式指向装置的光标定位方法
CN103645805B (zh) * 2013-12-20 2017-06-20 深圳泰山体育科技股份有限公司 体感方式的控件操控方法及系统
CN108196675B (zh) * 2017-12-29 2021-04-16 珠海市君天电子科技有限公司 用于触控终端的交互方法、装置及触控终端

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101398721A (zh) * 2007-09-26 2009-04-01 昆盈企业股份有限公司 空中鼠标的光标位移速度控制方法
US20190121449A1 (en) * 2009-06-04 2019-04-25 Sony Corporation Control device, input device, control system, handheld device, and control method
CN103902061A (zh) * 2012-12-25 2014-07-02 华为技术有限公司 空中鼠标的光标显示方法、设备及系统
CN106774868A (zh) * 2016-12-06 2017-05-31 杨超峰 一种无线演示装置
CN110489027A (zh) * 2019-07-05 2019-11-22 深圳市格上格创新科技有限公司 手持输入设备及其指示图标的显示位置控制方法和装置
CN110489026A (zh) * 2019-07-05 2019-11-22 深圳市格上格创新科技有限公司 一种手持输入设备及其指示图标的消隐控制方法和装置

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114030355A (zh) * 2021-11-15 2022-02-11 智己汽车科技有限公司 一种车辆控制方法、装置、车辆及介质

Also Published As

Publication number Publication date
CN110489027A (zh) 2019-11-22
CN110489027B (zh) 2021-07-23

Similar Documents

Publication Publication Date Title
WO2021004412A1 (zh) 手持输入设备及其指示图标的显示位置控制方法和装置
US11393154B2 (en) Hair rendering method, device, electronic apparatus, and storage medium
US20190129607A1 (en) Method and device for performing remote control
US11231845B2 (en) Display adaptation method and apparatus for application, and storage medium
WO2020019913A1 (zh) 一种人脸图像处理方法、装置及存储介质
CN102622108B (zh) 一种交互式投影系统及其实现方法
CN110427110B (zh) 一种直播方法、装置以及直播服务器
CN110456907A (zh) 虚拟画面的控制方法、装置、终端设备及存储介质
JP7382994B2 (ja) 仮想現実システム内の仮想コントローラの位置および向きの追跡
WO2021097600A1 (zh) 一种隔空交互方法、装置和设备
CN104081307A (zh) 图像处理装置、图像处理方法和程序
JP2015015023A (ja) 3次元モデルのためのテクスチャデータを取得する方法、ポータブル電子デバイス、及びプログラム
CN112581571B (zh) 虚拟形象模型的控制方法、装置、电子设备及存储介质
JP2007323660A (ja) 描画装置、及び描画方法
US8643679B2 (en) Storage medium storing image conversion program and image conversion apparatus
CN111833243B (zh) 一种数据展示方法、移动终端和存储介质
CN112150560A (zh) 确定消失点的方法、装置及计算机存储介质
WO2021004413A1 (zh) 一种手持输入设备及其指示图标的消隐控制方法和装置
CN106598422B (zh) 混合操控方法及操控系统和电子设备
WO2018076720A1 (zh) 单手操控方法及操控系统
WO2022199102A1 (zh) 图像处理方法及装置
WO2021244650A1 (zh) 控制方法、装置、终端及存储介质
CN105657187A (zh) 一种可见光通信方法、设备及系统
CN115529405A (zh) 前置摄像头的图像显示方法及装置
CN112308767B (zh) 一种数据展示方法、装置、存储介质以及电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20837833

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 20837833

Country of ref document: EP

Kind code of ref document: A1