WO2018171363A1 - 一种位置信息确定方法、投影设备和计算机存储介质 - Google Patents

一种位置信息确定方法、投影设备和计算机存储介质 Download PDF

Info

Publication number
WO2018171363A1
WO2018171363A1 PCT/CN2018/076438 CN2018076438W WO2018171363A1 WO 2018171363 A1 WO2018171363 A1 WO 2018171363A1 CN 2018076438 W CN2018076438 W CN 2018076438W WO 2018171363 A1 WO2018171363 A1 WO 2018171363A1
Authority
WO
WIPO (PCT)
Prior art keywords
area
coordinate
projection
screen
projection area
Prior art date
Application number
PCT/CN2018/076438
Other languages
English (en)
French (fr)
Inventor
赵冬晓
Original Assignee
西安中兴通讯终端科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 西安中兴通讯终端科技有限公司 filed Critical 西安中兴通讯终端科技有限公司
Publication of WO2018171363A1 publication Critical patent/WO2018171363A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/0418Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected

Definitions

  • the present disclosure relates to the field of projection technologies, and in particular, to a location information determining method, a projection device, and a computer storage medium.
  • the projection equipment and the screen equipment as new office equipment can meet the needs of users' office, and thus have been widely used; however, with the continuous development of projection technology, users are not limited. Only the projection content is demonstrated, and more and more attention is paid to the direct control of the projection content, that is, the touch operation of the coordinates in the projection content is performed to achieve the same touch operation effect on the system.
  • the gesture content is usually controlled by gesture recognition, but the gesture is easily occluded. Therefore, the method has low controllability on the projection content; in order to improve the control success rate of the projection content, based on
  • the displacement of the projection device causes the tracking time to be long, and thus the control of the projection content is inefficient.
  • Embodiments of the present disclosure provide a location information determining method, a projection device, and a computer storage medium.
  • An embodiment of the present disclosure provides a location information determining method, including:
  • the projection area is a projection area of the projection device in the screen area
  • the first coordinate is within the projection area, performing coordinate transformation on the first coordinate according to the positional relationship, determining a second coordinate, where the second coordinate is a coordinate of the touched point located in the projected area information.
  • the determining the positional relationship between the projection area and the screen area includes:
  • a positional relationship between the projection area and the screen area is determined based on the screen image information and the projected image information.
  • determining the positional relationship between the projection area and the screen area according to the screen image information and the projected image information includes:
  • the adjacent boundary line is the coordinate system established by the coordinate axis.
  • the preset Cartesian coordinate system is a coordinate system established by using a left boundary line of the screen area as an abscissa axis and a lower boundary line of the screen area as an ordinate axis. Determining the positional relationship between the projection area and the screen area according to the third coordinate of the vertices of the projection area in a preset rectangular coordinate system includes:
  • the touch point is located according to a third coordinate in the preset Cartesian coordinate system and a coordinate in the preset Cartesian coordinate system The coordinates within the projection area.
  • Embodiments of the present disclosure also provide a projection device including: a memory and a processor; wherein the memory is configured to store a computer program executable on the processor, the processor configured to operate When the program is executed, execute:
  • Determining a positional relationship between the projection area and the screen area wherein the projection area is a projection area of the projection device in the screen area; acquiring a first coordinate of the touch point corresponding to the touch operation based on the screen area; The first coordinate is in the projection area, and the first coordinate is coordinate-converted according to the positional relationship, and the second coordinate is determined, and the second coordinate is coordinate information of the touched point located in the projection area.
  • the processor is configured to: obtain the screen image information of the screen area, and acquire the projected image information of the projection area of the projection device in the screen area; according to the screen image information and The projected image information determines a positional relationship between the projected area and the screen area.
  • the processor when the processor is configured to run the program, performing image recognition processing on the projected image information to obtain the projection area, and performing image recognition processing on the screen image information to obtain the screen area. Determining a positional relationship between the projection area and the screen area according to a third coordinate of the vertices of the projection area in a preset rectangular coordinate system, wherein the preset rectangular coordinate system is the screen
  • the adjacent boundary line of the area is the coordinate system established by the coordinate axis.
  • the processor when the processor is configured to run the program, when the preset rectangular coordinate system is the left boundary line of the screen area, the horizontal axis, and the lower part of the screen area.
  • the side boundary line is a coordinate system established by the ordinate axis
  • the third coordinate located in the preset Cartesian coordinate system according to the apex of the projection area and the touch point are located in the preset Cartesian coordinate system
  • the coordinates are calculated according to a preset processing method for the coordinates of the touch point located in the projection area.
  • the embodiment of the present disclosure further provides a projection device, where the projection device includes: a determination module, an acquisition module, and a coordinate conversion module; wherein
  • the determining module is configured to determine a positional relationship between the projection area and the screen area, wherein the projection area is a projection area of the projection device in the screen area;
  • the acquiring module is configured to acquire a first coordinate of a touch point corresponding to a touch operation based on the screen area;
  • the coordinate conversion module is configured to perform coordinate conversion on the first coordinate according to the positional relationship if the first coordinate is in the projection area, and determine a second coordinate, where the second coordinate is the touch The coordinate information of the point located in the projection area.
  • the determining module includes: an obtaining unit and a determining unit; wherein
  • the acquiring unit is configured to acquire screen image information of the screen area, and acquire projection image information of the projection area of the projection device in the screen area;
  • the determining unit is configured to determine a positional relationship between the projection area and the screen area according to the screen image information and the projected image information.
  • the determining unit includes: an identifying subunit and a determining subunit;
  • the identification subunit is configured to perform image recognition processing on the projection image information to obtain the projection area, and perform image recognition processing on the screen image information to obtain the screen area;
  • the determining subunit is configured to determine a positional relationship between the projection area and the screen area according to a third coordinate of the vertices of the projection area in a preset rectangular coordinate system, the preset right angle
  • the coordinate system is a coordinate system established by using adjacent boundary lines of the screen area as coordinate axes.
  • the determining subunit is configured to: when the preset Cartesian coordinate system is the left boundary line of the screen area as an abscissa axis, and the lower boundary line of the screen area is an ordinate
  • the third coordinate in the preset Cartesian coordinate system and the coordinate in the preset Cartesian coordinate system according to the vertices of the projection area are processed according to presets.
  • the method calculates coordinates of the touch point located within the projection area.
  • the embodiment of the present disclosure further provides a computer storage medium having stored therein computer executable instructions for performing a location information determining method of an embodiment of the present disclosure.
  • the first coordinate is coordinate-converted by the positional relationship between the projection area of the projection device on the screen area and the screen area, and the second coordinate is determined, and the first coordinate is based on the screen.
  • the coordinate information of the touch point corresponding to the touch operation of the area, the second coordinate is the coordinate information of the touch point located in the projection area; thereby enabling direct control of the projected content according to the second coordinate, and
  • the calibration speed of the touch point coordinates can be improved, the coordinates of the touch point on the projection area in the screen area can be tracked in real time, the control efficiency of the projection content is improved, and the normal use of the projection device is ensured.
  • FIG. 1 is a schematic flowchart of an implementation process of a first embodiment of a method for determining location information according to the present disclosure
  • FIG. 2 is a schematic diagram of a refinement process for determining a positional relationship between a projection area and a screen area in the implementation flow shown in FIG. 1;
  • FIG. 3 is a schematic flow chart of determining a positional relationship between a projection area and a screen area according to the screen image information and the projected image information in the flow shown in FIG. 2;
  • FIG. 4 is a schematic diagram showing a positional relationship between a projection area and a curtain area according to an embodiment of the present disclosure
  • FIG. 5 is a schematic flowchart of acquiring a first coordinate of a touch point on a screen area by a projection device according to an embodiment of the present disclosure
  • FIG. 6 is a schematic structural diagram of a curtain device according to an embodiment of the present disclosure.
  • FIG. 7 is a schematic structural diagram of a first embodiment of a projection apparatus according to the present disclosure.
  • FIG. 8 is a schematic structural diagram of a second embodiment of a projection apparatus according to the present disclosure.
  • FIG. 9 is a schematic structural diagram of a refinement component of a determining module in the projection apparatus shown in FIG. 8;
  • FIG. 10 is a schematic diagram showing the detailed composition structure of the determining unit in the determining module shown in FIG. 9.
  • the position information determining method provided by the embodiment of the present disclosure is mainly applied to a projection system, and the first coordinate is coordinate-converted by the positional relationship between the projection area of the projection device on the screen area and the screen area to determine the second a coordinate, the first coordinate is coordinate information of a touch point corresponding to a touch operation based on the screen area, and the second coordinate is coordinate information of the touch point located in the projection area; thereby being capable of being according to the second
  • the coordinates realize direct control of the projected content, and during the shifting process of the projection device, the coordinates of the touched point can be tracked in real time, and the control efficiency of the projected content is improved.
  • FIG. 1 is a schematic flowchart of the implementation of the first embodiment of the method for determining location information according to the disclosure.
  • the location information determining method in this embodiment includes the following steps:
  • Step 101 determining a positional relationship between a projection area and a screen area, wherein the projection area is a projection area of the projection device in the screen area;
  • the position information determining method in this embodiment is mainly applied to a projection device of a projection system for converting coordinates of a touch point located in a screen area to coordinates of the touch point located at a projection area.
  • FIG. 2 is a schematic diagram of a refinement process for determining a positional relationship between a projection area and a screen area in the implementation flow shown in FIG. 1.
  • step 101 specifically includes the following steps:
  • Step 1011 Obtain screen image information of a screen area, and acquire projection image information of a projection area of the projection device in the screen area;
  • the projection device includes a camera, and the installation position of the camera is located on the same plane as the projector, and all the images projected by the projection device can be collected, so that the target image information can be acquired, and the target image information includes the screen image of the screen area.
  • Information and projected image information of the projection device at a projection area of the screen area is included in the projection device.
  • Step 1012 Determine a positional relationship between the projection area and the screen area according to the screen image information and the projected image information.
  • FIG. 3 is a schematic diagram of a refinement process for determining a positional relationship between a projection area and a screen area according to the screen image information and the projection image information in the flow shown in FIG. 2.
  • step 1012 specifically includes the following steps:
  • Step 10121 Perform image recognition processing on the projected image information to obtain the projection area, and perform image recognition processing on the screen image information to obtain the screen area.
  • the digital image processing method may be used to identify the projection area and the screen area.
  • the projection area in the projection image information and the target rectangular area in the screen image information are first identified, the target The rectangular area is the largest rectangular area in the image information of the screen, and then the screen area is identified based on the projected area and the target rectangular area.
  • the brightness value of the projection area is larger than the brightness value of other areas.
  • the projection area and the non-projection area can be distinguished.
  • the projected image may be shot due to external factors such as light illumination or mobile phone illumination.
  • the brightness value of the pixel in the plurality of regions is greater than the other regions.
  • the projection region may be accurately identified according to the shape of the region and the continuity of the pixel points in the region; specifically, all the pixels in the projected image information are calculated.
  • An average value of the brightness values of the points; a target pixel point in which the brightness value is greater than the average value in the projected image information; and a rectangular area formed by the target pixel point is set as the projection area.
  • the target rectangular area in the screen image information is identified by the digital image processing method and the rectangular area recognition, and if the projection area is completely within the target rectangular area, the target rectangular area is the curtain area; otherwise
  • the screen area recognition fails, and the camera needs to be re-adjusted so that the target image information captured by the camera includes a projection area and a screen area.
  • Step 10122 Determine a positional relationship between the projection area and the screen area according to a third coordinate of the vertices of the projection area in a preset rectangular coordinate system, where the preset rectangular coordinate system is The adjacent boundary line of the curtain area is the coordinate system established by the coordinate axis.
  • the Cartesian coordinate system of the curtain area may be established by using the left boundary line and the lower boundary line of the screen area as the coordinate axes, or the right angle of the screen area may be established by using the right boundary line and the lower boundary line of the curtain area as coordinate axes.
  • the coordinate system can also establish the Cartesian coordinate system of the curtain area with the left boundary line and the upper boundary line of the curtain area as the coordinate axes, or establish the curtain area by using the right boundary line and the upper boundary line of the curtain area as the coordinate axes.
  • the Cartesian coordinate system of the screen area is defined by taking the left boundary line and the lower boundary line of the screen area as coordinate axes as an example for detailed description; specifically, FIG. 4 is a projection area and a curtain.
  • FIG. 4 Schematic diagram of the positional relationship between the regions, as shown in FIG. 4, the Cartesian coordinate system of the curtain area is the horizontal boundary axis of the screen area, the direction is upward, and the lower boundary line of the curtain area is the ordinate axis. The direction is positive to the right.
  • the Cartesian coordinate system of the projection area can be established by using the left boundary line and the lower boundary line of the projection area as coordinate axes, or the projection area can be established by using the right boundary line and the lower boundary line of the projection area as coordinate axes.
  • the Cartesian coordinate system of the projection area can be established by using the left boundary line and the upper boundary line of the projection area as the coordinate axes, or the projection is established by using the right boundary line and the upper boundary line of the projection area as the coordinate axes.
  • a Cartesian coordinate system in which a projection area is formed by using a left boundary line and a lower boundary line of the projection area as coordinate axes is taken as an example for detailed description; specifically, referring to FIG. 4,
  • the Cartesian coordinate system of the projection area is the abscissa axis of the left side of the projection area, the direction is positive, and the lower boundary line of the projection area is the ordinate axis, and the direction is positive to the right.
  • the projection may be performed according to the projection a third coordinate of the vertices of the region is located in the preset rectangular coordinate system and a coordinate of the touch point in the preset rectangular coordinate system, and the touch point is calculated in the projection region according to a preset processing method.
  • the coordinates of the A vertex of the projection area in the Cartesian coordinate system of the screen area are (M1, N1), and the coordinates of the B vertex of the projection area in the Cartesian coordinate system of the screen area are (M2, N2), the coordinates of the C vertex of the projection area in the Cartesian coordinate system of the screen area are (M3, N3), and the coordinates of the D vertex of the projection area in the Cartesian coordinate system of the screen area are (M4, N4);
  • the coordinates of the screen area are represented by (X, Y), and the coordinates of the touch point with respect to the projection area are represented by (X 1 , Y 1 ), and the positional relationship between the projection area and the screen area can be used.
  • Formula (1) and formula (2) are expressed.
  • the vertex refers to a point at which each adjacent two sides of the rectangle intersect.
  • Step 102 Acquire a first coordinate of a touch point corresponding to a touch operation based on the screen area
  • the projection device may establish a connection with the screen device through a wireless module
  • the wireless module may adopt a Bluetooth communication method or a short-range wireless transmission technology (such as a wireless fidelity (WiFi) communication method).
  • the Zigbee communication method may be used to connect the projection device and the screen device.
  • the wireless module connects the projection device and the screen device in a Bluetooth communication manner as an example. Specifically, the Bluetooth module of the screen device and the projection device are separately opened, and the connection is performed through a pairing manner. After the connection is established, the projection device can receive the first coordinate sent by the screen device.
  • Step 103 If the first coordinate is in the projection area, perform coordinate conversion on the first coordinate according to the positional relationship, and determine a second coordinate, where the second coordinate is that the touch point is located in the projection Coordinate information of the area;
  • whether the touch point is within the projection area may be determined by comparing a first coordinate of the touch point with respect to the screen area and a third coordinate of the vertices of the projection area with respect to the screen area; specifically, Referring to FIG. 4, whether the touch point is within the projection area can be determined by comparing the coordinates of the touched point and the magnitude relationship of the coordinates of the A vertex, the B vertex, the C vertex, and the D vertex, respectively.
  • the projection device captures target image information by using a camera, the target image information including screen image information and projected image information;
  • the direction to the right is the Cartesian coordinate system of the curtain area;
  • the left boundary line of the projection area is the abscissa axis, the direction is positive upward, the lower boundary line of the projection area is the ordinate axis, and the direction to the right is the Cartesian coordinate system for establishing the projection area;
  • Control efficiency is because, in practical applications, when the projection device projects the projection content onto the screen area, the screen is not covered by the tilt of the projected image and the projected image cannot completely cover the screen area, resulting in the screen when the screen area is touched.
  • the first coordinate of the touch point of the device positioning cannot be equal to the second coordinate of the touch point relative to the projected content; converting the first coordinate to the second coordinate, the system may locate the touch according to the second coordinate Pointing at a specific position of the projected content, thereby enabling direct control of the projected content; and during the shifting of the projection device, converting the first coordinate by a positional relationship between the projected area and the screen area determined in real time The second coordinate, so that the coordinates of the touched point can be tracked in real time, and the control effect on the projected content is improved.
  • FIG. 5 is the projection device acquiring the screen area.
  • the flow chart of the first coordinate of the touch point, as shown in FIG. 5, the step of the projection device acquiring the first coordinate of the touch point on the screen area includes:
  • Step 201 When detecting that the screen area of the screen device is touched, acquiring a first coordinate of the touch point on the screen area, where the first coordinate is coordinate information of the touch point located in the screen area;
  • the screen device detects that the screen area of the screen device is touched, the first coordinates of the touch points on the screen area are acquired.
  • the screen device includes a touch sensing module, and the touch sensing module can detect a touch of the screen area.
  • the touch sensing module can adopt an infrared method, a capacitive method, an acoustic wave method, or a pressure method.
  • the touch of the screen area is detected by using an infrared method as an example.
  • FIG. 6 is a schematic structural diagram of the screen device. Referring to FIG.
  • the curtain device comprises an infrared transmitting tube and an infrared receiving tube.
  • the infrared transmitting tube and the infrared receiving tube are closely attached to the periphery of the screen area, and continuously emit and receive infrared rays to form an inverted infrared matrix; correspondingly, through continuous scanning of the infrared
  • the touch that is occluded to locate the touch object that is, when the touch object touches the screen area, the two infrared rays passing through the touch point will be blocked, and when the infrared receiving tube is scanned, the infrared light is not received, according to the infrared
  • the specific position of the receiving tube can locate the position information of the touch point, and According to the Cartesian coordinate system of the screen area obtaining a first pre-established coordinates of the touch point.
  • touching the screen area does not necessarily directly touch the screen area.
  • the touch sensing module of the screen device can sense that the screen area is touched, which can be called touch.
  • the two infrared rays of the screen area are blocked, it can be called a touch.
  • Step 202 Send the first coordinate to a projection device, where the projection device performs coordinate conversion on the first coordinate to determine a second coordinate, where the second coordinate is that the touched point is located in the projection area. Coordinate information.
  • the screen device may establish a connection with the projection device through a wireless module, and the wireless module may adopt a Bluetooth communication method, a short-range wireless transmission technology (such as WiFi), or a Zigbee communication method.
  • the screen device and the projection device in this embodiment, the wireless module connects the screen device and the projection device by using a Bluetooth communication method as an example for detailed description; specifically, respectively, the Bluetooth device of the screen device and the projection device is opened, and the pairing is performed.
  • the screen device transmits the first coordinates of the touch point to the projection device, so that the projection device can acquire the first coordinate.
  • An embodiment of the present disclosure provides a projection apparatus configured to implement the above-described location information determining method to achieve the same effect.
  • FIG. 7 is a schematic structural diagram of a first embodiment of a projection apparatus according to the present disclosure.
  • the projection apparatus of this embodiment includes: a memory 31 and a processor 32; wherein the memory 31 is configured to be stored in the processing.
  • Determining a positional relationship between the projection area and the screen area wherein the projection area is a projection area of the projection device in the screen area; acquiring a first coordinate of the touch point corresponding to the touch operation based on the screen area; The first coordinate is in the projection area, and the first coordinate is coordinate-converted according to the positional relationship, and the second coordinate is determined, and the second coordinate is coordinate information of the touched point located in the projection area.
  • the processor 32 is further configured to: when the program is executed, perform: acquiring screen image information of the screen area, and acquiring projection image information of the projection area of the projection device in the screen area; according to the screen image The information and the projected image information determine a positional relationship between the projection area and the screen area.
  • the processor 32 is further configured to: perform the image recognition process on the projection image information to obtain the projection area, and perform image recognition processing on the screen image information to obtain the a screen area; determining a positional relationship between the projection area and the screen area according to a third coordinate of the vertices of the projection area in a preset rectangular coordinate system, where the preset rectangular coordinate system is The adjacent boundary line of the curtain area is the coordinate system established by the coordinate axis.
  • the processor 32 is further configured to: when the program is executed, perform: a third coordinate located in the preset Cartesian coordinate system according to a vertex of the projection area, and the touch point is located in the preset Cartesian coordinate system
  • the coordinates in the middle, the coordinates of the touch point located in the projection area are calculated according to a preset processing method.
  • Embodiments of the present disclosure provide a computer storage medium, particularly a computer readable storage medium, in which computer executable instructions are stored, the computer executable instructions being used to execute the location information determining method.
  • FIG. 8 is a schematic structural diagram of a second embodiment of the projection device of the present disclosure.
  • the projection device of the embodiment includes: a determination module 41 , an acquisition module 42 , and coordinates. Conversion module 43; wherein
  • the determining component 41 is configured to determine a positional relationship between the projection area and the screen area, wherein the projection area is a projection area of the projection device in the screen area;
  • the acquiring module 42 is configured to acquire a first coordinate of a touch point corresponding to a touch operation based on the screen area;
  • the coordinate conversion module 43 is configured to perform coordinate conversion on the first coordinate according to the positional relationship if the first coordinate is in the projection area, and determine a second coordinate, where the second coordinate is the The touch point is located at coordinate information of the projection area.
  • the determination module includes: an acquisition unit 411 and a determination unit 412;
  • the acquiring unit 411 is configured to acquire screen image information of the screen area, and acquire projection image information of the projection area of the projection device in the screen area;
  • the determining unit 412 is configured to determine a positional relationship between the projection area and the screen area according to the screen image information and the projected image information.
  • the determining unit includes: an identifying subunit 4121 and a determining subunit 4122;
  • the identification subunit 4121 is configured to perform image recognition processing on the projection image information to obtain the projection area, and perform image recognition processing on the screen image information to obtain the screen area;
  • the determining subunit 4122 is configured to determine a positional relationship between the projection area and the screen area according to a third coordinate of the vertices of the projection area in a preset rectangular coordinate system, the preset
  • the Cartesian coordinate system is a coordinate system established by using adjacent boundary lines of the screen area as coordinate axes.
  • the preset Cartesian coordinate system is a coordinate system established by using a left boundary line of the screen area as an abscissa axis and a lower boundary line of the screen area as an ordinate axis
  • the determining subunit 4122 is specifically configured to: according to the third coordinate of the vertices of the projection area located in the preset Cartesian coordinate system and the coordinates of the touch point located in the preset Cartesian coordinate system, according to the pre-predetermined
  • a processing method is provided to calculate coordinates of the touch point located within the projection area.
  • the solution provided by the embodiment of the present disclosure performs coordinate conversion on the first coordinate by using a positional relationship between a projection area of the projection device on the screen area and the screen area, and determines a second coordinate, where the first coordinate is based on Describe coordinate information of the touch point corresponding to the touch operation of the screen area, the second coordinate is coordinate information of the touch point located in the projection area; thereby enabling direct control of the projected content according to the second coordinate, and
  • the calibration speed of the touch point coordinates can be improved, the coordinates of the touch point in the projection area on the screen area can be tracked in real time, the control efficiency of the projection content is improved, and the normal use of the projection device is ensured.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Position Input By Displaying (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

一种位置信息确定方法、投影设备和计算机存储介质。所述方法包括:确定投影区域与幕布区域之间的位置关系,其中,所述投影区域为投影设备在所述幕布区域的投影区域(101);获取基于所述幕布区域的触摸操作对应的触摸点的第一坐标(102);若所述第一坐标在所述投影区域内,根据所述位置关系对所述第一坐标进行坐标转换,确定第二坐标,所述第二坐标为所述触摸点位于所述投影区域的坐标信息(103)。

Description

一种位置信息确定方法、投影设备和计算机存储介质
相关申请的交叉引用
本申请基于申请号为201710183162.7、申请日为2017年03月24日的中国专利申请提出,并要求该中国专利申请的优先权,该中国专利申请的全部内容在此引入本申请作为参考。
技术领域
本公开涉及投影技术领域,尤其涉及一种位置信息确定方法、投影设备和计算机存储介质。
背景技术
在追求高效率、快节奏的现代办公中,投影设备和幕布设备作为新型办公设备,可以满足用户办公的需求,因而得到了广泛的应用;然而,随着投影技术的不断发展,用户并不局限于只对投影内容进行演示,而越来越重视对投影内容的直接控制,即通过对投影内容中的坐标进行触摸操作,来达到对系统进行同样的触摸操作的效果。
相关技术中,通常采用手势识别的方式实现对投影内容的控制,然而手势很容易被遮挡,因此采用该方式会导致对投影内容的控制成功率低;为了提高对投影内容的控制成功率,基于此提出了采用激光跟踪的方式对投影内容进行控制,采用投影设备激光笔在幕布上照射出一个光斑,然后投影设备中的摄像头通过拍摄幕布来跟踪该光斑,生成该光斑相对于投影内容的坐标;然而,该方式在控制过程中,投影设备的移位会使得跟踪该光斑的时间较长,因此会导致对投影内容的控制效率低。
发明内容
本公开实施例提供一种位置信息确定方法、投影设备和计算机存储介 质。
本公开实施例的技术方案是这样实现的:
本公开实施例提供了一种位置信息确定方法,包括:
确定投影区域与幕布区域之间的位置关系,其中,所述投影区域为投影设备在所述幕布区域的投影区域;
获取基于所述幕布区域的触摸操作对应的触摸点的第一坐标;
若所述第一坐标在所述投影区域内,根据所述位置关系对所述第一坐标进行坐标转换,确定第二坐标,所述第二坐标为所述触摸点位于所述投影区域的坐标信息。
上述方案中,所述确定投影区域与幕布区域之间的位置关系包括:
获取幕布区域的幕布图像信息,并获取投影设备在所述幕布区域的投影区域的投影图像信息;
根据所述幕布图像信息和所述投影图像信息确定投影区域与幕布区域的位置关系。
上述方案中,所述根据所述幕布图像信息和所述投影图像信息确定投影区域与幕布区域的位置关系包括:
对所述投影图像信息进行图像识别处理获得所述投影区域,并对所述幕布图像信息进行图像识别处理获得所述幕布区域;
根据所述投影区域的顶点位于预设的直角坐标系中的第三坐标,确定所述投影区域与所述幕布区域之间的位置关系,所述预设的直角坐标系是以所述幕布区域的相邻边界线为坐标轴建立的坐标系。
上述方案中,当所述预设的直角坐标系是以所述幕布区域的左侧边界线为横坐标轴、以所述幕布区域的下侧边界线为纵坐标轴建立的坐标系时,所述根据所述投影区域的顶点位于预设的直角坐标系中的第三坐标,确定所述投影区域与所述幕布区域之间的位置关系包括:
根据所述投影区域的顶点位于所述预设的直角坐标系中的第三坐标和所述触摸点位于所述预设的直角坐标系中的坐标,按照预设处理方法计算所述触摸点位于所述投影区域内的坐标。
本公开实施例还提供了一种投影设备,所述投影设备包括:存储器和处理器;其中,所述存储器配置为存储可在所述处理器上运行的计算机程序,所述处理器配置为运行所述程序时,执行:
确定投影区域与幕布区域之间的位置关系,其中,所述投影区域为投影设备在所述幕布区域的投影区域;获取基于所述幕布区域的触摸操作对应的触摸点的第一坐标;若所述第一坐标在所述投影区域内,根据所述位置关系对所述第一坐标进行坐标转换,确定第二坐标,所述第二坐标为所述触摸点位于所述投影区域的坐标信息。
上述方案中,所述处理器配置为运行所述程序时,执行:获取幕布区域的幕布图像信息,并获取投影设备在所述幕布区域的投影区域的投影图像信息;根据所述幕布图像信息和所述投影图像信息确定投影区域与幕布区域的位置关系。
上述方案中,所述处理器配置为运行所述程序时,执行:对所述投影图像信息进行图像识别处理获得所述投影区域,并对所述幕布图像信息进行图像识别处理获得所述幕布区域;根据所述投影区域的顶点位于预设的直角坐标系中的第三坐标,确定所述投影区域与所述幕布区域之间的位置关系,所述预设的直角坐标系是以所述幕布区域的相邻边界线为坐标轴建立的坐标系。
上述方案中,所述处理器配置为运行所述程序时,执行:当所述预设的直角坐标系是以所述幕布区域的左侧边界线为横坐标轴、以所述幕布区域的下侧边界线为纵坐标轴建立的坐标系时,根据所述投影区域的顶点位于所述预设的直角坐标系中的第三坐标和所述触摸点位于所述预设的直角坐标系中的坐标,按照预设处理方法计算所述触摸点位于所述投影区域内的坐标。
本公开实施例还提供了一种投影设备,所述投影设备包括:确定模块、获取模块和坐标转换模块;其中,
所述确定模块,配置为确定投影区域与幕布区域之间的位置关系,其中,所述投影区域为投影设备在所述幕布区域的投影区域;
所述获取模块,配置为获取基于所述幕布区域的触摸操作对应的触摸 点的第一坐标;
所述坐标转换模块,配置为若所述第一坐标在所述投影区域内,根据所述位置关系对所述第一坐标进行坐标转换,确定第二坐标,所述第二坐标为所述触摸点位于所述投影区域的坐标信息。
上述方案中,所述确定模块包括:获取单元和确定单元;其中,
所述获取单元,配置为获取幕布区域的幕布图像信息,并获取投影设备在所述幕布区域的投影区域的投影图像信息;
所述确定单元,配置为根据所述幕布图像信息和所述投影图像信息确定投影区域与幕布区域的位置关系。
上述方案中,所述确定单元包括:识别子单元和确定子单元;其中,
所述识别子单元,配置为对所述投影图像信息进行图像识别处理获得所述投影区域,并对所述幕布图像信息进行图像识别处理获得所述幕布区域;
所述确定子单元,配置为根据所述投影区域的顶点位于预设的直角坐标系中的第三坐标,确定所述投影区域与所述幕布区域之间的位置关系,所述预设的直角坐标系是以所述幕布区域的相邻边界线为坐标轴建立的坐标系。
上述方案中,所述确定子单元,配置为当所述预设的直角坐标系是以所述幕布区域的左侧边界线为横坐标轴、以所述幕布区域的下侧边界线为纵坐标轴建立的坐标系时,根据所述投影区域的顶点位于所述预设的直角坐标系中的第三坐标和所述触摸点位于所述预设的直角坐标系中的坐标,按照预设处理方法计算所述触摸点位于所述投影区域内的坐标。
本公开实施例还提供了一种计算机存储介质,所述计算机存储介质中存储有计算机可执行指令,所述计算机可执行指令用于执行本公开实施例的位置信息确定方法。
可见,本公开实施例通过投影设备在幕布区域上的投影区域与所述幕布区域之间的位置关系,对第一坐标进行坐标转换,确定第二坐标,所述第一坐标为基于所述幕布区域的触摸操作对应的触摸点的坐标信息,所述 第二坐标为所述触摸点位于所述投影区域的坐标信息;从而能够根据所述第二坐标实现对投影内容的直接控制,并且,在投影设备的移位过程中,能够提高触摸点坐标的校准速度,实时跟踪幕布区域上触摸点在投影区域的坐标,提高了对投影内容的控制效率,保证了投影设备的正常使用。
附图说明
图1为本公开位置信息确定方法实施例一的实现流程示意图;
图2为图1所示实现流程中确定投影区域与幕布区域之间的位置关系的细化流程示意图;
图3为图2所示流程中根据所述幕布图像信息和所述投影图像信息确定投影区域与幕布区域的位置关系的细化流程示意图;
图4为本公开实施例投影区域与幕布区域之间的位置关系示意图;
图5为本公开实施例投影设备获取幕布区域上触摸点的第一坐标的流程示意图;
图6为本公开实施例幕布设备的结构示意图;
图7为本公开投影设备实施例一的组成结构示意图;
图8为本公开投影设备实施例二的组成结构示意图;
图9为图8所示投影设备中确定模块的细化组成结构示意图;
图10为图9所示确定模块中确定单元的细化组成结构示意图。
具体实施方式
本公开实施例提供的位置信息确定方法,主要应用于投影系统中,通过投影设备在幕布区域上的投影区域与所述幕布区域之间的位置关系,对第一坐标进行坐标转换,确定第二坐标,所述第一坐标为基于所述幕布区域的触摸操作对应的触摸点的坐标信息,所述第二坐标为所述触摸点位于所述投影区域的坐标信息;从而能够根据所述第二坐标实现对投影内容的直接控制,并且在投影设备的移位过程中,能够实时跟踪触摸点的坐标,提高对投影内容的控制效率。
本公开目的的实现、功能特点及优点将结合实施例,参照附图做进一步说明。应当理解,此处所描述的具体实施例仅仅用以解释本公开,并不用于限定本公开。
图1为本公开位置信息确定方法实施例一的实现流程示意图,参照图1所示,本实施例的位置信息确定方法包括以下步骤:
步骤101,确定投影区域与幕布区域之间的位置关系,其中,所述投影区域为投影设备在所述幕布区域的投影区域;
本实施例中的位置信息确定方法主要应用在投影系统的投影设备中,用于将触摸点位于幕布区域内的坐标转换成所述触摸点位于投影区域的坐标。
图2为图1所示实现流程中确定投影区域与幕布区域之间的位置关系的细化流程示意图,参照图2所示,步骤101具体包括以下步骤:
步骤1011,获取幕布区域的幕布图像信息,并获取投影设备在所述幕布区域的投影区域的投影图像信息;
通常,投影设备包含有摄像头,该摄像头的安装位置与投影光机位于同一平面,可以采集到投影设备投出的全部画面,从而可以获取到目标图像信息,该目标图像信息包括幕布区域的幕布图像信息和所述投影设备在所述幕布区域的投影区域的投影图像信息。
步骤1012,根据所述幕布图像信息和所述投影图像信息确定投影区域与幕布区域的位置关系。
图3为图2所示流程中根据所述幕布图像信息和所述投影图像信息确定投影区域与幕布区域的位置关系的细化流程示意图,参照图3所示,步骤1012具体包括以下步骤:
步骤10121,对所述投影图像信息进行图像识别处理获得所述投影区域,并对所述幕布图像信息进行图像识别处理获得所述幕布区域;
这里,可以采用数字图像处理方法来识别所述投影区域和所述幕布区域,本实施例中,首先识别所述投影图像信息中的投影区域和所述幕布图像信息中的目标矩形区域,该目标矩形区域为该幕布图像信息中最大的矩 形区域,然后根据该投影区域和目标矩形区域识别所述幕布区域。
通常,投影区域的亮度值大于其他区域的亮度值,利用该特性,可以区分投影区域与非投影区域,在识别过程中,可能会由于外部因素例如灯光照射或者手机照射,会在拍摄的投影图像信息中出现多个区域中像素点的亮度值大于其他区域,此时,可以根据区域的形状和区域中像素点的连续性来准确识别投影区域;具体地,计算所述投影图像信息中所有像素点的亮度值的平均值;获取所述投影图像信息中亮度值大于所述平均值的目标像素点;将所述目标像素点构成的矩形区域设定为所述投影区域。
在识别到投影区域之后,通过数字图像处理方法和矩形区域识别,识别所述幕布图像信息中目标矩形区域,如果投影区域完全在该目标矩形区域内,则该目标矩形区域即为幕布区域;否则,幕布区域识别失败,需要重新调整摄像头,以使该摄像头拍摄的目标图像信息包含投影区域和幕布区域。
步骤10122,根据所述投影区域的顶点位于预设的直角坐标系中的第三坐标,确定所述投影区域与所述幕布区域之间的位置关系,所述预设的直角坐标系是以所述幕布区域的相邻边界线为坐标轴建立的坐标系。
这里,可以以幕布区域的左侧边界线和下侧边界线为坐标轴建立幕布区域的直角坐标系,也可以以幕布区域的右侧边界线和下侧边界线为坐标轴建立幕布区域的直角坐标系,还可以以幕布区域的左侧边界线和上侧边界线为坐标轴建立幕布区域的直角坐标系,抑或是以幕布区域的右侧边界线和上侧边界线为坐标轴建立幕布区域的直角坐标系,本实施例中,将以幕布区域的左侧边界线和下侧边界线为坐标轴建立幕布区域的直角坐标系为例进行详细说明;具体地,图4为投影区域与幕布区域之间的位置关系示意图,参照图4所示,幕布区域的直角坐标系以幕布区域的左侧边界线为横坐标轴,方向向上为正,以幕布区域的下侧边界线为纵坐标轴,方向向右为正。
同理,可以以投影区域的左侧边界线和下侧边界线为坐标轴建立投影区域的直角坐标系,也可以以投影区域的右侧边界线和下侧边界线为坐标轴建立投影区域的直角坐标系,还可以以投影区域的左侧边界线和上侧边 界线为坐标轴建立投影区域的直角坐标系,抑或是以投影区域的右侧边界线和上侧边界线为坐标轴建立投影区域的直角坐标系,本实施例中,将以投影区域的左侧边界线和下侧边界线为坐标轴建立投影区域的直角坐标系为例进行详细说明;具体地,参照图4所示,投影区域的直角坐标系以投影区域的左侧边界线为横坐标轴,方向向上为正,以投影区域的下侧边界线为纵坐标轴,方向向右为正。
当所述预设的直角坐标系是以所述幕布区域的左侧边界线为横坐标轴、以所述幕布区域的下侧边界线为纵坐标轴建立的坐标系时,可以根据所述投影区域的顶点位于所述预设的直角坐标系中的第三坐标和所述触摸点位于所述预设的直角坐标系中的坐标,按照预设处理方法计算所述触摸点位于所述投影区域内的坐标,从而确定所述投影区域与所述幕布区域之间的位置关系。
具体地,参照图4所示,投影区域的A顶点位于幕布区域的直角坐标系中的坐标为(M1,N1),投影区域的B顶点位于幕布区域的直角坐标系中的坐标为(M2,N2),投影区域的C顶点位于幕布区域的直角坐标系中的坐标为(M3,N3),投影区域的D顶点位于幕布区域的直角坐标系中的坐标为(M4,N4);设触摸点相对于幕布区域的坐标用(X,Y)表示,该触摸点相对于投影区域的坐标用(X 1,Y 1)表示,则所述投影区域与所述幕布区域之间的位置关系可以用公式(1)和公式(2)表示。
Figure PCTCN2018076438-appb-000001
Figure PCTCN2018076438-appb-000002
这里,顶点是指:矩形中每相邻两条边相交的点。
步骤102,获取基于所述幕布区域的触摸操作对应的触摸点的第一坐 标;
这里,所述投影设备可以通过无线模块与所述幕布设备建立连接,该无线模块可以采用蓝牙通信方式,也可以采用短程无线传输技术(比如无线保真(WiFi,Wireless Fidelity)等)通信方式,还可以采用紫蜂(Zigbee)通信方式来连接所述投影设备和所述幕布设备;本实施例中,所述无线模块将采用蓝牙通信方式连接所述投影设备和所述幕布设备为例进行详细说明;具体地,分别打开幕布设备与投影设备的蓝牙模块,通过配对方式进行连接,建立连接后,投影设备就可以接收所述幕布设备发送的第一坐标。
步骤103,若所述第一坐标在所述投影区域内,根据所述位置关系对所述第一坐标进行坐标转换,确定第二坐标,所述第二坐标为所述触摸点位于所述投影区域的坐标信息;
这里,可以通过对触摸点相对于幕布区域的第一坐标和所述投影区域的顶点相对于幕布区域的第三坐标进行比较,来确定所述触摸点是否在所述投影区域内;具体地,参照图4所示,可以通过分别比较触摸点的坐标和A顶点、B顶点、C顶点和D顶点的坐标的大小关系来确定该触摸点是否在所述投影区域内。
具体地,投影设备通过摄像头拍摄目标图像信息,该目标图像信息包括幕布图像信息和投影图像信息;
计算所述投影图像信息中所述像素点的亮度值的平均值;
获取所述投影图像信息中亮度值大于所述平均值的目标像素点;
将所述目标像素点构成的矩形区域设定为所述投影区域;
通过数字图像方法以及矩形区域识别方法,识别所述幕布图像信息中最大的矩形区域,当该矩形区域包含投影区域时,确定该矩形区域为幕布区域;
以所述幕布区域的左侧边界线为横坐标轴,方向向上为正,以幕布区域的下侧边界线为纵坐标轴,方向向右为正建立幕布区域的直角坐标系;并以所述投影区域的左侧边界线为横坐标轴,方向向上为正,以投影区域的下侧边界线为纵坐标轴,方向向右为正建立所述投影区域的直角坐标系;
根据所述投影区域的A顶点、B顶点、C顶点和D顶点在幕布区域的直角坐标系中的第三坐标,确定所述投影区域与所述幕布区域之间的位置关系;
接收所述幕布设备发送的触摸点的第一坐标,该第一坐标为相对于幕布区域的坐标;
根据上述确定的所述投影区域与所述幕布区域之间的位置关系,将所述第一坐标转换为第二坐标,该第二坐标为相对于所述投影区域的坐标。
可以理解的是,通过确定的投影区域与幕布区域之间的位置关系,将相对于幕布区域的第一坐标转换为相对于投影区域的第二坐标,就能够实现对投影内容的直接控制并提高控制效率,是因为:在实际应用中,当投影设备将投影内容投影到幕布区域上时,由于所述投影图像的倾斜以及投影图像无法全部覆盖所述幕布区域,导致在触摸幕布区域时,幕布设备定位的触摸点的第一坐标无法与该触摸点相对于投影内容的第二坐标相等;将所述第一坐标转换成所述第二坐标,系统可以根据所述第二坐标,定位该触摸点在投影内容的具体位置,从而能够实现对投影内容的直接控制;并且在投影设备的移位过程中,通过实时确定的投影区域与幕布区域之间的位置关系,将所述第一坐标转换成第二坐标,从而能够实时跟踪触摸点的坐标,提高对投影内容的控制效率。
进一步地,在本公开位置信息确定方法实施例二中,将详细介绍所述投影设备是如何获取基于所述幕布区域的触摸操作对应的触摸点的第一坐标;图5为投影设备获取幕布区域上触摸点的第一坐标的流程示意图,参照图5所示,所述投影设备获取幕布区域上触摸点的第一坐标的步骤包括:
步骤201,检测到幕布设备的幕布区域被触摸时,获取所述幕布区域上的触摸点的第一坐标,所述第一坐标为所述触摸点位于幕布区域内的坐标信息;
这里,所述幕布设备检测到幕布设备的幕布区域被触摸时,会获取所述幕布区域上的触摸点的第一坐标。
通常,所述幕布设备包括触摸感应模块,该触摸感应模块可以检测所述幕布区域的触摸,该触摸感应模块可以采用红外方式、也可以采用电容方式、还可以采用声波方式、抑或是采用压力方式来检测所述幕布区域的触摸;本实施例中,将采用红外方式检测所述幕布区域的触摸为例进行详细说明,具体地,图6为幕布设备的结构示意图,参照图6所示,该幕布设备包括红外发射管和红外接收管,该红外发射管和红外接收管紧贴幕布区域四周,通过不断的发射和接收红外线,进而形成横竖交错的红外线矩阵;相应的,通过不断的扫描红外线是否被遮挡来定位触碰物的触摸,即当有触碰物触摸幕布区域时,将会遮挡住经过该触摸点的横竖两条红外线,当扫描到红外接收管接收不到红外线时,根据该红外接收管的具体位置就可以定位该触摸点的位置信息,并根据预先建立的幕布区域的直角坐标系获取该触摸点的第一坐标。
应当说明的是,对幕布区域进行触摸并不一定要直接触碰所述幕布区域,只要靠近所述幕布区域,使幕布设备的触摸感应模块能够感应到幕布区域被触摸,就能称之为触摸;本实施例中,只要遮挡了幕布区域的横竖两条红外线,就能称之为一次触摸。
步骤202,将所述第一坐标发送给投影设备,供所述投影设备对所述第一坐标进行坐标转换确定第二坐标,所述第二坐标为所述被触摸点位于所述投影区域的坐标信息。
这里,所述幕布设备可以通过无线模块与投影设备建立连接,该无线模块可以采用蓝牙通信方式,也可以采用短程无线传输技术(比如WiFi等)通信方式,还可以采用Zigbee通信方式来连接所述幕布设备和投影设备;本实施例中,所述无线模块将采用蓝牙通信方式连接所述幕布设备和投影设备为例进行详细说明;具体地,分别打开幕布设备与投影设备的蓝牙模块,通过配对方式进行连接,建立连接后,所述幕布设备将触摸点的第一坐标发送给投影设备,从而所述投影设备可以获取到所述第一坐标。
本公开实施例提供了一种投影设备,配置为实现上述位置信息确定方法,达到相同的效果。
图7为本公开投影设备实施例一的组成结构示意图,参照图7所示,本实施例的投影设备包括:存储器31和处理器32;其中,所述存储器31配置为存储可在所述处理器32上运行的计算机程序,所述处理器32配置为运行所述程序时,执行:
确定投影区域与幕布区域之间的位置关系,其中,所述投影区域为投影设备在所述幕布区域的投影区域;获取基于所述幕布区域的触摸操作对应的触摸点的第一坐标;若所述第一坐标在所述投影区域内,根据所述位置关系对所述第一坐标进行坐标转换,确定第二坐标,所述第二坐标为所述触摸点位于所述投影区域的坐标信息。
可选地,所述处理器32还配置为运行所述程序时,执行:获取幕布区域的幕布图像信息,并获取投影设备在所述幕布区域的投影区域的投影图像信息;根据所述幕布图像信息和所述投影图像信息确定投影区域与幕布区域的位置关系。
可选地,所述处理器32还配置为运行所述程序时,执行:对所述投影图像信息进行图像识别处理获得所述投影区域,并对所述幕布图像信息进行图像识别处理获得所述幕布区域;根据所述投影区域的顶点位于预设的直角坐标系中的第三坐标,确定所述投影区域与所述幕布区域之间的位置关系,所述预设的直角坐标系是以所述幕布区域的相邻边界线为坐标轴建立的坐标系。
可选地,当所述预设的直角坐标系是以所述幕布区域的左侧边界线为横坐标轴、以所述幕布区域的下侧边界线为纵坐标轴建立的坐标系时,所述处理器32还配置为运行所述程序时,执行:根据所述投影区域的顶点位于所述预设的直角坐标系中的第三坐标和所述触摸点位于所述预设的直角坐标系中的坐标,按照预设处理方法计算所述触摸点位于所述投影区域内的坐标。
本公开实施例提供了一种计算机存储介质,具体为计算机可读存储介质,所述计算机存储介质中存储有计算机可执行指令,所述计算机可执行指令用于执行上述位置信息确定方法。
本公开实施例还提供了一种投影设备,图8为本公开投影设备实施例二的组成结构示意图,参照图8所示,本实施例的投影设备包括:确定模块41、获取模块42和坐标转换模块43;其中,
所述确定部件41,配置为确定投影区域与幕布区域之间的位置关系,其中,所述投影区域为投影设备在所述幕布区域的投影区域;
所述获取模块42,配置为获取基于所述幕布区域的触摸操作对应的触摸点的第一坐标;
所述坐标转换模块43,配置为若所述第一坐标在所述投影区域内,根据所述位置关系对所述第一坐标进行坐标转换,确定第二坐标,所述第二坐标为所述触摸点位于所述投影区域的坐标信息。
图9为图8所示投影设备中确定模块的细化组成结构示意图,参照图9所示,所述确定模块包括:获取单元411和确定单元412;其中,
所述获取单元411,配置为获取幕布区域的幕布图像信息,并获取投影设备在所述幕布区域的投影区域的投影图像信息;
所述确定单元412,配置为根据所述幕布图像信息和所述投影图像信息确定投影区域与幕布区域的位置关系。
图10为图9所示确定模块中确定单元的细化组成结构示意图,参照图10所示,所述确定单元包括:识别子单元4121和确定子单元4122;其中,
所述识别子单元4121,配置为对所述投影图像信息进行图像识别处理获得所述投影区域,并对所述幕布图像信息进行图像识别处理,获得所述幕布区域;
所述确定子单元4122,配置为根据所述投影区域的顶点位于预设的直角坐标系中的第三坐标,确定所述投影区域与所述幕布区域之间的位置关系,所述预设的直角坐标系是以所述幕布区域的相邻边界线为坐标轴建立的坐标系。
可选地,当所述预设的直角坐标系是以所述幕布区域的左侧边界线为横坐标轴、以所述幕布区域的下侧边界线为纵坐标轴建立的坐标系时,所 述确定子单元4122,具体配置为根据所述投影区域的顶点位于所述预设的直角坐标系中的第三坐标和所述触摸点位于所述预设的直角坐标系中的坐标,按照预设处理方法计算所述触摸点位于所述投影区域内的坐标。
以上所述,仅为本公开的较佳实施例而已,并非用于限定本公开的保护范围。凡在本公开的精神和范围之内所作的任何修改、等同替换和改进等,均包含在本公开的保护范围之内。
工业实用性
本公开实施例提供的方案,通过投影设备在幕布区域上的投影区域与所述幕布区域之间的位置关系,对第一坐标进行坐标转换,确定第二坐标,所述第一坐标为基于所述幕布区域的触摸操作对应的触摸点的坐标信息,所述第二坐标为所述触摸点位于所述投影区域的坐标信息;从而能够根据所述第二坐标实现对投影内容的直接控制,并且,在投影设备的移位过程中,能够提高触摸点坐标的校准速度,实时跟踪幕布区域上触摸点在投影区域的坐标,提高了对投影内容的控制效率,保证了投影设备的正常使用。

Claims (13)

  1. 一种位置信息确定方法,包括:
    确定投影区域与幕布区域之间的位置关系,其中,所述投影区域为投影设备在所述幕布区域的投影区域;
    获取基于所述幕布区域的触摸操作对应的触摸点的第一坐标;
    若所述第一坐标在所述投影区域内,根据所述位置关系对所述第一坐标进行坐标转换,确定第二坐标,所述第二坐标为所述触摸点位于所述投影区域的坐标信息。
  2. 根据权利要求1所述的方法,其中,所述确定投影区域与幕布区域之间的位置关系包括:
    获取幕布区域的幕布图像信息,并获取投影设备在所述幕布区域的投影区域的投影图像信息;
    根据所述幕布图像信息和所述投影图像信息确定投影区域与幕布区域的位置关系。
  3. 根据权利要求2所述的方法,其中,所述根据所述幕布图像信息和所述投影图像信息确定投影区域与幕布区域的位置关系包括:
    对所述投影图像信息进行图像识别处理获得所述投影区域,并对所述幕布图像信息进行图像识别处理获得所述幕布区域;
    根据所述投影区域的顶点位于预设的直角坐标系中的第三坐标,确定所述投影区域与所述幕布区域之间的位置关系,所述预设的直角坐标系是以所述幕布区域的相邻边界线为坐标轴建立的坐标系。
  4. 根据权利要求3所述的方法,其中,当所述预设的直角坐标系是以所述幕布区域的左侧边界线为横坐标轴、以所述幕布区域的下侧边界线为纵坐标轴建立的坐标系时,所述根据所述投影区域的顶点位于预设的直角坐标系中的第三坐标,确定所述投影区域与所述幕布区域之间的位置关系包括:
    根据所述投影区域的顶点位于所述预设的直角坐标系中的第三坐标和 所述触摸点位于所述预设的直角坐标系中的坐标,按照预设处理方法计算所述触摸点位于所述投影区域内的坐标。
  5. 一种投影设备,包括:存储器和处理器;其中,所述存储器配置为存储可在所述处理器上运行的计算机程序,所述处理器配置为运行所述程序时,执行:
    确定投影区域与幕布区域之间的位置关系,其中,所述投影区域为投影设备在所述幕布区域的投影区域;获取基于所述幕布区域的触摸操作对应的触摸点的第一坐标;若所述第一坐标在所述投影区域内,根据所述位置关系对所述第一坐标进行坐标转换,确定第二坐标,所述第二坐标为所述触摸点位于所述投影区域的坐标信息。
  6. 根据权利要求5所述的投影设备,其中,所述处理器配置为运行所述程序时,执行:
    获取幕布区域的幕布图像信息,并获取投影设备在所述幕布区域的投影区域的投影图像信息;根据所述幕布图像信息和所述投影图像信息确定投影区域与幕布区域的位置关系。
  7. 根据权利要求6所述的投影设备,其中,所述处理器配置为运行所述程序时,执行:
    对所述投影图像信息进行图像识别处理获得所述投影区域,并对所述幕布图像信息进行图像识别处理获得所述幕布区域;根据所述投影区域的顶点位于预设的直角坐标系中的第三坐标,确定所述投影区域与所述幕布区域之间的位置关系,所述预设的直角坐标系是以所述幕布区域的相邻边界线为坐标轴建立的坐标系。
  8. 根据权利要求7所述的投影设备,其中,所述处理器配置为运行所述程序时,执行:
    当所述预设的直角坐标系是以所述幕布区域的左侧边界线为横坐标轴、以所述幕布区域的下侧边界线为纵坐标轴建立的坐标系时,根据所述投影区域的顶点位于所述预设的直角坐标系中的第三坐标和所述触摸点位于所述预设的直角坐标系中的坐标,按照预设处理方法计算所述触摸点位 于所述投影区域内的坐标。
  9. 一种计算机存储介质,所述计算机存储介质中存储有计算机可执行指令,所述计算机可执行指令用于执行权利要求1至4任一项所述的位置信息确定方法。
  10. 一种投影设备,包括:确定模块、获取模块和坐标转换模块;其中,
    所述确定模块,配置为确定投影区域与幕布区域之间的位置关系,其中,所述投影区域为投影设备在所述幕布区域的投影区域;
    所述获取模块,配置为获取基于所述幕布区域的触摸操作对应的触摸点的第一坐标;
    所述坐标转换模块,配置为若所述第一坐标在所述投影区域内,根据所述位置关系对所述第一坐标进行坐标转换,确定第二坐标,所述第二坐标为所述触摸点位于所述投影区域的坐标信息。
  11. 根据权利要求10所述的投影设备,其中,所述确定模块包括:获取单元和确定单元;其中,
    所述获取单元,配置为获取幕布区域的幕布图像信息,并获取投影设备在所述幕布区域的投影区域的投影图像信息;
    所述确定单元,配置为根据所述幕布图像信息和所述投影图像信息确定投影区域与幕布区域的位置关系。
  12. 根据权利要求11所述的投影设备,其中,所述确定单元包括:识别子单元和确定子单元;其中,
    所述识别子单元,配置为对所述投影图像信息进行图像识别处理获得所述投影区域,并对所述幕布图像信息进行图像识别处理获得所述幕布区域;
    所述确定子单元,配置为根据所述投影区域的顶点位于预设的直角坐标系中的第三坐标,确定所述投影区域与所述幕布区域之间的位置关系,所述预设的直角坐标系是以所述幕布区域的相邻边界线为坐标轴建立的坐标系。
  13. 根据权利要求12所述的投影设备,其中,所述确定子单元,配置为当所述预设的直角坐标系是以所述幕布区域的左侧边界线为横坐标轴、以所述幕布区域的下侧边界线为纵坐标轴建立的坐标系时,根据所述投影区域的顶点位于所述预设的直角坐标系中的第三坐标和所述触摸点位于所述预设的直角坐标系中的坐标,按照预设处理方法计算所述触摸点位于所述投影区域内的坐标。
PCT/CN2018/076438 2017-03-24 2018-02-12 一种位置信息确定方法、投影设备和计算机存储介质 WO2018171363A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201710183162.7A CN108628487A (zh) 2017-03-24 2017-03-24 一种位置信息确定方法、投影设备和计算机存储介质
CN201710183162.7 2017-03-24

Publications (1)

Publication Number Publication Date
WO2018171363A1 true WO2018171363A1 (zh) 2018-09-27

Family

ID=63584928

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/076438 WO2018171363A1 (zh) 2017-03-24 2018-02-12 一种位置信息确定方法、投影设备和计算机存储介质

Country Status (2)

Country Link
CN (1) CN108628487A (zh)
WO (1) WO2018171363A1 (zh)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110413184A (zh) * 2019-06-20 2019-11-05 视联动力信息技术股份有限公司 一种投影控制方法及装置
CN110837322A (zh) * 2019-09-29 2020-02-25 深圳市火乐科技发展有限公司 投影触控方法、投影设备、投影幕布及存储介质
CN116418958A (zh) * 2021-12-31 2023-07-11 深圳光峰科技股份有限公司 投影校准方法、装置、电子设备及存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101859210A (zh) * 2010-06-10 2010-10-13 深圳市德力信科技有限公司 一种交互式投影系统及其实现方法
CN102591531A (zh) * 2011-12-26 2012-07-18 深圳市巨龙科教高技术股份有限公司 一种电子白板的坐标映射方法、装置及电子白板
CN102819327A (zh) * 2012-07-26 2012-12-12 郑州信大捷安信息技术股份有限公司 红外定位无线操控的交互式投影系统及其实现方法
CN102880356A (zh) * 2012-09-13 2013-01-16 福州锐达数码科技有限公司 一种基于电子白板实现双笔书写的方法
CN103631401A (zh) * 2013-11-04 2014-03-12 南京信息职业技术学院 一种便携式交互式投影设备及其投影方法

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5828671B2 (ja) * 2011-05-10 2015-12-09 キヤノン株式会社 情報処理装置、情報処理方法
CN102445998A (zh) * 2011-09-16 2012-05-09 海信集团有限公司 遥控光点投影位置的获取方法及交互式投影系统
CN103092432B (zh) * 2011-11-08 2016-08-03 深圳市中科睿成智能科技有限公司 人机交互操作指令的触发控制方法和系统及激光发射装置
CN102799317B (zh) * 2012-07-11 2015-07-01 联动天下科技(大连)有限公司 智能互动投影系统

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101859210A (zh) * 2010-06-10 2010-10-13 深圳市德力信科技有限公司 一种交互式投影系统及其实现方法
CN102591531A (zh) * 2011-12-26 2012-07-18 深圳市巨龙科教高技术股份有限公司 一种电子白板的坐标映射方法、装置及电子白板
CN102819327A (zh) * 2012-07-26 2012-12-12 郑州信大捷安信息技术股份有限公司 红外定位无线操控的交互式投影系统及其实现方法
CN102880356A (zh) * 2012-09-13 2013-01-16 福州锐达数码科技有限公司 一种基于电子白板实现双笔书写的方法
CN103631401A (zh) * 2013-11-04 2014-03-12 南京信息职业技术学院 一种便携式交互式投影设备及其投影方法

Also Published As

Publication number Publication date
CN108628487A (zh) 2018-10-09

Similar Documents

Publication Publication Date Title
US9690388B2 (en) Identification of a gesture
US10319110B2 (en) Display control method and system
US20140300542A1 (en) Portable device and method for providing non-contact interface
KR20130004357A (ko) 컴퓨팅 디바이스 인터페이스
WO2018171363A1 (zh) 一种位置信息确定方法、投影设备和计算机存储介质
US9727171B2 (en) Input apparatus and fingertip position detection method
US20140354784A1 (en) Shooting method for three dimensional modeling and electronic device supporting the same
JP6543924B2 (ja) 情報処理方法、情報処理プログラム、及び情報処理装置
CN110726971B (zh) 可见光定位方法、装置、终端及存储介质
KR20130037430A (ko) 모바일기기 및 그 제어방법
TWI499938B (zh) 觸控系統
TWI510082B (zh) 用於影像辨識之影像擷取方法及其系統
US10073614B2 (en) Information processing device, image projection apparatus, and information processing method
US20160019424A1 (en) Optical touch-control system
WO2021004413A1 (zh) 一种手持输入设备及其指示图标的消隐控制方法和装置
JP2011002292A (ja) 三次元指先位置検出方法、三次元指先位置検出装置、およびプログラム
KR102191061B1 (ko) 2차원 카메라를 이용하여 객체 제어를 지원하는 방법, 시스템 및 비일시성의 컴퓨터 판독 가능 기록 매체
JP2018055685A (ja) 情報処理装置、その制御方法、プログラム、および記憶媒体
US9684415B2 (en) Optical touch-control system utilizing retro-reflective touch-control device
CN103576906A (zh) 鼠标图标控制方法及系统
JP6057407B2 (ja) タッチ位置入力装置及びタッチ位置入力方法
US8493362B2 (en) Image-based coordinate input apparatus and method utilizing buffered images
CN115174879B (zh) 投影画面校正方法、装置、计算机设备和存储介质
TWI537775B (zh) 滑鼠圖示控制方法及系統
EP4113251A1 (en) Calibration method of a system comprising an eye tracking device and a computing device comprising one or multiple screens

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18771126

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18771126

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 18771126

Country of ref document: EP

Kind code of ref document: A1