WO2017197779A1 - 一种实现互动投影的方法及系统 - Google Patents

一种实现互动投影的方法及系统 Download PDF

Info

Publication number
WO2017197779A1
WO2017197779A1 PCT/CN2016/093407 CN2016093407W WO2017197779A1 WO 2017197779 A1 WO2017197779 A1 WO 2017197779A1 CN 2016093407 W CN2016093407 W CN 2016093407W WO 2017197779 A1 WO2017197779 A1 WO 2017197779A1
Authority
WO
WIPO (PCT)
Prior art keywords
interaction
image
interactive
projector
projection
Prior art date
Application number
PCT/CN2016/093407
Other languages
English (en)
French (fr)
Inventor
杨伟樑
高志强
宁曼莹
许剑波
Original Assignee
广景视睿科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 广景视睿科技(深圳)有限公司 filed Critical 广景视睿科技(深圳)有限公司
Publication of WO2017197779A1 publication Critical patent/WO2017197779A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected

Definitions

  • the present invention relates to the field of projection technology, and in particular, to a method and system for implementing interactive projection.
  • a projector is a device that can project an image or video onto a projection screen.
  • the image or video projected onto the projection screen is multiplied or tens of times in magnification to facilitate viewing. It also gives people an open view, so projectors are very popular with users.
  • Interactive projection refers to the use of computer vision technology and projection display technology to identify the user's use of the foot or hand and the virtual scene on the projection area to operate, and adjust the projection content according to the operation, thus creating a kind of Dynamic interactive experience technology.
  • the interactive projection directly detects the user's operation action based on the infrared technology when identifying the user's operation.
  • the projection environment changes, it is difficult to accurately identify and locate the operation dynamic track, which greatly affects the effect of the interactive projection.
  • the technical problem to be solved by the present invention is to provide a method and system for realizing interactive projection, which can obtain more accurate interactive operation feature information in an environment with unstable projection environment, and realize interactive projection anytime and anywhere, and the user can enjoy enjoy Adventure the smooth interactive projection feast.
  • a technical solution adopted by the present invention is to provide a method for implementing interactive projection, comprising: inputting a projection content to a projector, so that the projector projects according to the projection content;
  • the image acquisition device collects the An interactive image of a position of the projection screen of the projector;
  • the interactive operation body image is recognized from the interaction image according to a preset recognition algorithm, and the interaction when the interaction operation body is extracted according to the interaction operation body image
  • Operating feature information wherein the interactive operating body picture refers to a picture of the interactive operating body in the interactive image; adjusting the projected content of the projector according to the interactive operating feature information, and returning the inputting projection to the projector The steps of the content.
  • the method further includes: dividing the interaction image according to the preset segmentation algorithm Obtaining an interaction operation area; acquiring a position of the interaction operation body picture at the interaction image, and acquiring an interaction operation area to which the interaction operation body picture belongs according to the position of the interaction image according to the interaction operation body picture; according to the interaction
  • the step of adjusting the projection content of the projector includes: generating an operation instruction according to the interaction operation feature information and combining the interaction operation area to which the interaction operation body screen belongs; and adjusting the operation according to the operation instruction The projection content of the projector.
  • the method further includes: extracting, from the preset audio library, an audio file corresponding to the operation instruction; and in the process of adjusting the projection content of the projector, outputting the interactive audio according to the acquired audio file.
  • the interaction operation feature information includes a motion track, a color, and a shape of the interaction operator.
  • another technical solution adopted by the present invention is to provide a system for implementing interactive projection, including a processor, a projector and an image acquisition device, wherein the processor and the projector and the image acquisition device respectively
  • the processor is configured to: input a projection content to the projector, so that the projector performs projection according to the projection content, and perform an interaction operation on an interaction entity between the projector and the image collection device.
  • the image capturing device And acquiring, by the image capturing device, an interaction image of a location of the projection screen of the projector, and identifying the interaction operator image from the interaction image according to a preset recognition algorithm, and operating the body image according to the interaction Extracting the interaction operation feature information when the interaction operation body performs an interaction operation, wherein the interaction operation body picture refers to a picture of the interaction operation body in the interaction image, and adjusting the projector according to the interaction operation feature information Projection content, And returning to the step of inputting the projected content to the projector.
  • the processor is further configured to divide the interaction image according to a preset blocking algorithm. And an interaction operation area, and acquiring an location of the interaction operation body image at the interaction image, and acquiring an interaction operation area to which the interaction operation body picture belongs according to the position of the interaction operation image according to the interaction operation body picture;
  • the step of adjusting the projection content of the projector according to the interaction operation feature information includes: generating an operation instruction according to the interaction operation feature information, and combining the interaction operation area to which the interaction operation body screen belongs, and The operation instruction adjusts the projection content of the projector.
  • the system further includes an audio output device and a storage device, the audio output device and the storage device are respectively connected to a processor, the storage device is configured to store a preset audio library; and the processor is further configured to use the storage device And extracting, in the preset audio library, an audio file corresponding to the operation instruction, and outputting the acquired audio file to the audio output device in a process of adjusting the projection content of the projector, so that the The audio output device outputs interactive audio according to the acquired audio file.
  • the image acquisition device is a camera.
  • the interaction operation feature information includes a motion track, a color, and a shape of the interaction operator.
  • the present invention collects the projection screen of the projector through the image acquisition device when the interactive operation between the projection screen of the projector and the image acquisition device performs an interactive operation.
  • the interactive image of the position because the interactive operating body is located in front of the image capturing device, when the image capturing device collects the interactive image at the position where the projected image is located, the interactive image includes the interactive operating body image, and the interactive operating body is used to perform the interactive operation through the interactive image.
  • Interacting with the feature information, and adjusting the projection content of the projector according to the interaction operation feature information to realize the interactive projection and the present invention identifies the interactive operation feature information through the interactive image including the interactive operation body image, which is unstable in the projection environment. In the environment, more accurate interactive operation feature information can be obtained, and interactive projection can be realized anytime and anywhere, and the user can enjoy the smooth interactive projection feast.
  • FIG. 1 is a flow chart of a first embodiment of a method for implementing interactive projection according to the present invention
  • FIG. 2 is a schematic diagram of an interactive screen overlay projection screen in an embodiment of the present invention.
  • FIG. 3 is a flow chart of a second embodiment of a method for implementing interactive projection according to the present invention.
  • FIG. 4 is a schematic diagram of dividing four interactive operation areas on an interactive screen in an embodiment of the present invention.
  • FIG. 5 is a schematic diagram of an inter-operating body picture in an H zone according to an embodiment of the present invention.
  • FIG. 6 is a schematic diagram of an inter-operating body picture in a G zone in an embodiment of the present invention.
  • FIG. 7 is a schematic diagram of an embodiment of a system for implementing interactive projection of the present invention.
  • the methods for implementing interactive projection include:
  • Step S201 inputting the projection content to the projector, so that the projector projects according to the projection content
  • the projector projects a projected image when it is projected, and the projected image changes depending on the content of the projected content.
  • Step S202 when the interactive operation between the projection screen of the projector and the image acquisition device performs an interaction operation, the interaction image of the position of the projection screen of the projector is acquired by the image acquisition device;
  • the framing range of the image capturing device covers the projected image.
  • the range of the framing range of the image capturing device is larger than the range of the landscaping screen. Since the interactive operating body is located between the image capturing device and the projected image, and the interactive operating body is located in front of the image capturing device, the image captured by the image capturing device includes the interactive operating body image and the projected image, as shown in FIG. 2 .
  • the interactive operation body picture and the projection picture may partially or completely overlap.
  • the interaction operation body picture and the projection picture may not overlap.
  • Step S203 According to the preset recognition algorithm, the interaction operator image is recognized from the interaction image, and the interaction operation is performed according to the interaction operation body image extraction interaction operator Characteristic information, wherein the interactive operation body picture refers to a picture of the interaction operation body in the interaction image;
  • the interaction operation body generates the interaction operation feature information, and the interaction operation feature information is presented in the interaction image. Therefore, the interaction operation feature information can be extracted from the interaction image.
  • the interaction operation feature information includes The trajectory, color, and shape of the interactive body.
  • Step S204 Adjust the projection content of the projector according to the interaction operation feature information, and return to the step of inputting the projection content to the projector.
  • Different interaction operation feature information may represent different operations. For example, if the interaction operation feature information is a fist, that is, the interaction operation body is a fist, the current projection content is closed; if the interaction operation feature information is an interactive operation body, it is leftward. The motion track of the motion switches the projected content.
  • the interactive operating body Since the interactive operating body is located in front of the image capturing device, when the image capturing device collects the interactive image at the position where the projected image is located, the interactive image may include an interactive operating body image, so that the interactive operating body image within the interactive image may be recognized, and according to the interaction
  • the operating body image acquires the interactive operation feature information.
  • the operation is directly operated by the operation interface, which is more intuitive and capable.
  • the direct detection of the interactive operation body is more flexible and limited, and the color, the shape, or the dynamic motion track can be detected, and the result is processed by the processor, and these are infrared technology interactive projections. Can't do it.
  • the interaction image may also be segmented, and the same interaction operation feature information may represent different operations when different segments are divided.
  • different interaction operation feature information of the same segment also represents different operations.
  • Step S205 According to the preset blocking algorithm, dividing a plurality of interactive operation regions in the interaction image;
  • the preset blocking algorithm can be pre-defined, and the different blocking images can be different in different ways, because the contents of different projection pages are different, and the objects to be operated are likely to be different. It can be understood that the mobile phone is projected by a projector. Different interfaces and different operating objects are different, and the blocking method is of course not necessarily the same.
  • Step S206 Acquiring the position of the interactive operation body picture at the interaction image, and acquiring the interaction operation area to which the interaction operation body picture belongs according to the position of the interaction image according to the interaction operation body picture;
  • Step S204 is specifically: generating an operation instruction according to the interaction operation feature information, and combining the interaction operation area to which the interaction operation body picture belongs, and adjusting the projection content of the projector according to the operation instruction.
  • the number of operation instructions can be greatly expanded by determining the operation instruction by the two-dimensional information of the interactive operation feature information and the interaction operation area to which the interaction operation body picture belongs.
  • the interactive image is divided into four interactive operation regions, namely E, F, G, and H regions, and the interactions generated by the same interactive operation feature information at different coordinate positions are different, and the same coordinate position is generated.
  • the interaction reaction generated by different interaction operation feature information is also different. For example, as shown in FIG.
  • the image acquisition device transmits the collected interaction image to the processor, and the processor acquires The obtained interactive image is preprocessed, and after the interference factor is removed, the interactive operation area H area where the interactive operation is located is identified, and the operation instruction is identified by combining the H area and the interactive operation feature information.
  • the area where the recognized interaction operation is located is the G area, even if the interaction operation feature information in FIG. 5 is the same as the interaction operation feature information in FIG. 6 , and the two are located in different interaction operation areas, Therefore, the operational instructions generated by the two are not the same. Further, in order to facilitate the operator to recognize the block in which the interactive operation body is currently located, the projection block line is superimposed on the position of the projection screen by other projectors.
  • the operation can be performed through the interactive operation body, and the related audio can also be output according to different operations.
  • the method further includes:
  • Step S207 extract an audio file corresponding to the operation instruction from the preset audio library
  • Audio files represent different sounds, and audio files correspond to operational commands one-to-one.
  • Step S208 in the process of adjusting the projected content of the projector, outputting the interactive audio according to the acquired audio file;
  • the interactive audio indicates the successful operation of the interactive operator, increasing the interaction between the projection and the operator.
  • the projection screen and the image collection device are located in the projector
  • the interactive operation body performs the interactive operation
  • the interactive image of the position of the projection screen of the projector is acquired by the image acquisition device, and the interactive operation body is located in front of the image acquisition device, and the image acquisition device collects the interaction image at the position where the projection image is located.
  • the interactive image includes an interactive operation body image, and the interactive operation image identifies the interactive operation feature information when the interaction operation body performs the interaction operation, and adjusts the projection content of the projector according to the interaction operation feature information to realize the interactive projection, and the present invention passes the
  • the interactive image containing the interactive operation screen recognizes the interactive operation feature information, and in the unstable environment of the projection environment, the accurate interactive operation feature information can also be obtained, and the interactive projection can be realized anytime and anywhere, and the user can enjoy the smooth enjoyment.
  • Interactive projection feast is identifies the interactive operation feature information when the interaction operation body performs the interaction operation, and adjusts the projection content of the projector according to the interaction operation feature information to realize the interactive projection, and the present invention passes the
  • the interactive image containing the interactive operation screen recognizes the interactive operation feature information, and in the unstable environment of the projection environment, the accurate interactive operation feature information can also be obtained, and the interactive projection can be realized anytime and anywhere, and the user can enjoy the smooth enjoyment.
  • Interactive projection feast is identifies the interactive operation feature information when the interaction
  • the present invention further provides a system implementation that enables interactive projection.
  • the system 30 for implementing interactive projection includes a processor 31, a projector 32, and an image capture device 33, wherein the processor 31 is coupled to the projector 32 and the image capture device 33, respectively.
  • the processor 31 is configured to: input the projection content to the projector 32, so that the projector 32 performs projection according to the projection content, and when the interaction operation between the projector 32 and the image acquisition device 33 performs an interaction operation, the image is collected.
  • the device 33 collects an interaction image of the location of the projection screen of the projector 32, and recognizes the interaction operator image from the interaction image according to the preset recognition algorithm, and extracts the interaction operation when the interaction operator performs the interaction operation according to the interaction operation body screen.
  • the feature information wherein the interworking body picture refers to a picture of the interworking body in the interactive image, and adjusts the projection content of the projector 32 according to the interoperation function information, and returns a step of inputting the projection content to the projector 32.
  • the interworking feature information includes a motion trajectory, a color, and a shape of the interworking body.
  • the interactive operating body is located in front of the image capturing device 33, when the image capturing device 33 collects the interactive image at the position where the projected image is located, the interactive image may include an interactive operating body image, so that the interactive operating body image within the interactive image can be recognized, and According to the interactive operation body image, the interactive operation feature information is obtained.
  • the more accurate interaction operation feature information can be obtained, and the interactive projection can be realized anytime and anywhere, and the user can enjoy the smooth interactive projection feast.
  • the interaction image may also be segmented, and the two-dimensional information of the two pieces of information corresponding to the interaction operation feature information and the interaction operation feature information is determined.
  • the processor 31 is further configured to divide a plurality of interactive operation regions in the interactive image according to a preset blocking algorithm, and Acquiring the position of the interactive operation body picture at the interaction image, and acquiring the interaction operation area to which the interaction operation body picture belongs according to the position of the interaction image according to the interaction operation body picture.
  • the processor 31 adjusts the projection content of the projector 32 according to the interaction operation feature information, and generates an operation instruction according to the interaction operation feature information and the interaction operation area to which the interaction operation body screen belongs, and adjusts the projector according to the operation instruction. 32 projection content.
  • the system 30 further includes an audio output device 34 and a storage device 35.
  • the audio output device 34 and the storage device 35 are respectively connected to the processor 31, and the storage device 35 is configured to store a preset audio library.
  • the processor 31 is further configured to extract an audio file corresponding to the operation instruction from the preset audio library of the storage device 35, and output the acquired audio to the audio output device 34 in the process of adjusting the projection content of the projector 32.
  • the file is such that the audio output device 34 outputs interactive audio based on the acquired audio file.
  • the interactiveness of the interactive projection can be increased by outputting interactive audio associated with the operation.
  • the image pickup device 33 is preferably a camera.
  • the interactive image of the position of the projection screen of the projector 32 is acquired by the image acquisition device 33,
  • the interactive operating body is located in front of the image capturing device 33.
  • the interactive image may include an interactive operating body image
  • the interactive operating image identifies the interactive operating feature when the interactive operating body performs the interactive operation.
  • the information and the projection content of the projector 32 are adjusted according to the interactive operation feature information to realize the interactive projection.
  • the present invention recognizes the interactive operation feature information through the interactive image including the interactive operation body image, and the projection environment is unstable. It can also obtain more accurate interactive operation feature information, realize interactive projection anytime and anywhere, and users can enjoy the smooth interactive projection feast.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Controls And Circuits For Display Device (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

一种实现互动投影的方法及系统(30),方法包括:向投影仪(32)输入投影内容,以使投影仪(32)根据投影内容进行投影;在位于投影仪(32)的投影画面与图像采集装置(33)之间的交互操作体执行交互操作时,通过图像采集装置(33)采集投影仪(32)的投影画面所在位置的交互图像;根据预设识别算法,从交互图像中识出交互操作体画面,并且根据交互操作体画面提取交互操作体进行交互操作时的交互操作特征信息,其中,交互操作体画面是指交互操作体在交互图像中的画面;根据交互操作特征信息,调整投影仪(32)的投影内容,并且返回向投影仪(32)输入投影内容的步骤。通过上述方式,能够随时随地实现交互投影,用户可享受流畅的互动投影过程。

Description

一种实现互动投影的方法及系统 技术领域
本发明涉及投影技术领域,特别是涉及一种实现互动投影的方法及系统。
背景技术
投影仪,是一种可以将图像或视频投射到投影屏幕上的设备,其投影到投影屏幕上的图像或者视频在保持清晰度的情况下呈现数倍或者数十倍进行放大,方便人们观看,也给予人们开阔的视野,因此,投影仪深受用户的欢迎。
近年来互动投影越来越流行,互动投影是指采用计算机视觉技术和投影显示技术,识别用户使用脚或手与投影区域上的虚拟场景进行操作,并根据操作调整投影内容,从而营造出一种动感的交互体验的技术。目前,互动投影在识别用户的操作时主要基于红外技术直接检测用户的操作动作,在投影环境发生变化时,操作动态轨迹难以实现准确识别和定位,大大地影响互动投影的效果。
发明内容
本发明主要解决的技术问题是提供一种实现互动投影的方法及系统,能够在投影环境不稳定的环境下,也可以得出较为准确的交互操作特征信息,随时随地实现交互投影,用户可尽情的享受流畅的互动投影盛宴。
为解决上述技术问题,本发明采用的一个技术方案是:提供一种实现互动投影的方法,包括:向投影仪输入投影内容,以使所述投影仪根据所述投影内容进行投影;在位于所述投影仪的投影画面与图像采集装置之间的交互操作体执行交互操作时,通过所述图像采集装置采集所述 投影仪的投影画面所在位置的交互图像;根据预设识别算法,从所述交互图像中识出所述交互操作体画面,并且根据交互操作体画面提取所述交互操作体进行交互操作时的交互操作特征信息,其中,所述交互操作体画面是指交互操作体在交互图像中的画面;根据所述交互操作特征信息,调整所述投影仪的投影内容,并且返回所述向投影仪输入投影内容的步骤。
其中,在所述根据交互操作体画面提取所述交互操作体进行交互操作时的交互操作特征信息的步骤之后,所述方法还包括:根据预设分块算法,在所述交互图像划分出若干个交互操作区域;获取所述交互操作体画面在所述交互图像的位置,以及根据交互操作体画面在所述交互图像的位置获取所述交互操作体画面所属的交互操作区域;根据所述交互操作特征信息,调整所述投影仪的投影内容的步骤包括:根据所述交互操作特征信息,并且结合所述交互操作体画面所属的交互操作区域,生成操作指令;根据所述操作指令,调整所述投影仪的投影内容。
其中,所述方法还包括:从预设声频库中,提取与所述操作指令相对应的音频文件;在调整所述投影仪的投影内容的过程中,根据获取到的音频文件输出交互音频。
其中,所述交互操作特征信息包括交互操作体的运动轨迹、颜色和形状。
为解决上述技术问题,本发明采用的另一个技术方案是:提供一种实现互动投影的系统,包括处理器、投影仪和图像采集装置,其中,所述处理器分别与投影仪和图像采集装置连接;所述处理器,用于:向投影仪输入投影内容,以使所述投影仪根据所述投影内容进行投影,在位于所述投影仪与图像采集装置之间的交互操作体执行交互操作时,通过所述图像采集装置采集所述投影仪的投影画面所在位置的交互图像,并且根据预设识别算法,从所述交互图像中识出所述交互操作体画面,并且根据交互操作体画面提取所述交互操作体进行交互操作时的交互操作特征信息,其中,所述交互操作体画面是指交互操作体在交互图像中的画面,以及根据所述交互操作特征信息,调整所述投影仪的投影内容, 并且返回所述向投影仪输入投影内容的步骤。
其中,所述处理器在所述通过所述图像采集装置采集所述投影仪的投影画面所在位置的交互图像的步骤之后,还用于根据预设分块算法,在所述交互图像划分出若干个交互操作区域,并且获取所述交互操作体画面在所述交互图像的位置,以及根据交互操作体画面在所述交互图像的位置获取所述交互操作体画面所属的交互操作区域;所述处理器根据所述交互操作特征信息,调整所述投影仪的投影内容的步骤包括:根据所述交互操作特征信息,并且结合所述交互操作体画面所属的交互操作区域,生成操作指令,并且根据所述操作指令,调整所述投影仪的投影内容。
其中,所述系统还包括音频输出装置和存储装置,所述音频输出装置和存储装置分别与处理器连接,所述存储装置用于存储预设声频库;所述处理器还用于从存储装置的预设声频库中,提取与所述操作指令相对应的音频文件,并且在调整所述投影仪的投影内容的过程中,向所述音频输出装置输出获取到的音频文件,以使所述音频输出装置根据所述获取到的音频文件输出交互音频。
其中,所述图像采集装置为摄像头。
其中,所述交互操作特征信息包括交互操作体的运动轨迹、颜色和形状。
本发明的有益效果是:区别于现有技术的情况,本发明当位于投影仪的投影画面与图像采集装置之间的交互操作体执行交互操作时,通过图像采集装置采集投影仪的投影画面所在位置的交互图像,由于交互操作体位于图像采集装置的前方,在图像采集装置采集投影画面所在位置的交互图像时,交互图像会包含交互操作体画面,通过交互图像识别出交互操作体执行交互操作时交互操作特征信息,以及根据交互操作特征信息,调整投影仪的投影内容,实现交互投影,而本发明是通过包含有交互操作体画面的交互图像识别出交互操作特征信息,在投影环境不稳定的环境下,也可以得出较为准确的交互操作特征信息,随时随地实现交互投影,用户可尽情的享受流畅的互动投影盛宴。
附图说明
图1是本发明实现互动投影的方法第一实施方式的流程图;
图2是本发明实施方式中的交互画面覆盖投影画面的示意图;
图3是本发明实现互动投影的方法第二实施方式的流程图;
图4是本发明实施方式中将在交互画面上划分出四个交互操作区域的示意图;
图5是本发明实施方式中将交互操作体画面在H区的示意图;
图6是本发明实施方式中将交互操作体画面在G区的示意图;
图7是本发明实现互动投影的系统实施方式的示意图。
具体实施方式
下面结合附图和实施方式对本发明进行详细说明。
请参阅图1,实现互动投影的方法包括:
步骤S201:向投影仪输入投影内容,以使投影仪根据投影内容进行投影;
投影仪在投影时会投射出投影画面,并且投影画面会根据投影内容的变化而变化。
步骤S202:在位于投影仪的投影画面与图像采集装置之间的交互操作体执行交互操作时,通过图像采集装置采集投影仪的投影画面所在位置的交互图像;
图像采集装置的取景范围覆盖投影画面,在本实施方式中,优选的,图像采集装置的取景范围所覆盖的范围大于投景画面的范围。由于交互操作体位于图像采集装置与投影画面之间,并且交互操作体位于图像采集装置的前方,因此,图像采集装置采集到的图像包含交互操作体画面和投影画面,如图2所示。当然,交互操作体画面与投影画面可以部分或者全部重叠,当然,交互操作体画面与投影画面也可以不重叠。
步骤S203:根据预设识别算法,从交互图像中识出交互操作体画面,并且根据交互操作体画面提取交互操作体进行交互操作时的交互操作 特征信息,其中,交互操作体画面是指交互操作体在交互图像中的画面;
交互操作体在进行操作时会产生交互操作特征信息,而交互操作特征信息会在交互图像呈现,因此,可以从交互图像中提取出交互操作特征信息,在本实施方式中,交互操作特征信息包括交互操作体的运动轨迹、颜色和形状。
步骤S204:根据交互操作特征信息,调整投影仪的投影内容,并且返回向投影仪输入投影内容的步骤。
不同交互操作特征信息,可以代表不同的操作,例如:若交互操作特征信息为拳头,也即为交互操作体为拳头,则关闭当前的投影内容;若交互操作特征信息为交互操作体呈向左运动的运动轨迹,则切换投影内容。
由于交互操作体位于图像采集装置的前方,在图像采集装置采集投影画面所在位置的交互图像时,交互图像会包含交互操作体画面,从而可以通过识别交互图像内的交互操作体画面,并且根据交互操作体画面获取交互操作特征信息,相比于,红外技术互动投影的方法,其不容易受到环境的影响,互动性更强,比如:用手在操作界面直接操作,看起来更直观,并且能够直接侦测到交互操作体,更加灵活,局限性小,同时可以对颜色,形状,或者动态的运动轨迹都可以进行侦测得到,再通过处理器处理得到结果,而这些均是红外技术互动投影做不到。
为了扩展交互操作的内容,也可以对交互图像进行分块,同一交互操作特征信息在不同分块时,可以代表不同操作,当然,同一分块的不同的交互操作特征信息也代表不同操作,则请参阅图3,步骤S203之后,方法还包括:
步骤S205:根据预设分块算法,在交互图像划分出若干个交互操作区域;
预设分块算法可以预先定义好,而不同的投影画面采用的分块方式可以不同,因为不同的两个投影页面里面的内容不一样,要操作的对象很有可能不一样。可以理解为将手机用投影仪投影出去,不同的界面,操作的对象不尽相同,分块方式当然也不一定相同。
步骤S206:获取交互操作体画面在交互图像的位置,以及根据交互操作体画面在交互图像的位置获取交互操作体画面所属的交互操作区域;
步骤S204具体为:根据交互操作特征信息,并且结合交互操作体画面所属的交互操作区域,生成操作指令,并且根据操作指令,调整投影仪的投影内容。
通过交互操作特征信息和交互操作体画面所属的交互操作区域两个二维信息确定操作指令,可以大大扩充操作指令的数量。如图4所示,交互图像划分为四个交互操作区域,分别为E、F、G、H区,在不同的坐标位置上相同的交互操作特征信息所产生的互动反应不同,同一坐标位置上不同交互操作特征信息所产生的交互反应也不同,例如:如图5所示,当用户在投影区域内进行交互操作时,经过图像采集装置将采集到交互图像发送到处理器,处理器对获取到的交互图像进行预处理,除去干扰因素后,识别交互操作所在的交互操作区域H区,结合H区和交互操作特征信息识别出操作指令。又如图6所示,当识别后的交互操作所在的区域为G区时,即使图5中的交互操作特征信息与图6中交互操作特征信息相同,并且两者位于不同的交互操作区域,因此,两者所产生的操作指令是不相同的。进一步的,为了方便操作者识别出交互操作体当前所在的分块,通过其它投影仪在投影画面所在位置上叠加投影分块线。
为了增加投影的互动性,在通过交互操作体进行操作,也可以根据不同操作,输出相关音频,请再次参阅图3,方法还包括:
步骤S207:从预设声频库中,提取与操作指令相对应的音频文件;
不同音频文件代表不同声音,而音频文件与操作指令一一对应。
步骤S208:在调整投影仪的投影内容的过程中,根据获取到的音频文件输出交互音频;
通过交互音频指示交互操作体成功操作,增加投影与操作者之间的互动性。
在本发明实施方式中,当位于投影仪的投影画面与图像采集装置之 间的交互操作体执行交互操作时,通过图像采集装置采集投影仪的投影画面所在位置的交互图像,由于交互操作体位于图像采集装置的前方,在图像采集装置采集投影画面所在位置的交互图像时,交互图像会包含交互操作体画面,通过交互图像识别出交互操作体执行交互操作时交互操作特征信息,以及根据交互操作特征信息,调整投影仪的投影内容,实现交互投影,而本发明是通过包含有交互操作体画面的交互图像识别出交互操作特征信息,在投影环境不稳定的环境下,也可以得出较为准确的交互操作特征信息,随时随地实现交互投影,用户可尽情的享受流畅的互动投影盛宴。
本发明又提供实现互动投影的系统实施方式。请参阅图7,实现互动投影的系统30包括处理器31、投影仪32和图像采集装置33,其中,处理器31分别与投影仪32和图像采集装置33连接。
处理器31,用于:向投影仪32输入投影内容,以使投影仪32根据投影内容进行投影,在位于投影仪32与图像采集装置33之间的交互操作体执行交互操作时,通过图像采集装置33采集投影仪32的投影画面所在位置的交互图像,并且根据预设识别算法,从交互图像中识出交互操作体画面,并且根据交互操作体画面提取交互操作体进行交互操作时的交互操作特征信息,其中,交互操作体画面是指交互操作体在交互图像中的画面,以及根据交互操作特征信息,调整投影仪32的投影内容,并且返回向投影仪32输入投影内容的步骤。在本实施方式中,交互操作特征信息包括交互操作体的运动轨迹、颜色和形状。
由于交互操作体位于图像采集装置33的前方,在图像采集装置33采集投影画面所在位置的交互图像时,交互图像会包含交互操作体画面,从而可以通过识别交互图像内的交互操作体画面,并且根据交互操作体画面获取交互操作特征信息,在投影环境不稳定的环境下,也可以得出较为准确的交互操作特征信息,随时随地实现交互投影,用户可尽情的享受流畅的互动投影盛宴。
为了扩展交互操作的内容,也可以对交互图像进行分块,由交互操作特征信息和交互操作特征信息所属的分块两个二维信息确定操作,具 体的,处理器31在通过图像采集装置33采集投影仪32的投影画面所在位置的交互图像的步骤之后,还用于根据预设分块算法,在交互图像划分出若干个交互操作区域,并且获取交互操作体画面在交互图像的位置,以及根据交互操作体画面在交互图像的位置获取交互操作体画面所属的交互操作区域。处理器31根据交互操作特征信息,调整投影仪32的投影内容的步骤包括:根据交互操作特征信息,并且结合交互操作体画面所属的交互操作区域,生成操作指令,并且根据操作指令,调整投影仪32的投影内容。
进一步的,系统30还包括音频输出装置34和存储装置35,音频输出装置34和存储装置35分别与处理器31连接,存储装置35用于存储预设声频库。处理器31还用于从存储装置35的预设声频库中,提取与操作指令相对应的音频文件,并且在调整投影仪32的投影内容的过程中,向音频输出装置34输出获取到的音频文件,以使音频输出装置34根据获取到的音频文件输出交互音频。通过输出与操作相关的交互音频,可以增加交互投影的互动性。在本实施方式中,图像采集装置33优选为摄像头。
在本发明实施方式中,当位于投影仪32的投影画面与图像采集装置33之间的交互操作体执行交互操作时,通过图像采集装置33采集投影仪32的投影画面所在位置的交互图像,由于交互操作体位于图像采集装置33的前方,在图像采集装置33采集投影画面所在位置的交互图像时,交互图像会包含交互操作体画面,通过交互图像识别出交互操作体执行交互操作时交互操作特征信息,以及根据交互操作特征信息,调整投影仪32的投影内容,实现交互投影,而本发明是通过包含有交互操作体画面的交互图像识别出交互操作特征信息,在投影环境不稳定的环境下,也可以得出较为准确的交互操作特征信息,随时随地实现交互投影,用户可尽情的享受流畅的互动投影盛宴。
以上所述仅为本发明的实施方式,并非因此限制本发明的专利范围,凡是利用本发明说明书及附图内容所作的等效结构或等效流程变换,或直接或间接运用在其他相关的技术领域,均同理包括在本发明的 专利保护范围内。

Claims (9)

  1. 一种实现互动投影的方法,其特征在于,包括:
    向投影仪输入投影内容,以使所述投影仪根据所述投影内容进行投影;
    在位于所述投影仪的投影画面与图像采集装置之间的交互操作体执行交互操作时,通过所述图像采集装置采集所述投影仪的投影画面所在位置的交互图像;
    根据预设识别算法,从所述交互图像中识出所述交互操作体画面,并且根据交互操作体画面提取所述交互操作体进行交互操作时的交互操作特征信息,其中,所述交互操作体画面是指交互操作体在交互图像中的画面;
    根据所述交互操作特征信息,调整所述投影仪的投影内容,并且返回所述向投影仪输入投影内容的步骤。
  2. 根据权利要求1所述的方法,其特征在于,在所述根据交互操作体画面提取所述交互操作体进行交互操作时的交互操作特征信息的步骤之后,所述方法还包括:
    根据预设分块算法,在所述交互图像划分出若干个交互操作区域;
    获取所述交互操作体画面在所述交互图像的位置,以及根据交互操作体画面在所述交互图像的位置获取所述交互操作体画面所属的交互操作区域;
    所述根据所述交互操作特征信息,调整所述投影仪的投影内容的步骤包括:
    根据所述交互操作特征信息,并且结合所述交互操作体画面所属的交互操作区域,生成操作指令;
    根据所述操作指令,调整所述投影仪的投影内容。
  3. 根据权利要求2所述的方法,其特征在于,所述方法还包括:
    从预设声频库中,提取与所述操作指令相对应的音频文件;
    在调整所述投影仪的投影内容的过程中,根据获取到的音频文件输 出交互音频。
  4. 根据权利要求1所述的方法,其特征在于,
    所述交互操作特征信息包括交互操作体的运动轨迹、颜色和形状。
  5. 一种实现互动投影的系统,其特征在于,包括处理器、投影仪和图像采集装置,其中,所述处理器分别与投影仪和图像采集装置连接;
    所述处理器,用于:向投影仪输入投影内容,以使所述投影仪根据所述投影内容进行投影,在位于所述投影仪与图像采集装置之间的交互操作体执行交互操作时,通过所述图像采集装置采集所述投影仪的投影画面所在位置的交互图像,并且根据预设识别算法,从所述交互图像中识出所述交互操作体画面,并且根据交互操作体画面提取所述交互操作体进行交互操作时的交互操作特征信息,其中,所述交互操作体画面是指交互操作体在交互图像中的画面,以及根据所述交互操作特征信息,调整所述投影仪的投影内容,并且返回所述向投影仪输入投影内容的步骤。
  6. 根据权利要求5所述的系统,其特征在于,
    所述处理器在所述通过所述图像采集装置采集所述投影仪的投影画面所在位置的交互图像的步骤之后,还用于根据预设分块算法,在所述交互图像划分出若干个交互操作区域,并且获取所述交互操作体画面在所述交互图像的位置,以及根据交互操作体画面在所述交互图像的位置获取所述交互操作体画面所属的交互操作区域;
    所述处理器根据所述交互操作特征信息,调整所述投影仪的投影内容的步骤包括:根据所述交互操作特征信息,并且结合所述交互操作体画面所属的交互操作区域,生成操作指令,并且根据所述操作指令,调整所述投影仪的投影内容。
  7. 根据权利要求6所述的系统,其特征在于,所述系统还包括音频输出装置和存储装置,所述音频输出装置和存储装置分别与处理器连接,所述存储装置用于存储预设声频库;
    所述处理器还用于从存储装置的预设声频库中,提取与所述操作指令相对应的音频文件,并且在调整所述投影仪的投影内容的过程中,向 所述音频输出装置输出获取到的音频文件,以使所述音频输出装置根据所述获取到的音频文件输出交互音频。
  8. 根据权利要求5~7中任意一项所述的系统,其特征在于,
    所述图像采集装置为摄像头。
  9. 根据权利要求5~7所述的系统,其特征在于,
    所述交互操作特征信息包括交互操作体的运动轨迹、颜色和形状。
PCT/CN2016/093407 2016-05-18 2016-08-05 一种实现互动投影的方法及系统 WO2017197779A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201610333295.3 2016-05-18
CN201610333295.3A CN106055092A (zh) 2016-05-18 2016-05-18 一种实现互动投影的方法及系统

Publications (1)

Publication Number Publication Date
WO2017197779A1 true WO2017197779A1 (zh) 2017-11-23

Family

ID=57177840

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/093407 WO2017197779A1 (zh) 2016-05-18 2016-08-05 一种实现互动投影的方法及系统

Country Status (2)

Country Link
CN (1) CN106055092A (zh)
WO (1) WO2017197779A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113064494A (zh) * 2021-05-25 2021-07-02 广东机电职业技术学院 一种空中交互无接触的虚拟按键设备及其使用方法

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106775557A (zh) * 2016-11-28 2017-05-31 墨宝股份有限公司 一种智能机器人与虚拟3d的交互系统及方法
CN107818584B (zh) * 2017-09-27 2020-03-17 歌尔科技有限公司 用户手指位置信息的确定方法及装置、投影仪、投影系统
TWI721429B (zh) * 2018-05-21 2021-03-11 仁寶電腦工業股份有限公司 互動式投影系統和互動式投影方法
CN110109364A (zh) * 2019-03-25 2019-08-09 深圳绿米联创科技有限公司 设备控制方法、装置、摄像机以及存储介质
CN114694545B (zh) * 2020-12-30 2023-11-24 成都极米科技股份有限公司 图像显示方法、装置、投影仪及存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN201194096Y (zh) * 2007-11-07 2009-02-11 干宏江 互动式动态视觉显示装置
CN102789310A (zh) * 2011-05-17 2012-11-21 天津市卓立成科技有限公司 交互系统及其实现方法
US20140204120A1 (en) * 2013-01-23 2014-07-24 Fujitsu Limited Image processing device and image processing method
CN104090664A (zh) * 2014-07-29 2014-10-08 广景科技有限公司 一种交互式投影方法、装置及系统
CN105446623A (zh) * 2015-11-20 2016-03-30 广景视睿科技(深圳)有限公司 一种多互动投影的方法及系统

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103581727A (zh) * 2013-10-17 2014-02-12 四川长虹电器股份有限公司 一种基于智能电视平台的手势识别交互系统及交互方法
CN104866081A (zh) * 2014-02-25 2015-08-26 中兴通讯股份有限公司 一种终端操作方法、装置及终端

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN201194096Y (zh) * 2007-11-07 2009-02-11 干宏江 互动式动态视觉显示装置
CN102789310A (zh) * 2011-05-17 2012-11-21 天津市卓立成科技有限公司 交互系统及其实现方法
US20140204120A1 (en) * 2013-01-23 2014-07-24 Fujitsu Limited Image processing device and image processing method
CN104090664A (zh) * 2014-07-29 2014-10-08 广景科技有限公司 一种交互式投影方法、装置及系统
CN105446623A (zh) * 2015-11-20 2016-03-30 广景视睿科技(深圳)有限公司 一种多互动投影的方法及系统

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113064494A (zh) * 2021-05-25 2021-07-02 广东机电职业技术学院 一种空中交互无接触的虚拟按键设备及其使用方法

Also Published As

Publication number Publication date
CN106055092A (zh) 2016-10-26

Similar Documents

Publication Publication Date Title
WO2017197779A1 (zh) 一种实现互动投影的方法及系统
JP7457082B2 (ja) 反応型映像生成方法及び生成プログラム
US9135753B2 (en) Apparatus and method of augmented reality interaction
Suau et al. Real-time head and hand tracking based on 2.5 D data
KR101250619B1 (ko) 가상 사용자 인터페이스를 이용한 증강현실 시스템 및 그 방법
KR101929077B1 (ko) 이미지 식별 방법 및 이미지 식별 장치
US20110164032A1 (en) Three-Dimensional User Interface
US20220221943A1 (en) Using natural movements of a hand-held device to manipulate digital content
CN109982054B (zh) 一种基于定位追踪的投影方法、装置、投影仪及投影系统
JP2018517984A (ja) 画像領域を選択して追跡することによるビデオ・ズームのための装置および方法
JP2012243007A (ja) 映像表示装置及びそれを用いた映像領域選択方法
JP2003316510A (ja) 表示画面上に指示されたポイントを表示する表示装置、及び表示プログラム。
WO2015184841A1 (zh) 一种控制投影显示的方法及装置
CN106200942B (zh) 信息处理方法及电子设备
JP2013172446A (ja) 情報処理装置、端末装置、撮像装置、情報処理方法、及び撮像装置における情報提供方法
WO2019128086A1 (zh) 一种舞台互动投影方法、装置以及系统
CN113709545A (zh) 视频的处理方法、装置、计算机设备和存储介质
WO2022206304A1 (zh) 视频的播放方法、装置、设备、存储介质及程序产品
US20160127651A1 (en) Electronic device and method for capturing image using assistant icon
US20230319234A1 (en) System and Methods for Enhanced Videoconferencing
Alcoverro et al. Gesture control interface for immersive panoramic displays
JP5850188B2 (ja) 画像表示システム
US20180047169A1 (en) Method and apparatus for extracting object for sticker image
KR101414362B1 (ko) 영상인지 기반 공간 베젤 인터페이스 방법 및 장치
US11557065B2 (en) Automatic segmentation for screen-based tutorials using AR image anchors

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16902159

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 26.04.2019)

122 Ep: pct application non-entry in european phase

Ref document number: 16902159

Country of ref document: EP

Kind code of ref document: A1