CN113160806A - Projection system and control method thereof - Google Patents

Projection system and control method thereof Download PDF

Info

Publication number
CN113160806A
CN113160806A CN202010012978.5A CN202010012978A CN113160806A CN 113160806 A CN113160806 A CN 113160806A CN 202010012978 A CN202010012978 A CN 202010012978A CN 113160806 A CN113160806 A CN 113160806A
Authority
CN
China
Prior art keywords
target
instruction information
gesture
processor
voice
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010012978.5A
Other languages
Chinese (zh)
Inventor
孙琳琳
杨光照
杨华旭
金美灵
杨瑞锋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BOE Technology Group Co Ltd
Beijing BOE Optoelectronics Technology Co Ltd
Original Assignee
BOE Technology Group Co Ltd
Beijing BOE Optoelectronics Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BOE Technology Group Co Ltd, Beijing BOE Optoelectronics Technology Co Ltd filed Critical BOE Technology Group Co Ltd
Priority to CN202010012978.5A priority Critical patent/CN113160806A/en
Publication of CN113160806A publication Critical patent/CN113160806A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/22Procedures used during a speech recognition process, e.g. man-machine dialogue
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/26Speech to text systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3141Constructional details thereof

Abstract

The invention discloses a projection system and a control method thereof, and relates to the technical field of projection display. The main technical scheme of the invention is as follows: the projection system comprises a projector, an upper computer and a voice acquisition device, wherein a processor and a projection display element are arranged in the projector, and the processor controls the projection display element to display in a projection manner; the upper computer is electrically connected with the projector and is used for providing a calling target library; the voice acquisition device is in signal connection with the processor and is used for acquiring voice instruction information of a user; the processor receives voice instruction information acquired by the voice acquisition device, determines a target in the calling target library according to the voice instruction information and calls the target, and controls the projection display element to project and display a target picture of the called target; the invention not only can play the slide contents which are made in advance, but also can temporarily call the target to perform real-time projection display according to the actual situation and the voice instruction information of the user.

Description

Projection system and control method thereof
Technical Field
The invention relates to the technical field of projection display, in particular to a projection system and a control method thereof.
Background
The projector is an optical instrument which utilizes an optical element to amplify the outline of a workpiece and project the workpiece onto a screen, and can be connected with a computer and other terminals through different interfaces to play corresponding video signals.
However, the existing projector has a simple structure and a single function, and can only play the pre-made slide, and the monotonous playing mode cannot focus on the attention of viewers; and the playing content cannot be changed according to the temporary situation of the lecture site.
Disclosure of Invention
In view of this, embodiments of the present invention provide a projection system and a control method thereof, and mainly aim to solve the problems that the existing projector has a simple structure and a single function, can only play a pre-made slide, cannot focus attention of a viewer in a monotonous playing mode, and cannot change playing contents in response to a temporary situation of a lecture scene.
In order to achieve the purpose, the invention mainly provides the following technical scheme:
an embodiment of the present invention provides a projection system, which includes;
the projector is internally provided with a processor and a projection display element, and the processor controls the projection display element to perform projection display;
the upper computer is electrically connected with the projector and is used for providing a calling target library;
the voice acquisition device is in signal connection with the processor and is used for acquiring voice instruction information of a user;
the processor receives the voice instruction information acquired by the voice acquisition device, determines a target in the calling target library according to the voice instruction information and calls the target, and controls the projection display element to project and display a target picture of the called target.
Optionally, the projection system of the preceding, wherein the processor comprises a speech recognition and analysis circuit;
the voice recognition and analysis circuit is electrically connected with the voice acquisition device and is used for converting the audio signal of the voice instruction information into a digital signal and comparing and translating the digital signal with a digital signal library to obtain an instruction and a target in the voice instruction information;
the digital signal library is preset in the processor.
Optionally, the projection system of the preceding, wherein the processor further comprises a counting circuit;
the counting circuit is electrically connected with the voice recognition and analysis circuit and is used for calculating the frequency of the target in the voice instruction information;
and the processor performs prominent projection display on a certain target when the occurrence frequency of the target reaches a preset frequency.
Optionally, the projection system further includes at least two cameras;
the at least two cameras are respectively arranged at different positions and are in signal connection with the processor for acquiring gesture action pictures of a user;
the processor receives the gesture motion pictures collected by the at least two cameras, converts the gesture motion pictures into gesture instruction information, and controls the conversion display of a target picture according to the gesture instruction information.
Optionally, the projection system of the preceding, wherein the processor comprises image processing circuitry;
the image processing circuit is electrically connected with the at least two cameras and is used for converting the gesture action picture into gesture instruction information and comparing and matching the gesture instruction information with a gesture instruction library to obtain an instruction and a target in the gesture instruction information;
wherein the gesture instruction library is preset in the processor.
Optionally, in the projection system, the image processing circuit includes a preprocessing circuit, a positioning circuit, and a comparing circuit;
the preprocessing circuit is electrically connected with the at least two cameras and is used for receiving at least two gesture action pictures and performing matching cost, cost aggregation, parallax optimization and parallax refinement on the at least two gesture action pictures to obtain a final parallax map;
the positioning circuit is electrically connected with the preprocessing circuit and used for receiving the disparity map and generating position coordinates of the disparity map;
the comparison circuit is electrically connected with the positioning circuit and used for comparing the coordinate difference of the parallax images, judging gesture tracks through the coordinate difference to generate gesture instruction information, and comparing and matching the gesture instruction information with a gesture instruction library to obtain an instruction and a target in the gesture instruction information.
Optionally, the projection system further includes a remote controller;
the remote controller is in signal connection with the processor, the remote controller is in a first working state, the processor controls at least two cameras to stop shooting, the processor receives position information of the remote controller, converts the position information into gesture instruction information, and controls the conversion display of a target picture according to the gesture instruction information.
Optionally, in the projection system, the remote controller is electrically connected to the positioning circuit, and the positioning circuit is configured to receive a position signal of the remote controller and generate a corresponding position coordinate;
the comparison circuit is electrically connected with the positioning circuit and used for comparing coordinate differences of the plurality of remote controllers, judging tracks of the remote controllers through the coordinate differences to generate finger instruction information, and comparing and matching the gesture instruction information with a gesture instruction library to obtain instructions and targets in the gesture instruction information.
Optionally, the projection system further includes an external speaker device;
the sound box equipment is in signal connection with the processor and used for playing an audio target.
In another aspect, the present invention provides a control method based on the projection system, which includes the following steps: collecting voice instruction information of a user;
determining a target according to the voice instruction information and calling;
and projecting and displaying a target picture of the called target.
Optionally, the method for determining a target according to the voice instruction information and calling includes:
and converting the audio signal of the voice instruction information into a digital signal, comparing the digital signal with a digital signal library, and translating to obtain an instruction and a target in the voice instruction information.
Optionally, the method for displaying the target screen of the called target in a projection manner includes:
and calling the target in a calling target library according to the instruction and the target to perform projection display corresponding to the instruction.
Optionally, in the method for displaying a target screen of the called target by projection, the method further includes:
and when the frequency of a certain target appearing in the voice instruction information reaches a preset frequency, carrying out prominent projection display on the target.
Optionally, in the method for displaying a target screen of the called target by projection, the method further includes:
acquiring a gesture action picture, determining a gesture track to generate gesture instruction information, comparing and matching the gesture instruction information with a gesture instruction library to obtain an instruction and a target in the gesture instruction information, and controlling the conversion display of a target picture according to the instruction; alternatively, the first and second electrodes may be,
the method comprises the steps of collecting remote controller position information, determining a remote controller track to generate gesture instruction information, comparing and matching the gesture instruction information with a gesture instruction library to obtain an instruction and a target in the gesture instruction information, and controlling conversion display of a target picture according to the instruction.
The projection system and the control method thereof provided by the embodiment of the invention at least have the following beneficial effects: in order to solve the problems that the existing projector is simple in structure and single in function, only can play a pre-made slide, cannot focus attention of a viewer in a monotonous playing mode, and cannot cope with temporary situations of a lecture site to change playing contents, a voice acquisition device acquires voice instruction information of a user in the projection system provided by the invention, a processor of the projector determines and calls a target in a calling target library in the upper computer according to the voice instruction information, and controls a projection display element to project and display a target picture of the called target; the method can not only play the slide contents made in advance, but also temporarily call the target to perform real-time projection display according to the actual situation and the voice instruction information of the user.
Drawings
Fig. 1 is a schematic structural diagram of a projection system according to an embodiment of the present invention;
FIG. 2 is a detailed structural diagram of a projection system according to an embodiment of the present invention;
fig. 3 is a flowchart illustrating a control method of a projection system according to an embodiment of the invention;
in the figure: the system comprises a projector 1, a processor 11, a voice recognition and analysis circuit 111, a counting circuit 112, a preprocessing circuit 113, the positioning circuit 114, the comparison circuit 115, a timing circuit 116, a projection display element 12, an upper computer 2, a voice acquisition device 3, a sound box device 4, a camera 5 and a remote controller 6.
Detailed Description
To further illustrate the technical means and effects of the present invention adopted to achieve the predetermined objects, the following detailed description of the embodiments, structures, features and effects of the projection system and the control method thereof according to the present invention will be made with reference to the accompanying drawings and preferred embodiments. In the following description, different "one embodiment" or "an embodiment" refers to not necessarily the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
In order to solve the technical problems, the embodiment of the invention has the following general idea:
example 1
Referring to fig. 1, the present invention provides a projection system, which includes a projector 1, an upper computer 2 and a voice acquisition device 3; a processor 11 and a projection display element 12 are arranged in the projector 1, and the processor 11 controls the projection display element 12 to perform projection display; the upper computer 2 is electrically connected with the projector 1 and is used for providing a calling target library; the voice acquisition device 3 is in signal connection with the processor 11 and is used for acquiring voice instruction information of a user;
the processor 11 receives the voice instruction information acquired by the voice acquisition device 3, determines a target in the call target library according to the voice instruction information, and calls the target, and the processor 11 controls the projection display element 12 to project and display a target picture of the called target.
Specifically, in order to solve the problems that the existing projector is simple in structure and single in function, only can play a prefabricated slide, a monotonous playing mode cannot concentrate attention of a viewer, and the temporary situation of a speech site cannot be changed to change playing contents, the invention provides a projection system, which comprises a projector 1, an upper computer 2 and a voice acquisition device 3; the voice acquisition device 3 acquires voice instruction information of a user and transmits the voice instruction information to the processor 11 of the projector 1 for processing and analysis, the processor 11 determines a target in the calling target library of the upper computer 2 according to the voice instruction information and calls the target, and the processor 11 controls the projection display element 12 to project and display a target picture of the called target.
The projector 1 is similar to the projector in the prior art in structure, and functions of the projector are used for projection display, but the difference is that: the projector in the embodiment of the present invention is internally provided with the processor 11, the processor 11 integrates data processing, analyzing, comparing, and instruction controlling functions, and the functions of the processor 11 may be implemented by program editing, which is not specifically limited herein; the projection display element 12 is the same as a projection display element in a projector in the prior art, and redundant description is not repeated herein; of course, the projector 1 is provided with an internal memory, a flash, a bluetooth receiver, a wifi communication module, an infrared receiver, etc., which are conventional settings in the prior art, and are not described herein in any more detail. The upper computer 2 has the same structure as a computer in the prior art, and has functions of data search, data storage, data transmission and the like, the calling target library is a preset material library (including characters, audio, a PPT template, pictures, formulas, instructions and the like) in the upper computer 2, and can be downloaded through the upper computer 2 in a networking manner or stored through a USB plug-in card (a USB disk, a hard disk and the like), and the manner of establishing the calling target library is a manner well known by those skilled in the art, and is not described herein in detail. The voice collecting device 3 is a device with a sound receiving function, and is configured as a microphone in this embodiment, and the microphone is a microphone with a filtering function, so as to ensure clear collection of voice instruction information of a user.
In light of the foregoing, the projection system and the control method thereof according to the embodiments of the invention have at least the following advantages; in the projection system provided by the invention, voice instruction information of a user is acquired through the voice acquisition device 3, the processor 11 of the projector 1 determines and calls a target in a calling target library in the upper computer 2 according to the voice instruction information, and controls the projection display element 12 to project and display a target picture of the called target; the method can play the contents of the unreal lamp pieces which are manufactured in advance, and can temporarily call the target to perform real-time projection display according to the actual situation and the voice instruction information of the user.
The term "and/or" herein is merely an association describing an associated object, meaning that three relationships may exist, e.g., a and/or B, with the specific understanding that: a and B can be contained simultaneously, A can exist independently, B can exist independently, and any one of the three conditions can be met; wherein the inside and outside are referenced to the inside and outside in the actual installation.
Further, as shown in fig. 2 and fig. 3, an embodiment of the present invention provides a projection system, in which in a specific implementation, the processor 11 includes a speech recognition and analysis circuit 111;
the voice recognition and analysis circuit 111 is electrically connected with the voice acquisition device 3, and is used for converting the audio signal of the voice instruction information into a digital signal and comparing and translating the digital signal with a digital signal library to obtain an instruction and a target in the voice instruction information; the digital signal library is preset in the processor 11.
Specifically, in order to display a target screen according to the user voice instruction information, in this embodiment, the voice recognition and analysis circuit 111 is disposed in the processor 11, and the voice recognition and analysis circuit 111 has functions of voice recognition, analysis, comparison, and translation, and can be implemented by program editing, which is not described herein in detail. Referring to fig. 3, after the voice command information of the user is collected by the voice collecting device 3, the voice command information after being processed is transmitted to the recognition and analysis circuit 111 through filtering processing, the recognition and analysis circuit 111 first converts the voice command information from an audio signal into a digital signal, and cuts the digital signal into a plurality of sections of digital signals, i.e. a plurality of sections of square waves, and then compares the plurality of sections of square waves with the square waves in a digital signal library (the digital signal library is preset in the processor 11, for example, the square waves in a document opening, a new page, a page turning up and down, a text editing, a font amplifying, a font bolding, a font underlining, a lattice withdrawing, an image, an amplifying, a shrinking, an upward moving, a downward moving, a left moving, a right moving, a shape, a color, a deletion, a formula and the like) to obtain a vocabulary corresponding to each section of square waves, and finally, splicing the vocabularies to obtain a voice recognition result, namely instructions (the instructions are verbs such as opening, new creation, page turning, editing, backspace, deleting and the like) and targets (the targets are noun objects such as files, pages, texts, fonts, pictures, shapes, colors, formulas and the like) in the voice instruction information.
The process of voice-controlled projection display in this embodiment is as follows:
the user says 'text 1', the sound collection device 3 filters the audio signal and transmits the filtered 'text 1' audio signal to the processor 11, the voice recognition and analysis circuit 111 converts the audio signal of 'text 1' into a digital signal, compares the digital signal with a digital signal library in the processor 11 and translates the digital signal library to obtain an instruction of 'new construction' and 'text 1' (an input block of the text 1 is newly built under the condition that no text 1 exists on a creation page, and the text 1 is selected under the condition that the text 1 exists on the page), then the processor 11 increases the instruction of the text input block, controls the projection display element 12 to jump out a text block on a projection screen (the shape of a default text block is 50mm, and the default text block can be adjusted according to actual needs, can be realized by program editing); the user continues to say "text 1 editing", the voice collecting device 3 cooperates with the voice recognition and analysis circuit 111, the comparison and translation instruction is "editing", the target is "text 1", then the processor 11 makes the word cursor appear at the central position of the aforementioned text 1 (this is a default item, can be adjusted according to the actual need, can be realized through program editing); then, after the user outputs a piece of content by voice, the user continues to compare and translate the content by the voice collecting device 3 and the voice recognition and analysis circuit 111, and the processor 11 controls the projection display element 12 to start projecting and displaying the output characters on the projection screen.
If the font needs to be adjusted, the user needs to say "text 1", and at this time, the previously established text 1 is locked, all the characters in the text 1 are selected, and at this time, the user says "adjusting font", the instruction obtained through the comparison and translation of the voice recognition and analysis circuit 111 is "adjusting", and the target is "font", the processor 11 displays a plurality of fonts (the fonts are in the calling target library in the upper computer) arranged in the order of 1/2/3/4/5 … …, and the user says corresponding numbers to select the corresponding font model.
Or after selecting the text 1, saying 'font enlargement/font reduction', the instruction obtained by the comparison and translation of the voice recognition and analysis circuit 111 is 'enlargement/reduction', and the target is 'font', the processor 11 displays a plurality of fonts (the fonts are in the calling target library in the upper computer) which are arranged according to the sequence of 1/2/3/4/5 … …, and the user says corresponding numbers, namely selects the corresponding font model to continue to add text blocks on the current page.
Similarly, the adjustment of the fonts, the typefaces, the colors, and the like in the text refers to the description of the specific implementation process, and is not described in detail herein.
If the text input text needs to be reestablished, the user says "newly-built text 2" and adds a second text block in the same page at this time (no instruction for changing the page is given, so the processor 11 defaults to newly-built text 2 in the same page, and the default item is set, which can be adjusted according to actual needs and can be realized by program editing), and according to the above process, the text 2 is edited or the font, the typeface, the color and the like are adjusted, and no redundant description is given here.
If the text block needs to be deleted, the user needs to lock the text block to be deleted first, i.e., "text 1", and then "delete", and other commands refer to the above detailed description, which is not described herein in detail.
For another example: if a picture needs to be inserted, a user needs to say an 'inserted picture', the sound acquisition device 3 performs filtering processing on the 'inserted picture' audio signal after the collection, and transmits the 'inserted picture' audio signal after the filtering processing to the processor 11, the voice recognition and analysis circuit 111 converts the 'inserted picture' audio signal into a digital signal, compares and translates the digital signal with a digital signal base in the processor 11 to obtain an instruction of 'inserted' and a target of 'picture', the processor 11 jumps to a calling target base of the upper computer 2 to call the picture in the target base to be a reduced style (the default item is set, can be adjusted according to actual needs and can be realized by program editing), arranges according to the sequence of figure 1/figure 2/figure 3/… …, and after the processor 11 analyzes the picture selected by the user according to the selected voice of the user, the processor 11 inserts the corresponding picture into the page, the position is the middle of the page (the default item can be adjusted according to actual needs and can be realized by program editing; if the displayable editable page is 200mm x 120mm, then (100mm, 60mm) is the middle point of the page), and the size of the picture is 60mm x 40mm (the default item can be adjusted according to actual needs and can be realized by program editing).
If the size of the picture needs to be adjusted, the user says "picture 1" at this time, the inserted picture 1 is locked, and then the user says "zoom in/out" (here, the default setting in the processor 11 identifies the relationship of multiples of 0.1, 0.2, 0.3, 0.4, 0.5, 0.8, 1, 2, 3, and performs several times of zooming in or out according to the size of the picture, for example, if the inserted picture is zoomed in 0.5 times, the size of the picture appearing on the curtain is 90mm by 60mm, and the center position of the picture is still at the middle position (100mm, 60 mm).
If the picture position needs to be adjusted, the user needs to say "picture 1" to lock the inserted picture 1, and then "move up/down" (here, the quantized data is identified by default in the processor 11, and the default distance is 30mm if moving up), and then the center position of the picture 1 is moved to (100mm, 90 mm).
Similarly, the adjustment of the shape and the color of the picture is not described herein, and the detailed process can be obtained by referring to the above description.
If the picture needs to be deleted, the picture to be deleted needs to be locked first, that is, "picture 1" is said, and then "delete" is said, and other commands refer to the above detailed description process, which is not described herein in any greater detail.
For another example: if a formula needs to be inserted, the user needs to speak an "insertion formula", the analysis process is as before, the processor 11 jumps to the calling target library of the upper computer 2 to display a plurality of formulas (which may be in a picture form or in an editable form), the user speaks the name of the formula "shrinkage formula" according to the need, and the processor 11 invokes a corresponding "shrinkage formula" from the plurality of formulas after analysis; the above description of the other adjustment processes is not repeated herein.
Further, referring to fig. 2 and fig. 3, in a specific implementation of a projection system according to an embodiment of the present invention, the processor 11 further includes a counting circuit 112; the counting circuit 112 is electrically connected with the voice recognition and analysis circuit 111 and is used for calculating the frequency of the occurrence of the target in the voice instruction information; wherein, the number of times of occurrence of a certain target reaches a preset number of times, and the processor 11 performs a projecting display on the target.
Specifically, in order to match the highlighting of the display content by the presenter and to raise the interest of the viewer, the counting circuit 112 is provided in the processor 11 in the embodiment, and the counting circuit 112 has a number counting function, which can be realized by program editing, and is not limited herein. The counting circuit 112 is electrically connected to the voice recognition and analysis circuit 111, the voice recognition and analysis circuit 111 sends a signal to the counting circuit 112 after recognizing a target in the voice instruction information, the counting circuit 112 counts and counts the corresponding target, and when the number of times of occurrence of a certain target reaches the preset number of times (5 times, which is a default item, can be adjusted according to actual needs, and can be realized by program editing), the processor 11 controls the projection display element 12 to perform prominent projection display on the target; for example: when the "picture 1" appears five times in the user's voice command information, the counting circuit 112 feeds back the processor 11 to highlight the signal, and the processor 11 controls the projection display element 12 to highlight the "picture 1" (for example, flashing, circling, underlining, etc., which is a default item, can be adjusted according to actual needs, and can be realized by program editing).
Further, referring to fig. 1, in a specific implementation, a projection system according to an embodiment of the present invention further includes an external speaker device 4; the sound box device 4 is in signal connection with the processor 11 and is used for playing an audio target.
Specifically, in order to implement playing of an audio target, the sound box device 4 is set in this embodiment, when a certain audio, a certain music, or the like appears in the voice instruction information of the user, the processor 11 controls the sound box device 4 to play the corresponding audio or music, and the detailed process refers to the above description and is not described herein again.
Example 2
Based on the projection system of embodiment 1, a projection system according to an embodiment of the present invention further includes at least two cameras 5; the at least two cameras 5 are respectively arranged at different positions, and the at least two cameras 5 are in signal connection with the processor 11 and are used for acquiring gesture action pictures of a user;
the processor 11 receives the gesture motion pictures acquired by the at least two cameras 5, converts the gesture motion pictures into gesture instruction information, and controls the conversion display of the target picture according to the gesture instruction information.
Specifically, in order to achieve synchronization between gesture control and voice control, at least two cameras 5 are provided in this embodiment, and the cameras 5 have the same structure as the display 5 in the prior art, and have a specific image acquisition function; wherein, at least two cameras 5 are usually arranged on two corners of the upper side of the projection screen (the shooting angle is adjustable, mainly corresponding to the hand position of the user), if there is no projection screen, at least two cameras 5 can be adjusted adaptively according to the actual situation; the camera 5 takes pictures continuously, for example, 1 time/5 ms, and can be adjusted according to actual needs.
Wherein, referring to fig. 3 and fig. 2, the preprocessing circuit 113, the positioning circuit 114 and the comparing circuit 115 are disposed in the processor 11; the camera 5 transmits the photographed gesture motion pictures to the and processing circuit 113, and the preprocessing circuit 113 performs matching cost, cost aggregation, parallax optimization and parallax refinement on at least two gesture motion pictures (two gesture motion pictures photographed by two cameras 5 at the same time) photographed by the camera 5 to obtain a final parallax map, and the final parallax map preprocessing can be realized by program editing; wherein, the matching Cost (AD-Census Cost Initialization) is to perform graying processing on the image, and improve the contrast to process each pixel point and the parallax grade; cost Aggregation (Cross-based Cost Aggregation) is to reduce the influence of matching blur and image noise in the gray level image (namely noise reduction), and then an edge segmentation method based on Canny operator is selected to segment the background and the image, so that the background and the image are better distinguished (namely the hand and the background in the gray level image); selecting an optimal parallax range by parallax Optimization (Scanline Optimization), and generating a parallax image, namely cutting off the peripheral background of the hand from the gray image of only the hand; detecting points of an outlier, a shielding region and a parallax discontinuous region by using a left-right consistency criterion through parallax Refinement (Multi-step Disparity reference), namely overlapping two gray maps, reserving an overlapped part, removing peripheral non-overlapped pixel points, discontinuous regions and shielding regions to obtain a final parallax map only reserving the overlapped part, and identifying that a user stretches out one hand or two hands; it should be noted that: the processing and recognition process of the gesture action diagram is disclosed in the prior art, please refer to the recognition principle of the Kinect motion sensing game machine, and will not be described in detail herein.
The positioning circuit has the function of receiving the disparity map and generating the position coordinates of the disparity map, and can be realized by program editing; specifically, the positioning circuit receives the disparity map, simulates a checkerboard with the center of the projection display element 12 as a (0,0) point, and projects and displays the hand in the disparity map in the form of a cursor or an arrow at a position (any position in the actual projection, which corresponds to the coordinate position in the checkerboard in the processor 11); by analogy, during the movement of the user's hand, the camera 5, the preprocessing circuit 113 and the positioning circuit 114 cooperate to obtain a plurality of coordinate information.
The comparison circuit 115 has the functions of comparison and comparison, and can be realized by program editing; the comparison circuit 115 receives the coordinate information, performs a difference between the first coordinate and the last coordinate to obtain a gesture track of the user and generate a gesture command, and compares the gesture command with a gesture command library (the gesture command library is preset in the processor 11, for example:
one hand slides to the left to realize automatic turning to the next page;
one hand slides to the right to realize automatic turning to the previous page;
the two hands are separated from the middle to the two sides, so that the amplification from the center position is realized;
after one hand stays at the target position 3S of the page, the other hand is added to be separated from the middle to two sides, and the target amplification is realized;
after one hand stays at the target position 3S of the page, the other hand is added to be separated from two sides to the middle, and the target is reduced;
after one hand stays at any blank position of the page for 3S, the page moves along with the position of the hand;
single-hand double-click selection or confirmation) to obtain an instruction and a target; for example: the coordinates of the disparity map obtained for the first time are (120, 60), the coordinates of the disparity map obtained for the last time are (160,60), the coordinate difference obtained by the comparison circuit 115 is a forward motion towards the X axis, i.e. a user slides to the right with one hand, and then the command obtained by comparison with the gesture command library is "automatically turning to the previous page";
for another example: in the process of completing selection or confirmation through the gesture instruction, the hand of the user is required to perform double-click operation at the same position, and the gesture action picture shot by the camera 5 is processed to obtain the same or slightly different coordinates (the difference value can be set to a preset range, the setting can be set by a person skilled in the art in a conventional manner and can be realized through program programming), which indicates that the user is performing double-click confirmation; therefore, other gesture instructions refer to the above process, and the gesture trajectory can be determined through the coordinate position and can be used as the gesture instruction in this embodiment, the gesture instruction is only an example and does not limit the protection range of this embodiment, and all methods for determining the gesture trajectory through the coordinate difference belong to the protection range of the present invention and are not described herein; referring to the description process in embodiment 1, the gesture control may be implemented synchronously with the voice control, for example: after the text 1 is newly built or the picture 1 is inserted, one hand stays above the text 1 or the picture 1 for 3S, and the other hand is added to be separated from the middle to two sides, so that all characters or the picture 1 in the text 1 can be correspondingly amplified.
Further, referring to fig. 2, in a projection system according to an embodiment of the present invention, in an implementation, the processor 11 further includes a timing circuit 116; the timing circuit 116 is electrically connected with the image processing circuit and is used for calculating the duration of the gesture instruction information; when the holding time of a certain gesture instruction message reaches a preset time, the processor 11 performs a highlighted projection display on the target.
Specifically, in order to implement special gesture instruction information in gesture control, in this embodiment, the timing circuit 116 is arranged in the processor 11, and is configured to time the holding time of the gesture instruction information after the comparison circuit 115 obtains the gesture instruction information, and if a preset time length (3s, which is a default item, can be adjusted according to actual needs, and can be implemented by program editing) is reached, perform a protruding projection display on a target; for example: after the page target position 3S is stopped by one hand, it indicates that the target has been selected for highlighting, and the highlighting may refer to the above-mentioned flashing, circling or underlining, or make further highlighting according to further gesture instruction information, for example: after one hand stays at the target position 3S of the page, the other hand is added to be separated from the middle to two sides, and the target amplification is realized.
Example 3
On the basis of the embodiment 1 and the embodiment 2, referring to fig. 2 and fig. 3, a projection system according to an embodiment of the present invention further includes a remote controller 6; the remote controller 6 is in signal connection with the processor 11, the remote controller 6 is in a first working state, the processor 11 controls at least two cameras 5 to stop shooting, and the processor 11 receives the position information of the remote controller 6, converts the position information into gesture instruction information, and controls the conversion and display of a target picture according to the gesture instruction information.
Specifically, in order to realize diversified control of the projection system, in this embodiment, the remote controller 6 is externally installed, the remote controller 6 is internally provided with a laser gyroscope, the remote controller 6 is in signal connection with the processor 11, when the remote controller 6 turns on a power switch or is picked up, the remote controller 6 sends an enabling signal to the processor 11, the processor 11 controls the camera 5 to stop image acquisition, at this time, the laser gyroscope in the remote controller 6 judges a moving direction according to an inclined direction in which a user drives the remote controller 6, determines a self-motion displacement or a motion track, converts the self-motion displacement or the motion track into a pulse signal, and transmits the pulse signal to the positioning circuit 114, the positioning circuit 114 determines a position of the laser gyroscope (that is, the remote controller 6 is a hand of the user), and virtualizing coordinate information thereof, where the virtual coordinate information may only represent a moving point of the gyroscope in the initial state, and set it as the center (0,0) position of the projection display element 12, then the positioning circuit 114 may directly send position information to the positioning circuit 114 according to the moment of the laser gyroscope in the moving process of the remote controller 6 to determine the trajectory of the projection display (here, the coordinate information of the gyroscope in the initial position and the stop position may also be compared by the comparison circuit 115 to determine the movement trajectory thereof, the detailed process refers to the description of embodiment 2, and is not described herein too much), and the movement trajectory at this time is equivalent to the gesture trajectory of the user, generating gesture instruction information, comparing the gesture instruction information with gesture instructions in a gesture instruction library, and determining a final gesture instruction, wherein the detailed process refers to the above contents and is not described herein; the setting mode of sending the position information to the processor 11 by the gyroscope is similar to or even the same as the matching mode of a mouse and a computer in the prior art, and is a mode of detecting displacement by laser, converting the displacement signal into an electric pulse signal, and controlling the movement of a cursor arrow on a screen by process sequence processing and conversion, which is well known by those skilled in the art, so that the setting mode is not described herein.
Example 4
The embodiment of the invention provides a control method of a projection system, which comprises the following steps:
collecting voice instruction information of a user;
specifically, the sound collection device 3 collects the sound instruction information of the user and transmits the sound instruction information to the processor 11 of the projector 1.
Determining a target according to the voice instruction information and calling;
specifically, the processor 11 of the projector 1 performs recognition analysis and processing on voice command information of the user to obtain a command and a target, and calls the target in a call target library of the upper computer 2.
Projecting and displaying a target picture of the called target;
specifically, the processor 11 of the projector 1 performs projection display of a target picture on the target according to the projection display element 12 in the command controller.
Specifically, referring to fig. 3, the following are the detailed steps of the method of this embodiment:
101. starting the whole system equipment;
102. whether the voice instruction information of the user exists or not, if so, executing the step 103, and if not, executing the step 108;
103. the voice acquisition device 3 acquires voice instruction information of a user and carries out filtering processing;
104. the voice recognition and analysis circuit 111 converts the audio signal of the voice instruction information subjected to the filtering processing in step 103 into a digital signal;
105. the voice recognition and analysis circuit 111 compares the voice instruction information converted into the digital signal with a digital signal library in the processor 11 for translation;
the recognition and analysis circuit 111 first converts the voice command information from an audio signal into a digital signal, and cuts the digital signal into a plurality of sections of digital signals, that is, into a plurality of sections of square waves, and then compares the plurality of sections of square waves with the square waves in a digital signal library (the digital signal library is preset in the processor 11, such as a file opening, a newly-built page, a page turning up, a page turning down, a text editing, a font amplification, a font thickening, a font underlining, a backspace, a picture, an amplification, a reduction, an upward shift, a downward shift, a left shift, a right shift, a shape, a color, a deletion, a formula and the like) to obtain words corresponding to each section of square waves, so that the words are finally spliced to obtain a voice recognition result;
106. obtaining the command and the target in the user voice command information according to the recognition result in the step 105;
instructions (the instructions are verbs, such as opening, creating, turning pages, editing, quitting grids, deleting and the like) and targets (the targets are noun objects, such as files, pages, texts, fonts, pictures, shapes, colors, formulas and the like) in the voice instruction information;
107. according to the instruction and the target in the step 106, the processor 11 controls the projection display element 12 to perform projection display on the target picture; or controlling the conversion display of the target picture;
the detailed process refers to the illustration of embodiment 1, which is not described herein in detail;
108. judging whether a remote controller 6 is in signal connection with the processor 11, if not, executing the step 109, and if so, executing the step 115;
the remote controller 6 is internally provided with a laser gyroscope, the remote controller 6 is in signal connection with the processor 11, and when the remote controller 6 turns on a power switch or is taken up, the remote controller 6 sends a starting signal to the processor 11;
109. the processor 11 controls at least two cameras 5 to take the gesture action pictures of the user;
110. after the camera 5 takes a picture, sending the gesture action picture to the preprocessing circuit 113 to process the picture to obtain a disparity map; the detailed process refers to the illustration of embodiment 2, which is not described herein for further details;
111. the positioning circuit 114 generates corresponding coordinate information;
112. the comparison circuit 115 compares the coordinate difference of the first and last gesture motion pictures;
113. determining a gesture track according to the coordinate difference in the step 112, generating a gesture instruction, and comparing the gesture instruction with a gesture instruction library in the processor 11 to obtain an instruction and a target;
114. according to the instruction of the step 113 and the conversion display of the target control target picture;
the detailed process refers to the illustration of embodiment 2, which is not described herein for further details;
115. the processor 11 receives the starting signal of the remote controller 6 and controls at least two cameras 5 to stop photographing;
116. the remote control 6 sends the position information to the positioning circuit 114 and performs step 111.
The above description is only a preferred embodiment of the present invention, and is not intended to limit the present invention in any way, and any simple modification, equivalent change and modification made to the above embodiment according to the technical spirit of the present invention are still within the scope of the technical solution of the present invention.

Claims (14)

1. A projection system, characterized by: which comprises
The projector is internally provided with a processor and a projection display element, and the processor controls the projection display element to perform projection display;
the upper computer is electrically connected with the projector and is used for providing a calling target library;
the voice acquisition device is in signal connection with the processor and is used for acquiring voice instruction information of a user;
the processor receives the voice instruction information acquired by the voice acquisition device, determines a target in the calling target library according to the voice instruction information and calls the target, and controls the projection display element to project and display a target picture of the called target.
2. The projection system of claim 1, wherein:
the processor includes speech recognition and analysis circuitry;
the voice recognition and analysis circuit is electrically connected with the voice acquisition device and is used for converting the audio signal of the voice instruction information into a digital signal and comparing and translating the digital signal with a digital signal library to obtain an instruction and a target in the voice instruction information;
the digital signal library is preset in the processor.
3. The projection system of claim 2, wherein:
the processor further includes a counting circuit;
the counting circuit is electrically connected with the voice recognition and analysis circuit and is used for calculating the frequency of the target in the voice instruction information;
and the processor performs prominent projection display on a certain target when the occurrence frequency of the target reaches a preset frequency.
4. The projection system of claim 1, wherein:
the system also comprises at least two cameras;
the at least two cameras are respectively arranged at different positions and are in signal connection with the processor for acquiring gesture action pictures of a user;
the processor receives the gesture motion pictures collected by the at least two cameras, converts the gesture motion pictures into gesture instruction information, and controls the conversion display of a target picture according to the gesture instruction information.
5. The projection system of claim 4, wherein:
the processor comprises an image processing circuit;
the image processing circuit is electrically connected with the at least two cameras and is used for converting the gesture action picture into gesture instruction information and comparing and matching the gesture instruction information with a gesture instruction library to obtain an instruction and a target in the gesture instruction information;
wherein the gesture instruction library is preset in the processor.
6. The projection system of claim 5, wherein:
the image processing circuit comprises a preprocessing circuit, a positioning circuit and a comparison circuit;
the preprocessing circuit is electrically connected with the at least two cameras and is used for receiving at least two gesture action pictures and performing matching cost, cost aggregation, parallax optimization and parallax refinement on the at least two gesture action pictures to obtain a final parallax map;
the positioning circuit is electrically connected with the preprocessing circuit and used for receiving the disparity map and generating position coordinates of the disparity map;
the comparison circuit is electrically connected with the positioning circuit and used for comparing the coordinate difference of the parallax images, judging gesture tracks through the coordinate difference to generate gesture instruction information, and comparing and matching the gesture instruction information with a gesture instruction library to obtain instructions and targets in the gesture instruction information.
7. The projection system of claim 5, wherein:
the remote control also comprises a remote controller;
the remote controller is in signal connection with the processor, the remote controller is in a first working state, the processor controls at least two cameras to stop shooting, the processor receives position information of the remote controller, converts the position information into gesture instruction information, and controls the conversion display of a target picture according to the gesture instruction information.
8. The projection system of claim 7, wherein:
the remote controller is electrically connected with the positioning circuit, and the positioning circuit is used for receiving a position signal of the remote controller and generating a corresponding position coordinate;
the comparison circuit is electrically connected with the positioning circuit and used for comparing coordinate differences of the plurality of remote controllers, judging tracks of the remote controllers through the coordinate differences to generate finger instruction information, and comparing and matching the gesture instruction information with a gesture instruction library to obtain instructions and targets in the gesture instruction information.
9. The projection system of any of claims 1-8, wherein:
the loudspeaker box also comprises external loudspeaker box equipment;
the sound box equipment is in signal connection with the processor and used for playing an audio target.
10. A method of controlling a projection system according to any of claims 1-9, comprising the steps of:
collecting voice instruction information of a user;
determining a target according to the voice instruction information and calling;
and projecting and displaying a target picture of the called target.
11. The method for controlling a projection system according to claim 10, wherein the method for determining a target and calling according to the voice command information comprises:
and converting the audio signal of the voice instruction information into a digital signal, comparing the digital signal with a digital signal library, and translating to obtain an instruction and a target in the voice instruction information.
12. The method for controlling a projection system according to claim 11, wherein the method for displaying the target screen of the called target by projection includes:
and calling the target in a calling target library according to the instruction and the target to perform projection display corresponding to the instruction.
13. The method for controlling a projection system according to claim 12, wherein the method for displaying the target screen of the called target by projection further comprises:
and when the frequency of a certain target appearing in the voice instruction information reaches a preset frequency, carrying out prominent projection display on the target.
14. The method for controlling a projection system according to claim 12, wherein the method for displaying the target screen of the called target by projection further comprises:
acquiring a gesture action picture, determining a gesture track to generate gesture instruction information, comparing and matching the gesture instruction information with a gesture instruction library to obtain an instruction and a target in the gesture instruction information, and controlling the conversion display of a target picture according to the instruction; alternatively, the first and second electrodes may be,
collecting remote controller position information, determining a remote controller track to generate gesture instruction information, comparing and matching the gesture instruction information with a gesture instruction library to obtain an instruction and a target in the gesture instruction information, and controlling the conversion display of a target picture according to the instruction.
CN202010012978.5A 2020-01-07 2020-01-07 Projection system and control method thereof Pending CN113160806A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010012978.5A CN113160806A (en) 2020-01-07 2020-01-07 Projection system and control method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010012978.5A CN113160806A (en) 2020-01-07 2020-01-07 Projection system and control method thereof

Publications (1)

Publication Number Publication Date
CN113160806A true CN113160806A (en) 2021-07-23

Family

ID=76881414

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010012978.5A Pending CN113160806A (en) 2020-01-07 2020-01-07 Projection system and control method thereof

Country Status (1)

Country Link
CN (1) CN113160806A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113923425A (en) * 2021-09-13 2022-01-11 科大讯飞股份有限公司 Projector control method and related device, projector, projection kit and medium
CN115237306A (en) * 2022-07-22 2022-10-25 联想(北京)有限公司 Processing method, device, equipment and storage medium

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102346647A (en) * 2011-09-19 2012-02-08 盛乐信息技术(上海)有限公司 Control method and system of projector
CN105472283A (en) * 2014-08-27 2016-04-06 中兴通讯股份有限公司 Projection control method, projection equipment and mobile terminal
US20160170491A1 (en) * 2014-12-12 2016-06-16 Alpine Electronics, Inc. Gesture assistive zoomable selector for screen
CN105912106A (en) * 2016-04-05 2016-08-31 深圳市祈锦通信技术有限公司 Interaction system for intelligent projector and interaction method thereof
CN106043150A (en) * 2016-08-04 2016-10-26 歌尔科技有限公司 Vehicle-mounted projection system with speech recognition function
CN106356059A (en) * 2015-07-17 2017-01-25 中兴通讯股份有限公司 Voice control method, device and projector
CN106502424A (en) * 2016-11-29 2017-03-15 上海小持智能科技有限公司 Based on the interactive augmented reality system of speech gestures and limb action
CN208422402U (en) * 2018-04-24 2019-01-22 长城汽车股份有限公司 A kind of voice interactive system
CN109754801A (en) * 2019-01-15 2019-05-14 东莞松山湖国际机器人研究院有限公司 A kind of voice interactive system and method based on gesture identification
CN209683408U (en) * 2019-03-12 2019-11-26 苏州车萝卜汽车电子科技有限公司 Side window Projection Display interactive system

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102346647A (en) * 2011-09-19 2012-02-08 盛乐信息技术(上海)有限公司 Control method and system of projector
CN105472283A (en) * 2014-08-27 2016-04-06 中兴通讯股份有限公司 Projection control method, projection equipment and mobile terminal
US20160170491A1 (en) * 2014-12-12 2016-06-16 Alpine Electronics, Inc. Gesture assistive zoomable selector for screen
CN106356059A (en) * 2015-07-17 2017-01-25 中兴通讯股份有限公司 Voice control method, device and projector
CN105912106A (en) * 2016-04-05 2016-08-31 深圳市祈锦通信技术有限公司 Interaction system for intelligent projector and interaction method thereof
CN106043150A (en) * 2016-08-04 2016-10-26 歌尔科技有限公司 Vehicle-mounted projection system with speech recognition function
CN106502424A (en) * 2016-11-29 2017-03-15 上海小持智能科技有限公司 Based on the interactive augmented reality system of speech gestures and limb action
CN208422402U (en) * 2018-04-24 2019-01-22 长城汽车股份有限公司 A kind of voice interactive system
CN109754801A (en) * 2019-01-15 2019-05-14 东莞松山湖国际机器人研究院有限公司 A kind of voice interactive system and method based on gesture identification
CN209683408U (en) * 2019-03-12 2019-11-26 苏州车萝卜汽车电子科技有限公司 Side window Projection Display interactive system

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113923425A (en) * 2021-09-13 2022-01-11 科大讯飞股份有限公司 Projector control method and related device, projector, projection kit and medium
CN115237306A (en) * 2022-07-22 2022-10-25 联想(北京)有限公司 Processing method, device, equipment and storage medium

Similar Documents

Publication Publication Date Title
US11030987B2 (en) Method for selecting background music and capturing video, device, terminal apparatus, and medium
KR101858531B1 (en) Display apparatus controled by a motion, and motion control method thereof
KR102294078B1 (en) Information processing device, information processing method, program, and imaging system
US20200404162A1 (en) Control apparatus, control method, and storage medium
US9437246B2 (en) Information processing device, information processing method and program
CN108900771B (en) Video processing method and device, terminal equipment and storage medium
US20140125661A1 (en) Image processing apparatus, image processing method, and program
US20110273551A1 (en) Method to control media with face detection and hot spot motion
US10990226B2 (en) Inputting information using a virtual canvas
CN103379274A (en) Camera apparatus and control method thereof
CN103209291A (en) Method, apparatus and device for controlling automatic image shooting
CN110809101B (en) Image zooming processing method and device, electronic equipment and storage medium
KR20150005270A (en) Method for previewing images captured by electronic device and the electronic device therefor
KR101491760B1 (en) Apparatus and method for providing virtual reality of stage
CN113160806A (en) Projection system and control method thereof
JP2013162333A (en) Image processing device, image processing method, program, and recording medium
CN113747050A (en) Shooting method and equipment
JP2013046151A (en) Projector, projection system, and information search display method
CN106168895A (en) Sound control method and intelligent terminal for intelligent terminal
JP2014146066A (en) Document data generation device, document data generation method, and program
CN116939364A (en) Method, user equipment and system for automatically generating full-focus image through mobile camera
JP2010008620A (en) Imaging apparatus
US20120092457A1 (en) Stereoscopic image display apparatus
EP2753094A2 (en) Method and apparatus for controlling contents in electronic device
US20120229678A1 (en) Image reproducing control apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination