CN114827563A - Projection apparatus and projection region correction method - Google Patents

Projection apparatus and projection region correction method Download PDF

Info

Publication number
CN114827563A
CN114827563A CN202210389054.6A CN202210389054A CN114827563A CN 114827563 A CN114827563 A CN 114827563A CN 202210389054 A CN202210389054 A CN 202210389054A CN 114827563 A CN114827563 A CN 114827563A
Authority
CN
China
Prior art keywords
projection
target
projection area
vertex
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210389054.6A
Other languages
Chinese (zh)
Inventor
卢平光
郑晴晴
王昊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Visual Technology Co Ltd
Original Assignee
Hisense Visual Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Visual Technology Co Ltd filed Critical Hisense Visual Technology Co Ltd
Publication of CN114827563A publication Critical patent/CN114827563A/en
Priority to PCT/CN2022/132250 priority Critical patent/WO2023088304A1/en
Priority to CN202280063350.5A priority patent/CN118077192A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3141Constructional details thereof
    • H04N9/317Convergence or focusing systems
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/14Details
    • G03B21/53Means for automatic focusing, e.g. to compensate thermal effects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • H04N9/3185Geometric adjustment, e.g. keystone or convergence

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Geometry (AREA)
  • General Physics & Mathematics (AREA)
  • Transforming Electric Information Into Light Information (AREA)
  • Projection Apparatus (AREA)
  • Automatic Focus Adjustment (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Focusing (AREA)

Abstract

Some embodiments of the present application provide a projection apparatus and a projection region correction method. When the projection area correction instruction is obtained, the projection equipment can obtain the mapping relation between the graphic card projected by the optical machine and the projection surface. According to the mapping relation, the optical machine can obtain the initial projection area of the projection plane, which is the projection area before correction, of the graphic card. The projection device can perform contour correction on the initial projection area to obtain a target projection area, so that a target area set by a user can be determined, and the target area can be a curtain. The target position information of the target projection area is determined according to the mapping relation, and the optical machine can be enabled to project the target graphic card to the target projection area, so that the projection area is corrected, the projection equipment is guaranteed to project the image into the curtain completely, and the use experience of a user is improved.

Description

Projection apparatus and projection region correction method
The present application claims priority of chinese patent application entitled "projection apparatus and display control method based on geometric correction" filed on 16/11/2021 at chinese patent office, application number 202111355866.0, which is incorporated herein by reference in its entirety.
Technical Field
The application relates to the technical field of projection equipment, in particular to projection equipment and a projection region correction method.
Background
A projection device is a display device that can project an image or video onto a screen. The projection equipment can project laser light of a specific color to a projection surface through the refraction effect of the optical lens component to form a specific image.
In order to meet the needs of the user, the projection device may project an image into a projection surface, a projection area set by the user, for example, a curtain placed in advance by the user. When the projection device is used, if the position of the projection device is shifted, the projection device cannot completely project an image into a projection area, and the experience of a user is poor. For this purpose, the projection region needs to be determined again, so that the image is projected into the projection region again.
When the projection area is determined again by the existing projection equipment, the position and the projection angle of the projection equipment can be adjusted by a user, so that the image can be completely projected into the projection area. However, this method requires the user to manually select the adjustment direction, and is tedious in process, unable to accurately determine the projection area, and poor in user experience.
Disclosure of Invention
The application provides a projection device and a projection region correction method. The method and the device solve the problems that in the related technology, the projection area cannot be accurately determined, and the experience of a user is poor.
In one aspect, the present application provides a projection device comprising an optical engine and a controller. Wherein the light engine is configured to project the graphics card to the plane of projection; the controller is configured to perform the steps of:
responding to a projection area correction instruction, and acquiring a mapping relation between a graphic card projected by the optical machine and the projection surface;
acquiring according to the mapping relation, and projecting the graphic card to an initial projection area in the projection surface by the optical machine;
carrying out contour correction on the initial projection area to obtain a target projection area;
determining target position information of the target projection area according to the mapping relation;
and controlling the optical machine to project the target graphic card to the target projection area according to the target position information.
In another aspect, the present application provides a projection region correction method applied to a projection device, including:
responding to a projection area correction instruction, and acquiring a mapping relation between a graphic card projected by the optical machine and the projection surface;
acquiring according to the mapping relation, and projecting the graphic card to an initial projection area in the projection surface by the optical machine;
carrying out contour correction on the initial projection area to obtain a target projection area;
determining target position information of the target projection area according to the mapping relation;
and controlling the optical machine to project the target graphic card to the target projection area according to the target position information.
According to the technical scheme, the projection equipment and the projection area correction method are provided. When the projection area correction instruction is obtained, the projection equipment can obtain the mapping relation between the graphic card projected by the optical machine and the projection surface. According to the mapping relation, the optical machine can obtain that the optical machine projects the graphic card to an initial projection area in the projection surface, namely the projection area before correction. The projection equipment can perform contour correction on the initial projection area to obtain a target projection area, so that a target area set by a user can be determined, and the target area can be a curtain. The target position information of the target projection area is determined according to the mapping relation, and the optical machine can be enabled to project the target graphic card to the target projection area, so that the projection area is corrected, the projection equipment is guaranteed to project the image into the curtain completely, and the use experience of a user is improved.
Drawings
In order to more clearly explain the technical solution of the present application, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious to those skilled in the art that other drawings can be obtained according to the drawings without any creative effort.
FIG. 1 illustrates a schematic view of a projection curtain in some embodiments;
FIG. 2 illustrates a schematic layout of a projection device in some embodiments;
FIG. 3 is a schematic diagram of the optical path of a projection device in some embodiments;
FIG. 4 is a schematic diagram of a circuit architecture of a projection device in some embodiments;
FIG. 5 shows a schematic diagram of a projection device in some embodiments;
FIG. 6 is a schematic diagram of a circuit configuration of a projection device in some embodiments;
FIG. 7 is a system block diagram that illustrates a projection device implementing display control in some embodiments;
FIG. 8 shows a schematic view of a change in position of a projection device in some embodiments;
FIG. 9 illustrates a flow diagram for interaction of components of the projection device in some embodiments;
FIG. 10 shows a schematic view of a first graphic card in some embodiments;
FIG. 11 illustrates a schematic diagram of a first graphic card in some embodiments;
FIG. 12 is a schematic diagram illustrating a process for obtaining mappings in some embodiments;
FIG. 13 shows a schematic diagram of a target projection area in some embodiments;
FIG. 14 shows a schematic diagram of a target projection area in some embodiments;
FIG. 15 shows a schematic diagram of a target projection area in some embodiments;
FIG. 16 is a schematic diagram illustrating the projection region before and after modification in some embodiments;
FIG. 17 illustrates a flow diagram of one embodiment of a method of projection region modification.
Detailed Description
To make the objects, embodiments and advantages of the present application clearer, the following description of exemplary embodiments of the present application will clearly and completely describe the exemplary embodiments of the present application with reference to the accompanying drawings in the exemplary embodiments of the present application, and it is to be understood that the described exemplary embodiments are only a part of the embodiments of the present application, and not all of the embodiments.
All other embodiments, which can be derived by a person skilled in the art from the exemplary embodiments described herein without inventive step, are intended to be within the scope of the claims appended hereto. In addition, while the disclosure herein has been presented in terms of one or more exemplary examples, it should be appreciated that aspects of the disclosure may be implemented solely as a complete embodiment. It should be noted that the brief descriptions of the terms in the present application are only for the convenience of understanding the embodiments described below, and are not intended to limit the embodiments of the present application. These terms should be understood in their ordinary and customary meaning unless otherwise indicated.
The terms "first," "second," "third," and the like in the description and claims of this application and in the above-described drawings are used for distinguishing between similar or analogous objects or entities and not necessarily for describing a particular sequential or chronological order, unless otherwise indicated. It is to be understood that the terms so used are interchangeable under appropriate circumstances.
The terms "comprises" and "comprising," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or apparatus that comprises a list of elements is not necessarily limited to all elements expressly listed, but may include other elements not expressly listed or inherent to such product or apparatus.
The term "module" refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware and/or software code that is capable of performing the functionality associated with that element.
The embodiment of the application can be applied to various types of projection equipment. The projection apparatus will be explained below by taking a projector as an example.
The projector is a device capable of projecting images or videos onto a screen, and the projector can be connected with a computer, a broadcast television network, the internet, a Video Compact Disc (VCD), a Digital Versatile Disc (DVD), a game machine, a Digital Video Disc (DV), and the like through different interfaces to play corresponding Video signals. Projectors are widely used in homes, offices, schools, entertainment venues, and the like.
The projection device may be provided with an optical engine, and the optical engine may project an image, for example, a graphic card selected by a user, onto a projection surface, and specifically onto a projection area designated by the user in the projection surface. The projection surface can be a wall surface, and the projection area can be a specific area in the wall surface or a projection curtain positioned in front of the wall surface.
In some embodiments, the projection curtain is a tool for displaying images and video files in the fields of movies, offices, home theaters, large conferences and the like, and can be set to different specification sizes according to actual needs. In order to make the display effect more suitable for the viewing habit of the user, the aspect ratio of the corresponding curtain of the projector is usually set to 16:9 to accommodate the size of the image projected by the projection device. FIG. 1 shows a schematic view of a projection curtain in some embodiments. The edge of the screen of the projection screen may be a dark edge band for highlighting the boundaries of the screen, and the center area of the projection screen may be a white screen that may serve as a projection area for displaying the image projected by the projection device.
In order to realize the best projection effect, the installation positions of the projection curtain and the projection equipment can be operated by professional after-sale technicians, and the projection curtain and the projection equipment are arranged at the placing positions for realizing the best projection effect during working, so that the image projected by the projection equipment can be completely positioned in the projection curtain, and the experience of a user is improved. In the using process, the user can set the positions of the two devices by himself, and the embodiment of the application is not limited.
Fig. 2 shows a schematic layout of a projection device in some embodiments. In some embodiments, the projection curtain provided by the present application may be fixed at a first position, and the projection device is placed at a second position, so that a picture projected by the projection device is matched with the projection curtain.
Fig. 3 shows a schematic diagram of the optical path of the projection device in some embodiments.
The embodiment of the application provides a projection device, which comprises a laser light source 100, an optical machine 200, a lens 300 and a projection medium 400. The laser light source 100 provides illumination for the optical engine 200, and the optical engine 200 modulates light source beams and outputs the light source beams to the lens 300 for imaging, and the light source beams are projected to the projection medium 400 to form a projection image.
In some embodiments, the laser source 100 of the projection apparatus includes a laser assembly and an optical lens assembly, and a light beam emitted from the laser assembly can pass through the optical lens assembly to provide illumination for the light engine. Wherein, for example, optical lens assemblies require a higher level of environmental cleanliness, hermetic class sealing; and the chamber for installing the laser assembly can be sealed by adopting a dustproof grade with a lower sealing grade so as to reduce the sealing cost.
In some embodiments, the light engine 200 of the projection apparatus may be implemented to include a blue light engine, a green light engine, a red light engine, and may further include a heat dissipation system, a circuit control system, and the like. It should be noted that, in some embodiments, the light emitting component of the projector may also be implemented by an LED light source.
In some embodiments, the present application provides a projection device comprising a tristimulus engine and a controller; the three-color optical machine is used for modulating and generating laser with a user interface containing pixel points, and comprises a blue optical machine, a green optical machine and a red optical machine; the controller is configured to: acquiring an average gray value of a user interface; and when the average gray value is judged to be larger than a first threshold value and the duration time of the average gray value is judged to be larger than a time threshold value, controlling the working current value of the red light machine to be reduced according to a preset gradient value so as to reduce the heating of the three-color light machine. It can be found that the overheating of the red light machine can be controlled by reducing the working current of the red light machine integrated in the three-color light machine, so that the overheating of the three-color light machine and the projection equipment can be controlled.
The light engine 200 may be implemented as a three-color light engine, which integrates a blue light engine, a green light engine, and a red light engine.
The optical engine 200 of the projection apparatus is implemented to include a blue optical engine, a green optical engine, and a red optical engine, and the technical solution provided in the present application will be described below.
In some embodiments, the optical system of the projection device is composed of a light source part and an optical machine part, the light source part is used for providing illumination for the optical machine, and the optical machine part is used for modulating illumination light beams provided by the light source and finally emitting through the lens to form a projection picture.
Fig. 4 shows a schematic circuit architecture diagram of a projection device in some embodiments. In some embodiments, the projection device may include a display control circuit 10, a laser light source 20, at least one laser driving assembly 30, and at least one brightness sensor 40, and the laser light source 20 may include at least one laser in one-to-one correspondence with the at least one laser driving assembly 30. Wherein, the at least one means one or more, and the plurality means two or more.
Based on the circuit architecture, the projection device can realize adaptive adjustment. For example, by providing the luminance sensor 40 in the light outgoing path of the laser light source 20, the luminance sensor 40 can detect a first luminance value of the laser light source and transmit the first luminance value to the display control circuit 10.
The display control circuit 10 may obtain a second brightness value corresponding to the driving current of each laser, and determine that the laser has a COD fault when it is determined that a difference between the second brightness value of the laser and the first brightness value of the laser is greater than a difference threshold; the display control circuit can adjust the current control signal of the corresponding laser driving component of the laser until the difference value is less than or equal to the difference value threshold value, so as to eliminate the COD fault of the blue laser; the projection equipment can eliminate COD fault of the laser in time, reduce the damage rate of the laser and improve the image display effect of the projection equipment.
In some embodiments, the laser light source 20 includes three lasers, which may be a blue laser 201, a red laser 202, and a green laser 203, respectively, in a one-to-one correspondence with the laser driving assembly 30. The blue laser 201 is used for emitting blue laser, the red laser 202 is used for emitting red laser, and the green laser 203 is used for emitting green laser. In some embodiments, the laser driving assembly 30 may be implemented to include a plurality of sub-laser driving assemblies, each corresponding to a laser of a different color.
The display control circuit 10 is used for outputting a primary color enable signal and a primary color current control signal to the laser driving component 30 to drive the laser to emit light. The display control circuit 10 is connected to the laser driving components 30, and is configured to output at least one enable signal corresponding to three primary colors of each frame of image in the multiple frames of display images, transmit the at least one enable signal to the corresponding laser driving components 30, output at least one current control signal corresponding to the three primary colors of each frame of image, and transmit the at least one current control signal to the corresponding laser driving components 30. For example, the display control circuit 10 may be a Micro Controller Unit (MCU), which is also called a single chip. The current control signal may be a Pulse Width Modulation (PWM) signal.
In some embodiments, blue laser 201, red laser 202, and green laser 203 are each connected to laser drive assembly 30. The laser driving assembly 30 may provide a corresponding driving current to the blue laser 201 in response to the blue PWM signal and the enable signal sent by the display control circuit 10. The blue laser 201 is used to emit light under the drive of the drive current.
The brightness sensor is arranged in the light-emitting path of the laser light source, and is generally arranged on one side of the light-emitting path, so that the light path is not blocked. As shown in fig. 2, at least one brightness sensor 40 is disposed in the light outgoing path of the laser light source 20, and each brightness sensor is connected to the display control circuit 10, and is configured to detect a first brightness value of one laser and send the first brightness value to the display control circuit 10.
In some embodiments, the display control circuit 10 may obtain, from the corresponding relationship, a second brightness value corresponding to a driving current of each laser, where the driving current is a current actual operating current of the laser, and the second brightness value corresponding to the driving current is a brightness value that can be emitted when the laser normally operates under the driving of the driving current. The difference threshold may be a fixed value stored in advance in the display control circuit 10.
In some embodiments, the display control circuit 10 may reduce the duty ratio of the current control signal of the laser driving component 30 corresponding to the laser when adjusting the current control signal of the laser driving component 30 corresponding to the laser, thereby reducing the driving current of the laser.
In some embodiments, the brightness sensor 40 may detect a first brightness value of the blue laser 201 and send the first brightness value to the display control circuit 10. The display control circuit 10 can obtain the driving current of the blue laser 201, and obtain the second brightness value corresponding to the driving current from the corresponding relationship between the current and the brightness value. Then, it is detected whether the difference between the second brightness value and the first brightness value is greater than the difference threshold, and if the difference is greater than the difference threshold, it indicates that the blue laser 201 has a COD fault, the display control circuit 10 may decrease the current control signal of the laser driving component 30 corresponding to the blue laser 201. The display control circuit 10 may then obtain the first brightness value of the blue laser 201 and the second brightness value corresponding to the driving current of the blue laser 201 again, and decrease the current control signal of the laser driving component 30 corresponding to the blue laser 201 again when the difference between the second brightness value and the first brightness value is greater than the difference threshold. And the operation is circulated until the difference is less than or equal to the difference threshold. Thereby eliminating the COD failure of the blue laser 201 by reducing the drive current of the blue laser 201.
Fig. 5 shows a schematic diagram of a projection device in some embodiments.
In some embodiments, the laser light source 20 in the projection apparatus may include a blue laser 201, a red laser 202, and a green laser 203, which are independently disposed, and the projection apparatus may also be referred to as a three-color projection apparatus, and each of the blue laser 201, the red laser 202, and the green laser 203 is a module lightweight (MCL) package laser, which has a small volume and is beneficial to the compact arrangement of optical paths.
In some embodiments, the controller includes at least one of a Central Processing Unit (CPU), a video processor, an audio processor, a Graphic Processing Unit (GPU), a RAM Random Access Memory (RAM), a ROM (Read-Only Memory), a first interface to an nth interface for input/output, a communication Bus (Bus), and the like.
In some embodiments, the projection device may be configured with a camera for cooperating with the projection device to effect adjustment control of the projection process. For example, the projector-configured camera may be embodied as a 3D camera, a depth camera, or a binocular camera; when the camera is implemented as a binocular camera, the camera specifically includes a left camera and a right camera; the binocular camera can acquire a curtain corresponding to the projector, namely an image and a graphic card presented by the projection surface, and the image or the graphic card is projected by an optical machine built in the projector.
When the projector moves, the projection angle and the distance to the projection surface of the projector change, so that the projection image is deformed and can be displayed as a trapezoidal image or other malformed images; the projector controller can realize automatic trapezoidal correction by coupling an included angle between projection surfaces of the optical machine and correct display of a projected image based on an image shot by the camera.
Fig. 6 shows a schematic circuit diagram of a projection device in some embodiments. In some embodiments, the laser drive assembly 30 may include a drive circuit 301, a switching circuit 302, and an amplification circuit 303. The driving circuit 301 may be a driving chip. The switch circuit 302 may be a metal-oxide-semiconductor (MOS) transistor. The driving circuit 301 is connected to the switching circuit 302, the amplifying circuit 303, and the corresponding laser included in the laser light source 20. The driving circuit 301 is configured to output a driving current to a corresponding laser in the laser light source 20 through the VOUT terminal based on a current control signal sent by the display control circuit 10, and transmit a received enable signal to the switch circuit 302 through the ENOUT terminal. Wherein the laser may comprise n sub-lasers LD1 to LDn connected in series. n is a positive integer greater than 0.
The switch circuit 302 is connected in series in the current path of the laser, and is used for controlling the current path to be conducted when the received enable signal is at the effective potential. The amplifying circuit 303 is connected to the detection node E in the current path of the laser light source 20 and the display control circuit 10, respectively, and is configured to convert the detected driving current of the laser component into a driving voltage, amplify the driving voltage, and transmit the amplified driving voltage to the display control circuit 10. The display control circuit 10 is further configured to determine the amplified driving voltage as a driving current of the laser, and obtain a second brightness value corresponding to the driving current.
In some embodiments, the amplification circuit 303 may include: the circuit comprises a first operational amplifier A1, a first resistor (also called a sampling power resistor) R1, a second resistor R2, a third resistor R3 and a fourth resistor R4.
In some embodiments, the display control circuit 10, the driving circuit 301, the switching circuit 302, and the amplifying circuit 303 form a closed loop to implement feedback adjustment of the driving current of the laser, so that the display control circuit 10 can adjust the driving current of the laser in time through a difference between a second brightness value and a first brightness value of the laser, that is, adjust actual luminance brightness of the laser in time, avoid a long-time COD failure of the laser, and improve accuracy of light emission control of the laser. It should be noted that, if the laser light source 20 includes a blue laser, a red laser and a green laser. The blue laser 201 may be disposed at the L1 position, the red laser 202 may be disposed at the L2 position, and the green laser 203 may be disposed at the L3 position.
The laser light at the position of L1 is transmitted once through the fourth dichroic plate 604 and reflected once through the fifth dichroic plate 605, and then enters the first lens 901. The light efficiency at the position L1, P1 ═ Pt × Pf. Where Pt denotes the transmittance of the dichroic filter, and Pf denotes the reflectance of the dichroic filter or the fifth reflectance.
In some embodiments, among the three positions L1, L2, and L3, the light efficiency of the laser light at the L3 position is the highest, and the light efficiency of the laser light at the L1 position is the lowest. Since the maximum optical power Pb output by the blue laser 201 is 4.5 watts (W), the maximum optical power Pr output by the red laser 202 is 2.5W, and the maximum optical power Pg output by the green laser 203 is 1.5W. That is, the maximum optical power output by the blue laser 201 is the largest, the maximum optical power output by the red laser 202 is the next largest, and the maximum optical power output by the green laser 203 is the smallest. Thus, the green laser 203 is disposed at the L3 position, the red laser 202 is disposed at the L2 position, and the blue laser 201 is disposed at the L1 position. That is, the green laser 203 is disposed in the optical path with the highest light efficiency, thereby ensuring that the projection apparatus can obtain the highest light efficiency.
In some embodiments, the controller includes at least one of a Central Processing Unit (CPU), a video processor, an audio processor, a Graphic Processing Unit (GPU), a RAM Random Access Memory (RAM), a ROM (Read-Only Memory), a first interface to an nth interface for input/output, a communication Bus (Bus), and the like. In some embodiments, the projector may directly enter a display interface of a signal source selected last time after the projector is started, or a signal source selection interface, where the signal source may be a preset video-on-demand program, or may be at least one of an HDMI interface and a live tv interface, and after a user selects different signal sources, the projector may display contents obtained from different signal sources.
FIG. 7 is a block diagram of a system framework for implementing display control for a projection device in some embodiments.
In some embodiments, the projector provided by the application has the characteristic of long-focus micro-projection, and comprises a controller, wherein the controller can perform display control on a camera picture through a preset algorithm so as to realize functions of automatic trapezoidal correction, automatic screen entering, automatic obstacle avoidance, automatic focusing, anti-glare and the like of the display picture. It can be understood that the projector can realize flexible position movement in a long-focus micro-projection scene through a display control method based on geometric correction; in the process of each movement of the equipment, aiming at the problems of projection picture distortion, foreign matter shielding of a projection plane, abnormal projection picture from a curtain and the like which possibly occur, the controller can control the projector to realize the automatic display correction function, so that the projector automatically restores to normal display.
In some embodiments, a geometry correction based display control system includes an application Service layer (APK Service), a Service layer, and an underlying algorithm library.
The application program service layer is used for realizing interaction between the projector and the user; based on the display of the user interface, a user can configure various parameters and display pictures of the projector, and the controller can realize the function of automatically correcting the display pictures of the projector when the display of the projector is abnormal by coordinating and calling the algorithm service corresponding to various functions.
The Service layer can comprise contents such as correction Service, camera Service, time of flight (TOF) Service and the like, and the Service upwards corresponds to an application program Service layer (APK Service) and realizes corresponding specific functions of different configuration services of the projector; the service layer is connected with data acquisition services such as an algorithm library, a camera, a flight time sensor and the like downwards, and complex logic of a packaging bottom layer is achieved.
The bottom algorithm library provides correction service and a control algorithm for realizing various functions of the projector, and can complete various mathematical operations based on OpenCV (based on a permissible open source) and provide basic computing capability for the correction service; OpenCV is a cross-platform computer vision, machine learning software library that is issued on the basis of BSD licensing, and can run in a variety of existing operating system environments.
In some embodiments, the projector is configured with a gyro sensor; in the moving process of the equipment, the gyroscope sensor can sense the position movement and actively acquire moving data; and then sending the collected data to an application program service layer through a system framework layer for supporting application data required in the user interface interaction and application program interaction processes, wherein the collected data can also be used for data calling of the controller in the algorithm service implementation.
In some embodiments, the projector is configured with a time-of-flight (TOF) sensor that, upon acquisition of corresponding data by the TOF sensor, will be sent to a time-of-flight service corresponding to the service layer;
after the flight time service acquires data, the acquired data is sent to an application program service layer through a process communication framework, and the data is used for data calling, a user interface, program application and the like of the controller for interactive use.
In some embodiments, the camera acquisition data will be sent to the camera service, which then sends the acquired image data to the process communication framework, and/or projector correction service; the projector correction service can receive camera acquisition data sent by the camera service, and the controller can call corresponding control algorithms in the algorithm library according to different functions to be realized.
In some embodiments, data interaction is performed with the application service through the process communication framework, and then the calculation result is fed back to the correction service through the process communication framework; and the correction service sends the acquired calculation result to the projector operating system to generate a control signaling, and sends the control signaling to the optical machine control drive to control the working condition of the optical machine and realize the automatic correction of the projection area.
After the user starts projection equipment, projection equipment can be in projecting the image projection plane, specifically can project in the projection region that the user set up in advance, can be certain region of wall or the curtain that the user put in advance, can show the projection image in this projection region for the user watches. However, when the user is not properly positioned with the projection device, the projection device cannot project the image completely into the projection area. Or during the use process, the user touches the projection device to change the position of the projection device, so that the projected image can be shifted and cannot be completely positioned in the projection area. For the above situation, the projection device needs to determine the projection area again, so as to project the image into the projection area completely for the user to view. FIG. 8 shows a schematic representation of a change in position of a projection device in some embodiments. The projection device is initially in position a and may be projected fully into a projection area, which is rectangular. When the projection device moves from the position A to the position B, the projected image is shifted and cannot be completely located in the projection area.
In the related art, a user may manually adjust the position and the projection angle of the projection device so that the projection image is completely located in the projection area, but such manual adjustment is cumbersome and less accurate. The projection device can also shoot the projection surface to obtain the image of the projection surface. And then, carrying out binarization image processing on the acquired image so that the contour of an object in the image can be displayed more clearly. For the binarized image, all the closed contours contained therein may be extracted, and the closed contour having the largest area and the uniform internal color may be determined as a projection region, thereby performing projection. However, when a large area of solid-color wall exists around the projection area and the wall edge constitutes a closed outline, the projection apparatus recognizes the wall as the projection area, resulting in an image being projected onto the wall instead of the area designated by the user.
Therefore, the projection device provided by the embodiment of the application can accurately determine the projection area so as to solve the problem.
In order to enable the projection apparatus to project an image completely into a projection area set by a user, the projection apparatus has a projection area correction function. When the image projected by the projection equipment cannot be completely positioned in the projection area set by the user, the projection equipment can determine the projection area again, so that the image can be completely projected into the projection area, and the projection area is corrected, and the effect of automatically entering the projection image into the screen is achieved.
In some embodiments, when the projection area correction instruction is received, the projection device may start a projection area correction function to correct the current projection area of the projection device. The projection area correction instruction is a control instruction for triggering the projection device to automatically correct the projection area.
In some embodiments, the projection region modification instruction may be an instruction actively input by a user. For example, the projection device may project an image on the projection surface after the power of the projection device is turned on. At this time, the user may press a projection area correction switch preset in the projection device, or a projection area correction key on a remote controller matched with the projection device, so that the projection device starts a projection area correction function, and the projection area is redetermined to realize correction of the projection area.
In some embodiments, the projection region modification instruction may also be automatically generated according to a control program built in the projection device.
The projection device may actively generate the projection area correction instruction after being turned on. For example, when the projection device detects that a video signal is input for the first time after the projection device is turned on, a projection area correction instruction may be generated, thereby triggering a projection area correction function.
Or the projection device can generate the projection area correction instruction by itself in the working process. In consideration of the fact that a user may actively move the projection device or accidentally touch the projection device during the use of the projection device, so that the placement posture or the setting position of the projection device is changed, at this time, the projection image may shift, so that the projection area is changed, and the projection image cannot be completely projected into the projection area set by the user. At this time, in order to ensure the viewing experience of the user, the projection device may automatically perform projection area correction.
In particular, the projection device may detect its own condition in real time. When the projection equipment detects that the placing posture or the setting position of the projection equipment is changed, a projection area correction instruction can be generated, so that a projection area correction function is triggered.
In some embodiments, the projection device can monitor the movement of the device in real time through the configuration component of the projection device, and immediately feed back the monitoring result to the projection device controller, so that after the projection device moves, the controller timely starts a projection region correction function, and automatic correction of the projection region is realized at the first time. For example, the controller receives monitoring data from a gyroscope or a TOF (Time of Flight) sensor configured in the projection device to determine whether the projection device is moving.
After determining that the projection device moves, the controller may generate a projection area correction command and turn on a projection area correction function.
The flight time sensor realizes distance measurement and position movement monitoring by adjusting the frequency change of the transmitted pulse, the measurement precision of the flight time sensor cannot be reduced along with the increase of the measurement distance, and the anti-interference capability is strong.
In some embodiments, when the projection area correction instruction is detected, the projection device may re-determine the projection area, so as to completely project the image into the projection area, thereby implementing automatic correction of the projection area, and achieving an effect of automatically entering the screen of the image.
FIG. 9 illustrates a flow diagram for interaction of components of the projection device in some embodiments.
In some embodiments, when the projection region correction instruction is acquired, the projection device may automatically correct the projection region. Specifically, the projection area correction function can be realized on the projection image through a preset projection area correction algorithm.
In order to be able to correct the projection area, the projection device may first determine a projection relationship between the projection surface and the projection device. In the embodiment of the present application, the projection relationship refers to a projection relationship in which the projection device projects an image onto a projection surface, specifically, a mapping relationship between a graphic card projected by an optical machine of the projection device and the projection surface. After the projection relationship between the projection plane and the projection device is determined, the projection device may determine the position information of the projection area in the projection plane, so as to perform projection, thereby implementing the correction of the projection area.
The projection device may project a first graphic card in the projection surface and determine location information of the projection surface based on the first graphic card. According to the position information of the projection surface, the projection relation between the projection surface and the projection equipment can be further determined.
It should be noted that, in the embodiment of the present application, a conversion matrix of the projection plane and the optical machine coordinate system in the world coordinate system may be constructed based on the binocular camera, where the conversion matrix is a homography relationship between the projection image in the projection plane and the playing card played by the optical machine, the homography relationship is also referred to as a projection relationship, and any projection conversion between the projection image and the playing card may be implemented by using the homography relationship.
In some embodiments, when the projection area modification instruction is acquired, the projection device may first project the first graphic card. Specifically, the controller may control the optical engine to project the first graphic card onto the projection surface.
After the first graphic card is projected, the controller can also control the camera to shoot the first graphic card displayed in the projection surface to obtain a first image.
The first graphic card may include a plurality of feature points, and therefore the first image captured by the camera may also include all the feature points in the first graphic card. The position information of the projection surface can be determined through the characteristic points. It should be noted that, for a plane, after the positions of three points in the plane are determined, the position information of the plane can be determined. Therefore, in order to determine the position information of the projection surface, the positions of at least three points in the projection surface need to be determined, that is, at least three feature points need to be included in the first graphic card. And determining the position information of the projection plane according to the at least three characteristic points.
In some embodiments, the first graphic card may contain a pattern and a color feature preset by a user. The first card may be a checkerboard card, the checkerboard card is configured as a checkerboard with black and white alternating, as shown in fig. 10, the feature points included in the checkerboard card are corner points of a rectangle. The pattern in the first graphic card may also be configured as a circular ring graphic card, including a circular ring pattern, as shown in fig. 11, the characteristic points included in the circular ring graphic card are corresponding solid points on each circular ring. In some embodiments, the first card may be configured as a combination of the two types of patterns, or as other patterns with recognizable feature points.
FIG. 12 illustrates a flow diagram for obtaining a mapping relationship in some embodiments.
In some embodiments, after the optical engine projects the first graphic card onto the projection surface, the controller may control the camera to shoot the first graphic card to obtain a first image, so as to obtain the position of the feature point.
Specifically, the camera may be a binocular camera, and two cameras are respectively disposed on two sides of the optical machine. The first picture card is shot through the binocular camera, the left camera obtains one image, and the right camera obtains the other image.
The controller may perform image recognition processing on the two images to obtain a first coordinate of the feature point in any one of the images. In the embodiment of the present application, the first coordinate refers to: and under the image coordinate system corresponding to the first image, the coordinate information of the characteristic points.
The image coordinate system refers to: the X and Y axes are parallel to the coordinate system of two sides of the image by taking the center of the image as the origin of coordinates. The image coordinate system in the embodiment of the present application may be set as follows: for a projection area preset by a user, a central point of the projection area may be set as an origin, a horizontal direction is an X axis, and a vertical direction is a Y axis. The high image coordinate system may be set in advance according to a preset projection area.
The coordinate information of the feature points in the first image can be determined according to the image coordinate system.
The position of the same feature point in the image captured by the left camera and the position of the same feature point in the image captured by the right camera may be different. The coordinates of the feature point in the camera coordinate system of any one of the left camera and the right camera can be determined through two positions of the same feature point in the two images. In the embodiment of the present application, the second coordinate is used to represent: and under a camera coordinate system, coordinate information of the feature points.
In the embodiment of the present application, the camera coordinate system specifically includes: a light spot of the camera is taken as a center, an optical axis is taken as a Z axis, and a direct coordinate system of a space is established for an XOY plane parallel to a projection plane.
For the binocular camera, the second coordinate of the feature point in the camera coordinate system corresponding to any one of the cameras may be determined, and the left camera is taken as an example in the embodiment of the present application.
Specifically, the second coordinate information of the feature point in the camera coordinate system of the left camera may be determined from the position information of the same feature point in two images captured by the left and right cameras, and set as P (x, y, z).
In some embodiments, after acquiring the second coordinates of the feature point in the camera coordinate system of the left camera, the controller may convert the second coordinates into third coordinates. In the embodiment of the present application, the third coordinate refers to: and under the optical machine coordinate system, the coordinate information of the characteristic points.
In the embodiment of the present application, the optical-mechanical coordinate system specifically includes: a light spot of the optical machine is taken as a center, an optical axis is taken as a Z axis, and a direct coordinate system of a space is established for an XOY plane parallel to a projection plane. It should be noted that the optical engine coordinate system and the camera coordinate system are convertible to each other, so that the coordinates of the feature point in the camera coordinate system can be converted into the coordinates in the optical engine coordinate system. Specifically, the feature points can be converted between two coordinate systems according to external parameters between the optical cameras. The external parameters between the optical-mechanical cameras are device parameters that are marked on a device housing or a specification of the projection device when the projection device is manufactured and shipped, are usually set based on functions, assembly, manufacture and parts of the projection device, are applicable to all projection devices of the same model, and can include a rotation matrix and a translation matrix between the optical-mechanical cameras.
According to external parameters between the optical-mechanical camera, the conversion relation between the optical-mechanical coordinate system and the camera coordinate system can be determined, and the coordinates of the feature points in the optical-mechanical coordinate system can be further obtained.
The conversion formula is as follows:
P′(x′,y′,z′)=RRR*P+TTT (1)
wherein:
p '(x', y ', z') is the coordinate of the feature point in the optical-mechanical coordinate system.
RRR is an optical-mechanical camera rotation matrix, and TTT is an optical-mechanical camera translation matrix.
In some embodiments, after the coordinates of the feature points in the optical-mechanical coordinate system are obtained, a projection plane equation of the projection plane in the optical-mechanical coordinate system can be determined.
It should be noted that the coordinate information of at least three points is needed to determine the position information of a plane. Therefore, the controller may acquire first positions of at least three feature points in at least the first image and determine coordinate information in the camera coordinate system according to the first positions of the feature points. And further, the coordinate information under the camera coordinate system can be converted into the coordinate information under the optical-mechanical coordinate system.
After the coordinate information of the at least three characteristic points in the optical-mechanical coordinate system is determined, the controller can fit the coordinates of the characteristic points, so that a projection surface equation of the projection surface in the optical-mechanical coordinate system is obtained. The projection surface equation can be expressed as:
z=ax+by+c (2);
or the following formula:
Figure BDA0003594842350000101
in some embodiments, after determining the projection surface equation of the projection surface in the optical-mechanical coordinate system, the controller may obtain a transformation matrix of the optical-mechanical coordinate system and the world coordinate system according to the projection surface equation, where the transformation matrix is used to characterize the projection relationship.
In the embodiment of the present application, the world coordinate system is set to: the image coordinate system is an XOY plane, namely the projection plane is the XOY plane, wherein the origin is the central point of the projection area preset by the user. And setting a Z axis in a direction vertical to the projection plane to establish a space coordinate system.
In obtaining the transformation matrices of the opto-mechanical coordinate system and the world coordinate system, the controller may determine a representation of the projection surface in the world coordinate system and a representation of the projection surface in the opto-mechanical coordinate system, respectively.
Specifically, the controller may first determine a unit normal vector of the projection plane in a world coordinate system.
Since the projection plane itself is the XOY plane of the world coordinate system, the unit normal vector of the projection plane in the world coordinate system can be expressed as:
m=(0,0,1)*T (4)
the controller can also obtain a unit normal vector of the projection surface in the optical machine coordinate system according to a projection surface equation of the projection surface in the optical machine coordinate system, and the unit normal vector of the projection surface in the optical machine coordinate system can be expressed as a formula:
Figure BDA0003594842350000111
according to the unit normal vector of the projection plane under the two coordinate systems, a conversion matrix between the optical machine coordinate system and the world coordinate system can be obtained, and the correlation is expressed as the following formula:
m=R1*n (6)
wherein: r1 represents a transformation matrix between the opto-mechanical coordinate system and the world coordinate system.
The conversion matrix can represent the mapping relation between the graphic card projected by the optical machine and the projection surface.
After the mapping relation is determined, the controller can realize the conversion of the coordinates of a certain point between the world coordinate system and the optical machine coordinate system.
For a certain determined target area in the projection surface, the coordinate representation of the target area in the world coordinate system can be determined according to the position information of the target area in the projection surface, and the controller can convert the coordinate representation in the world coordinate system into the coordinate representation in the optical machine coordinate system according to the conversion matrix, so that the position information of the target area for the projection equipment is determined, and the graphic card can be directly projected into the target area.
For the projection device, when the projection process is not performed, there is no projection area in the projection plane, and the specific projection area cannot be directly determined. The projection device can determine the state of the image card to be projected, the projection angle and the like, so as to determine the coordinate representation of the projection area to be projected in the optical machine coordinate system. The controller can convert the coordinate representation in the optical machine coordinate system into the coordinate representation in the world coordinate system according to the conversion matrix, so as to determine the position information of the projection area to be projected in the projection plane.
That is, the coordinate representation of the projection region between the world coordinate system and the optomechanical coordinate system can be arbitrarily converted based on the conversion matrix.
In some embodiments, the projection area may be determined again according to a transformation matrix, that is, a mapping relationship between the graphics card projected by the optical engine and the projection surface, so as to correct the projection area.
According to the mapping relationship, the optical machine can be firstly obtained to project the graphic card to the initial projection area in the projection surface. Specifically, the projection area to be corrected in the initial projection area is: and before the projection area is corrected, displaying the image projected by the projection equipment in a display area in the projection surface.
It should be noted that the projection device may be performing a projection process while the projected image is being displayed in an initial projection area in the projection surface. For example, the user has turned on the projection device, and the projection device performs a projection process, so that a projection image is formed in the projection plane, and the area of the current projection plane where the image is displayed is the initial projection area. However, the current initial projection area and the target projection area set by the user may not be one projection area, so that the projected image is shifted from the position set by the user, and the projection apparatus needs to correct the initial projection area.
The projection device may not be performing the projection process, and the projection surface does not display the projection image, and therefore the initial projection area is not directly displayed. At this time, the area to which the projection apparatus is to be projected is an initial projection area. However, it is possible for the projection apparatus to determine the position information of the area to which it is to be projected, that is, the position information of the initial projection area. However, since the projection process is not actually performed, the user cannot determine the specific initial projection area, and thus cannot determine whether the initial projection area and the set target projection area are the same initial projection area. In this case, in order to ensure that the projection device can completely project the image into the target projection area, the user may control the projection device to directly perform the projection area correction, or the projection device may actively perform the projection area correction, for example, actively perform the projection area correction when the projector is turned on.
In some embodiments, the controller may acquire the initial projection region according to the mapping relationship.
Specifically, the controller may determine initial position information of the optical engine when projecting the graphic card to the projection surface. In this embodiment, the initial position information is position information of an initial projection area in the optical-mechanical coordinate system, and specifically may be a vertex coordinate of the initial projection area.
The user may control the projection device to project the graphic card to form a projected image in the projection surface. The graphic card may be an image selected by the user, including a picture or video, etc., which is generally rectangular in shape. Therefore, the image projected into the projection plane by the graphic card can also be rectangular for the user to see. That is, the projection area corresponding to the projection image is a rectangular area, and when the coordinate information of the four vertices of the rectangular area is determined, the position of the rectangular area is determined.
It should be noted that, for the projection device, whether it has performed the projection process or not, the projection device may determine that it is about to project or has projected. The projection device can determine the state of the graphic card to be projected by itself, the projection angle and the like, so as to determine the position to which the optical machine is projected, namely the initial position information to which the optical machine projects the graphic card. Thus, the controller may obtain a coordinate representation of the initial projection area in the light engine coordinate system.
According to the mapping relationship, the controller may convert the initial position information, i.e., the vertex coordinates of the initial projection area, into position information in a world coordinate system. Specifically, the vertex coordinates of the initial projection area may be in a world coordinate system. When the position information of the initial projection area in the world coordinate system is determined, the specific initial projection area is determined.
In some embodiments, if the projection device is performing a projection process, the area of the image displayed in the projection surface at that time is the initial projection area. The initial projection region may also be obtained by:
the controller may control the camera to shoot the projection plane, and the obtained projection plane image may include a graphic card projected by the projection device, and an area where the graphic card is located is the initial projection area. The controller can identify the projection plane image, determine the coordinate information of the four vertexes of the graphic card, and accordingly determine the position of the initial projection area.
Considering that the card projected by the projection device may not be completely recognized, for example, the edge area of the card is white, and the projection surface is also white, the edge of the card and the projection surface cannot be distinguished. For the projection plane image shot by the camera, the four vertexes of the graphic card cannot be identified, so that the initial projection area cannot be determined. Therefore, the controller can control the optical machine to project the preset graphic card into the projection surface. The preset graphic card is used for distinguishing from the projection surface and can be a green graphic card. At this time, the controller controls the camera to shoot the projection surface, so that the complete picture card contour can be identified, and the position information of the initial projection area can be determined.
In some embodiments, after the initial projection area is obtained, the controller may perform contour correction on the initial projection area to obtain the target projection area. The target projection area is a projection area set by a user. Therefore, after the contour correction is performed on the initial projection area, the projection area set by the user can be redetermined, and the correction processing of the projection area is realized.
Specifically, the controller may control the camera to shoot the projection plane first, so as to obtain the projection plane image. The projection plane image may include various environmental objects, such as a curtain, a television cabinet, a wall, a ceiling, a tea table, and the like. The position information of the object can be determined by determining the outline of the object. For example, coordinates of four vertices of the curtain are determined, and position information of the curtain in the projection plane is also determined.
In the projection plane image, an initial projection area image corresponding to the initial projection area may be generated. Specifically, the vertex coordinates of the initial projection area are acquired under the world coordinate system. Therefore, in the projection plane image, four vertexes of the initial projection area are marked first, and then the four vertexes are further connected to form the initial projection area. At this time, the initial projection region is also displayed in the projection plane image.
When performing contour correction on the initial projection region, four vertices of the initial projection region may be corrected, respectively, to obtain four corrected target vertices. The region formed by the four target vertices can be considered as a target projection region.
In some embodiments, the target vertices may be obtained from the vertices of the initial projection area image. For the four vertices of the initial projection area image, each vertex corresponds to a modified target vertex.
Specifically, the controller may perform corner detection processing on the initial projection region image. And processing the initial projection area image through a corner detection algorithm, which can be Shi Tomasi algorithm, so as to obtain all corners contained in the initial projection area image. The corner point may be an extreme point of a gray value or a brightness value change in the initial projection region image, such as a vertex of an object contour. Therefore, the vertices of the object contour included in the image, such as the vertices of the curtain, can be determined by the corner points.
It should be noted that there is no geometric relationship between the corner points, and therefore, a specific curtain contour cannot be determined by a simple corner point identification. At this time, an angular point corresponding to each vertex of the initial projection area image after correction may be determined, and the four angular points may be regarded as four vertices of the curtain, so as to determine a specific curtain contour.
For each vertex of the initial projection area image, a predetermined first size may be determined centering on the vertex. Specifically, the initial projection area images with different sizes may correspond to different first sizes, for example, for the initial projection area image of 1280 × 720, the first size may be set to 20 pixels. Therefore, for each vertex, the vertex can be set as a circle center, and the first correction area corresponding to the vertex can be determined by taking 20 pixel points as radiuses.
In the first correction area, the controller may obtain a corner closest to a center of the circle, that is, a vertex of the image of the initial projection area, which is referred to as a first corner in this embodiment of the present application. The first corner point can be regarded as a target vertex corresponding to the vertex of the initial projection area image.
According to the method, four vertexes of the initial projection area image and corresponding target vertexes can be determined. The four object vertices form the object projection area.
It should be noted that the error between the target projection area obtained by the method and the real outline of the curtain is very small, so that the target projection area can be considered as the curtain area, and a pixel-level curtain identification effect can be realized, so that the image entering into the curtain is accurately controlled, and the viewing experience of a user is greatly improved.
In some embodiments, the curtain profile is relatively dark considering a darker scene. At this time, if the projection device is performing a projection process, a projection image may be displayed in the projection surface, resulting in the projection image appearing in the projection surface image. Because the projected image is bright and the curtain outline is relatively dark, the edge of the projected image may be identified as an angular point, so that the situation that the first angular point actually belongs to the projected image occurs, and the projected image is determined as the curtain by mistake. Therefore, after the first angle is obtained, the first angle can be further detected.
Specifically, the controller may determine the position information of the first corner point, that is, the coordinate of the first corner point in the projection plane image. The controller may directly convert the coordinates in the image coordinate system to coordinates in the world coordinate system. According to the mapping relationship, the coordinates of the first corner point in the world coordinate system can be converted into coordinates in the optical-mechanical coordinate system, which are referred to as corner point coordinates in the embodiment of the application. Under the light machine coordinate system, the controller knows the vertex coordinates of the initial projection area. Therefore, the coordinates of the vertex and the coordinates of the corner can be confirmed by comparison. The controller may determine whether the vertex coordinates and the corner coordinates satisfy a preset second size condition.
In the embodiment of the present application, the preset second size condition is set as: the vertex coordinates and the corner coordinates are larger than a preset second size, the second size may be a size preset by a related technician, and may be set to be the same as or different from the first size, and the embodiment of the present application is not limited.
And if the vertex coordinates and the corner coordinates are detected to meet the preset second size condition, the vertex of the initial projection area is far away from the first corner, and the first corner does not belong to the edge of the initial projection area. At this time, the first corner may be determined as a target vertex, so as to obtain a target projection area, and the target projection area is the curtain area.
And if the vertex coordinates and the corner coordinates are detected not to meet the preset second size condition, the vertex of the initial projection area is closer to the first corner, and the first corner belongs to the edge of the initial projection area. At this time, the identified curtain region is not real, so the first corner point cannot be used as the target vertex, and the curtain entering operation is not performed, that is, the projection apparatus does not perform the normal projection process.
In some embodiments, when the vertex coordinates and the corner coordinates do not satisfy the preset second size condition, that is, the first corner cannot be the target vertex, the first corner may be deleted. Meanwhile, new first corners can be continuously searched from the remaining corners in the initial projection area image, and the new first corners can be continuously detected until the first corners meeting the conditions are obtained, and the first corners can be used as target vertexes. If the first corner point which meets the condition does not exist, the screen entering operation cannot be executed, and the projection equipment cannot project.
In some embodiments, the projection device does not project if the target vertex cannot be acquired. Meanwhile, the projection device can prompt the user, for example, project a screen entry failure message to prompt the user that the placement position of the projection device cannot enter the screen normally. At this time, the user may adjust the position of the projection apparatus, and the projection apparatus may perform the projection area correction again until the target projection area, i.e., the curtain area, is identified.
In some embodiments, when the target projection area is determined, the position information of the target projection area in the projection plane is obtained at this time. It should be noted that, since the target projection area has four vertices and the graphic card projected by the projection device also has four vertices, in order to perform projection accurately, a one-to-one correspondence relationship between the four vertices of the target projection area and the four vertices of the graphic card may be obtained. The upper left vertex of the graphic card can be projected to the upper left vertex of the target projection area, so that the graphic card can be normally displayed in the target projection area.
For a normally placed curtain, four top left, top right, bottom left and bottom right vertexes can be determined, so as to obtain a corresponding relationship between a target projection area and a graphic card.
Specifically, the controller may first determine the state of the target projection area. In the embodiment of the application, the target projection area is set to have a forward projection state and a tilt state. Wherein, the positive throw state refers to: compared with the reference region, the image is horizontal and has no inclination. The controller may acquire coordinate information of the four target vertices in an image coordinate system or a world coordinate system. Any one of the two coordinate systems may be used.
For four target vertices, their X-coordinates may be counted. The controller may further determine: and whether the difference value of the X coordinates of the two target vertexes meets a preset coordinate condition exists. The preset coordinate conditions may be set as: the difference between the X coordinates of the two target vertices is less than a predetermined size. When the preset coordinate condition is satisfied, the straight line where the two target vertexes are located can be considered as a vertical state, and therefore, the whole target projection area can be considered as a forward projection state. And if the preset coordinate condition is not met, the target projection area is considered to be in an inclined state.
And the corresponding relation of the four target vertexes can be further determined according to the state of the target projection area.
In some embodiments, if the target projection area is in a forward projection state, the controller may first classify four target vertices, and determine two target vertices with smaller X coordinates and two target vertices with larger X coordinates. FIG. 13 shows a schematic diagram of a target projection area in some embodiments. The coordinates of four target vertexes of the target projection area are as follows in sequence: a (x1, y2), B (x1, y1), C (x2, y2) and D (x2, y 1). Vertices a and B are two target vertices with smaller X coordinates, and C and D are two target vertices with larger X coordinates.
The controller may determine, as the first target vertex, a target vertex having a large Y coordinate and determine, as the second target vertex, a target vertex having a small Y coordinate, of the two target vertices having a small X coordinate. I.e., vertex a is determined as the first target vertex and vertex B is determined as the second target vertex. The controller may determine, as a third target vertex, a target vertex having a large Y coordinate and a target vertex having a small Y coordinate, of the two target vertices having a large X coordinate. I.e., vertex C is determined as the third target vertex and vertex D is determined as the fourth target vertex.
In the embodiment of the present application, the first target vertex may be an upper left vertex, the second target vertex may be a lower left vertex, the third target vertex may be an upper right vertex, and the fourth target vertex may be a lower right vertex. The four target vertexes and the four vertexes of the target graphic card are in one-to-one correspondence, so that the specific projection relation between the target projection area and the graphic card is determined.
In some embodiments, if the target projection area is in a tilted state, the controller may first determine two target vertices with smaller X coordinates and two target vertices with larger X coordinates.
It should be noted that the target projection area may be tilted to the left, and possibly to the right, and therefore, the controller may determine the tilt direction of the target projection area first.
The controller may obtain the slope of the line in which the two target vertices have the smaller X coordinate. FIG. 14 shows a schematic diagram of a target projection area in some embodiments. The coordinates of four target vertexes of the target projection area are as follows in sequence: a (x1, y1), B (x2, y2), C (x3, y3), and D (x4, y 4). The vertexes a and B are two target vertexes with smaller X coordinates, and the slope K of the straight line AB is determined.
K=(x2-x1)/(y2-y1)
The inclination direction of the target projection region can be determined by the slope K, and specifically, if the slope is less than 0, it is determined that the target projection region is in a first inclination state, i.e., is inclined to the left. If the slope is greater than 0, it is determined that the target projection region is in the second tilt state, i.e., tilted to the right. The slope K of the line AB in fig. 15 is significantly less than 0 and is therefore inclined to the left.
After determining the tilt direction of the target projection area, it can further determine which vertex of the four target vertices is at a specific time.
In the embodiment of the present application, the example of the curtain size is 16: 9.
If the target projection area is in the first tilt state, the target projection area is tilted to the left. The controller may order the X coordinates of the four target vertices from small to large. And sequentially determining the four target vertexes as a first target vertex, a second target vertex, a third target vertex and a fourth target vertex according to the sequence from small to large. As shown in fig. 15, since x1< x2< x3< x4, vertex a is the first target vertex, i.e., the top left vertex. Vertex B is the second target vertex, the lower left vertex. Vertex C is the third target vertex, the top right vertex. Vertex D is the fourth target vertex, the lower right vertex.
If the target projection area is in a second inclined state, namely, inclined to the right, the controller may sequentially determine the four target vertices as a second target vertex, a first target vertex, a fourth target vertex, and a third target vertex in order from small to large X coordinates. As shown in fig. 15, since x2< x1< x4< x3, vertex a is the first target vertex, i.e., the top left vertex. Vertex B is the second target vertex, the lower left vertex. Vertex C is the third target vertex, the top right vertex. Vertex D is the fourth target vertex, the lower right vertex.
It should be noted that, in the embodiment of the present application, the curtain size 16:9 is used for description, and the length of the curtain is larger than the width, so the calculation is performed by using the X coordinate. If the length of the curtain is smaller than the width, the calculation can be performed by using the Y coordinate, and the specific method is similar to that described above and is not described herein again.
In some embodiments, a rotating assembly may also be provided behind the curtain. The controller can calculate a rotation angle capable of enabling the target projection area to recover the orthographic projection state according to the slope of the target projection area, sends the rotation angle to the rotating assembly, and controls the rotating assembly to rotate according to the rotation angle, so that the target projection area recovers the orthographic projection state, and the watching experience of a user is improved.
In some embodiments, after the target projection area is determined, position information of the target projection area in the world coordinate system may be acquired. According to the projection relation, the controller can convert the position information under the world coordinate system into the position information under the optical-mechanical coordinate system to obtain the target position information.
According to the target position information, the controller can control the optical machine to project the target graphic card to the target projection area, so that the correction of the projection area is realized. FIG. 16 is a schematic diagram illustrating the projection region before and after correction in some embodiments. After the projection device moves from the position A to the position B, the projected image is shifted and cannot be completely located in the projection area. The projection device may perform projection region correction processing so that the projection device can be projected into the projection region completely even at the B position.
An embodiment of the present application further provides a projection region correction method, which is applied to a projection device, and as shown in fig. 17, the method includes:
step 1701, in response to the projection area correction instruction, acquiring the mapping relation between the graphic card projected by the optical machine and the projection surface.
Step 1702, obtaining the image according to the mapping relationship, and projecting the image card to an initial projection area in the projection plane by the optical engine.
And step 1703, performing contour correction on the initial projection area to obtain a target projection area.
And step 1704, determining target position information of the target projection area according to the mapping relation.
And step 1705, controlling the optical machine to project the target graphic card to a target projection area according to the target position information.
The same and similar parts in the embodiments in this specification may be referred to one another, and are not described herein again.
Those skilled in the art will readily appreciate that the techniques of the embodiments of the present invention may be implemented as software plus a required general purpose hardware platform. Based on such understanding, the technical solutions in the embodiments of the present invention may be substantially or partially embodied in the form of a software product, which may be stored in a storage medium, such as a ROM/RAM, a magnetic disk, an optical disk, or the like, and includes several instructions for enabling a computer device (which may be a personal computer, a server, or a network device) to execute the method of the embodiments or some parts of the embodiments.
Finally, it should be noted that: the above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, those of ordinary skill in the art will understand that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.
The foregoing description, for purposes of explanation, has been presented in conjunction with specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the embodiments to the precise forms disclosed above. Many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles and the practical application, to thereby enable others skilled in the art to best utilize the embodiments and various embodiments with various modifications as are suited to the particular use contemplated.

Claims (10)

1. A projection device, comprising:
a light engine configured to project the graphics card to a plane of projection;
a controller configured to:
responding to a projection area correction instruction, and acquiring a mapping relation between a graphic card projected by the optical machine and the projection surface;
acquiring according to the mapping relation, and projecting the graphic card to an initial projection area in the projection surface by the optical machine;
carrying out contour correction on the initial projection area to obtain a target projection area;
determining target position information of the target projection area according to the mapping relation;
and controlling the optical machine to project the target graphic card to the target projection area according to the target position information.
2. The projection device of claim 1, further comprising:
a camera configured to capture an image displayed in the projection plane;
the controller is configured to:
in the step of acquiring the mapping relation between the graphic card projected by the optical machine and the projection surface,
controlling the optical machine to project a first graphic card to the projection surface, and controlling the camera to shoot the first graphic card to obtain a first image, wherein the first graphic card comprises feature points;
acquiring a first coordinate of the feature point in an image coordinate system corresponding to the first image;
acquiring a second coordinate of the feature point in a camera coordinate system according to the first coordinate;
converting the second coordinate into a third coordinate of the feature point under a light machine coordinate system;
acquiring a projection surface equation of the projection surface under the optical machine coordinate system according to the third coordinate;
and acquiring a conversion matrix of the optical machine coordinate system and the world coordinate system according to the projection surface equation, wherein the conversion matrix is used for representing the mapping relation.
3. The projection device of claim 2, wherein the controller is configured to:
in the step of acquiring the first coordinates of the feature points in the image coordinate system corresponding to the first image,
determining at least three characteristic points in the first image, and respectively acquiring first coordinates of the at least three characteristic points;
in the step of obtaining the projection surface equation of the projection surface in the optical machine coordinate system according to the third coordinate,
and fitting the third coordinates corresponding to the at least three characteristic points to obtain a projection surface equation of the projection surface under the optical machine coordinate system.
4. The projection device of claim 2, wherein the controller is configured to:
in the step of performing contour correction on the initial projection area to obtain a target projection area,
controlling the camera to shoot a projection surface to obtain a projection surface image;
generating an initial projection area image corresponding to the initial projection area in the projection plane image;
acquiring a target vertex according to the vertex of the initial projection area image;
and determining a region formed by the target vertex as a target projection region.
5. The projection device of claim 4, wherein the controller is configured to:
in performing the step of acquiring the target vertex from the vertex of the initial projection area image,
carrying out corner detection processing on the initial projection area image to obtain all corners contained in the initial projection area image;
taking the vertex of the initial projection area image as a center, and acquiring a first corner point which is closest to the vertex of the initial projection area image in a preset first size;
determining the nearest first corner point as a target vertex.
6. The projection device of claim 5, wherein the controller is configured to:
in the step of executing the step of obtaining the image card according to the mapping relation and projecting the image card to the initial projection area in the projection surface by the optical machine,
determining initial position information of the optical machine for projecting the graphic card into the projection surface, wherein the initial position information is a vertex coordinate of the initial projection area in the optical machine coordinate system;
converting the vertex coordinates into position information under the world coordinate system according to the mapping relation;
and determining an initial projection area according to the position information under the world coordinate system.
7. The projection device of claim 6, wherein the controller is further configured to:
before performing the step of determining the nearest first corner point as the target vertex,
determining coordinates of the first corner point under the world coordinate system, and converting the coordinates into coordinates of the corner point under the optical-mechanical coordinate system according to the mapping relation;
judging whether the vertex coordinates and the corner coordinates meet a preset second size condition or not;
if yes, determining the nearest first corner point as a target vertex.
8. The projection device of claim 5, wherein the controller is configured to:
after performing the step of determining the nearest first corner point as the target vertex,
acquiring coordinate information of the target vertex in the image coordinate system or the world coordinate system, wherein the number of the target vertex is four;
judging whether a difference value of X coordinates of two target vertexes meets a preset coordinate condition or not;
if so, determining that the target projection area is in a forward projection state, and if not, determining that the target projection area is in an inclined state;
if the target projection area is in a forward projection state, determining two target vertexes with smaller X coordinates and two target vertexes with larger X coordinates;
determining a target vertex with a large Y coordinate as a first target vertex and a target vertex with a small Y coordinate as a second target vertex in the two target vertices with the small X coordinate; determining a target vertex with a large Y coordinate as a third target vertex and a target vertex with a small Y coordinate as a fourth target vertex in the two target vertices with the large X coordinate; and the four target vertexes and the four vertexes of the target graph card are in one-to-one correspondence.
9. The projection device of claim 8, wherein the controller is configured to:
if the target projection area is in an inclined state, determining two target vertexes with smaller X coordinates and two target vertexes with larger X coordinates;
acquiring the slope of a straight line where the two target vertexes with the smaller X coordinate are located; if the slope is less than 0, determining that the target projection area is in a first inclined state; if the slope is greater than 0, determining that the target projection area is in a second inclined state;
if the target projection area is in a first inclined state, sequentially determining four target vertexes as a first target vertex, a second target vertex, a third target vertex and a fourth target vertex according to the sequence of the X coordinates from small to large; and if the target projection area is in a second inclined state, sequentially determining the four target vertexes as a second target vertex, a first target vertex, a fourth target vertex and a third target vertex according to the sequence of the X coordinates from small to large.
10. A projection region correction method applied to a projection device is characterized by comprising the following steps:
responding to a projection area correction instruction, and acquiring a mapping relation between a graphic card projected by the optical machine and the projection surface;
acquiring according to the mapping relation, and projecting the graphic card to an initial projection area in the projection surface by the optical machine;
carrying out contour correction on the initial projection area to obtain a target projection area;
determining target position information of the target projection area according to the mapping relation;
and controlling the optical machine to project the target graphic card to the target projection area according to the target position information.
CN202210389054.6A 2021-11-16 2022-04-13 Projection apparatus and projection region correction method Pending CN114827563A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2022/132250 WO2023088304A1 (en) 2021-11-16 2022-11-16 Projection device and projection area correction method
CN202280063350.5A CN118077192A (en) 2021-11-16 2022-11-16 Projection equipment and projection area correction method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111355866 2021-11-16
CN2021113558660 2021-11-16

Publications (1)

Publication Number Publication Date
CN114827563A true CN114827563A (en) 2022-07-29

Family

ID=80658581

Family Applications (13)

Application Number Title Priority Date Filing Date
CN202210006233.7A Pending CN114466173A (en) 2021-11-16 2022-01-05 Projection equipment and projection display control method for automatically throwing screen area
CN202210050600.3A Pending CN114205570A (en) 2021-11-16 2022-01-17 Projection equipment and display control method for automatically correcting projected image
CN202210168263.8A Pending CN114401390A (en) 2021-11-16 2022-02-23 Projection equipment and projection image correction method based on optical machine camera calibration
CN202210325721.4A Active CN114885136B (en) 2021-11-16 2022-03-29 Projection apparatus and image correction method
CN202210343443.5A Active CN114885137B (en) 2021-11-16 2022-03-31 Projection equipment and automatic focusing method
CN202210343444.XA Pending CN114885138A (en) 2021-11-16 2022-03-31 Projection equipment and automatic focusing method
CN202210345204.3A Pending CN114727079A (en) 2021-11-16 2022-03-31 Projection equipment and focusing method based on position memory
CN202210389054.6A Pending CN114827563A (en) 2021-11-16 2022-04-13 Projection apparatus and projection region correction method
CN202210583357.1A Active CN115022606B (en) 2021-11-16 2022-05-25 Projection equipment and obstacle avoidance projection method
CN202210709400.4A Active CN115174877B (en) 2021-11-16 2022-06-21 Projection device and focusing method thereof
CN202280063192.3A Pending CN118104230A (en) 2021-11-16 2022-09-29 Projection equipment and display control method
CN202280063329.5A Pending CN118104231A (en) 2021-11-16 2022-11-16 Projection apparatus and projection image correction method
CN202280063350.5A Pending CN118077192A (en) 2021-11-16 2022-11-16 Projection equipment and projection area correction method

Family Applications Before (7)

Application Number Title Priority Date Filing Date
CN202210006233.7A Pending CN114466173A (en) 2021-11-16 2022-01-05 Projection equipment and projection display control method for automatically throwing screen area
CN202210050600.3A Pending CN114205570A (en) 2021-11-16 2022-01-17 Projection equipment and display control method for automatically correcting projected image
CN202210168263.8A Pending CN114401390A (en) 2021-11-16 2022-02-23 Projection equipment and projection image correction method based on optical machine camera calibration
CN202210325721.4A Active CN114885136B (en) 2021-11-16 2022-03-29 Projection apparatus and image correction method
CN202210343443.5A Active CN114885137B (en) 2021-11-16 2022-03-31 Projection equipment and automatic focusing method
CN202210343444.XA Pending CN114885138A (en) 2021-11-16 2022-03-31 Projection equipment and automatic focusing method
CN202210345204.3A Pending CN114727079A (en) 2021-11-16 2022-03-31 Projection equipment and focusing method based on position memory

Family Applications After (5)

Application Number Title Priority Date Filing Date
CN202210583357.1A Active CN115022606B (en) 2021-11-16 2022-05-25 Projection equipment and obstacle avoidance projection method
CN202210709400.4A Active CN115174877B (en) 2021-11-16 2022-06-21 Projection device and focusing method thereof
CN202280063192.3A Pending CN118104230A (en) 2021-11-16 2022-09-29 Projection equipment and display control method
CN202280063329.5A Pending CN118104231A (en) 2021-11-16 2022-11-16 Projection apparatus and projection image correction method
CN202280063350.5A Pending CN118077192A (en) 2021-11-16 2022-11-16 Projection equipment and projection area correction method

Country Status (2)

Country Link
CN (13) CN114466173A (en)
WO (3) WO2023087950A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023088304A1 (en) * 2021-11-16 2023-05-25 海信视像科技股份有限公司 Projection device and projection area correction method

Families Citing this family (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114760454A (en) * 2022-05-24 2022-07-15 海信视像科技股份有限公司 Projection equipment and trigger correction method
WO2023087951A1 (en) * 2021-11-16 2023-05-25 海信视像科技股份有限公司 Projection device, and display control method for projected image
WO2023087947A1 (en) * 2021-11-16 2023-05-25 海信视像科技股份有限公司 Projection device and correction method
WO2023087948A1 (en) * 2021-11-16 2023-05-25 海信视像科技股份有限公司 Projection device and display control method
CN115002432A (en) * 2022-05-30 2022-09-02 海信视像科技股份有限公司 Projection equipment and obstacle avoidance projection method
WO2023087960A1 (en) * 2021-11-16 2023-05-25 海信视像科技股份有限公司 Projection device and focusing method
CN114640832A (en) * 2022-02-11 2022-06-17 厦门聚视智创科技有限公司 Automatic correction method for projected image
CN115002429B (en) * 2022-05-07 2023-03-24 深圳市和天创科技有限公司 Projector capable of automatically calibrating projection position based on camera calculation
CN114885142B (en) * 2022-05-27 2024-05-17 海信视像科技股份有限公司 Projection equipment and method for adjusting projection brightness
CN115314689A (en) * 2022-08-05 2022-11-08 深圳海翼智新科技有限公司 Projection correction method, projection correction device, projector and computer program product
CN115314691B (en) * 2022-08-09 2023-05-09 北京淳中科技股份有限公司 Image geometric correction method and device, electronic equipment and storage medium
CN115061415B (en) * 2022-08-18 2023-01-24 赫比(成都)精密塑胶制品有限公司 Automatic process monitoring method and device and computer readable storage medium
CN115474032B (en) * 2022-09-14 2023-10-03 深圳市火乐科技发展有限公司 Projection interaction method, projection device and storage medium
CN115529445A (en) * 2022-09-15 2022-12-27 海信视像科技股份有限公司 Projection equipment and projection image quality adjusting method
WO2024066776A1 (en) * 2022-09-29 2024-04-04 海信视像科技股份有限公司 Projection device and projection-picture processing method
CN115361540B (en) * 2022-10-20 2023-01-24 潍坊歌尔电子有限公司 Method and device for self-checking abnormal cause of projected image, projector and storage medium
CN115760620B (en) * 2022-11-18 2023-10-20 荣耀终端有限公司 Document correction method and device and electronic equipment
CN116723395A (en) * 2023-04-21 2023-09-08 深圳市橙子数字科技有限公司 Non-inductive focusing method and device based on camera
CN116993879B (en) * 2023-07-03 2024-03-12 广州极点三维信息科技有限公司 Method for automatically avoiding obstacle and distributing light, electronic equipment and storage medium
CN117278735B (en) * 2023-09-15 2024-05-17 山东锦霖智能科技集团有限公司 Immersive image projection equipment
CN117830437B (en) * 2024-03-01 2024-05-14 中国科学院长春光学精密机械与物理研究所 Device and method for calibrating internal and external parameters of large-view-field long-distance multi-view camera

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101827221B1 (en) * 2017-09-07 2018-02-07 주식회사 조이펀 Mixed Reality Content Providing Device with Coordinate System Calibration and Method of Coordinate System Calibration using it
CN108206946A (en) * 2016-12-16 2018-06-26 Cj Cgv 株式会社 Image based on filming apparatus shooting automatically corrects the method and its system of projected area
CN109495729A (en) * 2018-11-26 2019-03-19 青岛海信激光显示股份有限公司 Projected picture correcting method and system
CN110336987A (en) * 2019-04-03 2019-10-15 北京小鸟听听科技有限公司 A kind of projector distortion correction method, device and projector
CN110636273A (en) * 2019-10-15 2019-12-31 歌尔股份有限公司 Method and device for adjusting projection picture, readable storage medium and projector
CN113489961A (en) * 2021-09-08 2021-10-08 深圳市火乐科技发展有限公司 Projection correction method, projection correction device, storage medium and projection equipment

Family Cites Families (62)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005031267A (en) * 2003-07-09 2005-02-03 Sony Corp Picture projection device and picture projection method
JP3951984B2 (en) * 2003-08-22 2007-08-01 日本電気株式会社 Image projection method and image projection apparatus
JP2006109088A (en) * 2004-10-05 2006-04-20 Olympus Corp Geometric correction method in multi-projection system
JP4984968B2 (en) * 2007-02-28 2012-07-25 カシオ計算機株式会社 Projection apparatus, abnormality control method and program
US7872637B2 (en) * 2007-04-25 2011-01-18 Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. System and method for tracking a laser spot on a projected computer screen image
JP4831219B2 (en) * 2008-10-29 2011-12-07 セイコーエプソン株式会社 Projector and projector control method
CN102236784A (en) * 2010-05-07 2011-11-09 株式会社理光 Screen area detection method and system
CN102681312B (en) * 2011-03-16 2015-06-24 宏瞻科技股份有限公司 Human eye safety protection system of laser projection system
JP2013033206A (en) * 2011-07-06 2013-02-14 Ricoh Co Ltd Projection display device, information processing device, projection display system, and program
CN103293836A (en) * 2012-02-27 2013-09-11 联想(北京)有限公司 Projection method and electronic device
CN103002240B (en) * 2012-12-03 2016-11-23 深圳创维数字技术有限公司 A kind of method and apparatus setting avoiding obstacles projection
JP6201359B2 (en) * 2013-03-22 2017-09-27 カシオ計算機株式会社 Projection system, projection method, and projection program
JP2015128242A (en) * 2013-12-27 2015-07-09 ソニー株式会社 Image projection device and calibration method of the same
CN103905762B (en) * 2014-04-14 2017-04-19 上海索广电子有限公司 Method for automatically detecting projection picture for projection module
CN103942796B (en) * 2014-04-23 2017-04-12 清华大学 High-precision projector and camera calibration system and method
JP2016014712A (en) * 2014-07-01 2016-01-28 キヤノン株式会社 Shading correction value calculation device and shading correction value calculation method
WO2016103543A1 (en) * 2014-12-25 2016-06-30 パナソニックIpマネジメント株式会社 Projection apparatus
CN104536249B (en) * 2015-01-16 2016-08-24 努比亚技术有限公司 The method and apparatus of regulation projector focal length
CN104835143A (en) * 2015-03-31 2015-08-12 中国航空无线电电子研究所 Rapid projector system parameter calibration method
JP2016197768A (en) * 2015-04-02 2016-11-24 キヤノン株式会社 Image projection system and control method of projection image
WO2016194191A1 (en) * 2015-06-04 2016-12-08 日立マクセル株式会社 Projection-type picture display apparatus and picture display method
CN105208308B (en) * 2015-09-25 2018-09-04 广景视睿科技(深圳)有限公司 A kind of method and system for the best projection focus obtaining projecting apparatus
US10630884B2 (en) * 2016-03-23 2020-04-21 Huawei Technologies Co., Ltd. Camera focusing method, apparatus, and device for terminal
CN107318007A (en) * 2016-04-27 2017-11-03 中兴通讯股份有限公司 The method and device of projected focus
CN107547881B (en) * 2016-06-24 2019-10-11 上海顺久电子科技有限公司 A kind of auto-correction method of projection imaging, device and laser television
CN106713879A (en) * 2016-11-25 2017-05-24 重庆杰夫与友文化创意有限公司 Obstacle avoidance projection method and apparatus
CN109215082B (en) * 2017-06-30 2021-06-22 杭州海康威视数字技术股份有限公司 Camera parameter calibration method, device, equipment and system
CN109426060A (en) * 2017-08-21 2019-03-05 深圳光峰科技股份有限公司 Projector automatic focusing method and projector
CN107479168A (en) * 2017-08-22 2017-12-15 深圳市芯智科技有限公司 A kind of projector that can realize rapid focus function and focusing method
CN109856902A (en) * 2017-11-30 2019-06-07 中强光电股份有限公司 Projection arrangement and Atomatic focusing method
CN110058483B (en) * 2018-01-18 2022-06-10 深圳光峰科技股份有限公司 Automatic focusing system, projection equipment, automatic focusing method and storage medium
US11240475B2 (en) * 2018-04-17 2022-02-01 Sony Corporation Information processing apparatus and method
CN110769214A (en) * 2018-08-20 2020-02-07 成都极米科技股份有限公司 Automatic tracking projection method and device based on frame difference
CN109544643B (en) * 2018-11-21 2023-08-11 北京佳讯飞鸿电气股份有限公司 Video camera image correction method and device
CN110769225B (en) * 2018-12-29 2021-11-09 成都极米科技股份有限公司 Projection area obtaining method based on curtain and projection device
CN110769227A (en) * 2019-02-27 2020-02-07 成都极米科技股份有限公司 Focusing method and focusing device of ultra-short-focus projector and readable storage medium
CN110769226B (en) * 2019-02-27 2021-11-09 成都极米科技股份有限公司 Focusing method and focusing device of ultra-short-focus projector and readable storage medium
CN110336951A (en) * 2019-08-26 2019-10-15 厦门美图之家科技有限公司 Contrast formula focusing method, device and electronic equipment
CN112799275B (en) * 2019-11-13 2023-01-06 青岛海信激光显示股份有限公司 Focusing method and focusing system of ultra-short-focus projection lens and projector
CN111028297B (en) * 2019-12-11 2023-04-28 凌云光技术股份有限公司 Calibration method of surface structured light three-dimensional measurement system
CN111050150B (en) * 2019-12-24 2021-12-31 成都极米科技股份有限公司 Focal length adjusting method and device, projection equipment and storage medium
CN111050151B (en) * 2019-12-26 2021-08-17 成都极米科技股份有限公司 Projection focusing method and device, projector and readable storage medium
CN110996085A (en) * 2019-12-26 2020-04-10 成都极米科技股份有限公司 Projector focusing method, projector focusing device and projector
CN111311686B (en) * 2020-01-15 2023-05-02 浙江大学 Projector defocus correction method based on edge perception
CN113554709A (en) * 2020-04-23 2021-10-26 华东交通大学 Camera-projector system calibration method based on polarization information
CN111429532B (en) * 2020-04-30 2023-03-31 南京大学 Method for improving camera calibration accuracy by utilizing multi-plane calibration plate
CN113301314B (en) * 2020-06-12 2023-10-24 阿里巴巴集团控股有限公司 Focusing method, projector, imaging apparatus, and storage medium
CN112050751B (en) * 2020-07-17 2022-07-22 深圳大学 Projector calibration method, intelligent terminal and storage medium
CN111932571B (en) * 2020-09-25 2021-01-22 歌尔股份有限公司 Image boundary identification method and device and computer readable storage medium
CN112584113B (en) * 2020-12-02 2022-08-30 深圳市当智科技有限公司 Wide-screen projection method and system based on mapping correction and readable storage medium
CN112598589A (en) * 2020-12-17 2021-04-02 青岛海信激光显示股份有限公司 Laser projection system and image correction method
CN112904653A (en) * 2021-01-26 2021-06-04 四川长虹电器股份有限公司 Focusing method and focusing device for projection equipment
CN112995625B (en) * 2021-02-23 2022-10-11 峰米(北京)科技有限公司 Trapezoidal correction method and device for projector
CN112995624B (en) * 2021-02-23 2022-11-08 峰米(北京)科技有限公司 Trapezoidal error correction method and device for projector
CN112689136B (en) * 2021-03-19 2021-07-02 深圳市火乐科技发展有限公司 Projection image adjusting method and device, storage medium and electronic equipment
CN113099198B (en) * 2021-03-19 2023-01-10 深圳市火乐科技发展有限公司 Projection image adjusting method and device, storage medium and electronic equipment
CN112804507B (en) * 2021-03-19 2021-08-31 深圳市火乐科技发展有限公司 Projector correction method, projector correction system, storage medium, and electronic device
CN113038105B (en) * 2021-03-26 2022-10-18 歌尔股份有限公司 Projector adjusting method and adjusting apparatus
CN113160339B (en) * 2021-05-19 2024-04-16 中国科学院自动化研究所苏州研究院 Projector calibration method based on Molaque law
CN113286134A (en) * 2021-05-25 2021-08-20 青岛海信激光显示股份有限公司 Image correction method and shooting equipment
CN113473095B (en) * 2021-05-27 2022-10-21 广景视睿科技(深圳)有限公司 Method and device for obstacle avoidance dynamic projection
CN114466173A (en) * 2021-11-16 2022-05-10 海信视像科技股份有限公司 Projection equipment and projection display control method for automatically throwing screen area

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108206946A (en) * 2016-12-16 2018-06-26 Cj Cgv 株式会社 Image based on filming apparatus shooting automatically corrects the method and its system of projected area
KR101827221B1 (en) * 2017-09-07 2018-02-07 주식회사 조이펀 Mixed Reality Content Providing Device with Coordinate System Calibration and Method of Coordinate System Calibration using it
CN109495729A (en) * 2018-11-26 2019-03-19 青岛海信激光显示股份有限公司 Projected picture correcting method and system
CN110336987A (en) * 2019-04-03 2019-10-15 北京小鸟听听科技有限公司 A kind of projector distortion correction method, device and projector
CN110636273A (en) * 2019-10-15 2019-12-31 歌尔股份有限公司 Method and device for adjusting projection picture, readable storage medium and projector
CN113489961A (en) * 2021-09-08 2021-10-08 深圳市火乐科技发展有限公司 Projection correction method, projection correction device, storage medium and projection equipment

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023088304A1 (en) * 2021-11-16 2023-05-25 海信视像科技股份有限公司 Projection device and projection area correction method

Also Published As

Publication number Publication date
CN118104231A (en) 2024-05-28
CN118077192A (en) 2024-05-24
CN114885136B (en) 2024-05-28
CN114885136A (en) 2022-08-09
WO2023088304A1 (en) 2023-05-25
CN118104230A (en) 2024-05-28
CN114466173A (en) 2022-05-10
CN114885137A (en) 2022-08-09
WO2023088329A1 (en) 2023-05-25
CN114885137B (en) 2024-05-31
WO2023087950A1 (en) 2023-05-25
CN115022606A (en) 2022-09-06
CN115174877B (en) 2024-05-28
CN115022606B (en) 2024-05-17
CN114885138A (en) 2022-08-09
CN114727079A (en) 2022-07-08
CN114205570A (en) 2022-03-18
CN114401390A (en) 2022-04-26
CN115174877A (en) 2022-10-11

Similar Documents

Publication Publication Date Title
CN114827563A (en) Projection apparatus and projection region correction method
CN111435985B (en) Projector control method, projector and projection system
US9554105B2 (en) Projection type image display apparatus and control method therefor
JP2007078821A (en) Projector, projecting method and program
WO2023087947A1 (en) Projection device and correction method
CN114866751A (en) Projection equipment and trigger correction method
CN115002432A (en) Projection equipment and obstacle avoidance projection method
US8031271B2 (en) Calibrating a projection system
CN115883803A (en) Projection equipment and projection picture correction method
JP2012181264A (en) Projection device, projection method, and program
CN116055696A (en) Projection equipment and projection method
CN116320335A (en) Projection equipment and method for adjusting projection picture size
CN115529445A (en) Projection equipment and projection image quality adjusting method
CN115623181A (en) Projection equipment and projection picture moving method
CN114760454A (en) Projection equipment and trigger correction method
CN114928728A (en) Projection apparatus and foreign matter detection method
CN115604445A (en) Projection equipment and projection obstacle avoidance method
JP2017173488A (en) Projector and control method for the same
CN114885141A (en) Projection detection method and projection equipment
CN114885142B (en) Projection equipment and method for adjusting projection brightness
JP2006235073A (en) Projector, projection method and program
WO2023087951A1 (en) Projection device, and display control method for projected image
WO2024124978A1 (en) Projection device and projection method
US20170214894A1 (en) Projector and method of controlling projector
CN118233608A (en) Laser projection correction method and projection display device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination