CN105677030A - Control method and electronic device - Google Patents

Control method and electronic device Download PDF

Info

Publication number
CN105677030A
CN105677030A CN201610004521.3A CN201610004521A CN105677030A CN 105677030 A CN105677030 A CN 105677030A CN 201610004521 A CN201610004521 A CN 201610004521A CN 105677030 A CN105677030 A CN 105677030A
Authority
CN
China
Prior art keywords
image data
preset
light source
preset image
electronic device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201610004521.3A
Other languages
Chinese (zh)
Other versions
CN105677030B (en
Inventor
智建军
张强
钟将为
李建国
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201610004521.3A priority Critical patent/CN105677030B/en
Publication of CN105677030A publication Critical patent/CN105677030A/en
Priority to US15/397,971 priority patent/US20170193869A1/en
Application granted granted Critical
Publication of CN105677030B publication Critical patent/CN105677030B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Control Of Indicators Other Than Cathode Ray Tubes (AREA)

Abstract

The embodiment of the invention discloses a control method. The method includes the steps that an electronic device obtains first image data and projects the first image data through a first light source; preset image data is obtained, and the obtained preset image data is projected through a second light source with the wavelength parameter meeting the preset conditions; a target image corresponding to the first image data and a detection image corresponding to the preset image data are obtained; the projection power of the second light source corresponds to the preset image data; the target image and the detection image are shown in the first area outside the electronic device. The embodiment of the invention further discloses the electronic device.

Description

Control method and electronic equipment
Technical Field
The present invention relates to control technologies, and in particular, to a control method and an electronic device.
Background
In the hardware innovation of the computerVisionPhone project, the micro projector has the greatest characteristic that an image can be projected, the user gesture is identified through the projected image, and the image capable of identifying the user gesture is called an interactive projection interface; in practical application, the distance between the interactive projection interface and the micro projector is in the range of 0.2 m-2.5 m, so that long-distance or short-distance interaction is realized; in the existing method, a Micro-Electro-mechanical system (MEMS) is usually used to control the output of three colors of Red (R, Red), Green (G, Green) and Blue (B, Blue) to combine into a colorful pattern; further, a colorful Pattern is presented, a fixed Pattern is presented, an acquisition device is used for capturing the fixed Pattern, and gesture recognition is achieved through the fixed Pattern; and the fixed Pattern is an interactive projection interface.
In the prior art, since the Pattern is usually fixed, the gesture recognition algorithm is also fixed, that is, no matter how far the distance between the interactive projection interface and the micro projector is, the complexity of the gesture recognition algorithm is uniform, and the power consumption of the consistent algorithm is high, so a method is urgently needed to solve the above problems and achieve automatic adaptation of the distance between the gesture recognition algorithm and the interactive projection interface to the micro projector.
Disclosure of Invention
In order to solve the existing technical problem, embodiments of the present invention provide a control method and an electronic device.
The technical scheme of the embodiment of the invention is realized as follows:
the embodiment of the invention provides a control method, which comprises the following steps:
the method comprises the steps that electronic equipment obtains first image data, and a first light source is used for projecting the first image data;
acquiring preset image data, and projecting the acquired preset image data by using a second light source with wavelength parameters meeting preset conditions;
obtaining a target image corresponding to the first image data and a detection image corresponding to the preset image data; the projection power of the second light source corresponds to the preset image data;
presenting the target image and the detection image in a first area outside the electronic device.
An embodiment of the present invention further provides an electronic device, including:
the device comprises a first transmitting unit, a second transmitting unit and a control unit, wherein the first transmitting unit is used for acquiring first image data and projecting the first image data by using a first light source;
the second transmitting unit is used for acquiring preset image data and projecting the acquired preset image data by using a second light source with wavelength parameters meeting preset conditions;
the processing unit is used for obtaining a target image corresponding to the first image data and a detection image corresponding to the preset image data, and presenting the target image and the detection image in a first area outside the electronic equipment; the projection power of the second light source corresponds to the preset image data.
According to the control method and the electronic device provided by the embodiment of the invention, when the original first light source projects first image data, the second light source is utilized to project preset image data, and then the target image corresponding to the first image data and the detection image corresponding to the preset image data are presented in the first area outside the electronic device, so that the purpose of presenting the target image and presenting the detection image is realized, and a foundation is laid for identifying user operation through the detection image.
Furthermore, since the detection image is displayed by projecting the preset image data by the second light source in the embodiment of the present invention, different preset image data can be selected according to the actually required user operation identification requirement (for example, a distance identification requirement), and then the preset image data matched with the user operation identification requirement is projected by the second light source to obtain the detection image; therefore, the control method provided by the embodiment of the invention can realize automatic adaptation of the distance, optimize the recognition algorithm and reduce the power consumption.
Drawings
FIG. 1 is a first schematic flow chart illustrating an implementation of a control method according to an embodiment of the present invention;
FIG. 2 is a schematic diagram illustrating a projection of first image data according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of a conventional laser micro-projection;
FIG. 4 is a schematic diagram of a control method according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of a second implementation flow of the control method according to the embodiment of the present invention;
FIG. 6 is a third schematic flow chart illustrating an implementation of the control method according to the embodiment of the present invention;
FIG. 7 is a first schematic structural diagram of an electronic device according to an embodiment of the invention;
FIG. 8 is a second schematic structural diagram of an electronic device according to an embodiment of the invention;
fig. 9 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
So that the manner in which the features and aspects of the present invention can be understood in detail, a more particular description of the invention, briefly summarized above, may be had by reference to embodiments, some of which are illustrated in the appended drawings.
Example one
FIG. 1 is a first schematic flow chart illustrating an implementation of a control method according to an embodiment of the present invention; the method is applied to the electronic equipment; as shown in fig. 1, the method includes:
step 101: the method comprises the steps that electronic equipment obtains first image data, and a first light source is used for projecting the first image data;
in this embodiment, the electronic device may be a notebook computer; the mobile phone can also be a smart phone; of course, the electronic device may also be a wearable device, such as a smart watch, a smart bracelet, smart glasses, a smart headset, and other devices with data processing capability.
In practical applications, the first light source may specifically be a light source composed of three primary colors of RGB; that is, after the electronic device acquires the first image data, the first image data is projected by using the three primary colors of RGB as light sources. Referring to fig. 2, after the electronic device acquires the first image data, the first image data is projected by using a red light digital processing (DLP) device, a green light DLP device, and a blue light DLP device.
Step 102: acquiring preset image data, and projecting the acquired preset image data by using a second light source with wavelength parameters meeting preset conditions;
in this embodiment, the preset condition specifically indicates that the wavelength of the second light source is within a wavelength range corresponding to the invisible light; for example, the second light source may specifically characterize an infrared laser; that is to say, in this embodiment, the second light source representing the invisible light is used to project the preset image data, and then a detection image corresponding to the preset image data is projected, and for the observation of the user, the detection image is an image presented by the invisible light.
In practical application, the preset image data is preset image data according to actual requirements, for example, the preset image data may specifically be a dynamic Pattern, and the dynamic Pattern may specifically be a bitmap; in a specific embodiment, after the preset image data is obtained, the electronic device further needs to detect the preset image data to obtain a pixel feature of the preset image data, determine a first projection power matched with the preset image data according to the pixel feature of the preset image data, and project the obtained preset image data at the first projection power by using a second light source whose wavelength parameter meets a preset condition. That is to say, the electronic device may determine the projection power of the second light source according to the pixel characteristics of the preset image data, so as to lay a foundation for the identification process of the user operation in different environments and under different conditions.
Step 103: obtaining a target image corresponding to the first image data and a detection image corresponding to the preset image data; the projection power of the second light source corresponds to the preset image data;
in this embodiment, the correspondence between the projection power of the second light source and the preset image data may specifically be that the projection power of the second light source matches with a pixel feature of the preset image data.
Step 104: presenting the target image and the detection image in a first area outside the electronic device.
In this embodiment, the detection image represents an interactive projection interface capable of identifying user operations, and further, the preset image data is a dynamic Pattern, which may be specifically a bitmap; therefore, the detection image may also be a dynamic Pattern corresponding to the preset image data, such as a bitmap.
The embodiments of the present invention are explained in further detail below with reference to specific application scenarios:
FIG. 3 is a schematic diagram of a conventional laser micro-projection; as shown in fig. 3, in the laser micro projection, the conventional method utilizes MEMS to control the output of RGB three primary colors, and finally combines them into a colorful pattern; the fixed Pattern is generated by a Diffractive Optical Element (DOE), and here, because the Pattern is fixed, if the fixed Pattern can support short-distance user operation identification, when the distance between the interactive projection interface and the micro projector is increased, the user operation identification cannot be realized; further, if the fixed Pattern can support remote user operation identification, at this time, although the fixed Pattern supporting remote identification can obviously support short-range identification, the identification algorithm at this time is complex and high in power consumption, and it is difficult to integrate the identification algorithm with high power consumption on the mobile device; furthermore, if the remote and close identification is realized through different fixed Pattern patterns, a plurality of DOEs are needed, so that the cost is increased, and meanwhile, the layout design difficulty is increased because the DOE placement is closely related to the light path coverage; thus, the existing approaches limit the application of laser micro-projection to mobile devices.
FIG. 4 is a schematic diagram of a control method according to an embodiment of the present invention; as shown in fig. 4, in the control method according to the embodiment of the present invention, on the basis of the original RGB three primary colors, specifically, the first light source, an additional light source, that is, a second light source, for example, an infrared laser, is added, and the preset image data is projected by the second light source, specifically, the MEMS is used to perform time-sharing scanning on the first light source and the second light source, that is, the four light sources, so as to obtain the target image and the detection image, thereby laying a foundation for identifying the user operation by the detection image.
Furthermore, since the wavelength of the second light source is within the wavelength range of the invisible light, that is, the second light source is the invisible light, from the viewpoint of viewing by a user, it is obvious that projecting the preset image data by using the newly added second light source does not affect the display of the target image; furthermore, since the control method according to the embodiment of the present invention projects the preset image data through the second light source, the embodiment of the present invention can select the preset image data with different pixel characteristics according to the actually required user operation identification requirement (for example, a distance identification requirement), and further project the preset image data matched with the user operation identification requirement through the second light source to obtain a detection image, where the distance identification can be implemented by adjusting the pixel characteristics of the preset image data, so that the embodiment of the present invention can adopt different optimization algorithms according to the difference in the distance presented, and further achieve the purpose of saving power consumption; meanwhile, the method of the embodiment of the invention can be applied to the mobile equipment because the recognition algorithm can be optimized according to the difference of the presentation distance.
Specifically, the embodiment of the invention utilizes MEMS to control infrared laser to output different dynamic patterns, wherein when the dynamic patterns are arranged more distantly, the electronic equipment for realizing the control method can identify the user operation in a close range; when the dynamic Pattern lattice arrangement is dense, the electronic equipment for realizing the control method can identify remote user operation; in addition, in the embodiment of the invention, the first light source and the second light source can be well overlapped, that is, the presented target image and the presented detection image can be well overlapped, so that the identification accuracy is improved.
According to the control method provided by the embodiment of the invention, when the original first light source projects the first image data, the second light source is utilized to project the preset image data, and then the target image corresponding to the first image data and the detection image corresponding to the preset image data are presented in the first area outside the electronic equipment, so that the purpose of presenting the target image and presenting the detection image is realized, and a foundation is laid for identifying the user operation through the detection image.
Furthermore, since the detection image is displayed by projecting the preset image data by the second light source in the embodiment of the present invention, different preset image data can be selected according to the actually required user operation identification requirement (for example, a distance identification requirement), and then the preset image data matched with the user operation identification requirement is projected by the second light source to obtain the detection image; therefore, the control method provided by the embodiment of the invention can realize automatic adaptation of the distance, optimize the recognition algorithm and reduce the power consumption.
Example two
Based on the control method according to the first embodiment, after step 104, as shown in fig. 5, the control method further includes:
step 105: acquiring the detection image and user operation in the first area;
step 106: analyzing the user operation according to the acquired detection image;
step 107: and determining a control instruction according to the analysis result so as to control the electronic equipment to respond to the user operation.
In practical application, the electronic device may specifically acquire the detection image through an acquisition unit, and simultaneously acquire the user operation in the first area, so as to determine an operation characteristic of the user operation with respect to the detection image, for example, three-dimensional coordinate information corresponding to the detection image of the user operation, and generate a control instruction according to the three-dimensional coordinate information, so as to control the electronic device to respond to the user operation. Here, the analysis result may be three-dimensional coordinate information of the user operation with respect to the detection image.
EXAMPLE III
FIG. 6 is a third schematic flow chart illustrating an implementation of the control method according to the embodiment of the present invention; in the embodiment, the electronic equipment can select the preset image data matched with the preset presentation distance according to the preset presentation distance so as to present the detection image, so that the aim of pertinently selecting the preset image data is fulfilled, and a foundation is laid for simplifying the identification algorithm; specifically, as shown in fig. 6, the method includes:
step 601: the electronic equipment acquires a preset presentation distance between a detection image expected to be presented and the electronic equipment;
in this embodiment, the electronic device may be a notebook computer; the mobile phone can also be a smart phone; of course, the electronic device may also be a wearable device, such as a smart watch, a smart bracelet, smart glasses, a smart headset, and other devices with data processing capability.
In this embodiment, the process of obtaining the preset presentation distance may be obtaining the preset presentation distance through a user operation, for example, a user directly inputs a numerical value, and the electronic device sets the numerical value input by the user as the preset presentation distance; or, the electronic device may directly determine a preset presentation distance in a laser ranging manner, for example, when a user expects to project a detection image on a first screen that the electronic device is, at this time, the electronic device detects a distance between itself and the first screen in the laser ranging manner, and sets the preset presentation distance according to a detection result. Here, the embodiment does not limit the manner in which the electronic device obtains the preset presentation distance, and in practical applications, the electronic device may determine the preset presentation distance according to any existing detection manner.
Step 602: selecting preset image data from a preset image list according to a preset presentation distance; wherein the pixel characteristics of the preset image data are matched with the preset presentation distance;
in this embodiment, before step 601, the electronic device needs to set a preset image list, where preset image data with different pixel characteristics are stored in the preset image list, so that the electronic device can select the preset image data matched with the presentation distance according to the presentation distance.
In practical applications, the execution order of the steps 601 and 602 may be exchanged with that of the step 603, as long as the steps 601 and 602 are executed before the step 604.
Step 603: acquiring first image data, and projecting the first image data by using a first light source;
in practical applications, the first light source may specifically be a light source composed of three primary colors of RGB; that is, after the electronic device acquires the first image data, the first image data is projected by using the three primary colors of RGB as light sources. Referring to fig. 2, after the electronic device acquires the first image data, the first image data is projected by using a red light digital processing (DLP) device, a green light DLP device, and a blue light DLP device.
Step 604: acquiring preset image data, and projecting the acquired preset image data by using a second light source with wavelength parameters meeting preset conditions;
in this embodiment, the preset condition specifically indicates that the wavelength of the second light source is within a wavelength range corresponding to the invisible light; for example, the second light source may specifically characterize an infrared laser; that is to say, in this embodiment, the second light source representing the invisible light is used to project the preset image data, and then a detection image corresponding to the preset image data is projected, and for the observation of the user, the detection image is an image presented by the invisible light.
In practical application, the preset image data is preset image data according to actual requirements, for example, the preset image data may be a dynamic Pattern, and the dynamic Pattern may be a bitmap.
Further, in practical applications, the electronic device further needs to determine a second projection power according to the determined preset image data and the preset presentation distance; then, a second light source with wavelength parameters meeting preset conditions is used for projecting the acquired preset image data at the second projection power; therefore, the purpose of adjusting the pixel characteristics of the detection image by adjusting the projection power is achieved, and a foundation is laid for the identification process of user operation under different environments and different conditions.
Step 605: obtaining a target image corresponding to the first image data and a detection image corresponding to the preset image data; the projection power of the second light source corresponds to the preset image data;
in this embodiment, the correspondence between the projection power of the second light source and the preset image data may specifically be that the projection power of the second light source matches with a pixel feature of the preset image data.
Step 606: presenting the target image and the detection image in a first area outside the electronic device.
In this embodiment, the detection image represents an interactive projection interface capable of identifying user operations, and further, the preset image data is a dynamic Pattern, which may be specifically a bitmap; therefore, the detection image may also be a dynamic Pattern corresponding to the preset image data, such as a bitmap.
The embodiments of the present invention are explained in further detail below with reference to specific application scenarios:
FIG. 3 is a schematic diagram of a conventional laser micro-projection; as shown in fig. 3, in the laser micro projection, the conventional method utilizes MEMS to control the output of RGB three primary colors, and finally combines them into a colorful pattern; the fixed Pattern is generated by a Diffractive Optical Element (DOE), and here, because the Pattern is fixed, if the fixed Pattern can support short-distance user operation identification, when the distance between the interactive projection interface and the micro projector is increased, the user operation identification cannot be realized; further, if the fixed Pattern can support remote user operation identification, at this time, although the fixed Pattern supporting remote identification can obviously support short-range identification, the identification algorithm at this time is complex and high in power consumption, and it is difficult to integrate the identification algorithm with high power consumption on the mobile device; furthermore, if the remote and close identification is realized through different fixed Pattern patterns, a plurality of DOEs are needed, so that the cost is increased, and meanwhile, the layout design difficulty is increased because the DOE placement is closely related to the light path coverage; thus, the existing approaches limit the application of laser micro-projection to mobile devices.
FIG. 4 is a schematic diagram of a control method according to an embodiment of the present invention; as shown in fig. 4, in the control method according to the embodiment of the present invention, on the basis of the original RGB three primary colors, specifically, the first light source, an additional light source, that is, a second light source, for example, an infrared laser, is added, and the preset image data is projected by the second light source, specifically, the MEMS is used to perform time-sharing scanning on the first light source and the second light source, that is, the four light sources, so as to obtain the target image and the detection image, thereby laying a foundation for identifying the user operation by the detection image.
Furthermore, since the wavelength of the second light source is within the wavelength range of the invisible light, that is, the second light source is the invisible light, from the viewpoint of viewing by a user, it is obvious that projecting the preset image data by using the newly added second light source does not affect the display of the target image; furthermore, since the control method according to the embodiment of the present invention projects the preset image data through the second light source, the embodiment of the present invention can select the preset image data with different pixel characteristics according to the actually required user operation identification requirement (for example, a distance identification requirement), and further project the preset image data matched with the user operation identification requirement through the second light source to obtain a detection image, where the distance identification can be implemented by adjusting the pixel characteristics of the preset image data, so that the embodiment of the present invention can adopt different optimization algorithms according to the difference in the distance presented, and further achieve the purpose of saving power consumption; meanwhile, the method of the embodiment of the invention can be applied to the mobile equipment because the recognition algorithm can be optimized according to the difference of the presentation distance.
Specifically, the embodiment of the invention utilizes MEMS to control infrared laser to output different dynamic patterns, wherein when the dynamic patterns are arranged more distantly, the electronic equipment for realizing the control method can identify the user operation in a close range; when the dynamic Pattern lattice arrangement is dense, the electronic equipment for realizing the control method can identify remote user operation; in addition, in the embodiment of the invention, the first light source and the second light source can be well overlapped, that is, the presented target image and the presented detection image can be well overlapped, so that the identification accuracy is improved.
According to the control method provided by the embodiment of the invention, when the original first light source projects the first image data, the second light source is utilized to project the preset image data, and then the target image corresponding to the first image data and the detection image corresponding to the preset image data are presented in the first area outside the electronic equipment, so that the purpose of presenting the target image and presenting the detection image is realized, and a foundation is laid for identifying the user operation through the detection image.
Furthermore, since the detection image is displayed by projecting the preset image data by the second light source in the embodiment of the present invention, different preset image data can be selected according to the actually required user operation identification requirement (for example, a distance identification requirement), and then the preset image data matched with the user operation identification requirement is projected by the second light source to obtain the detection image; therefore, the control method provided by the embodiment of the invention can realize automatic adaptation of the distance, optimize the recognition algorithm and reduce the power consumption.
Example four
FIG. 7 is a first schematic structural diagram of an electronic device according to an embodiment of the invention; as shown in fig. 7, the electronic apparatus includes:
a first emission unit 71 configured to acquire first image data, and project the first image data using a first light source;
a second transmitting unit 72, configured to acquire preset image data, and project the acquired preset image data by using a second light source whose wavelength parameter meets a preset condition;
a processing unit 73, configured to obtain a target image corresponding to the first image data and a detection image corresponding to the preset image data, and present the target image and the detection image in a first area outside the electronic device; the projection power of the second light source corresponds to the preset image data.
In this embodiment, the second transmitting unit 72 is further configured to detect the preset image data to obtain pixel characteristics of the preset image data; and the second light source with wavelength parameters meeting preset conditions is used for projecting the acquired preset image data at the first projection power.
Those skilled in the art should understand that the functions of each processing unit in the electronic device according to the embodiment of the present invention can be understood by referring to the description of the foregoing control method, and are not described herein again. Each processing unit in the electronic device according to the embodiment of the present invention may be implemented by an analog circuit that implements the functions described in the embodiment of the present invention, or may be implemented by running software that executes the functions described in the embodiment of the present invention on an intelligent terminal.
EXAMPLE five
FIG. 8 is a second schematic structural diagram of an electronic device according to an embodiment of the invention; as shown in fig. 8, the electronic apparatus includes:
a first emission unit 71 configured to acquire first image data, and project the first image data using a first light source;
a second transmitting unit 72, configured to acquire preset image data, and project the acquired preset image data by using a second light source whose wavelength parameter meets a preset condition;
a processing unit 73, configured to obtain a target image corresponding to the first image data and a detection image corresponding to the preset image data, and present the target image and the detection image in a first area outside the electronic device; the projection power of the second light source corresponds to the preset image data;
an acquisition unit 74, configured to acquire the detection image and a user operation in the first region;
and a control unit 75, configured to analyze the user operation according to the acquired detection image, and determine a control instruction according to an analysis result, so as to control the electronic device to respond to the user operation.
In this embodiment, the second transmitting unit 72 is further configured to detect the preset image data to obtain pixel characteristics of the preset image data; and the second light source with wavelength parameters meeting preset conditions is used for projecting the acquired preset image data at the first projection power.
In this embodiment, the second transmitting unit 72 is further configured to obtain a preset presenting distance between the detection image expected to be presented and the electronic device; selecting preset image data from a preset image list according to a preset presentation distance; and matching the pixel characteristics of the preset image data with the preset presenting distance.
In this embodiment, the second transmitting unit 72 is further configured to determine a second projection power according to the determined preset image data and the preset presentation distance, and project the obtained preset image data with the second projection power by using a second light source whose wavelength parameter meets a preset condition.
Those skilled in the art should understand that the functions of each processing unit in the electronic device according to the embodiment of the present invention can be understood by referring to the description of the foregoing control method, and are not described herein again. Each processing unit in the electronic device according to the embodiment of the present invention may be implemented by an analog circuit that implements the functions described in the embodiment of the present invention, or may be implemented by running software that executes the functions described in the embodiment of the present invention on an intelligent terminal.
The division of each unit in the electronic device is only schematic, and is a logic function division, and there may be another division manner in actual implementation, and another division manner of the electronic device and the specific implementation process of the electronic device in the division manner for executing the control method according to the embodiment of the present invention are given below.
EXAMPLE five
FIG. 9 is a schematic structural diagram of an electronic device according to an embodiment of the invention; as shown in fig. 9, the electronic apparatus includes: a dynamic Pattern memory, an infrared laser (IRLase) generator, an infrared laser control circuit, a digital video signal memory, a Central Processing Unit (CPU) or a frame buffer storage video processing integrated circuit (frame buffer memory integrated circuit), an RGB laser control circuit, a signal feedback controller, and an MEMS; wherein,
in the electronic device shown in fig. 9, the conventional RGB image signals still follow the normal processing flow, specifically, the first image data is stored in the digital video signal memory, and the light source composed of three RGB primary colors is output after the video processing integrated circuit and the RGB laser control circuit are stored by the CPU or the frame buffer; in this embodiment, an additional infrared laser generator is added, the infrared laser generator acquires preset image data from a dynamic Pattern memory, and the preset image data is used as an IRLaser driving image input source, and then time-sharing scanning processing is performed on four light sources of R, G, B, and IR through an MEMS logic, so as to obtain a target image corresponding to the first image data and a detection image corresponding to the preset image data; the detection image is used for identifying user operation; here, although the two images exist at the same time, the target image is visible from the user's perspective, and the detection image is invisible, so there is no influence on the user; furthermore, the amount of the stored preset image data can be determined according to the storage size of the dynamic Pattern memory, so that a foundation is laid for the recognition of user operation in different environments with different distances, and a foundation is laid for effectively saving power consumption.
The signal feedback controller is used for receiving signal feedback information sent by the MEMS and sending the signal feedback information to the CPU or the frame buffer storage video processing integrated circuit so as to adjust the time-sharing scanning strategy of the MEMS according to the feedback information.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above-described device embodiments are merely illustrative, for example, the division of the unit is only a logical functional division, and there may be other division ways in actual implementation, such as: multiple units or components may be combined, or may be integrated into another system, or some features may be omitted, or not implemented. In addition, the coupling, direct coupling or communication connection between the components shown or discussed may be through some interfaces, and the indirect coupling or communication connection between the devices or units may be electrical, mechanical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed on a plurality of network units; some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, all the functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may be separately regarded as one unit, or two or more units may be integrated into one unit; the integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
Those of ordinary skill in the art will understand that: all or part of the steps for implementing the method embodiments may be implemented by hardware related to program instructions, and the program may be stored in a computer readable storage medium, and when executed, the program performs the steps including the method embodiments; and the aforementioned storage medium includes: a mobile storage device, a Read Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and various media capable of storing program codes.
Alternatively, the integrated unit of the present invention may be stored in a computer-readable storage medium if it is implemented in the form of a software functional module and sold or used as a separate product. Based on such understanding, the technical solutions of the embodiments of the present invention may be essentially implemented or a part contributing to the prior art may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the methods described in the embodiments of the present invention. And the aforementioned storage medium includes: a mobile storage device, a Read Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and various media capable of storing program codes.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the appended claims.

Claims (10)

1. A control method, comprising:
the method comprises the steps that electronic equipment obtains first image data, and a first light source is used for projecting the first image data;
acquiring preset image data, and projecting the acquired preset image data by using a second light source with wavelength parameters meeting preset conditions;
obtaining a target image corresponding to the first image data and a detection image corresponding to the preset image data; the projection power of the second light source corresponds to the preset image data;
presenting the target image and the detection image in a first area outside the electronic device.
2. The method of claim 1, further comprising:
acquiring the detection image and user operation in the first area;
analyzing the user operation according to the acquired detection image;
and determining a control instruction according to the analysis result so as to control the electronic equipment to respond to the user operation.
3. The method of claim 1, further comprising:
detecting the preset image data to obtain pixel characteristics of the preset image data;
determining a first projection power matched with the preset image data according to the pixel characteristics of the preset image data;
correspondingly, the projecting the acquired preset image data by using the second light source with the wavelength parameter meeting the preset condition includes:
and projecting the acquired preset image data by using the first projection power by using a second light source with wavelength parameters meeting preset conditions.
4. The method of claim 1,
acquiring a preset presentation distance between a detection image expected to be presented and the electronic equipment;
selecting preset image data from a preset image list according to a preset presentation distance; and matching the pixel characteristics of the preset image data with the preset presenting distance.
5. The method of claim 4, further comprising:
determining a second projection power according to the determined preset image data and the preset presenting distance;
correspondingly, the projecting the acquired preset image data by using the second light source with the wavelength parameter meeting the preset condition includes:
and projecting the acquired preset image data by using a second light source with wavelength parameters meeting preset conditions at the second projection power.
6. An electronic device, comprising:
the device comprises a first transmitting unit, a second transmitting unit and a control unit, wherein the first transmitting unit is used for acquiring first image data and projecting the first image data by using a first light source;
the second transmitting unit is used for acquiring preset image data and projecting the acquired preset image data by using a second light source with wavelength parameters meeting preset conditions;
the processing unit is used for obtaining a target image corresponding to the first image data and a detection image corresponding to the preset image data, and presenting the target image and the detection image in a first area outside the electronic equipment; the projection power of the second light source corresponds to the preset image data.
7. The electronic device of claim 6, further comprising:
the acquisition unit is used for acquiring the detection image and the user operation in the first area;
and the control unit is used for analyzing the user operation according to the acquired detection image and determining a control instruction according to an analysis result so as to control the electronic equipment to respond to the user operation.
8. The electronic device of claim 6, wherein the second transmitting unit is further configured to detect the preset image data to obtain pixel characteristics of the preset image data; and the second light source with wavelength parameters meeting preset conditions is used for projecting the acquired preset image data at the first projection power.
9. The electronic device of claim 6, wherein the second transmitting unit is further configured to obtain a preset presentation distance between a detection image expected to be presented and the electronic device; selecting preset image data from a preset image list according to a preset presentation distance; and matching the pixel characteristics of the preset image data with the preset presenting distance.
10. The electronic device according to claim 9, wherein the second transmitting unit is further configured to determine a second projection power according to the determined preset image data and the preset presentation distance, and project the acquired preset image data with the second projection power by using a second light source whose wavelength parameter meets a preset condition.
CN201610004521.3A 2016-01-04 2016-01-04 A kind of control method and electronic equipment Active CN105677030B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201610004521.3A CN105677030B (en) 2016-01-04 2016-01-04 A kind of control method and electronic equipment
US15/397,971 US20170193869A1 (en) 2016-01-04 2017-01-04 Method and electronic device for projected image processing

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610004521.3A CN105677030B (en) 2016-01-04 2016-01-04 A kind of control method and electronic equipment

Publications (2)

Publication Number Publication Date
CN105677030A true CN105677030A (en) 2016-06-15
CN105677030B CN105677030B (en) 2018-11-09

Family

ID=56298852

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610004521.3A Active CN105677030B (en) 2016-01-04 2016-01-04 A kind of control method and electronic equipment

Country Status (1)

Country Link
CN (1) CN105677030B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106022319A (en) * 2016-06-30 2016-10-12 联想(北京)有限公司 Gesture recognition method and gesture recognition system
CN109358909A (en) * 2018-08-28 2019-02-19 努比亚技术有限公司 Show page control method, terminal and computer readable storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130044054A1 (en) * 2011-08-19 2013-02-21 Electronics And Telecommunications Research Institute Of Daejeon Method and apparatus for providing bare-hand interaction
US20130222427A1 (en) * 2012-02-29 2013-08-29 Electronics And Telecommunications Research Institute System and method for implementing interactive augmented reality
US20140002421A1 (en) * 2012-07-02 2014-01-02 Electronics And Telecommunications Research Institute User interface device for projection computer and interface method using the same
CN104076914A (en) * 2013-03-28 2014-10-01 联想(北京)有限公司 Electronic equipment and projection display method
CN104977785A (en) * 2014-04-09 2015-10-14 全视科技有限公司 Combined visible and non-visible projection system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130044054A1 (en) * 2011-08-19 2013-02-21 Electronics And Telecommunications Research Institute Of Daejeon Method and apparatus for providing bare-hand interaction
US20130222427A1 (en) * 2012-02-29 2013-08-29 Electronics And Telecommunications Research Institute System and method for implementing interactive augmented reality
US20140002421A1 (en) * 2012-07-02 2014-01-02 Electronics And Telecommunications Research Institute User interface device for projection computer and interface method using the same
CN104076914A (en) * 2013-03-28 2014-10-01 联想(北京)有限公司 Electronic equipment and projection display method
CN104977785A (en) * 2014-04-09 2015-10-14 全视科技有限公司 Combined visible and non-visible projection system

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106022319A (en) * 2016-06-30 2016-10-12 联想(北京)有限公司 Gesture recognition method and gesture recognition system
CN109358909A (en) * 2018-08-28 2019-02-19 努比亚技术有限公司 Show page control method, terminal and computer readable storage medium

Also Published As

Publication number Publication date
CN105677030B (en) 2018-11-09

Similar Documents

Publication Publication Date Title
US11087728B1 (en) Computer vision and mapping for audio applications
EP3419024B1 (en) Electronic device for providing property information of external light source for interest object
KR101930657B1 (en) System and method for immersive and interactive multimedia generation
KR102560689B1 (en) Method and apparatus for displaying an ar object
US10178379B2 (en) Method and apparatus for testing virtual reality head display device
KR102664723B1 (en) Method for providing preview and electronic device using the same
US11217031B2 (en) Electronic device for providing second content for first content displayed on display according to movement of external object, and operating method therefor
KR102423295B1 (en) An apparatus for composing objects using depth map and a method thereof
EP3641294A1 (en) Electronic device and method for obtaining images
JP2019095595A (en) Communication device, display device, control method and program of communication and display devices, as well as display system thereof
KR102641738B1 (en) Image processing method and electronic device supporting the same
US10055065B2 (en) Display system, projector, and control method for display system
KR102655532B1 (en) Electronic device and method for acquiring biometric information using light of display
CN105677030B (en) A kind of control method and electronic equipment
US20230124173A1 (en) Information terminal device and application operation mode control method of same
KR20210138923A (en) Electronic device for providing augmented reality service and operating method thereof
KR20190095716A (en) Electronic device for conrolling display of content and operating method thereof
KR20190035358A (en) An electronic device controlling a camera based on an external light and control method
CN105578164B (en) A kind of control method and electronic equipment
KR20210136659A (en) Electronic device for providing augmented reality service and operating method thereof
US11889181B2 (en) Electronic device having plurality of lenses where the device changes a visual object corresponding to a recommended lens
KR20210101416A (en) Image sensor controlling method and apparatus
KR102574730B1 (en) Method of providing augmented reality TV screen and remote control using AR glass, and apparatus and system therefor
US20110285624A1 (en) Screen positioning system and method based on light source type
US20170193869A1 (en) Method and electronic device for projected image processing

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant