CN104793861A - Information processing method and system and electronic devices - Google Patents

Information processing method and system and electronic devices Download PDF

Info

Publication number
CN104793861A
CN104793861A CN201510169484.7A CN201510169484A CN104793861A CN 104793861 A CN104793861 A CN 104793861A CN 201510169484 A CN201510169484 A CN 201510169484A CN 104793861 A CN104793861 A CN 104793861A
Authority
CN
China
Prior art keywords
electronic equipment
projected image
conditioned
displaying contents
operand
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201510169484.7A
Other languages
Chinese (zh)
Inventor
刘文静
马骞
陈柯
杨晨
肖蔓君
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201510169484.7A priority Critical patent/CN104793861A/en
Publication of CN104793861A publication Critical patent/CN104793861A/en
Pending legal-status Critical Current

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the invention discloses an information processing method and system and electronic devices. The information processing method comprises the steps that display content is obtained, a projection unit of the first electronic device is controlled to project light rays corresponding to the display content, and when the projection unit faces a supporting face of the first electronic device, the supporting face can form a projection image containing the display content; when the projection image is located in a detection area, the operation of an operator for the projection image is detected, and the operation object corresponding to the operation in the display content is determined; whether the operation meets a first preset condition is judged at least, and the operation object is controlled based on a judgment result.

Description

A kind of information processing method, system and electronic equipment
Technical field
The present invention relates to the information processing technology, be specifically related to a kind of information processing method, system and electronic equipment.
Background technology
Along with the development of technology, in electronic equipment (as smart mobile phone, notebook computer), can projecting cell be added, the displaying contents of electronic equipment be projected and forms projected image on a projection plane.But the plane that this projection plane is normally vertical (as metope), and user want the content controlled in projected image change (as movement, deleting etc.) still need to be controlled by the input equipment (as mouse, keyboard) of electronic equipment, directly cannot control project content for projected image, be unfavorable for the operating experience of user like this.
Summary of the invention
For solving the technical matters of existing existence, the embodiment of the present invention provides a kind of information processing method, system and electronic equipment, directly can carry out operation to the content in projected image and control, promote the operating experience of user.
For achieving the above object, the technical scheme of the embodiment of the present invention is achieved in that
Embodiments provide a kind of information processing method, described method comprises:
Obtain displaying contents, control the light that the projecting cell projection of the first electronic equipment is corresponding with described displaying contents; Wherein, when described projecting cell is towards the supporting surface of described first electronic equipment, the projected image including described displaying contents can be formed at described supporting surface;
When described projected image is arranged in surveyed area, detects operating body for the operation of described projected image, determine to operate corresponding operand with described in displaying contents;
At least judge that whether described operation meets first pre-conditioned, control described operand based on above-mentioned judged result.
In such scheme, before the operation of described detection operating body for described projected image, described method also comprises: detect the object in described surveyed area, when testing result show described object meet second pre-conditioned time, initiate and the wireless connections of the second electronic equipment;
Receive the success response of described second electronic equipment in the first Preset Time after, set up the wireless connections with described second electronic equipment.
In such scheme, described object meets second pre-conditioned, comprising: when the shape of described object and/or size meet predetermined condition, determines that described object meets described second pre-conditioned.
In such scheme, described at least to judge that whether described operation meets first pre-conditioned, controls described operand, comprising based on above-mentioned judged result:
After wireless connections set up by described first electronic equipment and the second electronic equipment, detect the operation trace of described operation, determine described operation trace meet the 3rd pre-conditioned after, to second electronic equipment send the first information;
Receive the first trigger message of described second electronic equipment in the second Preset Time after, control described operand and send to described second electronic equipment; Wherein, described first trigger message triggers after satisfied 4th pre-conditioned operation trace being detected based on described second electronic equipment.
In such scheme, described at least to judge that whether described operation meets first pre-conditioned, controls described operand, comprising based on above-mentioned judged result:
After wireless connections set up by described first electronic equipment and the second electronic equipment, detect the first operation trace of described operation, determine described first operation trace meet the 5th pre-conditioned after, identify the second information that described second electronic equipment sends, described second information comprises the second operation trace that described second electronic equipment detects;
It is the 6th pre-conditioned that described first electronic equipment determines that described second operation trace meets, or determine described first operation trace and described second operation trace combination meet the 7th pre-conditioned time, control described operand and be sent to described second electronic equipment.
In such scheme, described first trigger message be based on described second electronic equipment projecting cell towards the projection light corresponding with the second displaying contents of described second electronic equipment during described supporting surface with can be formed at described supporting surface include described second displaying contents the second projected image, detect that operating body operates and the satisfied 8th pre-conditioned rear triggering of the described second operation trace operated for second of described second projected image.
In such scheme, described at least to judge that whether described operation meets first pre-conditioned, controls described operand, comprising based on above-mentioned judged result:
Detect the operation trace of described operation, determine described operation trace meet the 3rd pre-conditioned after, to second electronic equipment send the first information;
The trigger message of described second electronic equipment is not received in the second Preset Time, or receive the trigger message of described second electronic equipment not meeting preset requirement in described second Preset Time after, generate and perform the first instruction, deleting described operand.
In such scheme, described at least to judge that whether described operation meets first pre-conditioned, controls described operand, comprising based on above-mentioned judged result:
Detect the operation trace of described operation, determine described operation trace meet the 9th pre-conditioned after, judge that described first electronic equipment is current and whether connect with the second electronic equipment, obtain the second judged result;
When to be that described first electronic equipment is current do not connect with described second electronic equipment described second judged result, generate and perform the first instruction, deleting described operand;
When described second judged result is described first electronic equipment and described second electronic equipment connects, generates and perform the second instruction, described operand is sent to described second electronic equipment.
The embodiment of the present invention additionally provides a kind of electronic equipment, and described electronic equipment is the first electronic equipment; Described electronic equipment comprises:
Processor, for obtaining displaying contents, controls the light that the projection of described projecting cell is corresponding with described displaying contents;
Projecting cell, is arranged at the first surface of described first electronic equipment, for projecting the light corresponding with described displaying contents; When described projecting cell is towards the supporting surface of described first electronic equipment, the projected image including described displaying contents can be formed at described supporting surface;
Detecting unit, is arranged at the first surface of described first electronic equipment, for when described projected image is arranged in surveyed area, detects the operation of operating body for described projected image;
Described processor, also operates corresponding operand with described for determining in displaying contents, at least judges that whether described operation meets first pre-conditioned, controls described operand based on above-mentioned judged result.
In such scheme, described electronic equipment also comprises wireless communication unit;
Described detecting unit, also for detecting the object in described surveyed area;
Described processor, also for show when the testing result that described detecting unit obtains described object meet second pre-conditioned time, initiate the wireless connections with the second electronic equipment by described wireless communication unit; Receive the success response of described second electronic equipment in the first Preset Time after, set up the wireless connections with described second electronic equipment.
The embodiment of the present invention additionally provides a kind of information handling system, and described system comprises the first electronic equipment and the second electronic equipment;
Described first electronic equipment, for detecting the object in described surveyed area, when testing result show described object meet second pre-conditioned time, initiate and the wireless connections of the second electronic equipment; Also for obtaining displaying contents, control the light that the projecting cell projection of self is corresponding with described displaying contents; Wherein, when described projecting cell is towards the supporting surface of described first electronic equipment, the projected image including described displaying contents can be formed at described supporting surface; When described projected image is arranged in surveyed area, detects operating body for the operation of described projected image, determine to operate corresponding operand with described in displaying contents; At least judge that whether described operation meets first pre-conditioned, control described operand based on above-mentioned judged result and be sent to described second electronic equipment.
Described second electronic equipment, after the wireless connections that described first electronic equipment is initiated being detected, sends success response, to set up the wireless connections with the first electronic equipment to described first electronic equipment in the first Preset Time.
In such scheme, described second electronic equipment, for obtaining the second displaying contents, control the projecting cell of self towards projecting the light corresponding with described second displaying contents during described supporting surface to form the second projected image including described second displaying contents at described supporting surface; Also for when described projected image is arranged in surveyed area, detect the operation of operating body for described second projected image.
Embodiments provide information processing method, system and electronic equipment, the first electronic equipment, by obtaining displaying contents, controls the light that the projecting cell projection of described first electronic equipment is corresponding with described displaying contents; Wherein, when described projecting cell is towards the supporting surface of described first electronic equipment, the projected image including described displaying contents can be formed at described supporting surface; When described projected image is arranged in surveyed area, detects operating body for the operation of described projected image, determine to operate corresponding operand with described in displaying contents; At least judge that whether described operation meets first pre-conditioned, control described operand based on above-mentioned judged result.So, adopt the technical scheme of the embodiment of the present invention, achieve the supporting surface of described first electronic equipment as projection plane on the one hand, also achieve by the operand in projected image described in the operation control break to described projection plane on the other hand, also namely achieve the displaying contents by the first electronic equipment described in the operation control break to projection plane, greatly improve the operating experience of user.
Accompanying drawing explanation
Fig. 1 is the schematic flow sheet of the information processing method of the embodiment of the present invention one;
Fig. 2 is the scene application schematic diagram of the information processing method of the embodiment of the present invention one;
Fig. 3 is the schematic flow sheet of the information processing method of the embodiment of the present invention two;
Fig. 4 is the scene application schematic diagram of the information processing method of the embodiment of the present invention two;
Fig. 5 is the schematic flow sheet of the information processing method of the embodiment of the present invention three;
Fig. 6 is the schematic flow sheet of the information processing method of the embodiment of the present invention four;
Fig. 7 is the first composition structural representation of the electronic equipment of the embodiment of the present invention;
Fig. 8 is the second composition structural representation of the electronic equipment of the embodiment of the present invention.
Embodiment
Below in conjunction with drawings and the specific embodiments, the present invention is further detailed explanation.
Embodiment one
Embodiments provide a kind of information processing method.Fig. 1 is the schematic flow sheet of the information processing method of the embodiment of the present invention one; As shown in Figure 1, described information processing method comprises:
Step 101: obtain displaying contents, controls the light that the projecting cell projection of the first electronic equipment is corresponding with described displaying contents; Wherein, when described projecting cell is towards the supporting surface of described first electronic equipment, the projected image including described displaying contents can be formed at described supporting surface.
Information processing method described in the present embodiment is applied in the first electronic equipment, and described first electronic equipment is specifically as follows notebook computer.Described first electronic equipment comprises projecting cell.Fig. 2 is the scene application schematic diagram of the information processing method of the embodiment of the present invention one; As shown in Figure 2, the projecting cell 11 of described first electronic equipment is arranged on the first surface of described first electronic equipment, for the notebook computer shown in Fig. 2, the first surface of described first electronic equipment is described first electronic equipment when being in closure state and keeping flat to supporting surface, presents to the surface of user.When described first electronic equipment be in using state make the display screen place plane of described first electronic equipment and keyboard place plane be in certain angle time, the projecting cell of described first electronic equipment, towards the supporting surface of described first electronic equipment, can form the projected image including described displaying contents at described supporting surface.
The executive agent of the information processing method of the present embodiment is the first electronic equipment, then in this step, described acquisition displaying contents, control the light that the projecting cell projection of the first electronic equipment is corresponding with described displaying contents, comprise: described first electronic equipment obtains displaying contents, control the light that the projecting cell projection of self is corresponding with described displaying contents.
Step 102: when described projected image is arranged in surveyed area, detects operating body for the operation of described projected image, determines to operate corresponding operand with described in displaying contents.
In the present embodiment, described first electronic equipment also comprises detecting unit, and described detecting unit can detect the operation of operating body for the projected image of described projecting cell, in this case, described detecting unit has surveyed area, and described projected image is arranged in described surveyed area.
As a kind of embodiment, described detecting unit can be depth image collecting unit, then described surveyed area is the depth image pickup area of described depth image collecting unit.Described depth image collecting unit can collect the depth image in described depth image pickup area.To be illustrated as example shown in Fig. 2, described depth image collecting unit can be arranged in the first surface of described first electronic equipment or the rotating shaft of described first electronic equipment, described depth image collecting unit, towards described projected image, carries out for described projected image the operating body that operates detecting.Concrete, described first electronic equipment is by described depth image collecting unit sampling depth view data, when operating body does not enter described depth image pickup area (can be understood as described operating body not operate described projected image), the depth data that described depth image collecting unit collects can not be undergone mutation; When in the depth data that described depth image collecting unit collects, the depth data in some regions is undergone mutation, and when the data identification of sudden change goes out to present irregular cylinder, determine operating body to be detected in described surveyed area.
Further, described first electronic equipment is based on the position of described projecting cell on described first surface, and the angle that the display screen place plane of described first electronic equipment and keyboard place plane are in, determine the distance range of described projected image and described electronic equipment; When described depth image collecting unit is arranged on first surface (collecting unit of depth image described in practical application can be disposed adjacent with described projecting cell) of described first electronic equipment, the distance range of described projected image and described electronic equipment is the distance of the projecting cell (also can be depth image collecting unit) of each pixel in described projected image and described electronic equipment.When in the rotating shaft that described depth image collecting unit is arranged on described first electronic equipment, the distance range of described projected image and described electronic equipment is the distance of the depth image collecting unit of each pixel and described electronic equipment in described projected image.When described image acquisition units detects operating body, according to the depth value of described operating body, determine described operating body for the position of described projected image, then determine the operand in the described projected image that described position is corresponding based on the position of described projected image; Described operand is specifically as follows any object that described first electronic equipment such as icon, file, picture in described projected image (i.e. the displaying contents of described first electronic equipment) can present.
As another kind of embodiment, described detecting unit can also be image acquisition units, then described surveyed area is the image acquisition region of described image acquisition units.To be illustrated as example shown in Fig. 2, described image acquisition units can be arranged on the first surface (can be disposed adjacent with described projecting cell in practical application) of described first electronic equipment, described image acquisition units, towards described projected image, carries out for described projected image the operating body that operates detecting.Concrete, described first electronic equipment is by described image acquisition units acquisition of image data, analyze the view data collected, when determining that view data comprises an irregular cylinder, can determine in described surveyed area, operating body (namely described irregular cylinder is described operating body) detected.
Further, when described image acquisition units collects the view data including operating body, in described view data, also comprise projected image; Identify described view data, can determine described operating body for described projected image in operand; Described operand is specifically as follows any object that described first electronic equipment such as icon, file, picture in described projected image (i.e. the displaying contents of described first electronic equipment) can present.
Step 103: at least judge that whether described operation meets first pre-conditioned, control described operand based on above-mentioned judged result.
Here, described at least to judge that whether described operation meets first pre-conditioned, comprising: judge whether the operation trace of described operation meets predetermined condition.As a kind of embodiment, when described detecting unit is depth image collecting unit, detect that the depth value of described operating body can determine the operation trace of described operating body by described depth image collecting unit; When described operation trace meet predetermined condition (as operating body as described in characterizing by as described in move in projected image as described in projected image outer) time, can determine that the operation trace of described operation meets predetermined condition.As another kind of embodiment, when described detecting unit is image acquisition units, the track of the operating point of described operating body detected by described image acquisition units; When described operating point meet predetermined condition (as by as described in view field gradually view field edge as described in trend even disappear in as described in view field) time, can determine that the operation trace of described operation meets predetermined condition.
Further, when described operation meet first pre-conditioned time, control described operand and change, the operand that can be understood as in the displaying contents controlling described first electronic equipment changes; Such as, described operand, mobile described operand etc. is deleted.
Adopt the technical scheme of the embodiment of the present invention, achieve the supporting surface of described first electronic equipment as projection plane on the one hand, also achieve by the operand in projected image described in the operation control break to described projection plane on the other hand, also namely achieve the displaying contents by the first electronic equipment described in the operation control break to projection plane, greatly improve the operating experience of user.
Embodiment two
The embodiment of the present invention additionally provides a kind of information processing method, and described information processing method is applied in the first electronic equipment.Fig. 3 is the schematic flow sheet of the information processing method of the embodiment of the present invention two; As shown in Figure 3, described information processing method comprises:
Step 201: detect the object in surveyed area, when testing result show described object meet second pre-conditioned time, initiate the wireless connections with the second electronic equipment.
Information processing method described in the present embodiment is applied in the first electronic equipment, and described first electronic equipment is specifically as follows notebook computer.Then in this step, object in described detection surveyed area, when testing result show described object meet second pre-conditioned time, initiate the wireless connections with the second electronic equipment, comprise: described first electronic equipment detects the object in surveyed area, when testing result show described object meet second pre-conditioned time, initiate and the wireless connections of the second electronic equipment.
In this step, described object meets second pre-conditioned, comprising: when the shape of described object and/or size meet predetermined condition, determines that described object meets described second pre-conditioned.
In the present embodiment, described first electronic equipment comprises detecting unit, and described detecting unit has a surveyed area.
As a kind of embodiment, described detecting unit can be depth image collecting unit, then described surveyed area is the depth image pickup area of described depth image collecting unit.Described depth image collecting unit can collect the depth image in described depth image pickup area.Described depth image collecting unit can be arranged in the first surface of described first electronic equipment or the rotating shaft of described first electronic equipment, towards the outside of described first electronic equipment, other objects except described first electronic equipment to be detected.Concrete, described first electronic equipment is by described depth image collecting unit sampling depth view data; When occurring without other objects in described depth image pickup area, the depth data that described depth image collecting unit collects can not be undergone mutation; When in the depth data that described depth image collecting unit collects, the depth data in some regions is undergone mutation, and the data identification of sudden change goes out to present the trapezoidal or rectangle of rule, and/or the physical size of data characterization of sudden change is when meeting the size of conventional second electronic equipment (as notebook computer), determine the object described in the present embodiment to be detected in described surveyed area, described object i.e. the second electronic equipment.
As another kind of embodiment, described detecting unit can also be image acquisition units, then described surveyed area is the image acquisition region of described image acquisition units.Described image acquisition units can be arranged on the first surface of described first electronic equipment, other objects except described first electronic equipment to be detected.Concrete, described first electronic equipment is by described image acquisition units acquisition of image data, analyze the view data collected, when determining that view data comprises the object of the trapezoidal of rule or rectangle, and when described object comprising specific mark (described specific mark comprises trade mark and/or model, as LENOVO, or Thinkx220 etc.), can determine the object described in the present embodiment to be detected in described surveyed area, described object i.e. the second electronic equipment.
In the present embodiment, it is second pre-conditioned that described first electronic equipment determines that described object meets, when namely finding the second electronic equipment, initiate the wireless connections with described second electronic equipment, namely adopt and preset described second electronic equipment of communication mechanism discovery, send radio connection request to described second electronic equipment.Described default communication mechanism is not limited to arbitrary home control network communication protocol of the prior art, as bluetooth, Wireless Fidelity (Wi-Fi).
Step 202: receive the success response of described second electronic equipment in the first Preset Time after, sets up the wireless connections with described second electronic equipment.
Here, after described second electronic equipment receives the radio connection request of described first electronic equipment transmission, described radio connection request is accepted in the first Preset Time, the success response of described radio connection request is sent, to complete the wireless connections between described first electronic equipment and described second electronic equipment to described first electronic equipment.
Step 203: obtain displaying contents, controls the light that the projecting cell projection of the first electronic equipment is corresponding with described displaying contents; Wherein, when described projecting cell is towards the supporting surface of described first electronic equipment, the projected image including described displaying contents can be formed at described supporting surface.
In the present embodiment, described first electronic equipment comprises projecting cell.Fig. 4 is the scene application schematic diagram of the information processing method of the embodiment of the present invention two; As shown in Figure 4, the projecting cell 11 of described first electronic equipment 1 is arranged on the first surface of described first electronic equipment 1, for the notebook computer shown in Fig. 4, the first surface of described first electronic equipment 1 is described first electronic equipment 1 when being in closure state and keeping flat to supporting surface, presents to the surface of user.When described first electronic equipment 1 be in using state make the display screen place plane of described first electronic equipment 1 and keyboard place plane be in certain angle time, the projecting cell 11 of described first electronic equipment 1, towards the supporting surface of described first electronic equipment 1, can form the projected image including described displaying contents at described supporting surface.
Preferably, described second electronic equipment 2 has same structure with described first electronic equipment 1, namely comprises projecting cell 21, specifically can as shown in Figure 4, and the position of the projecting cell 21 of described second electronic equipment 2 specifically as foregoing description, can repeat no more here.
Step 204: when described projected image is arranged in described surveyed area, detects operating body for the operation of described projected image, determines to operate corresponding operand with described in displaying contents.
In the present embodiment, described detecting unit can also detect the operation of operating body for the projected image of described projecting cell, and described projected image is arranged in described surveyed area.
As a kind of embodiment, when described detecting unit is depth image collecting unit, described depth image collecting unit can collect the depth image in described depth image pickup area.Concrete, described first electronic equipment is by described depth image collecting unit sampling depth view data, when operating body does not enter described depth image pickup area (can be understood as described operating body not operate described projected image), the depth data that described depth image collecting unit collects can not be undergone mutation; When in the depth data that described depth image collecting unit collects, the depth data in some regions is undergone mutation, and when the data identification of sudden change goes out to present irregular cylinder, determine operating body to be detected in described surveyed area.
Further, described first electronic equipment is based on the position of described projecting cell on described first surface, and the angle that the display screen place plane of described first electronic equipment and keyboard place plane are in, determine the distance range of described projected image and described electronic equipment; When described depth image collecting unit is arranged on first surface (collecting unit of depth image described in practical application can be disposed adjacent with described projecting cell) of described first electronic equipment, the distance range of described projected image and described electronic equipment is the distance of the projecting cell (also can be depth image collecting unit) of each pixel in described projected image and described electronic equipment.When in the rotating shaft that described depth image collecting unit is arranged on described first electronic equipment, the distance range of described projected image and described electronic equipment is the distance of the depth image collecting unit of each pixel and described electronic equipment in described projected image.When described image acquisition units detects operating body, according to the depth value of described operating body, determine described operating body for the position of described projected image, then determine the operand in the described projected image that described position is corresponding based on the position of described projected image; Described operand is specifically as follows any object that described first electronic equipment such as icon, file, picture in described projected image (i.e. the displaying contents of described first electronic equipment) can present.
As another kind of embodiment, when described detecting unit is image acquisition units, concrete, described first electronic equipment is by described image acquisition units acquisition of image data, analyze the view data collected, when determining that view data comprises an irregular cylinder, can determine in described surveyed area, operating body (namely described irregular cylinder is described operating body) detected.
Further, when described image acquisition units collects the view data including operating body, in described view data, also comprise projected image; Identify described view data, can determine described operating body for described projected image in operand; Described operand is specifically as follows any object that described first electronic equipment such as icon, file, picture in described projected image (i.e. the displaying contents of described first electronic equipment) can present.
Step 205: the operation trace detecting described operation, determine described operation trace meet the 3rd pre-conditioned after, to second electronic equipment send the first information.
As a kind of embodiment, when described detecting unit is depth image collecting unit, detect that the depth value of described operating body can determine the operation trace of described operating body by described depth image collecting unit; When described operation trace meet the 3rd pre-conditioned (as operating body as described in characterizing by as described in move in projected image as described in projected image outer) time, can determine that the operation trace of described operation meets the 3rd pre-conditioned.As another kind of embodiment, when described detecting unit is image acquisition units, the track of the operating point of described operating body detected by described image acquisition units; When meeting of described operating point the 3rd pre-conditioned (as by as described in view field gradually the described view field edge of trend even disappear in as described in view field) time, can determine that the operation trace of described operation is satisfied 3rd pre-conditioned.
In this step, described first electronic equipment determine described operation trace meet second pre-conditioned after, to described second electronic equipment send the first information, the described first information be used for request send described operand to described second electronic equipment.
Step 206: receive the first trigger message of described second electronic equipment in the second Preset Time after, controls described operand and sends to described second electronic equipment; Wherein, described first trigger message triggers after satisfied 4th pre-conditioned operation trace being detected based on described second electronic equipment.
Here, after described second electronic equipment receives the first information of described first electronic equipment transmission, the first information described in success response in the second Preset Time, sends the first trigger message to described first electronic equipment.Wherein, described first trigger message be based on described second electronic equipment projecting cell towards the projection light corresponding with the second displaying contents of described second electronic equipment during described supporting surface with can be formed at described supporting surface include described second displaying contents the second projected image, detect that operating body operates for second of described second projected image and trigger afterwards.
Concrete, described second electronic equipment has the structure same with described first electronic equipment, and as shown in Figure 4, described second electronic equipment comprises projecting cell, and described projecting cell is arranged on the first surface of described second electronic equipment.The projecting cell of described second electronic equipment, towards the supporting surface of described second electronic equipment, can form the projected image of the second displaying contents including described second electronic equipment at described supporting surface.Described second electronic equipment also comprises detecting unit, and described detecting unit can be depth image collecting unit or image acquisition units; Described second electronic equipment detects the operating body in surveyed area by described detecting unit, obtain the operation trace of described operating body.Then in this step, after described second electronic equipment receives the first information of described first electronic equipment transmission, by described detecting unit detect satisfied 4th pre-conditioned operation trace (as detect operating body by as described in move to outside projected image as described in projected image, or detect that described operating body tends to the mobile trend of described projected image center gradually by described projected image edge) time, generate the first trigger message, in the second Preset Time, send described first trigger message to described first electronic equipment.
As another embodiment of the present embodiment, the trigger message of described second electronic equipment is not received in the second Preset Time, or receive the trigger message of described second electronic equipment not meeting preset requirement in described second Preset Time after, generate and perform the first instruction, deleting described operand.
Concrete, described first electronic equipment arranges timer after sending the first information to described second electronic equipment, and the time of described timer setting is described second Preset Time.The trigger message of described second electronic equipment is not then received when the time of described timer, or receive the trigger message of described second electronic equipment not meeting preset requirement in described second Preset Time after, generate and perform the first instruction, described first instruction is for deleting described operand.
Adopt the technical scheme of the embodiment of the present invention, achieve the supporting surface of described first electronic equipment as projection plane on the one hand, also achieve by the operand in projected image described in the operation control break to described projection plane on the other hand, also the displaying contents by the first electronic equipment described in the operation control break to projection plane is namely achieved, described operand is sent to the second electronic equipment for controlling or deletes described operand by the displaying contents of described first electronic equipment of described change, mutual especially by what achieve between electronic equipment to the operation of described projection plane, greatly improve the operating experience of user.
Embodiment three
The embodiment of the present invention additionally provides a kind of information processing method, and described information processing method is applied in the first electronic equipment.Fig. 5 is the schematic flow sheet of the information processing method of the embodiment of the present invention three; As shown in Figure 5, described information processing method comprises:
Step 301: detect the object in surveyed area, when testing result show described object meet second pre-conditioned time, initiate the wireless connections with the second electronic equipment.
Information processing method described in the present embodiment is applied in the first electronic equipment, and described first electronic equipment is specifically as follows notebook computer.Then in this step, object in described detection surveyed area, when testing result show described object meet second pre-conditioned time, initiate the wireless connections with the second electronic equipment, comprise: described first electronic equipment detects the object in surveyed area, when testing result show described object meet second pre-conditioned time, initiate and the wireless connections of the second electronic equipment.
In this step, described object meets second pre-conditioned, comprising: when the shape of described object and/or size meet predetermined condition, determines that described object meets described second pre-conditioned.
In the present embodiment, described first electronic equipment comprises detecting unit, and described detecting unit has a surveyed area.
As a kind of embodiment, described detecting unit can be depth image collecting unit, then described surveyed area is the depth image pickup area of described depth image collecting unit.Described depth image collecting unit can collect the depth image in described depth image pickup area.Described depth image collecting unit can be arranged in the first surface of described first electronic equipment or the rotating shaft of described first electronic equipment, towards the outside of described first electronic equipment, other objects except described first electronic equipment to be detected.Concrete, described first electronic equipment is by described depth image collecting unit sampling depth view data; When occurring without other objects in described depth image pickup area, the depth data that described depth image collecting unit collects can not be undergone mutation; When in the depth data that described depth image collecting unit collects, the depth data in some regions is undergone mutation, and the data identification of sudden change goes out to present the trapezoidal or rectangle of rule, and/or the physical size of data characterization of sudden change is when meeting the size of conventional second electronic equipment (as notebook computer), determine the object described in the present embodiment to be detected in described surveyed area, described object i.e. the second electronic equipment.
As another kind of embodiment, described detecting unit can also be image acquisition units, then described surveyed area is the image acquisition region of described image acquisition units.Described image acquisition units can be arranged on the first surface of described first electronic equipment, other objects except described first electronic equipment to be detected.Concrete, described first electronic equipment is by described image acquisition units acquisition of image data, analyze the view data collected, when determining that view data comprises the object of the trapezoidal of rule or rectangle, and when described object comprising specific mark (described specific mark comprises trade mark and/or model, as LENOVO, or Thinkx220 etc.), can determine the object described in the present embodiment to be detected in described surveyed area, described object i.e. the second electronic equipment.
In the present embodiment, it is second pre-conditioned that described first electronic equipment determines that described object meets, when namely finding the second electronic equipment, initiate the wireless connections with described second electronic equipment, namely adopt and preset described second electronic equipment of communication mechanism discovery, send radio connection request to described second electronic equipment.Described default communication mechanism is not limited to arbitrary home control network communication protocol of the prior art, as bluetooth, Wireless Fidelity (Wi-Fi).
Step 302: receive the success response of described second electronic equipment in the first Preset Time after, sets up the wireless connections with described second electronic equipment.
Here, after described second electronic equipment receives the radio connection request of described first electronic equipment transmission, described radio connection request is accepted in the first Preset Time, the success response of described radio connection request is sent, to complete the wireless connections between described first electronic equipment and described second electronic equipment to described first electronic equipment.
Step 303: obtain displaying contents, controls the light that the projecting cell projection of the first electronic equipment is corresponding with described displaying contents; Wherein, when described projecting cell is towards the supporting surface of described first electronic equipment, the projected image including described displaying contents can be formed at described supporting surface.
In the present embodiment, described first electronic equipment comprises projecting cell; As shown in Figure 4, the projecting cell 11 of described first electronic equipment 1 is arranged on the first surface of described first electronic equipment 1, for the notebook computer shown in Fig. 4, the first surface of described first electronic equipment 1 is described first electronic equipment 1 when being in closure state and keeping flat to supporting surface, presents to the surface of user.When described first electronic equipment 1 be in using state make the display screen place plane of described first electronic equipment 1 and keyboard place plane be in certain angle time, the projecting cell 11 of described first electronic equipment 1, towards the supporting surface of described first electronic equipment 1, can form the projected image including described displaying contents at described supporting surface.
Preferably, described second electronic equipment 2 has same structure with described second electronic equipment 1, namely comprises projecting cell 21, specifically can as shown in Figure 4, and the position of the projecting cell 21 of described second electronic equipment 2 specifically as foregoing description, can repeat no more here.
Step 304: when described projected image is arranged in described surveyed area, detects operating body for the operation of described projected image, determines to operate corresponding operand with described in displaying contents.
In the present embodiment, described detecting unit can also detect the operation of operating body for the projected image of described projecting cell, and described projected image is arranged in described surveyed area.
As a kind of embodiment, when described detecting unit is depth image collecting unit, described depth image collecting unit can collect the depth image in described depth image pickup area.Concrete, described first electronic equipment is by described depth image collecting unit sampling depth view data, when operating body does not enter described depth image pickup area (can be understood as described operating body not operate described projected image), the depth data that described depth image collecting unit collects can not be undergone mutation; When in the depth data that described depth image collecting unit collects, the depth data in some regions is undergone mutation, and when the data identification of sudden change goes out to present irregular cylinder, determine operating body to be detected in described surveyed area.
Further, described first electronic equipment is based on the position of described projecting cell on described first surface, and the angle that the display screen place plane of described first electronic equipment and keyboard place plane are in, determine the distance range of described projected image and described electronic equipment; When described depth image collecting unit is arranged on first surface (collecting unit of depth image described in practical application can be disposed adjacent with described projecting cell) of described first electronic equipment, the distance range of described projected image and described electronic equipment is the distance of the projecting cell (also can be depth image collecting unit) of each pixel in described projected image and described electronic equipment.When in the rotating shaft that described depth image collecting unit is arranged on described first electronic equipment, the distance range of described projected image and described electronic equipment is the distance of the depth image collecting unit of each pixel and described electronic equipment in described projected image.When described image acquisition units detects operating body, according to the depth value of described operating body, determine described operating body for the position of described projected image, then determine the operand in the described projected image that described position is corresponding based on the position of described projected image; Described operand is specifically as follows any object that described first electronic equipment such as icon, file, picture in described projected image (i.e. the displaying contents of described first electronic equipment) can present.
As another kind of embodiment, when described detecting unit is image acquisition units, concrete, described first electronic equipment is by described image acquisition units acquisition of image data, analyze the view data collected, when determining that view data comprises an irregular cylinder, can determine in described surveyed area, operating body (namely described irregular cylinder is described operating body) detected.
Further, when described image acquisition units collects the view data including operating body, in described view data, also comprise projected image; Identify described view data, can determine described operating body for described projected image in operand; Described operand is specifically as follows any object that described first electronic equipment such as icon, file, picture in described projected image (i.e. the displaying contents of described first electronic equipment) can present.
Step 305: the first operation trace detecting described operation, determine described first operation trace meet the 5th pre-conditioned after, identify described second electronic equipment send the second information; Described second information comprises the second operation trace that described second electronic equipment detects.
As a kind of embodiment, when described detecting unit is depth image collecting unit, detect that the depth value of described operating body can determine the operation trace of described operating body by described depth image collecting unit; When described operation trace meet predetermined condition (as operating body as described in characterizing by as described in move in projected image as described in projected image outer) time, can determine that the operation trace of described operation meets the 5th pre-conditioned.As another kind of embodiment, when described detecting unit is image acquisition units, the track of the operating point of described operating body detected by described image acquisition units; When described operating point meet predetermined condition (as by as described in view field gradually the described view field edge of trend even disappear in as described in view field) time, can determine that the operation trace of described operation is satisfied 5th pre-conditioned.
In this step, described first electronic equipment determine described operation trace meet the 5th pre-conditioned after, receive described second electronic equipment send the second information, described second information comprises the second operation trace that described second electronic equipment detects.
Here, described second information be based on described second electronic equipment projecting cell towards the projection light corresponding with the second displaying contents of described second electronic equipment during described supporting surface with can be formed at described supporting surface include described second displaying contents the second projected image, detect that operating body operates for second of described second projected image and trigger afterwards.Concrete, described second electronic equipment has the structure same with described first electronic equipment, and as shown in Figure 4, described second electronic equipment 2 comprises projecting cell 21, and described projecting cell arranges 21 on the first surface of described second electronic equipment 2.The projecting cell 21 of described second electronic equipment 2, towards the supporting surface of described second electronic equipment 2, can form the projected image of the second displaying contents including described second electronic equipment 2 at described supporting surface.Described second electronic equipment also comprises detecting unit, and described detecting unit can be depth image collecting unit or image acquisition units; Described second electronic equipment detects the operating body in surveyed area by described detecting unit, obtain the operation trace of described operating body.Then in this step, described second electronic equipment detects the second operation trace of operating body, and described second operation trace is directly sent to described first electronic equipment.Wherein, described second operation trace is sent to described first electronic equipment by image data mode.
Step 306: determine that described second operation trace meets the 6th pre-conditioned, or determine the combination of described first operation trace and described second operation trace meet the 7th pre-conditioned time, control described operand and be sent to described second electronic equipment.
In the present embodiment, second operation trace described in described first electronic equipment identification, determine that described second operation trace meets the 6th and pre-conditionedly (identifies described second operation trace by moving in projected image outside projected image, or identify described second operation trace is tended to described projected image center gradually mobile trend by projected image edge) time, or determine that the combination of described first operation trace and described second operation trace meets the 7th and pre-conditionedly (identifies described first operation trace by moving to outside projected image in projected image, or identify described first operation trace is tended to described projected image edge gradually mobile trend by projected image center, and identify described second operation trace by moving in projected image outside projected image, or identify described second operation trace is tended to described projected image center gradually mobile trend by projected image edge) time, generate and perform the second instruction, described operand is sent to described second electronic equipment.
Adopt the technical scheme of the embodiment of the present invention, achieve the supporting surface of described first electronic equipment as projection plane on the one hand, also achieve by the operand in projected image described in the operation control break to described projection plane on the other hand, also the displaying contents by the first electronic equipment described in the operation control break to projection plane is namely achieved, described operand is sent to the second electronic equipment for controlling or deletes described operand by the displaying contents of described first electronic equipment of described change, mutual especially by what achieve between electronic equipment to the operation of described projection plane, greatly improve the operating experience of user.
Embodiment four
The embodiment of the present invention additionally provides a kind of information processing method.Fig. 6 is the schematic flow sheet of the information processing method of the embodiment of the present invention four; As shown in Figure 6, described information processing method comprises:
Step 401: obtain displaying contents, controls the light that the projecting cell projection of the first electronic equipment is corresponding with described displaying contents; Wherein, when described projecting cell is towards the supporting surface of described first electronic equipment, the projected image including described displaying contents can be formed at described supporting surface.
Information processing method described in the present embodiment is applied in the first electronic equipment, and described first electronic equipment is specifically as follows notebook computer.Described first electronic equipment comprises projecting cell.As shown in Figure 2, the projecting cell 11 of described first electronic equipment is arranged on the first surface of described first electronic equipment, for the notebook computer shown in Fig. 2, the first surface of described first electronic equipment is described first electronic equipment when being in closure state and keeping flat to supporting surface, presents to the surface of user.When described first electronic equipment be in using state make the display screen place plane of described first electronic equipment and keyboard place plane be in certain angle time, the projecting cell 11 of described first electronic equipment, towards the supporting surface of described first electronic equipment, can form the projected image including described displaying contents at described supporting surface.
The executive agent of the information processing method of the present embodiment is the first electronic equipment, then in this step, described acquisition displaying contents, control the light that the projecting cell projection of the first electronic equipment is corresponding with described displaying contents, comprise: described first electronic equipment obtains displaying contents, control the light that the projecting cell projection of self is corresponding with described displaying contents.
Step 402: when described projected image is arranged in surveyed area, detects operating body for the operation of described projected image, determines to operate corresponding operand with described in displaying contents.
In the present embodiment, described first electronic equipment also comprises detecting unit, and described detecting unit can detect the operation of operating body for the projected image of described projecting cell, in this case, described detecting unit has surveyed area, and described projected image is arranged in described surveyed area.
As a kind of embodiment, described detecting unit can be depth image collecting unit, then described surveyed area is the depth image pickup area of described depth image collecting unit.Described depth image collecting unit can collect the depth image in described depth image pickup area.To be illustrated as example shown in Fig. 2, described depth image collecting unit can be arranged in the first surface of described first electronic equipment or the rotating shaft of described first electronic equipment, described depth image collecting unit, towards described projected image, carries out for described projected image the operating body that operates detecting.Concrete, described first electronic equipment is by described depth image collecting unit sampling depth view data, when operating body does not enter described depth image pickup area (can be understood as described operating body not operate described projected image), the depth data that described depth image collecting unit collects can not be undergone mutation; When in the depth data that described depth image collecting unit collects, the depth data in some regions is undergone mutation, and when the data identification of sudden change goes out to present irregular cylinder, determine operating body to be detected in described surveyed area.
Further, described first electronic equipment is based on the position of described projecting cell on described first surface, and the angle that the display screen place plane of described first electronic equipment and keyboard place plane are in, determine the distance range of described projected image and described electronic equipment; When described depth image collecting unit is arranged on first surface (collecting unit of depth image described in practical application can be disposed adjacent with described projecting cell) of described first electronic equipment, the distance range of described projected image and described electronic equipment is the distance of the projecting cell (also can be depth image collecting unit) of each pixel in described projected image and described electronic equipment.When in the rotating shaft that described depth image collecting unit is arranged on described first electronic equipment, the distance range of described projected image and described electronic equipment is the distance of the depth image collecting unit of each pixel and described electronic equipment in described projected image.When described image acquisition units detects operating body, according to the depth value of described operating body, determine described operating body for the position of described projected image, then determine the operand in the described projected image that described position is corresponding based on the position of described projected image; Described operand is specifically as follows any object that described first electronic equipment such as icon, file, picture in described projected image (i.e. the displaying contents of described first electronic equipment) can present.
As another kind of embodiment, described detecting unit can also be image acquisition units, then described surveyed area is the image acquisition region of described image acquisition units.To be illustrated as example shown in Fig. 2, described image acquisition units can be arranged on the first surface (can be disposed adjacent with described projecting cell in practical application) of described first electronic equipment, described image acquisition units, towards described projected image, carries out for described projected image the operating body that operates detecting.Concrete, described first electronic equipment is by described image acquisition units acquisition of image data, analyze the view data collected, when determining that view data comprises an irregular cylinder, can determine in described surveyed area, operating body (namely described irregular cylinder is described operating body) detected.
Further, when described image acquisition units collects the view data including operating body, in described view data, also comprise projected image; Identify described view data, can determine described operating body for described projected image in operand; Described operand is specifically as follows any object that described first electronic equipment such as icon, file, picture in described projected image (i.e. the displaying contents of described first electronic equipment) can present.
Step 403: the operation trace detecting described operation, determine described operation trace meet the 9th pre-conditioned after, judge that described first electronic equipment is current and whether connect with the second electronic equipment, obtain the second judged result.
As a kind of embodiment, when described detecting unit is depth image collecting unit, detect that the depth value of described operating body can determine the operation trace of described operating body by described depth image collecting unit; When described operation trace meet predetermined condition (as operating body as described in characterizing by as described in move in projected image as described in projected image outer) time, can determine that the operation trace of described operation meets the 9th pre-conditioned.As another kind of embodiment, when described detecting unit is image acquisition units, the track of the operating point of described operating body detected by described image acquisition units; When described operating point meet predetermined condition (as by as described in view field gradually the described view field edge of trend even disappear in as described in view field) time, can determine that the operation trace of described operation is satisfied 9th pre-conditioned.
In this step, describedly judge that described first electronic equipment is current and whether connect with the second electronic equipment, be specially and judge whether described first electronic equipment sets up wireless connections with the second electronic equipment, described wireless connections are the wireless connections based on default communication mechanism, as bluetooth, Wireless Fidelity (Wi-Fi).Concrete, described first electronic equipment determine by the connection status as radio network interface, Bluetooth function module as described in whether the first electronic equipment is current connect with the second electronic equipment.
Step 404: when to be that described first electronic equipment is current do not connect with described second electronic equipment described second judged result, generate and perform the first instruction, deleting described operand.Wherein, described first instruction is for deleting described operand.
Step 405: when described second judged result is described first electronic equipment and described second electronic equipment connects, generates and performs the second instruction, and described operand is sent to described second electronic equipment.Wherein, described second instruction is used for, based on described default communication mechanism, described operand is sent to described second electronic equipment.
In the present embodiment, described second electronic equipment can have same structure with described first electronic equipment, namely comprises projecting cell, specifically can be as shown in Figure 4.Described second electronic equipment also can be different from described first electronic equipment, namely do not have projecting cell.
Adopt the technical scheme of the embodiment of the present invention, achieve the supporting surface of described first electronic equipment as projection plane on the one hand, also achieve by the operand in projected image described in the operation control break to described projection plane on the other hand, also the displaying contents by the first electronic equipment described in the operation control break to projection plane is namely achieved, described operand is sent to the second electronic equipment for controlling or deletes described operand by the displaying contents of described first electronic equipment of described change, mutual especially by what achieve between electronic equipment to the operation of described projection plane, greatly improve the operating experience of user.
Embodiment five
The embodiment of the present invention additionally provides a kind of electronic equipment, and described electronic equipment is the first electronic equipment.Fig. 7 is the first composition structural representation of the electronic equipment of the embodiment of the present invention; As shown in Figure 7, described electronic equipment is the first electronic equipment; Described electronic equipment comprises: processor 71, projecting cell 72 and detecting unit 73; Wherein,
Described processor 71, for obtaining displaying contents, controls the light that the projection of described projecting cell 72 is corresponding with described displaying contents;
Described projecting cell 72, is arranged at the first surface of described first electronic equipment, for projecting the light corresponding with described displaying contents; When described projecting cell 72 is towards the supporting surface of described first electronic equipment, the projected image including described displaying contents can be formed at described supporting surface;
Described detecting unit 73, is arranged at the first surface of described first electronic equipment, for when described projected image is arranged in surveyed area, detects the operation of operating body for described projected image;
Described processor 71, also operates corresponding operand with described for determining in displaying contents, at least judges that whether described operation meets first pre-conditioned, controls described operand based on above-mentioned judged result.
As the first embodiment, described processor 71, after setting up wireless connections with the second electronic equipment, detects the operation trace of described operation, determine described operation trace meet the 3rd pre-conditioned after, to second electronic equipment send the first information; Receive the first trigger message of described second electronic equipment in the second Preset Time after, control described operand and send to described second electronic equipment; Wherein, described first trigger message triggers after satisfied 4th pre-conditioned operation trace being detected based on described second electronic equipment; Wherein, described first trigger message be based on described second electronic equipment projecting cell 72 towards the projection light corresponding with the second displaying contents of described second electronic equipment during described supporting surface with can be formed at described supporting surface include described second displaying contents the second projected image, detect that operating body operates and the satisfied 8th pre-conditioned rear triggering of the described second operation trace operated for second of described second projected image.
As the second embodiment, described processor 71, after setting up wireless connections with the second electronic equipment, detect the first operation trace of described operation, determine described first operation trace meet the 5th pre-conditioned after, identify the second information that described second electronic equipment sends, described second information comprises the second operation trace that described second electronic equipment detects; Determine that described second operation trace meets the 6th pre-conditioned, or determine the combination of described first operation trace and described second operation trace meet the 7th pre-conditioned time, control described operand and be sent to described second electronic equipment; Wherein, described second information be based on described second electronic equipment projecting cell 72 towards the projection light corresponding with the second displaying contents of described second electronic equipment during described supporting surface with can be formed at described supporting surface include described second displaying contents the second projected image, detect that operating body operates for second of described second projected image and trigger afterwards.
As the third embodiment, described processor 71, for detecting the operation trace of described operation, determine described operation trace meet the 9th pre-conditioned after, judge current whether connecting with the second electronic equipment, obtain the second judged result; When described second judged result be current do not connect with described second electronic equipment time, generate and perform the first instruction, deleting described operand; When described second judged result be current and described second electronic equipment connect time, generate and perform the second instruction, described operand is sent to described second electronic equipment.
It will be appreciated by those skilled in the art that the function of each processing unit in the electronic equipment of the embodiment of the present invention, can refer to the associated description of aforementioned information disposal route and understand.
Embodiment six
The embodiment of the present invention additionally provides a kind of electronic equipment, and described electronic equipment is the first electronic equipment.Fig. 8 is the second composition structural representation of the electronic equipment of the embodiment of the present invention; As shown in Figure 8, described electronic equipment is the first electronic equipment; Described electronic equipment comprises: processor 71, projecting cell 72, detecting unit 73 and wireless communication unit 74; Wherein,
Described detecting unit 73, is arranged at the first surface of described first electronic equipment, for detecting the object in described surveyed area;
Described processor 71, for show when the testing result that described detecting unit 73 obtains described object meet second pre-conditioned time, initiate the wireless connections with the second electronic equipment by described wireless communication unit 74; Receive the success response of described second electronic equipment in the first Preset Time after, set up the wireless connections with described second electronic equipment; Also for obtaining displaying contents, control the light that the projection of described projecting cell 72 is corresponding with described displaying contents;
Described projecting cell 72, is arranged at the first surface of described first electronic equipment, for projecting the light corresponding with described displaying contents; When described projecting cell 72 is towards the supporting surface of described first electronic equipment, the projected image including described displaying contents can be formed at described supporting surface;
Described detecting unit 73, also for when described projected image is arranged in surveyed area, detects the operation of operating body for described projected image;
Described processor 71, also operates corresponding operand with described for determining in displaying contents, at least judges that whether described operation meets first pre-conditioned, controls described operand based on above-mentioned judged result.
As the first embodiment, described processor 71, after setting up wireless connections with the second electronic equipment, detects the operation trace of described operation, determine described operation trace meet the 3rd pre-conditioned after, to second electronic equipment send the first information; Receive the first trigger message of described second electronic equipment in the second Preset Time after, control described operand and send to described second electronic equipment; Wherein, described first trigger message triggers after satisfied 4th pre-conditioned operation trace being detected based on described second electronic equipment; Wherein, described first trigger message be based on described second electronic equipment projecting cell 72 towards the projection light corresponding with the second displaying contents of described second electronic equipment during described supporting surface with can be formed at described supporting surface include described second displaying contents the second projected image, detect that operating body operates and the satisfied 8th pre-conditioned rear triggering of the described second operation trace operated for second of described second projected image.
As the second embodiment, described processor 71, after setting up wireless connections with the second electronic equipment, detect the first operation trace of described operation, determine described first operation trace meet the 5th pre-conditioned after, identify the second information that described second electronic equipment sends, described second information comprises the second operation trace that described second electronic equipment detects; Determine that described second operation trace meets the 6th pre-conditioned, or determine the combination of described first operation trace and described second operation trace meet the 7th pre-conditioned time, control described operand and be sent to described second electronic equipment; Wherein, described second information be based on described second electronic equipment projecting cell 72 towards the projection light corresponding with the second displaying contents of described second electronic equipment during described supporting surface with can be formed at described supporting surface include described second displaying contents the second projected image, detect that operating body operates for second of described second projected image and trigger afterwards.
As the third embodiment, described processor 71, for detecting the operation trace of described operation, determine described operation trace meet the 3rd pre-conditioned after, to second electronic equipment send the first information; The trigger message of described second electronic equipment is not received in the second Preset Time, or receive the trigger message of described second electronic equipment not meeting preset requirement in described second Preset Time after, generate and perform the first instruction, deleting described operand.
As the 4th kind of embodiment, described processor 71, for detecting the operation trace of described operation, determine described operation trace meet the 9th pre-conditioned after, judge current whether connecting with the second electronic equipment, obtain the second judged result; When described second judged result be current do not connect with described second electronic equipment time, generate and perform the first instruction, deleting described operand; When described second judged result be current and described second electronic equipment connect time, generate and perform the second instruction, described operand is sent to described second electronic equipment.
It will be appreciated by those skilled in the art that the function of each processing unit in the electronic equipment of the embodiment of the present invention, can refer to the associated description of aforementioned information disposal route and understand.
In the embodiment of the present invention five and embodiment six, processor 71 in described electronic equipment, in actual applications can by the central processing unit (CPU in described electronic equipment, Central Processing Unit), digital signal processor (DSP, Digital Signal Processor) or programmable gate array (FPGA, Field-Programmable Gate Array) realization; Projecting cell 72 in described electronic equipment, in actual applications, can be realized by the minitype projection machine of the first surface being arranged on described electronic equipment; Detecting unit 73 in described electronic equipment, can be realized by the depth camera be arranged in the first surface of described electronic equipment or rotating shaft or camera in actual applications; Wireless communication unit 74 in described electronic equipment, can be realized by the wireless communication module of described electronic equipment (as wireless network card, Bluetooth antenna) in actual applications.
Embodiment seven
The embodiment of the present invention additionally provides a kind of information handling system, and described system comprises the first electronic equipment and the second electronic equipment;
Described first electronic equipment, for detecting the object in described surveyed area, when testing result show described object meet second pre-conditioned time, initiate and the wireless connections of the second electronic equipment; Also for obtaining displaying contents, control the light that the projecting cell projection of self is corresponding with described displaying contents; Wherein, when described projecting cell is towards the supporting surface of described first electronic equipment, the projected image including described displaying contents can be formed at described supporting surface; When described projected image is arranged in surveyed area, detects operating body for the operation of described projected image, determine to operate corresponding operand with described in displaying contents; At least judge that whether described operation meets first pre-conditioned, control described operand based on above-mentioned judged result and be sent to described second electronic equipment.
Described second electronic equipment, after the wireless connections that described first electronic equipment is initiated being detected, sends success response, to set up the wireless connections with the first electronic equipment to described first electronic equipment in the first Preset Time.
Concrete, described first electronic equipment detects the object in described surveyed area, when the shape of described object and/or size meet predetermined condition, determines that described object meets described second pre-conditioned.
In the present embodiment, described second electronic equipment, for obtaining the second displaying contents, control the projecting cell of self towards projecting the light corresponding with described second displaying contents during described supporting surface to form the second projected image including described second displaying contents at described supporting surface; Also for when described projected image is arranged in surveyed area, detect the operation of operating body for described second projected image.
In the present embodiment, concrete, described first electronic equipment and described second electronic equipment have same or analogous structure, namely described first electronic equipment is provided with the first projecting cell and the first detecting unit, described second electronic equipment is provided with the second projecting cell and the second detecting unit, specifically Figure 4 shows that example, be example when described first electronic equipment and described second electronic equipment are notebook computer, described first projecting cell is arranged on the first surface of described first electronic equipment, described second projecting cell is arranged on the first surface of described second electronic equipment, first surface is notebook computer when being in closure state and keeping flat to supporting surface, presents to the surface of user.When notebook computer be in using state make the display screen place plane of notebook computer and keyboard place plane be in certain angle time, first projecting cell of described first electronic equipment, towards described supporting surface, can form the first projected image of the first displaying contents including described first electronic equipment at described supporting surface; Second projecting cell of described second electronic equipment, towards described supporting surface, can form the second projected image of the second displaying contents including described second electronic equipment at described supporting surface.First detecting unit of described first electronic equipment has the first surveyed area, and the second detecting unit of described second electronic equipment has the second surveyed area; Described first electronic equipment detects the operation of operating body for described first projected image by described first detecting unit; Described second electronic equipment detects the operation of operating body for described second projected image by described second detecting unit.
As the first embodiment, described first electronic equipment, for detecting the operation trace of described operation, determine described operation trace meet the 3rd pre-conditioned after, to second electronic equipment send the first information; Also for receive described second electronic equipment send the first trigger message after, control described operand and send to described second electronic equipment;
Described second electronic equipment, for receive described first electronic equipment send the first information after, in the second Preset Time, send the first trigger message to described first electronic equipment; Wherein, described first trigger message triggers after satisfied 4th pre-conditioned operation trace being detected based on described second electronic equipment; Wherein, described second electronic equipment, for based on self projecting cell towards project during described supporting surface light corresponding to the second displaying contents with can be formed at described supporting surface include described second displaying contents the second projected image, detect operating body for described second projected image second operation and described second operation operation trace meet described first trigger message of the 8th pre-conditioned rear triggering.
As the second embodiment, described first electronic equipment, for detecting the first operation trace of described operation, also for receiving the second information that described second electronic equipment sends; Determine described first operation trace meet the 5th pre-conditioned after, identify described second electronic equipment send the second information; Determine that described second operation trace meets the 6th pre-conditioned, or determine the combination of described first operation trace and described second operation trace meet the 7th pre-conditioned time, control described operand and be sent to described second electronic equipment;
Described second electronic equipment, for the second operation trace being detected, generates the second information based on described second operation trace, described second information is sent to described first electronic equipment; Wherein, described second electronic equipment, for based on self projecting cell towards project during described supporting surface light corresponding to the second displaying contents with can be formed at described supporting surface include described second displaying contents the second projected image, detect operating body for described second projected image second operation after trigger described second information.
Those skilled in the art should understand, embodiments of the invention can be provided as method, system or computer program.Therefore, the present invention can adopt the form of hardware embodiment, software implementation or the embodiment in conjunction with software and hardware aspect.And the present invention can adopt in one or more form wherein including the upper computer program implemented of computer-usable storage medium (including but not limited to magnetic disk memory and optical memory etc.) of computer usable program code.
The present invention describes with reference to according to the process flow diagram of the method for the embodiment of the present invention, equipment (system) and computer program and/or block scheme.Should understand can by the combination of the flow process in each flow process in computer program instructions realization flow figure and/or block scheme and/or square frame and process flow diagram and/or block scheme and/or square frame.These computer program instructions can being provided to the processor of multi-purpose computer, special purpose computer, Embedded Processor or other programmable data processing device to produce a machine, making the instruction performed by the processor of computing machine or other programmable data processing device produce device for realizing the function of specifying in process flow diagram flow process or multiple flow process and/or block scheme square frame or multiple square frame.
These computer program instructions also can be stored in can in the computer-readable memory that works in a specific way of vectoring computer or other programmable data processing device, the instruction making to be stored in this computer-readable memory produces the manufacture comprising command device, and this command device realizes the function of specifying in process flow diagram flow process or multiple flow process and/or block scheme square frame or multiple square frame.
These computer program instructions also can be loaded in computing machine or other programmable data processing device, make on computing machine or other programmable devices, to perform sequence of operations step to produce computer implemented process, thus the instruction performed on computing machine or other programmable devices is provided for the step realizing the function of specifying in process flow diagram flow process or multiple flow process and/or block scheme square frame or multiple square frame.
The above, be only preferred embodiment of the present invention, be not intended to limit protection scope of the present invention.

Claims (12)

1. an information processing method, described method comprises:
Obtain displaying contents, control the light that the projecting cell projection of the first electronic equipment is corresponding with described displaying contents; Wherein, when described projecting cell is towards the supporting surface of described first electronic equipment, the projected image including described displaying contents can be formed at described supporting surface;
When described projected image is arranged in surveyed area, detects operating body for the operation of described projected image, determine to operate corresponding operand with described in displaying contents;
At least judge that whether described operation meets first pre-conditioned, control described operand based on above-mentioned judged result.
2. method according to claim 1, it is characterized in that, before the operation of described detection operating body for described projected image, described method also comprises: detect the object in described surveyed area, when testing result show described object meet second pre-conditioned time, initiate and the wireless connections of the second electronic equipment;
Receive the success response of described second electronic equipment in the first Preset Time after, set up the wireless connections with described second electronic equipment.
3. method according to claim 2, is characterized in that, described object meets second pre-conditioned, comprising: when the shape of described object and/or size meet predetermined condition, determines that described object meets described second pre-conditioned.
4. method according to claim 1, is characterized in that, described at least to judge that whether described operation meets first pre-conditioned, controls described operand, comprising based on above-mentioned judged result:
After wireless connections set up by described first electronic equipment and the second electronic equipment, detect the operation trace of described operation, determine described operation trace meet the 3rd pre-conditioned after, to second electronic equipment send the first information;
Receive the first trigger message of described second electronic equipment in the second Preset Time after, control described operand and send to described second electronic equipment; Wherein, described first trigger message triggers after satisfied 4th pre-conditioned operation trace being detected based on described second electronic equipment.
5. method according to claim 1, is characterized in that, described at least to judge that whether described operation meets first pre-conditioned, controls described operand, comprising based on above-mentioned judged result:
After wireless connections set up by described first electronic equipment and the second electronic equipment, detect the first operation trace of described operation, determine described first operation trace meet the 5th pre-conditioned after, identify the second information that described second electronic equipment sends, described second information comprises the second operation trace that described second electronic equipment detects;
It is the 6th pre-conditioned that described first electronic equipment determines that described second operation trace meets, or determine described first operation trace and described second operation trace combination meet the 7th pre-conditioned time, control described operand and be sent to described second electronic equipment.
6. method according to claim 4, it is characterized in that, described first trigger message be based on described second electronic equipment projecting cell towards the projection light corresponding with the second displaying contents of described second electronic equipment during described supporting surface with can be formed at described supporting surface include described second displaying contents the second projected image, detect that operating body operates and the satisfied 8th pre-conditioned rear triggering of the described second operation trace operated for second of described second projected image.
7. method according to claim 2, is characterized in that, described at least to judge that whether described operation meets first pre-conditioned, controls described operand, comprising based on above-mentioned judged result:
Detect the operation trace of described operation, determine described operation trace meet the 3rd pre-conditioned after, to second electronic equipment send the first information;
The trigger message of described second electronic equipment is not received in the second Preset Time, or receive the trigger message of described second electronic equipment not meeting preset requirement in described second Preset Time after, generate and perform the first instruction, deleting described operand.
8. method according to claim 1, is characterized in that, described at least to judge that whether described operation meets first pre-conditioned, controls described operand, comprising based on above-mentioned judged result:
Detect the operation trace of described operation, determine described operation trace meet the 9th pre-conditioned after, judge that described first electronic equipment is current and whether connect with the second electronic equipment, obtain the second judged result;
When to be that described first electronic equipment is current do not connect with described second electronic equipment described second judged result, generate and perform the first instruction, deleting described operand;
When described second judged result is described first electronic equipment and described second electronic equipment connects, generates and perform the second instruction, described operand is sent to described second electronic equipment.
9. an electronic equipment, described electronic equipment is the first electronic equipment; Described electronic equipment comprises:
Processor, for obtaining displaying contents, controls the light that the projection of described projecting cell is corresponding with described displaying contents;
Projecting cell, is arranged at the first surface of described first electronic equipment, for projecting the light corresponding with described displaying contents; When described projecting cell is towards the supporting surface of described first electronic equipment, the projected image including described displaying contents can be formed at described supporting surface;
Detecting unit, is arranged at the first surface of described first electronic equipment, for when described projected image is arranged in surveyed area, detects the operation of operating body for described projected image;
Described processor, also operates corresponding operand with described for determining in displaying contents, at least judges that whether described operation meets first pre-conditioned, controls described operand based on above-mentioned judged result.
10. electronic equipment according to claim 9, is characterized in that, described electronic equipment also comprises wireless communication unit;
Described detecting unit, also for detecting the object in described surveyed area;
Described processor, also for show when the testing result that described detecting unit obtains described object meet second pre-conditioned time, initiate the wireless connections with the second electronic equipment by described wireless communication unit; Receive the success response of described second electronic equipment in the first Preset Time after, set up the wireless connections with described second electronic equipment.
11. 1 kinds of information handling systems, described system comprises the first electronic equipment and the second electronic equipment;
Described first electronic equipment, for detecting the object in described surveyed area, when testing result show described object meet second pre-conditioned time, initiate and the wireless connections of the second electronic equipment; Also for obtaining displaying contents, control the light that the projecting cell projection of self is corresponding with described displaying contents; Wherein, when described projecting cell is towards the supporting surface of described first electronic equipment, the projected image including described displaying contents can be formed at described supporting surface; When described projected image is arranged in surveyed area, detects operating body for the operation of described projected image, determine to operate corresponding operand with described in displaying contents; At least judge that whether described operation meets first pre-conditioned, control described operand based on above-mentioned judged result and be sent to described second electronic equipment;
Described second electronic equipment, after the wireless connections that described first electronic equipment is initiated being detected, sends success response, to set up the wireless connections with the first electronic equipment to described first electronic equipment in the first Preset Time.
12. systems according to claim 11, it is characterized in that, described second electronic equipment, for obtaining the second displaying contents, control the projecting cell of self towards projecting the light corresponding with described second displaying contents during described supporting surface to form the second projected image including described second displaying contents at described supporting surface; Also for when described projected image is arranged in surveyed area, detect the operation of operating body for described second projected image.
CN201510169484.7A 2015-04-10 2015-04-10 Information processing method and system and electronic devices Pending CN104793861A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510169484.7A CN104793861A (en) 2015-04-10 2015-04-10 Information processing method and system and electronic devices

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510169484.7A CN104793861A (en) 2015-04-10 2015-04-10 Information processing method and system and electronic devices

Publications (1)

Publication Number Publication Date
CN104793861A true CN104793861A (en) 2015-07-22

Family

ID=53558692

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510169484.7A Pending CN104793861A (en) 2015-04-10 2015-04-10 Information processing method and system and electronic devices

Country Status (1)

Country Link
CN (1) CN104793861A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070297643A1 (en) * 2006-06-23 2007-12-27 Fuji Xerox Co., Ltd. Information processing system, information processing method, and program product therefor
CN102413300A (en) * 2010-09-17 2012-04-11 索尼公司 Information processing apparatus, information processing system, information processing method, and program
CN102523346A (en) * 2011-12-15 2012-06-27 广州市动景计算机科技有限公司 Cross-device file transmission method, device, transit server and device
CN104049853A (en) * 2013-03-11 2014-09-17 联想(北京)有限公司 Information processing method and electronic equipment
CN104049811A (en) * 2013-03-15 2014-09-17 德克萨斯仪器股份有限公司 Interaction Detection Using Structured Light Images
CN104076988A (en) * 2013-03-27 2014-10-01 联想(北京)有限公司 Display method, display control method and electronic equipment
CN104423561A (en) * 2013-09-09 2015-03-18 联想(北京)有限公司 Information processing method and terminal processing equipment

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070297643A1 (en) * 2006-06-23 2007-12-27 Fuji Xerox Co., Ltd. Information processing system, information processing method, and program product therefor
CN102413300A (en) * 2010-09-17 2012-04-11 索尼公司 Information processing apparatus, information processing system, information processing method, and program
CN102523346A (en) * 2011-12-15 2012-06-27 广州市动景计算机科技有限公司 Cross-device file transmission method, device, transit server and device
CN104049853A (en) * 2013-03-11 2014-09-17 联想(北京)有限公司 Information processing method and electronic equipment
CN104049811A (en) * 2013-03-15 2014-09-17 德克萨斯仪器股份有限公司 Interaction Detection Using Structured Light Images
CN104076988A (en) * 2013-03-27 2014-10-01 联想(北京)有限公司 Display method, display control method and electronic equipment
CN104423561A (en) * 2013-09-09 2015-03-18 联想(北京)有限公司 Information processing method and terminal processing equipment

Similar Documents

Publication Publication Date Title
CN108710456B (en) Application icon processing method and device and mobile terminal
CN105163028A (en) Method and device for controlling two cameras
CN105094499A (en) Intelligent mobile terminal and processing method for application windows thereof
CN104076988A (en) Display method, display control method and electronic equipment
CN104092815A (en) Method and device for inter-application information interaction based on Android system
CN104461312A (en) Display control method and electronic equipment
CN105573484A (en) Projection method and terminal
CN110750187A (en) Icon moving method of application program and terminal equipment
CN105005607A (en) Page information loading method and page information loading device
CN104750237A (en) Multi-screen interactive connection method and device as well as electronic equipment
CN104238900A (en) Page positioning method and device
CN103078953A (en) Method for pairing connected devices
CN104808959A (en) Information processing method and electronic device
CN111638849A (en) Screenshot method and device and electronic equipment
CN105630376A (en) Terminal control method and device
CN106372076A (en) Method and device for switching webpages in browsers
CN104898818A (en) Information processing method and electronic equipment
CN105630239A (en) Method and device for detecting operation
US10719926B2 (en) Image stitching method and electronic device
CN102929530A (en) Touch screen operation processing method and system on basis of mobile terminal and mobile terminal
CN102890606A (en) Information processing device, information processing method, and program
CN104793861A (en) Information processing method and system and electronic devices
CN110782530B (en) Method and device for displaying vehicle information in automatic driving simulation system
CN114489429A (en) Terminal device, long screen capture method and storage medium
CN106484259A (en) Input method, device and equipment

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
EXSB Decision made by sipo to initiate substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20150722