CN110618775B - Electronic device for interactive control - Google Patents

Electronic device for interactive control Download PDF

Info

Publication number
CN110618775B
CN110618775B CN201810632219.1A CN201810632219A CN110618775B CN 110618775 B CN110618775 B CN 110618775B CN 201810632219 A CN201810632219 A CN 201810632219A CN 110618775 B CN110618775 B CN 110618775B
Authority
CN
China
Prior art keywords
electronic device
predetermined
image
display
relative position
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810632219.1A
Other languages
Chinese (zh)
Other versions
CN110618775A (en
Inventor
柯杰斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Acer Inc
Original Assignee
Acer Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Acer Inc filed Critical Acer Inc
Priority to CN201810632219.1A priority Critical patent/CN110618775B/en
Publication of CN110618775A publication Critical patent/CN110618775A/en
Application granted granted Critical
Publication of CN110618775B publication Critical patent/CN110618775B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a first electronic device and a second electronic device which can be applied to interactive control. The first electronic device includes a first force sensitive component, a camera module, and a processing circuit, and the second electronic device includes a second force sensitive component, a display module, and a processing circuit. At least one of the first force sensitive component and the second force sensitive component is used to detect a contact event between the first electronic device and the second electronic device. The display module displays at least one predetermined image, the photographing module photographs at least one local image of the at least one predetermined image to allow the first electronic device to determine at least one relative position of the first electronic device to the second electronic device for use by the second electronic device, and particularly, the display module displays display content corresponding to the at least one relative position. The invention has the advantage that the interaction mechanism between the devices can be realized in a limited space.

Description

Electronic device for interactive control
Technical Field
The present invention relates to user interface control, and more particularly, to an electronic device for interactive control.
Background
According to the prior art, interaction mechanisms are found in some electronic devices, such as multifunctional mobile phones. In particular, such interaction mechanisms typically provide for human-machine interaction, rather than machine-machine interaction. Some proposals have been made in the prior art to attempt to design the interaction between machines. However, some problems arise. For example: high relative cost, large space occupied by the whole structure of 8230and the like. Therefore, a novel method and related architecture are needed to implement the interaction mechanism between devices without side effects or with less possibility of side effects.
Disclosure of Invention
An objective of the present invention is to disclose an electronic device applicable to (application to) interactive control, so as to solve the above-mentioned problems.
Another objective of the present invention is to disclose an electronic device applicable to interactive control, so as to realize an interaction mechanism between devices under the condition of no side effect or less possibility of side effect.
At least one embodiment of the present invention discloses a first electronic device, which can be applied to interactive control involving the first electronic device and a second electronic device, the first electronic device comprising: a first force-sensitive component; a camera module; and a processing circuit coupled to the first force sensitive component and the camera module. The first force sensitive component is operable to perform force-related sensing for the first electronic device, wherein at least one of the first force sensitive component and a second force sensitive component of the second electronic device is operable to detect a contact event between the first electronic device and the second electronic device. The camera module can be used for shooting at least one partial image (partial image) of at least one predetermined image, wherein a display module of the second electronic device displays the at least one predetermined image. The processing circuit can be used for controlling the operation of the first electronic device, wherein the processing circuit judges at least one relative position of the first electronic device to the second electronic device according to the at least one local image so as to be used by the second electronic device, and the display module of the second electronic device displays display content corresponding to the at least one relative position.
At least one embodiment of the present invention discloses a second electronic device, which can be applied to interactive control involving a first electronic device and the second electronic device, the second electronic device comprising: a second force sensitive component; a display module; and a processing circuit coupled to the second force sensitive component and the display module. The second force sensitive component is configured to perform force-dependent sensing for the second electronic device, wherein at least one of a first force sensitive component of the first electronic device and the second force sensitive component is configured to detect a contact event between the first electronic device and the second electronic device. The display module can be used for displaying at least one predetermined image, wherein a photographing module of the first electronic device photographs at least one local image of the at least one predetermined image to allow the first electronic device to determine at least one relative position of the first electronic device to the second electronic device for the second electronic device to use. The processing circuit can be used for controlling the operation of the second electronic device, wherein the processing circuit controls the display module to display the at least one predetermined image and controls the display module to display the display content corresponding to the at least one relative position.
One of the benefits of the present invention is that the present invention can properly control the operations of the electronic devices, and particularly, can implement an interaction mechanism between the devices in a limited space to avoid various problems in the prior art. In addition, the implementation according to the related embodiment of the invention does not add much extra cost. Therefore, the problems of the prior art can be solved without increasing the overall cost too much.
Drawings
Fig. 1 is a schematic diagram of an interactive system (interaction system) according to an embodiment of the invention.
Fig. 2 is a flowchart illustrating a method for interactive control according to an embodiment of the present invention.
FIG. 3 shows details of an embodiment of the interactive system.
FIG. 4 shows details of an implementation of the interactive system in another embodiment.
FIG. 5 shows details of an implementation of the interactive system in another embodiment.
FIG. 6 shows details of an implementation of the interactive system in another embodiment.
FIG. 7 illustrates some examples of the predetermined image in the method.
FIG. 8 illustrates the workflow of the method in one embodiment.
Wherein the reference numerals are as follows:
100. interactive system
110,110M,110C,110A,120, electronic devices
120T
112,122 processing circuit
114,114F,114B,114C,114A, force-sensitive component
124
116,126 communication module
118. Camera module
118F,118B,118C,118A cameras
128. Display module
200,400 work flow
210 to 250, S11 to S19
Detailed Description
Fig. 1 is a schematic diagram of an interactive system 100 according to an embodiment of the invention. The interactive system 100 may include electronic devices 110 and 120. Examples of electronic device 110 may include (but are not limited to): tablet computers, multi-function mobile phones, wearable devices, simple input gadgets (gadgets), toys, and the like; while examples of electronic device 120 may include (but are not limited to): interactive tables, all-in-one Personal Computers (PCs), notebook computers, tablet computers, multifunctional mobile phones, toys of larger size, and the like; the size of the electronic device 120 is usually larger than that of the electronic device 110, for example, the size may be more than 8 inches, but the invention is not limited thereto.
As shown in fig. 1, the electronic device 110 may include a processing circuit 112 (e.g., a processor, a microprocessor, a mcu, etc.), at least one force-sensitive component 114 (e.g., one or more force-sensitive components such as force sensors, pressure sensors, etc.), a communication module 116 (e.g., a wireless communication module conforming to a predetermined communication protocol such as any of bluetooth, wi-Fi, etc.), and a camera module 118, and the electronic device 120 may include a processing circuit 122 (e.g., a processor, a microprocessor, a mcu, etc.), at least one force-sensitive component 124 (e.g., one or more force-sensitive components such as force sensors, pressure sensors, etc.), a communication module 126 (e.g., a wireless communication module conforming to the predetermined communication protocol such as any of bluetooth, wi-Fi, etc.), and a display module 128, although the invention is not limited thereto. The architecture of the interactive system 100 may vary so long as it does not interfere with the practice of the present invention. For example: the number, type, arrangement, etc. of the components of any of the electronic devices 110 and 120 may vary. Another example is: some components of any of the electronic devices 110 and 120 may be integrated in the same module.
According to the present embodiment, the processing circuits 112 and 122 may be respectively used for controlling operations of the electronic devices 110 and 120, and the force- sensitive elements 114 and 124 may be respectively used for performing force-related sensing (e.g., sensing pressure, such as pressure applied by one of the electronic devices 110 and 120 to the other) for the electronic devices 110 and 120. In addition, the communication modules 116 and 126 can be used for wireless communication between the electronic devices 110 and 120, respectively, to allow the electronic devices 110 and 120 to exchange information. In addition, under the control of the processing circuit 112, the camera module 118 can capture one or more images for the electronic device 110, such as at least a portion (e.g., a portion or all) of a predetermined image displayed by the display module 128, for use in interactive control. Under the control of the processing circuit 122, the display module 128 may display the predetermined image for the electronic device 120 for further use of the electronic device 110, or display an image related to interactive control (such as a modified image corresponding to the interactive control) for the electronic device 120, so as to enhance user experience.
Fig. 2 is a workflow 200 of a method for interactive control according to an embodiment of the invention, wherein the method is applicable to the interactive system 100, the electronic devices 110 and 120, the processing circuits 112 and 122, and other components shown in fig. 1. The processing circuits 112 and 122 may control the respective operations of the electronic devices 110 and 120 according to the method. For ease of understanding, the first and second electronic devices in the workflow 200 may be described as the electronic devices 110 and 120, respectively, and their respective components (e.g., the first and second force-sensitive components) may be described as components of the electronic devices 110 and 120, respectively (e.g., the force-sensitive components 114 and 124).
In step 210, the interactive system 100 (e.g., at least one of the electronic devices 110 and 120, such as the electronic devices 110 and/or 120; and at least one of the processing circuits 112 and 122, such as the processing circuits 112 and/or 122) may detect a contact event between the electronic devices 110 and 120 using at least one of the force sensitive elements 114 of the electronic device 110 and the force sensitive element 124 of the electronic device 120 (e.g., the force sensitive elements 114 and/or 124), particularly, sense a pressure between the electronic devices 110 and 120 using the at least one of the force sensitive elements 114 and 124, and confirm that the contact event occurs when at least one condition is satisfied, wherein the at least one condition may include that the pressure falls within a predetermined pressure range. For example, the at least one condition may further include that a time when the pressure falls within the predetermined pressure range reaches a predetermined time threshold.
In step 220, the interactive system 100 can display at least one predetermined image (e.g., one or more predetermined images) by using the display module 128 of the electronic device 120. The processing circuit 122 may control the display module 120 to display the at least one predetermined image, for example, the at least one predetermined image may be stored in a memory (not shown) of the electronic device 120 in advance, and the at least one predetermined image (read from the memory) may be displayed in step 220, but the invention is not limited thereto.
In step 230, the interactive system 100 may capture at least one local image, such as one or more local images, of the at least one predetermined image by using the photographing module 118 of the electronic device 110).
In step 240, the interactive system 100 may determine at least one relative position (e.g., one or more relative positions) of the electronic device 110 to the electronic device 120 by using the electronic device 110 (particularly, the processing circuit 112 therein) according to the at least one local image for the electronic device 120 to use. For example: the electronic device 110 may store the at least one predetermined image in a memory (not shown) of the electronic device 110 in advance, and may determine the at least one relative position according to a position of the at least one local image on the at least one predetermined image (read from the memory) in step 240. Another example is: based on a predetermined rule, the at least one local image carries position information to indicate the at least one relative position, and based on the predetermined rule, the electronic device 110 can extract the position information from the at least one local image to determine the at least one relative position according to the position information.
In step 250, the interactive system 100 can utilize the display module 128 of the electronic device 120 to display the display content corresponding to the at least one relative position, such as the modified image. The processing circuit 122 may control the display module 120 to display the display content corresponding to the at least one relative position, such as corresponding auxiliary information, icons (icon), symbols, and the like, for example, in response to one or more actions of the user to place the electronic device 110 at the at least one relative position, the operation of step 250 is performed.
According to some embodiments, one or more steps may be added, modified, or removed from the workflow 200. According to some embodiments, the operation(s) of one of the electronic devices 110 and 120 may be performed by the other of the electronic devices 110 and 120, as long as the implementation of the present invention is not affected.
FIG. 3 shows details of an embodiment of the interactive system 100. The electronic devices 110M and 120T may be used as examples of the electronic devices 110 and 120, respectively. For convenience of understanding, the electronic devices 110M and 120T may be implemented as a multifunctional mobile phone and an integral personal computer, respectively, and may be provided with respective touch screens, which may be respectively built-in with the force- sensitive components 114F and 124, wherein the force-sensitive component 114F, the camera (camera) 118F (especially, a front camera), and the touch screen of the electronic device 120T may be respectively taken as examples of the force-sensitive component 114, the camera module 118, and the display module 128, but the invention is not limited thereto. In some embodiments, the force sensitive elements 114F and 124 can be disposed above or below the touch screens, respectively. In some embodiments, the force sensitive elements 114F and 124 may be transparent.
FIG. 4 shows details of an implementation of the interactive system 100 in another embodiment. The force sensitive device 114B and the camera 118B (especially, the rear camera) can be used as an example of the force sensitive device 114 and the camera module 118, respectively, but the invention is not limited thereto. For brevity, the parts of the present embodiment that are similar to the above-mentioned embodiments are not repeated.
According to some embodiments, the force sensitive components 114 may include force sensitive components 114F and 114B, and the camera module 118 may include cameras 118F and 118B.
FIG. 5 shows details of an implementation of the interactive system 100 in another embodiment. The electronic device 110C may be an example of the electronic device 110. For ease of understanding, the electronic device 110C may be implemented as a toy (e.g., a chess piece), wherein the force-sensitive component 114C and the camera 118C may be used as examples of the force-sensitive component 114 and the camera module 118, respectively, wherein the force-sensitive component 114C may be transparent, but the invention is not limited thereto. For brevity, the parts of this embodiment that are similar to the above-mentioned embodiments are not repeated.
FIG. 6 shows details of an implementation of the interactive system 100 in another embodiment. The electronic device 110A may be an example of the electronic device 110. For ease of understanding, the electronic device 110A may be implemented as a simple input widget, wherein the force-sensitive component 114A and the camera 118A may be taken as examples of the force-sensitive component 114 and the camera module 118, respectively, and the force-sensitive component 114A may be located around the camera 118A, but the invention is not limited thereto. For brevity, the parts of this embodiment that are similar to the above-mentioned embodiments are not repeated.
FIG. 7 illustrates some examples of the predetermined image in the method. For ease of understanding, the electronic device 110A, such as the simple input widget, may be illustrated in fig. 7 to indicate that different relative positions thereof may correspond to different position information carried by the predetermined image.
Example (a) includes a series of predetermined images that are used to sequentially display a horizontal black band (which may be shaded in fig. 7) that moves downward. When the occurrence of the touch event is confirmed, the processing circuit 122 may control the display module 128 to start displaying the predetermined series of images, such that the horizontal black stripe moves downward at a steady speed (e.g., a first speed). Since the display time of each predetermined image in the series of predetermined images and the total display time of the series of predetermined images can be known, when the horizontal black band is detected by the camera 118A, the electronic device 110A (e.g., the processing circuit 112) can determine a time length (such as the length of the time interval between the time point when the occurrence of the touch event is confirmed and the current time point), and determine the Y coordinate value in the relative position of the electronic device 110A according to the ratio of the time length to the total display time.
Example (b) includes another series of predefined images that are used to sequentially display a vertical black band (which may be shaded in fig. 7) that moves to the right. After the display of the predetermined series of images is completed, the processing circuit 122 may control the display module 128 to start displaying the other predetermined series of images, so that the vertical black stripe moves to the right at a stable speed (e.g., a second speed). Since the display time of each predetermined image in the another predetermined image series and the total display time of the another predetermined image series can be known, when the vertical black stripe is detected by the camera 118A, the electronic device 110A (e.g., the processing circuit 112) can determine a time length (such as the length of the time interval between the time point when the another predetermined image series starts to be displayed and the current time point), and determine the X-coordinate value in the relative position of the electronic device 110A according to the ratio of the time length to the total display time.
Example (c) includes a multi-region image that is distinguishable into a plurality of regions, wherein the distribution of the average luminance of each of the plurality of regions and the luminance distribution in each of the plurality of regions are based on the predetermined rule. For example: the average brightness of each of the plurality of regions may be different from each other and may vary with the position of the region, in particular, increasing from bottom to top and from left to right; and the brightness in each of the plurality of regions may vary with position, in particular, decreasing from bottom to top, from left to right; but the invention is not limited thereto. In some examples, the predetermined rules may be changed and the multi-region image may be modified accordingly.
Example (d) includes a multi-region image which can be divided into a plurality of regions, and two adjacent regions can be divided by black bars, wherein the respective colors of the plurality of regions (which can be represented by various types of shading in fig. 7), the distribution of the average luminance of each of the plurality of regions in each of the plurality of regions, and the luminance distribution in each of the plurality of regions are based on the predetermined rule. For example: the respective colors of the plurality of sets of regions may be different from each other; and the respective average brightness of the plurality of regions in each of the plurality of sets of regions may be different from each other and may vary with the position of the region, in particular, increasing from left to right; but the invention is not limited thereto. In some examples, the predetermined rules may be changed and the multi-region image may be modified accordingly.
According to some embodiments, the number and the way of distinguishing the plurality of regions in example (c) may be varied. According to some embodiments, the number of sets and the way of distinguishing the plurality of sets of regions and/or the number of the plurality of regions and the way of distinguishing in example (d) may be varied.
According to some embodiments, examples of the at least one predetermined image may include (but are not limited to): one or a combination of various image characteristics, such as color, shape, size, distribution, etc., and an actual image, such as a landscape photograph.
Fig. 8 shows a workflow 400 of the method in one embodiment. For ease of understanding, the electronic device 110 may be placed on the electronic device 120 by a user, causing pressure therebetween, wherein the electronic devices 110 and 120 may be considered as an upper device and a lower device, respectively. The electronic devices 110 and 120 (such as the upper device and the lower device) can sense the pressure through the force sensitive elements 114 and 124 respectively in steps S11-S14 for mutual discrimination, and can be positioned by using the at least one predetermined image in steps S15-S18 to determine the at least one relative position for further use in step S19.
In step S11, the electronic devices 110 and 120 may turn on the force- sensitive elements 114 and 124, respectively.
In step S12, the electronic devices 110 and 120 may sense the pressure by using the force- sensitive elements 114 and 124, respectively.
In step S13, at least one of the electronic devices 110 and 120 (e.g., the processing circuit 112 and/or 122) can determine whether the two electronic devices 110 and 120 have an up-down relationship to receive the device attribute according to the magnitude and duration of the pressure. Each of the processing circuits 112 and 122 may determine that the at least one condition is satisfied. In particular, the pressure may fall within the predetermined pressure range (e.g., 50 grams per square centimeter, or other predetermined value), and the time that the pressure falls within the predetermined pressure range reaches the predetermined time threshold (e.g., 3 seconds, or other predetermined value). Through the wireless communication between the communication modules 116 and 126, the processing circuit 112 may inform the processing circuit 122 that the at least one condition is satisfied and transmit a device attribute of the electronic device 110, which the processing circuit 122 may receive to confirm the occurrence of the contact event.
In step S14, the electronic device 120 (e.g., the processing circuit 122) can define which of the two electronic devices 110 and 120 is an upper device and which of the two electronic devices 110 and 120 is a lower device, and particularly, the electronic devices 110 and 120 are respectively defined as an upper device and a lower device.
In step S15, the lower device (which is the electronic device 120 in this example) may display a predetermined image (or images) for positioning, such as any one of the above-listed examples (a) to (d).
In step S16, the upper device (which is the electronic device 110 in this example) may capture a local image (or a plurality of local images) of the predetermined image (or any one of the plurality of predetermined images, for example, each of the plurality of predetermined images).
In step S17, the computing device may calculate spatial information of the computing device, such as the at least one relative position, according to the local image (or the plurality of local images).
In step S18, the upper device may output the spatial information of the upper device to the lower device.
In step S19, the lower device may display a corresponding user interface (e.g., display content corresponding to the at least one relative position, such as corresponding auxiliary information, icons, symbols, etc.) around the upper device.
According to some embodiments, one or more steps may be added, modified, or removed from the workflow 400.
According to some embodiments, the inter-device relationship between the electronic devices 110 and 120 may include the predetermined pressure range and the predetermined time threshold. For example, the electronic devices 110 and 120 may be connected to a network (such as the internet) in advance, and the relationship between the devices, such as the relationship between the devices once performing any of the workflows 200 and 400, may be obtained according to a history, wherein the relationship between the devices may be uploaded to a database in advance for the electronic devices 110 and 120 to download when needed. According to the relationship between the devices, the electronic devices 110 and 120 can confirm the occurrence of the contact event and/or define which device is the upper device and which device is the lower device. According to some embodiments, the device-to-device relationship may further include device attributes, such as size, weight, model, etc., of each of the electronic devices 110 and 120. According to some embodiments, the inter-device relationship may be determined by means of other information, such as: user accounts, friendships (e.g., friends on a social platform), wireless connection status (e.g., whether a wireless signal to a peer device can be detected).
According to some embodiments, the electronic device 110 (e.g., the processing circuit 112) may perform an image analysis according to the at least one local image to determine a relative height (Z-coordinate value), an inclination angle, a rotation angle, etc. of the electronic device 110 with respect to the electronic device 120, so that the electronic device 120 can correspondingly modify the user interface displayed by the display module 128.
The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (10)

1. A first electronic device for performing interactive control between the first electronic device and a second electronic device, comprising:
a first force sensitive component for detecting a contact event between the first electronic device and the second electronic device;
a camera module for capturing at least one local image of at least one predetermined image, wherein a display module of the second electronic device displays the at least one predetermined image; and
a processing circuit, coupled to the first force-sensitive component and the camera module, for controlling an operation of the first electronic device, wherein the at least one predetermined image is a moving image including a series of predetermined images, the processing circuit determines at least one relative position of the first electronic device to the second electronic device for use by the second electronic device according to the at least one local image, a total display time of the series of predetermined images, and a time length between the touch event and a time when the camera module captures the at least one local image, and the display module of the second electronic device displays a display content corresponding to the at least one relative position.
2. The first electronic device of claim 1, wherein, in connection with detecting the contact event, the first force sensitive component is configured to sense a pressure between the first electronic device and the second electronic device; and confirming the occurrence of the contact event when at least one condition is satisfied, wherein the at least one condition includes that the pressure falls within a predetermined pressure range.
3. The first electronic device of claim 2, wherein the at least one condition further comprises a time that the pressure falls within the predetermined pressure range reaching a predetermined time threshold.
4. The first electronic device of claim 1, wherein the first electronic device pre-stores the at least one predetermined image and determines the at least one relative position according to a position of the at least one local image on the pre-stored at least one predetermined image.
5. The first electronic device of claim 1, wherein the at least one local image carries location information indicating the at least one relative location based on a predetermined rule; and based on the preset rule, the first electronic device extracts the position information from the at least one local image so as to judge the at least one relative position according to the position information.
6. A second electronic device for performing interactive control between a first electronic device and the second electronic device, comprising:
a second force sensitive component for detecting a contact event between the first electronic device and the second electronic device;
a display module for displaying at least one predetermined image, wherein a camera module of the first electronic device captures at least one local image of the at least one predetermined image, the at least one predetermined image being a moving image comprising a series of predetermined images, allowing the first electronic device to determine at least one relative position of the first electronic device to the second electronic device for use by the second electronic device according to the at least one local image, a total display time of the series of predetermined images, and a time duration between the touch event and the time when the camera module captures the at least one local image; and
and a processing circuit, coupled to the second force-sensitive component and the display module, for controlling the operation of the second electronic device, wherein the processing circuit controls the display module to display the at least one predetermined image and controls the display module to display the display content corresponding to the at least one relative position.
7. The second electronic device of claim 6, wherein, in connection with detecting the touch event, the second force sensitive component is configured to sense a pressure between the first electronic device and the second electronic device; and confirming the occurrence of the contact event when at least one condition is satisfied, wherein the at least one condition includes that the pressure falls within a predetermined pressure range.
8. The second electronic device according to claim 7, wherein the at least one condition further comprises a time that the pressure falls within the predetermined pressure range reaching a predetermined time threshold.
9. The second electronic device of claim 6, wherein the second electronic device pre-stores the at least one predetermined image for display by the display module; and the first electronic device prestores the at least one preset image, and judges the at least one relative position according to the position of the at least one local image on the at least one preset image prestored in the first electronic device.
10. The second electronic device as claimed in claim 6, wherein the at least one partial image carries location information indicating the at least one relative location based on a predetermined rule; and based on the preset rule, the first electronic device extracts the position information from the at least one local image so as to judge the at least one relative position according to the position information.
CN201810632219.1A 2018-06-19 2018-06-19 Electronic device for interactive control Active CN110618775B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810632219.1A CN110618775B (en) 2018-06-19 2018-06-19 Electronic device for interactive control

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810632219.1A CN110618775B (en) 2018-06-19 2018-06-19 Electronic device for interactive control

Publications (2)

Publication Number Publication Date
CN110618775A CN110618775A (en) 2019-12-27
CN110618775B true CN110618775B (en) 2022-10-14

Family

ID=68920258

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810632219.1A Active CN110618775B (en) 2018-06-19 2018-06-19 Electronic device for interactive control

Country Status (1)

Country Link
CN (1) CN110618775B (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101002455A (en) * 2004-06-04 2007-07-18 B·F·加萨比安 System to enhance data entry in mobile and fixed environment
CN101924817A (en) * 2009-06-10 2010-12-22 英华达(南京)科技有限公司 Electronic device and shooting method thereof
CN203554618U (en) * 2013-11-22 2014-04-16 纬创资通股份有限公司 Test equipment used for image test on electronic device
CN104765443A (en) * 2014-01-03 2015-07-08 伊吉士摩斯科技股份有限公司 Image type virtual interaction device and implementation method thereof
US9244598B1 (en) * 2013-03-05 2016-01-26 Christina David Interactive centerpiece system
CN105988663A (en) * 2011-05-09 2016-10-05 林卓毅 Display apparatus, electronic apparatus, hand-wearing apparatus and control system
CN106104425A (en) * 2014-03-17 2016-11-09 谷歌公司 Concern based on user carrys out the adjustment information degree of depth
CN107391060A (en) * 2017-04-21 2017-11-24 阿里巴巴集团控股有限公司 Method for displaying image, device, system and equipment, computer-readable recording medium
KR20180043605A (en) * 2016-10-20 2018-04-30 삼성전자주식회사 Method for providing content and electronic device thereof

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9105221B2 (en) * 2012-03-02 2015-08-11 Realtek Semiconductor Corp. Multimedia interaction system and related computer program product capable of blocking multimedia interaction commands that against interactive rules

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101002455A (en) * 2004-06-04 2007-07-18 B·F·加萨比安 System to enhance data entry in mobile and fixed environment
CN101924817A (en) * 2009-06-10 2010-12-22 英华达(南京)科技有限公司 Electronic device and shooting method thereof
CN105988663A (en) * 2011-05-09 2016-10-05 林卓毅 Display apparatus, electronic apparatus, hand-wearing apparatus and control system
US9244598B1 (en) * 2013-03-05 2016-01-26 Christina David Interactive centerpiece system
CN203554618U (en) * 2013-11-22 2014-04-16 纬创资通股份有限公司 Test equipment used for image test on electronic device
CN104765443A (en) * 2014-01-03 2015-07-08 伊吉士摩斯科技股份有限公司 Image type virtual interaction device and implementation method thereof
CN106104425A (en) * 2014-03-17 2016-11-09 谷歌公司 Concern based on user carrys out the adjustment information degree of depth
KR20180043605A (en) * 2016-10-20 2018-04-30 삼성전자주식회사 Method for providing content and electronic device thereof
CN107391060A (en) * 2017-04-21 2017-11-24 阿里巴巴集团控股有限公司 Method for displaying image, device, system and equipment, computer-readable recording medium

Also Published As

Publication number Publication date
CN110618775A (en) 2019-12-27

Similar Documents

Publication Publication Date Title
US11231845B2 (en) Display adaptation method and apparatus for application, and storage medium
US9952754B2 (en) Information processing device, information processing method, and program
CN105210144B (en) Display control unit, display control method and recording medium
KR102456601B1 (en) Apparatus and method for providing functions regarding keyboard layout
WO2015098051A1 (en) Information processing device, information processing method, and program
US11755186B2 (en) Screen capturing method and terminal device
JP2013122738A (en) Electronic device
CN106534665A (en) Image display device and image display method
US9323339B2 (en) Input device, input method and recording medium
WO2013046596A1 (en) Mobile information processing terminal
CN108134903B (en) Shooting method and related product
CN104793887B (en) The dual camera control method and device of mobile terminal
CN110045935A (en) Processing unit, display system and recording medium
CN110442521B (en) Control unit detection method and device
JP6291072B2 (en) Live view control device, live view control method, live view system, and program
CN104536661A (en) Terminal screen shot method
JP2015156201A (en) Electronic apparatus and system, method, and program
EP4013022A1 (en) Method, apparatus and device for switching display mode, and medium
CN112825040A (en) User interface display method, device, equipment and storage medium
CN110618775B (en) Electronic device for interactive control
TWI669636B (en) Electronic device applicable to interaction control
CN104536564A (en) Terminal
CN110543275B (en) Interaction method based on mobile terminal photographing interface and mobile terminal
KR20070054569A (en) Apparatus for providing human interface using camera attached to mobile communication terminal and method thereof
JP6244666B2 (en) Display device and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant