CN112732158B - Wearable device and control method of wearable device - Google Patents

Wearable device and control method of wearable device Download PDF

Info

Publication number
CN112732158B
CN112732158B CN202011645241.3A CN202011645241A CN112732158B CN 112732158 B CN112732158 B CN 112732158B CN 202011645241 A CN202011645241 A CN 202011645241A CN 112732158 B CN112732158 B CN 112732158B
Authority
CN
China
Prior art keywords
touch
sensor
display interface
projection display
skin
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011645241.3A
Other languages
Chinese (zh)
Other versions
CN112732158A (en
Inventor
杨航
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202011645241.3A priority Critical patent/CN112732158B/en
Publication of CN112732158A publication Critical patent/CN112732158A/en
Application granted granted Critical
Publication of CN112732158B publication Critical patent/CN112732158B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a wearable device and a control method of the wearable device. The apparatus includes: a body having a first side and a second side for contacting a user's skin; the projection assembly is arranged on the first side face of the body and used for projecting and displaying the content to be displayed onto the skin of a user; the first sensor is arranged on the body and positioned on the second side face, and the first sensor is used for acquiring blood pressure information of the user; the second sensor is arranged on the body and positioned on the first side face, and the second sensor is used for acquiring position information of a touch body touching the skin of the user; the processor is respectively connected with the projection assembly, the first sensor and the second sensor, and controls the projection assembly to work according to the blood pressure information acquired by the first sensor and the position information of the touch body acquired by the second sensor, so that interaction between a user and a projection display interface can be realized.

Description

Wearable device and control method of wearable device
Technical Field
The application belongs to the technical field of electronic equipment control, and particularly relates to wearable equipment and a control method of the wearable equipment.
Background
Along with the popularization of intelligent wearable equipment application, the content that shows on intelligent wearable equipment's screen is more abundant, and intelligent wearable equipment is when bringing more convenient service function for the user, because intelligent wearable equipment volume is restricted, the size of the last display screen of intelligent wearable equipment also receives the restriction, so can't provide better demonstration experience for the user. For this reason, in the related art, the content displayed on the screen of the wearable device may be displayed in a projection manner.
The above-described scheme does not enable user interaction with the projection display interface.
Disclosure of Invention
The embodiment of the application aims to provide a wearable device, and the problem that the wearable device cannot realize interaction between a user and a projection display interface can be solved.
In order to solve the technical problem, the present application is implemented as follows:
in a first aspect, an embodiment of the present application provides a wearable device, including:
a body having a first side and a second side for contacting a user's skin;
the projection assembly is arranged on the first side face of the body and is used for projecting and displaying content to be displayed onto the skin of a user;
the first sensor is arranged on the body, is positioned on the second side face and is used for acquiring blood pressure information of a user;
the second sensor is arranged on the body, is positioned on the first side face and is used for acquiring position information of a touch body touching the skin of a user;
the processor is arranged on the body and is respectively connected with the projection assembly, the first sensor and the second sensor, and the processor controls the projection assembly to work according to the blood pressure information acquired by the first sensor and the position information of the touch body acquired by the second sensor.
In a second aspect, an embodiment of the present application provides a method for controlling a wearable device, where the method is applied to the wearable device according to the first aspect, and the method includes:
acquiring blood pressure information acquired by a first sensor;
under the condition that the blood pressure information meets a preset condition, acquiring position information of a touch body touching the skin of the user, wherein the position information is acquired by a second sensor;
and controlling the projection assembly to work according to the position information of the touch control body.
In a third aspect, the present application provides a readable storage medium, on which a program or instructions are stored, which when executed by a processor implement the steps of the method according to the second aspect.
In a fourth aspect, embodiments of the present application provide a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the method according to the second aspect.
In the embodiment of the application, the wearable device can project and display the content to be displayed on the skin of a user by arranging the projection component, so that the content to be displayed is enlarged and presented, and the user can conveniently view and operate the content. Further, the processor may determine whether the projection display interface projected on the skin of the user is touched by the touch object according to the blood pressure information acquired by the first sensor, and control the projection assembly to operate according to the position information of the touch object acquired by the second sensor in the case that it is determined that the projection display interface is touched by the touch object. The wearable device provided by the embodiment of the application can directly operate the projection display interface while the projection display interface is enlarged and displayed on the skin of a user through the projection assembly, and the touch position in the touch projection display interface of the touch body can be determined through the first sensor and the second sensor, so that the interaction between the user and the projection display interface can be realized, misoperation caused by the fact that the screen of the wearable device is small is avoided, the accuracy of touch operation identification is improved, and the efficiency of touch operation of the wearable device is improved.
Drawings
Fig. 1 is a schematic structural diagram of a wearable device provided in an embodiment of the present application;
fig. 2 is a schematic structural diagram of a system of a wearable device according to an embodiment of the present application;
fig. 3 is a schematic view of a usage state of a wearable device provided in an embodiment of the present application;
fig. 4 is a schematic diagram of determining a touch position in a wearable device according to an embodiment of the present disclosure;
fig. 5 is a schematic flowchart of a control method of a wearable device according to an embodiment of the present disclosure;
fig. 6 is a flowchart illustrating a control method of another wearable device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that embodiments of the application may be practiced in sequences other than those illustrated or described herein, and that the terms "first," "second," and the like are generally used herein in a generic sense and do not limit the number of terms, e.g., the first term can be one or more than one. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means that a preceding and succeeding related objects are in an "or" relationship.
The wearable device provided by the embodiment of the present application is described in detail below with reference to the accompanying drawings through specific embodiments and application scenarios thereof.
The embodiment of the application provides a wearable device, and the wearable device can be an intelligent bracelet, an intelligent watch and the like.
As shown in fig. 1 and 2, the wearable device includes a body 10, a projection assembly 20, a first sensor 30, a second sensor 40, and a processor 50.
A body 10, the body 10 having a first side 11 and a second side 12 for contacting the skin of a user.
In one embodiment of the present application, as shown in fig. 1, the body 10 includes a first side 11 and a second side 12, the first side 11 is adjacent to the second side 12, and the second side 12 may be a side contacting with an arm of a user.
Taking wearable device as an intelligent bracelet as an example, as shown in fig. 1, the main body 10 may include a main body structure and a wearable structure, and the wearable structure is connected to the main body structure. Wherein, wearable structure and major structure are for dismantling to be connected, and wearable structure's length can be adjusted. For example, the projection assembly 20 and the second sensor 40 can be disposed on the first side 11 of the body structure, and the first sensor 30 can be disposed on the second side 12 of the body structure.
The projection assembly 20 is arranged on the first side 11 of the body 10, and the projection assembly 20 is used for projecting and displaying the content to be displayed on the skin of a user.
In this embodiment, the projection assembly 20 is disposed on the first side of the body 10, and is used for projecting and displaying the content to be displayed of the wearable device onto the skin of the user, so as to present the content to be displayed of the wearable device in an enlarged manner, which is convenient for the user to view and operate.
For example, a physical button configured to turn on the projection operating mode may be provided on the wearable device, or an interactive control for turning on the projection operating mode may be presented on the display interface of the body 10. In response to the user's operation of a physical button or an interactive control for starting the projection operation mode, the processor 50 controls the projection assembly 20 to operate to project the contents to be displayed of the display interface of the body 10 onto the skin of the user.
Illustratively, when the wearable device establishes a communication connection with the external device, the processor 50 controls the projection component 20 to work to project the content to be displayed of the display interface of the external device onto the skin of the user. The external device may be a mobile terminal, a tablet computer, a notebook computer, or the like.
The first sensor 30, the first sensor 30 is disposed on the body 10, and the first sensor 30 is located on the second side 12, the first sensor 30 is used for obtaining blood pressure information of the user.
In this embodiment, when the skin of the user is touched by the touch object, that is, when the arm of the user is pressed, pressure is generated on the blood vessels of the user, so that the arterial pressure value of the user changes in a short time. That is, the blood pressure information may reflect whether the skin of the user is touched, that is, whether the projection display interface projected on the skin of the user is touched. Based on this, in the using process of the wearable device, the first sensor 30 is in contact with the arm of the user to collect the blood pressure information of the user in real time, so that whether the skin of the projection display interface is touched by the touch object can be judged according to the blood pressure information acquired by the first sensor.
The touch object may be a finger of the user, or may be a stylus or the like.
And the second sensor 40, the second sensor 40 is arranged on the body 10, the second sensor 40 is located on the first side surface 11, and the second sensor 40 is used for acquiring the position information of a touch body touching the skin of the user.
In the present embodiment, the second sensor 40 is disposed on the first side 11 of the body 10, and the second sensor 40 can be used to acquire the position information of the touch object touching the skin of the user. And determining the touch position of the touch body on the touch projection display interface according to the position information of the touch body.
In one embodiment of the present application, the second sensor 40 may be a distance sensor. The second sensor 40 may also be an array-type distance sensor. The second sensor 40 is disposed on the same side as the projection assembly 20. It should be noted that, the skilled person can set the number, relative positions, etc. of the projection assemblies 20, the first sensors 30, and the second sensors 30 according to actual needs.
In one embodiment of the present application, as shown in fig. 1 and 2, the second sensor 40 includes a first distance sensor 41 and a second distance sensor 42, the first distance sensor 41 and the second distance sensor 42 are both disposed on the first side 11 of the body 10, and the first distance sensor 41 and the second distance sensor 42 are respectively located on two sides of the projection assembly 20.
In the embodiment of the application, the touch positions of the user in the projection display interface are determined by the first distance sensor and the second distance sensor which are respectively arranged on two sides of the projection component, so that the accuracy of identification can be improved.
It should be noted that the first distance sensor may be a ToF sensor, an infrared sensor, a laser ranging sensor, or an ultrasonic sensor. The second distance sensor may also be a ToF sensor, an infrared sensor, a laser ranging sensor or an ultrasonic sensor. The embodiment of the present application does not limit the types of the first distance sensor and the second distance sensor. In the embodiment of the application, the first distance sensor and the second sensor both adopt ToF sensors, so that the recognition precision can be further improved, and the accuracy of touch operation of the wearable device is improved.
The processor 50 is disposed on the body 10, the processor 50 is respectively connected to the projection module 20, the first sensor 30 and the second sensor 40, and the processor 50 controls the projection module 20 to operate according to the blood pressure information acquired by the first sensor 30 and the position information of the touch object acquired by the second sensor 40.
In the embodiment of the application, the wearable device can project and display the content to be displayed on the skin of a user by arranging the projection component, so that the content to be displayed is enlarged and presented, and the user can conveniently view and operate the content.
Furthermore, the processor can determine whether the skin of the projection display interface is touched by a touch object according to the blood pressure information acquired by the first sensor, and control the projection component to work according to the position information of the touch object acquired by the second sensor under the condition that the projection display interface is determined to be touched by the touch object, so that interaction between a user and the projection display interface can be realized, misoperation caused by the fact that a screen of the wearable device is small is avoided, accuracy of touch operation identification is improved, and efficiency of touch operation of the wearable device is improved.
In an embodiment of the application, controlling the projection component to operate according to the blood pressure information acquired by the processor according to the first sensor and the position information of the touch object acquired by the processor according to the second sensor specifically includes:
the processor determines whether the skin of the projection display interface is touched by the touch body according to the blood pressure information acquired by the first sensor;
under the condition that the skin of the projection display interface is touched by the touch body, the processor determines the touch position of the touch body touching the projection display interface according to the position information of the touch body acquired by the second sensor;
the processor controls the projection component to realize a target action according to the touch position of the touch body touching the projection display interface, and the target action corresponds to the touch position of the projection display interface.
In this embodiment, the processor may store a mapping relationship between the target motion and the touch position of the projection display interface in advance. The touch position of the projection display interface may be, for example, position information of an interaction control in the projection display interface. In specific implementation, the target action corresponding to the touch position can be determined according to a pre-stored mapping relationship between the target action and the touch position of the projection display interface and the touch position where the touch body touches the projection display interface, and the target action is executed.
In the embodiment of the application, the processor may determine whether the skin of the projection display interface is touched by the touch object according to the blood pressure information acquired by the first sensor, and determine the touch position of the touch object on the projection display interface according to the position information of the touch object acquired by the second sensor under the condition that the skin of the projection display interface is touched by the touch object, and further control the projection component to implement the target action according to the touch position of the touch object on the projection display interface, so that interaction between a user and the projection display interface can be implemented, misoperation caused by a small screen of the wearable device is avoided, accuracy of touch operation identification is improved, and efficiency of touch operation of the wearable device is improved.
The blood pressure information includes mean arterial pressure values. When the skin of a user is touched by a touch body, namely when the arm of the user is pressed, pressure is generated on the blood vessel of the user, so that the arterial pressure value of the user in a short time is changed. Based on the above, whether the skin of the projection display interface is touched by the touch body can be determined according to the artery pressure value. The following describes a process of determining whether the skin of the projection display interface is touched by a touch object by using a specific example.
The processor determines whether the skin of the projection display interface is touched by the touch object according to the blood pressure information acquired by the first sensor, and the method specifically includes:
the processor determines a difference between the mean arterial pressure value and a preset initial mean arterial pressure value.
And in the case that the difference value is larger than a preset threshold value, the processor determines that the skin of the projection display interface is touched by the touch body.
And under the condition that the difference value is smaller than or equal to the preset threshold value, the processor determines that the skin of the projection display interface is not touched by the touch body.
In this example, the initial mean arterial pressure value is an arterial pressure value at which the arm of the user is not subjected to external pressure, and the initial mean arterial pressure value may be determined from the historical blood pressure information acquired by the first sensor. The mean arterial pressure value is the mean arterial pressure value acquired by the first sensor at the current moment.
During the use of the wearable device by the user, the first sensor may monitor the mean arterial pressure value of the user at the current moment in real time and compare the mean arterial pressure value at the current moment with a preset initial mean arterial pressure value. When the difference between the average arterial pressure value at the current moment and the initial average arterial pressure value is greater than the predetermined threshold, that is, when the abrupt change of the average arterial pressure value at the current moment is greater than the predetermined threshold, it is determined that the arm of the user is subjected to the external pressure, that is, the skin of the projection display interface is touched by the touch object. Under the condition that the difference value between the average arterial pressure value at the current moment and the initial average arterial pressure value is smaller than or equal to the preset threshold value, namely under the condition that the mutation quantity of the average arterial pressure value at the current moment is smaller than or equal to the preset threshold value, the arm of the user is considered not to be subjected to external pressure, namely the skin of the projection display interface is not touched by the touch body.
In the embodiment of the application, before the projection assembly is controlled to work according to the position information of the touch object acquired by the second sensor, whether the skin of the projection display interface is touched by the touch object is determined according to the blood pressure information acquired by the first sensor, so that the error identification rate of the wearable device can be reduced.
Taking the second sensor 40 including the first distance sensor 41 and the second distance sensor 42 as an example, a process of determining a touch position where the touch object touches the projection display interface will be described.
In this example, the determining, by the processor, the touch position of the touch projection display interface by the touch object according to the position information of the touch object acquired by the second sensor specifically includes:
the processor determines the position information of the touch body according to first distance information acquired by the first distance sensor and second distance information acquired by the second distance sensor.
And the processor determines the touch position of the touch projection display interface of the touch body according to the position information of the touch body.
For example, fig. 3 shows a schematic diagram of a user touching a projection display interface, as shown in fig. 3, 301 is the projection display interface, and point C is the position of the finger of the user. Fig. 4 shows a schematic diagram of a process for determining position information of a touch object, as shown in fig. 4, first distance information (x) is a distance from a first distance sensor 41 (point a) to a user finger (point C), second distance information (y) is a distance from a second distance sensor 42 (point B) to the user finger (point C), and a position of the user finger (point C), that is, position information of the touch object, can be determined according to an intersection of the first distance information (x) and the second distance information (y). Further, the touch position of the touch body on the projection display interface can be determined according to the position information of the touch body.
In the embodiment of the application, the position information of the touch body is determined by the first distance sensor and the second distance sensor which are respectively arranged on two sides of the projection assembly, so that the identification accuracy can be improved.
On the basis of any of the above embodiments, as shown in fig. 2, the wearable device further comprises a wireless communication unit 60. The wireless communication unit 60 is connected to the processor 50, and the wireless communication unit 60 is used to communicate with an external device. In the embodiment of the present application, the processor 50 may interact with an external device through the wireless communication unit 60, so that the user experience is better.
On the basis of any of the above embodiments, as shown in fig. 2, the wearable device further includes a power supply module 70. The power module 70 is used to provide power to the processor 50, the projection assembly 20, the first sensor 30, the second sensor 40, and the wireless communication unit 60.
On the basis of any of the above embodiments, as shown in fig. 2, the wearable device further includes a charging module 80. The charging module 80 is connected to the power module 70, and the charging module 80 is used for charging the power module 70.
The embodiment of the application provides a control method of wearable equipment. The method may be applied to a wearable device, and referring to the flowchart shown in fig. 5, the method includes the following steps S5100-S5300.
In step S5100, blood pressure information collected by the first sensor is obtained.
When the skin of the user is touched by the touch body, namely the arm of the user is pressed, pressure is generated on the blood vessels of the user, so that the arterial pressure value of the user in a short time is changed. That is, the blood pressure information may reflect whether the skin of the user is touched, that is, whether the projection display interface projected on the skin of the user is touched.
In one embodiment of the present application, the blood pressure information comprises mean arterial pressure values.
In one embodiment of the present application, after acquiring the blood pressure information acquired by the first sensor, the method may further include the following steps S6100-S6300.
In step S6100, a difference between the mean arterial pressure value and a preset initial mean arterial pressure value is determined.
The initial mean arterial pressure value is the arterial pressure value when the arm of the user is not subjected to external pressure, and the initial mean arterial pressure value can be determined according to the historical blood pressure information acquired by the first sensor.
The mean arterial pressure value is the mean arterial pressure value acquired by the first sensor at the current moment.
And S6200, determining that the blood pressure information meets a preset condition under the condition that the difference value is larger than a preset threshold value.
And step S6300, under the condition that the difference value is less than or equal to the preset threshold value, determining that the blood pressure information does not meet the preset condition.
For example, during the use of the wearable device by the user, the first sensor may monitor the mean arterial pressure value of the user at the current time in real time and compare the mean arterial pressure value at the current time with a preset initial mean arterial pressure value. Under the condition that the difference value between the average arterial pressure value at the current moment and the initial average arterial pressure value is larger than a preset threshold value, namely under the condition that the mutation amount of the average arterial pressure value at the current moment is larger than the preset threshold value, the arm of the user is considered to be subjected to external pressure, namely the skin of the projection display interface is touched by a touch body, and at the moment, the blood pressure information is determined to meet the preset condition. Under the condition that the difference value between the average arterial pressure value at the current moment and the initial average arterial pressure value is smaller than or equal to a preset threshold value, namely under the condition that the mutation quantity of the average arterial pressure value at the current moment is smaller than or equal to the preset threshold value, the arm of the user is considered not to be subjected to external pressure, namely the skin of the projection display interface is not touched by a touch body, and at the moment, the blood pressure information is determined not to meet the preset condition.
In the embodiment of the application, the blood pressure information acquired by the first sensor is acquired, and the projection assembly is further controlled to work according to the position information of the touch body acquired by the second sensor under the condition that the blood pressure information meets the preset condition, so that the false recognition rate of the wearable device can be reduced.
In step S5200, when the blood pressure information satisfies the preset condition, the position information of the touch object touching the skin of the user, acquired by the second sensor, is acquired.
In an embodiment of the application, the step of acquiring the position information of the touch object touching the skin of the user, acquired by the second sensor, may further include: first distance information acquired by a first distance sensor and second distance information acquired by a second distance sensor are acquired.
In the embodiment of the application, the position information of the touch body is determined by the first distance sensor and the second distance sensor which are respectively arranged on two sides of the projection component, so that the identification accuracy can be improved.
And step S5300, controlling the projection component to work according to the position information of the touch body.
In an embodiment of the application, the step of controlling the projection component to work according to the position information of the touch object may further include: steps S5310-S5320.
Step S5310, determining a touch position of the touch projection display interface according to the position information of the touch object.
The step of determining the touch position of the touch projection display interface of the touch body according to the position information of the touch body may further include: steps S5311-S5312.
Step S5311, determining position information of the touch object according to the first distance information and the second distance information.
Step S5312, determining a touch position of the touch object on the touch projection display interface according to the position information of the touch object.
For example, fig. 3 shows a schematic diagram of a user touching a projection display interface, as shown in fig. 3, 301 is the projection display interface, and point C is the position of the finger of the user. Fig. 4 shows a schematic diagram of a process for determining position information of a touch object, as shown in fig. 4, first distance information (x) is a distance from a first distance sensor 41 (point a) to a user finger (point C), second distance information (y) is a distance from a second distance sensor 42 (point B) to the user finger (point C), and a position of the user finger (point C), that is, position information of the touch object, can be determined according to an intersection of the first distance information (x) and the second distance information (y). Furthermore, the touch position of the touch body on the touch projection display interface can be determined according to the position information of the touch body.
Step S5320, controlling the projection module to execute a target action according to the touch position of the touch object on the projection display interface, where the target action corresponds to the touch position of the projection display interface.
In this embodiment, a mapping relationship between the target motion and the touch position of the projection display interface may be stored in advance. The touch position of the projection display interface may be, for example, position information of an interaction control in the projection display interface. In specific implementation, the target action corresponding to the touch position may be determined according to a pre-stored mapping relationship between the target action and the touch position of the projection display interface and the touch position at which the touch body touches the projection display interface, and the projection component is controlled to execute the target action.
The following describes a control method of a wearable device provided in an embodiment of the present application with reference to the drawings. Referring to fig. 6, the control method includes: S601-S607.
S601, receiving a first input.
S602, responding to the first input, and controlling the projection assembly to project a projection display interface corresponding to the display interface of the body on the skin of the user.
And S603, acquiring the mean arterial pressure value acquired by the first sensor.
And S604, calculating the difference value between the mean arterial pressure value and the initial mean arterial pressure value.
And S605, judging whether the difference value between the mean arterial pressure value and the initial mean arterial pressure value is larger than a preset threshold value, if so, entering S606, otherwise, returning to S603.
S606, position information of the touch body touching the skin of the user, which is acquired by the second sensor, is acquired.
And S607, controlling the projection component to work according to the position information of the touch body.
In the embodiment of the application, the processor can determine whether the skin of the projection display interface is touched by the touch object according to the blood pressure information acquired by the first sensor, and control the projection assembly to work according to the position information of the touch object acquired by the second sensor under the condition that the projection display interface is touched by the touch object, so that interaction between a user and the projection display interface can be realized, misoperation caused by the fact that a screen of the wearable device is small is avoided, the accuracy of touch operation identification is improved, and the efficiency of touch operation of the wearable device is improved.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored, and when the program or the instruction is executed by a processor, the program or the instruction implements each process of the control method embodiment of the wearable device, and can achieve the same technical effect, and in order to avoid repetition, the detailed description is omitted here.
The processor is the processor in the electronic device described in the above embodiment. The readable storage medium includes a computer readable storage medium, such as a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and so on.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to execute a program or an instruction to implement each process of the control method embodiment of the wearable device, and can achieve the same technical effect, and in order to avoid repetition, the details are not repeated here.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as system-on-chip, system-on-chip or system-on-chip, etc.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one of 8230, and" comprising 8230does not exclude the presence of additional like elements in a process, method, article, or apparatus comprising the element. Further, it should be noted that the scope of the methods and apparatus of the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions involved, e.g., the methods described may be performed in an order different than that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Through the description of the foregoing embodiments, it is clear to those skilled in the art that the method of the foregoing embodiments may be implemented by software plus a necessary general hardware platform, and certainly may also be implemented by hardware, but in many cases, the former is a better implementation. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (8)

1. A wearable device, comprising:
a body having a first side and a second side for contacting a user's skin;
the projection assembly is arranged on the first side face of the body and is used for projecting and displaying content to be displayed onto the skin of a user;
the first sensor is arranged on the body, is positioned on the second side surface and is used for acquiring blood pressure information of a user;
the second sensor is arranged on the body, is positioned on the first side face and is used for acquiring position information of a touch body touching the skin of a user;
the processor is arranged on the body and is respectively connected with the projection assembly, the first sensor and the second sensor, the processor determines whether the skin of the projection display interface is touched by the touch object according to the blood pressure information acquired by the first sensor, and controls the projection assembly to work according to the position information of the touch object acquired by the second sensor under the condition that the skin of the projection display interface is touched by the touch object;
wherein the blood pressure information comprises a mean arterial pressure value;
the processor determines whether the skin of the projection display interface is touched by the touch object according to the blood pressure information acquired by the first sensor, and specifically includes:
the processor determining a difference between the mean arterial pressure value and a preset initial mean arterial pressure value;
under the condition that the difference value is larger than a preset threshold value, the processor determines that the skin of the projection display interface is touched by a touch body;
in a case that the difference is less than or equal to the predetermined threshold, the processor determines that the skin of the projection display interface is not touched by the touch object.
2. The wearable device according to claim 1, wherein when the skin of the projection display interface is touched by a touch object, the controlling the projection component to operate according to the position information of the touch object acquired by the second sensor specifically includes:
under the condition that the skin of the projection display interface is touched by a touch object, the processor determines the touch position of the touch object touching the projection display interface according to the position information of the touch object acquired by the second sensor;
the processor controls the projection assembly to realize a target action according to the touch position of the touch body touch projection display interface, and the target action corresponds to the touch position of the projection display interface.
3. The wearable device according to claim 2, wherein the second sensor comprises a first distance sensor and a second distance sensor, the first distance sensor and the second distance sensor are both disposed on the first side of the body, and the first distance sensor and the second distance sensor are respectively located on both sides of the projection assembly.
4. The wearable device according to claim 3, wherein the processor determines the touch position of the touch object on the projection display interface according to the position information of the touch object obtained by the second sensor, and specifically includes:
the processor determines position information of the touch body according to first distance information acquired by the first distance sensor and second distance information acquired by the second distance sensor;
and the processor determines the touch position of the touch projection display interface of the touch body according to the position information of the touch body.
5. A control method of a wearable device, which is applied to the wearable device of any one of claims 1 to 4, the method comprising:
acquiring blood pressure information acquired by a first sensor;
acquiring position information of a touch body of the skin of the touch user, which is acquired by a second sensor, under the condition that the blood pressure information meets a preset condition;
controlling a projection assembly to work according to the position information of the touch body;
wherein the blood pressure information comprises a mean arterial pressure value; the method further comprises the following steps:
determining a difference value between the mean arterial pressure value and a preset initial mean arterial pressure value;
determining that the blood pressure information meets a preset condition under the condition that the difference value is larger than a preset threshold value;
and under the condition that the difference value is less than or equal to a preset threshold value, determining that the blood pressure information does not meet a preset condition.
6. The method according to claim 5, wherein the controlling the projection module to operate according to the position information of the touch object comprises:
determining the touch position of the touch body on the touch projection display interface according to the position information of the touch body;
and controlling the projection component to execute a target action according to the touch position of the touch body touching the projection display interface, wherein the target action corresponds to the touch position of the projection display interface.
7. The method of claim 6, wherein the second sensor comprises a first distance sensor and a second distance sensor;
the acquiring position information of the touch body touching the skin of the user, which is acquired by the second sensor, includes:
acquiring first distance information acquired by the first distance sensor and second distance information acquired by the second distance sensor;
the determining the touch position of the touch projection display interface of the touch body according to the position information of the touch body comprises:
determining position information of the touch body according to the first distance information and the second distance information;
and determining the touch position of the touch projection display interface of the touch body according to the position information of the touch body.
8. A computer-readable storage medium, characterized in that it stores thereon a program which, when executed by a processor, carries out the steps of a method of controlling a wearable device according to any of claims 5-7.
CN202011645241.3A 2020-12-31 2020-12-31 Wearable device and control method of wearable device Active CN112732158B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011645241.3A CN112732158B (en) 2020-12-31 2020-12-31 Wearable device and control method of wearable device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011645241.3A CN112732158B (en) 2020-12-31 2020-12-31 Wearable device and control method of wearable device

Publications (2)

Publication Number Publication Date
CN112732158A CN112732158A (en) 2021-04-30
CN112732158B true CN112732158B (en) 2023-01-31

Family

ID=75609408

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011645241.3A Active CN112732158B (en) 2020-12-31 2020-12-31 Wearable device and control method of wearable device

Country Status (1)

Country Link
CN (1) CN112732158B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103984437A (en) * 2014-04-28 2014-08-13 京东方科技集团股份有限公司 Wearable touch device and wearable touch method
CN108829294A (en) * 2018-04-11 2018-11-16 卡耐基梅隆大学 A kind of projection touch control method, device and projection touch control device

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20160015719A (en) * 2014-07-31 2016-02-15 엘지전자 주식회사 Mobile terminal and method for controlling the same
KR101873842B1 (en) * 2015-03-11 2018-07-04 한양대학교 산학협력단 Apparatus for providing virtual input using depth sensor and method for using the apparatus
CN111124196B (en) * 2018-10-30 2023-04-21 奇酷互联网络科技(深圳)有限公司 Dynamic display method of display interface, intelligent wearable device and storage medium

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103984437A (en) * 2014-04-28 2014-08-13 京东方科技集团股份有限公司 Wearable touch device and wearable touch method
CN108829294A (en) * 2018-04-11 2018-11-16 卡耐基梅隆大学 A kind of projection touch control method, device and projection touch control device

Also Published As

Publication number Publication date
CN112732158A (en) 2021-04-30

Similar Documents

Publication Publication Date Title
CN106598335B (en) A kind of touch screen control method, device and mobile terminal of mobile terminal
US9007314B2 (en) Method for touch processing and mobile terminal
CN105183284A (en) Short message viewing method and user terminal
EP2267589A2 (en) Method and device for recognizing a dual point user input on a touch based user input device
CN106855784A (en) Prevent method, device and the terminal of touch key-press false triggering
CN106681555A (en) Touch screen control method and device of mobile terminal and mobile terminal
CN104954549B (en) Electronic device and communication means
US20240077948A1 (en) Gesture-based display interface control method and apparatus, device and storage medium
CN111840988B (en) Game skill triggering method, game skill triggering device, game client and medium
CN107194213A (en) A kind of personal identification method and device
AU2019203256B2 (en) Fingerprint event processing method, apparatus, and terminal
CN109254707A (en) A kind of method, mobile terminal and computer readable storage medium that window is adjusted
CN105183302A (en) Method and terminal for controlling application
CN112764657B (en) Electronic device and control method of electronic device
CN107577404B (en) Information processing method and device and electronic equipment
US11144178B2 (en) Method for providing contents for mobile terminal on the basis of user touch and hold time
CN110658976B (en) Touch track display method and electronic equipment
CN107092433B (en) Touch control method and device of touch control all-in-one machine
CN112732158B (en) Wearable device and control method of wearable device
CN105260065B (en) The method and device of information processing
CN104750401A (en) Touch method and related device as well as terminal equipment
CN109683792A (en) A kind of voice interactive method, device, storage medium and mobile terminal
CN107728898B (en) Information processing method and mobile terminal
CA2853553C (en) Systems and methods of using input events on electronic devices
CN112162689B (en) Input method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant