CN114449153A - Shooting control method of wearable device, wearable device and storage medium - Google Patents

Shooting control method of wearable device, wearable device and storage medium Download PDF

Info

Publication number
CN114449153A
CN114449153A CN202011197335.9A CN202011197335A CN114449153A CN 114449153 A CN114449153 A CN 114449153A CN 202011197335 A CN202011197335 A CN 202011197335A CN 114449153 A CN114449153 A CN 114449153A
Authority
CN
China
Prior art keywords
camera
touch
gesture
screen
host
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011197335.9A
Other languages
Chinese (zh)
Inventor
郭亮
欧阳山
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Genius Technology Co Ltd
Original Assignee
Guangdong Genius Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Genius Technology Co Ltd filed Critical Guangdong Genius Technology Co Ltd
Priority to CN202011197335.9A priority Critical patent/CN114449153A/en
Publication of CN114449153A publication Critical patent/CN114449153A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/667Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/695Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects

Abstract

A shooting control method of a wearable device, the wearable device and a storage medium are provided, the method comprises: when a host of the wearable device is turned over to a preset angle relative to a host bracket, starting a target camera corresponding to the preset angle, wherein the target camera is a front camera or a rear camera; displaying an image acquired by a target camera on a screen; when touch operation aiming at the screen is detected, acquiring a first touch gesture corresponding to the touch operation; extracting a starting touch point and an ending touch point from the first touch gesture; judging whether the first touch control gesture is matched with a camera switching gesture or not according to the initial touch control point and the ending touch control point; and if so, switching the target camera, and displaying the image acquired by the switched target camera on the screen. According to the embodiment of the application, the front camera and the rear camera can be switched through simple gestures to control the wearable equipment, and the convenience of shooting by utilizing the wearable equipment is improved.

Description

Shooting control method of wearable device, wearable device and storage medium
Technical Field
The present application relates to the field of electronic devices, and in particular, to a shooting control method for a wearable device, and a storage medium.
Background
At present, the camera equipped on wearable equipment (such as telephone watch, intelligent bracelet, etc.) gradually becomes the mainstream design, and has received the wide popularity of user group. However, in practice, it is found that it is often difficult to accurately control the camera of the wearable device due to the limited size of the wearable device, and it is particularly difficult to control the wearable device with front and back double shooting to switch the camera in a reserved space, so that the convenience of shooting with the wearable device is reduced.
Disclosure of Invention
The embodiment of the application discloses a shooting control method of wearable equipment, the wearable equipment and a storage medium, wherein front and rear cameras of the wearable equipment can be controlled to be switched through simple gestures, and the convenience of shooting by utilizing the wearable equipment is improved.
The embodiment of the application discloses in a first aspect a shooting control method for a wearable device, the wearable device has a host computer capable of turning over relative to a host computer support, a front camera and a screen are arranged on the front of the host computer, a rear camera is arranged on the back of the host computer, and the method comprises the following steps:
when the host is turned over to a preset angle relative to the host bracket, starting a target camera corresponding to the preset angle, wherein the target camera is the front camera or the rear camera;
displaying the image collected by the target camera on the screen;
when touch operation aiming at the screen is detected, acquiring a first touch gesture corresponding to the touch operation;
extracting a starting touch point and an ending touch point from the first touch gesture;
judging whether the first touch gesture is matched with a camera switching gesture or not according to the starting touch point and the ending touch point;
and if the gesture is matched with the camera switching gesture, switching the target camera, and displaying the image acquired by the switched target camera on the screen.
As an optional implementation manner, in the first aspect of the embodiment of the present application, the switching the target camera includes:
if the target camera is the front camera, switching the target camera to the rear camera;
and if the target camera is the rear camera, switching the target camera to the front camera.
As another optional implementation manner, in the first aspect of the embodiment of the present application, the determining, according to the starting touch point and the ending touch point, whether the first touch gesture matches a camera switching gesture includes:
acquiring a first distance value of a boundary between the starting touch point and the screen, and a second distance value of a boundary between the ending touch point and the screen;
and judging whether the first distance value and the second distance value are in a standard distance range corresponding to a camera switching gesture, and if so, judging that the first touch gesture is matched with the camera switching gesture.
As another optional implementation manner, in the first aspect of the embodiment of the present application, after determining whether the first distance value and the second distance value are within a standard distance range corresponding to a camera switching gesture, the method further includes:
if the distance is not within the standard distance range, calculating the sum of the first distance value and the second distance value;
determining the false touch probability of the first touch gesture according to the sum of the first distance value and the second distance value;
and when the false touch probability is higher than a probability threshold value, ignoring the first touch gesture.
As another optional implementation manner, in the first aspect of the embodiment of the present application, after the displaying the image acquired by the switched target camera on the screen, the method further includes:
and re-executing the first touch gesture corresponding to the touch operation when the touch operation aiming at the screen is detected until the click operation aiming at the screen is detected.
As another optional implementation manner, in the first aspect of this embodiment of the present application, after the detecting of the click operation on the screen, the method further includes:
shooting by using the target camera to obtain a shot image;
displaying the photographed image on the screen.
As another optional implementation manner, in the first aspect of the embodiment of the present application, after the acquiring, when the touch operation on the screen is detected, a first touch gesture corresponding to the touch operation, the method further includes:
acquiring a second touch gesture corresponding to the touch operation within a preset time length;
displaying a special effect image corresponding to the second touch gesture on the screen, wherein the special effect image is superposed on the image acquired by the target camera.
The embodiment of this application in the second aspect discloses a wearable equipment, wearable equipment has the host computer that can overturn for the host computer support, the front of host computer is equipped with leading camera and screen, the back of host computer is equipped with rearmounted camera, wearable equipment includes:
the starting unit is used for starting a target camera corresponding to a preset angle when the host is turned over to the preset angle relative to the host bracket, and the target camera is the front camera or the rear camera;
the display unit is used for displaying the image acquired by the target camera on the screen;
the first acquisition unit is used for acquiring a first touch gesture corresponding to the touch operation when the touch operation aiming at the screen is detected;
the second acquisition unit is used for extracting a starting touch point and an ending touch point from the first touch gesture;
the first judging unit is used for judging whether the first touch control gesture is matched with the camera switching gesture according to the starting touch control point and the ending touch control point;
the switching unit is used for switching the target camera if the target camera is matched with the camera switching gesture;
the display unit is further used for displaying the image acquired by the switched target camera on the screen.
A third aspect of the embodiments of the present application discloses another wearable device, including:
a memory storing executable program code;
a processor coupled with the memory;
the processor calls the executable program code stored in the memory to execute all or part of the steps of the shooting control method of any one of the wearable devices disclosed in the first aspect of the embodiments of the present application.
A fourth aspect of the embodiments of the present application discloses a computer-readable storage medium storing a computer program, where the computer program enables a computer to execute all or part of the steps in any one of the shooting control methods for a wearable device disclosed in the first aspect of the embodiments of the present application.
A fifth aspect of embodiments of the present application discloses a computer program product, which, when running on a computer, causes the computer to execute all or part of the steps in the shooting control method for any one of the wearable devices in the first aspect of embodiments of the present application.
Compared with the prior art, the embodiment of the application has the following beneficial effects:
in this application embodiment, wearable equipment can indicate phone wrist-watch, intelligent bracelet etc. that have two cameras around, and this wearable equipment's host computer can overturn for its host computer support. When the host is turned over to a preset angle relative to the host bracket of the host, the wearable device can start a target camera corresponding to the preset angle, and the target camera can be a front camera or a rear camera of the wearable device; then, the image collected by the target camera can be displayed on a screen; when a touch operation for the screen is detected, the wearable device may acquire a first touch gesture corresponding to the touch operation; next, a starting touch point and an ending touch point can be extracted from the first touch gesture; finally, according to the starting touch point and the ending touch point, the wearable device can judge whether the first touch gesture is matched with the camera switching gesture; if the images are matched with each other, the target cameras can be switched, and the images acquired by the switched target cameras are displayed on the screen, so that the front camera and the rear camera can be switched for use. It is thus clear that, implement this application embodiment, can switch the front and back camera through simple gesture control wearable equipment, also avoid the mistake to touch when guaranteeing touch-control validity effectively, be favorable to promoting the convenience that utilizes wearable equipment to shoot.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic structural diagram of a wearable device disclosed in an embodiment of the present invention;
fig. 2 is a schematic structural diagram of another wearable device disclosed in the embodiment of the invention;
FIG. 3 is a schematic structural diagram of another wearable device disclosed in the embodiments of the present invention;
fig. 4 is a schematic flowchart of a shooting control method of a wearable device disclosed in an embodiment of the present application;
fig. 5 is a schematic flowchart of another shooting control method for a wearable device disclosed in the embodiments of the present application;
fig. 6 is a schematic flowchart of a shooting control method of another wearable device disclosed in the embodiments of the present application;
fig. 7 is a modular schematic diagram of a wearable device disclosed in an embodiment of the present application;
fig. 8 is a modular schematic diagram of yet another wearable device disclosed in embodiments of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described clearly and completely with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only some embodiments of the present application, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that the terms "comprises" and "comprising," and any variations thereof, in the embodiments of the present application, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
The embodiment of the application discloses a shooting control method of wearable equipment, the wearable equipment and a storage medium, wherein front and rear cameras of the wearable equipment can be controlled to be switched through simple gestures, and the convenience of shooting by utilizing the wearable equipment is improved.
The following detailed description will be made in conjunction with embodiments and the accompanying drawings.
In order to better understand the shooting control method of the wearable device disclosed in the embodiment of the present application, a wearable device disclosed in the embodiment of the present application is first described below. Referring to fig. 1 to 3, a wearable device disclosed in an embodiment of the present application may include a main frame support 10, a main frame 20, and a side belt 30, wherein the main frame support 10 is connected between two side belt ends of the side belt 30, a main frame top side 20a of the main frame 20 faces a main frame bottom side 20b of the main frame 20, the main frame top side 20a of the main frame 20 is provided with a front camera 22, and the main frame bottom side 20b is provided with a rear camera 23. One end of the main frame 20 is rotatably connected with the first end of the main frame support 10 through the first rotating shaft 21, so that the main frame 20 can rotate by different angles relative to the main frame support 10, and the front camera 22 and the rear camera 23 can obtain different shooting angles. One end of the side belt 30 is coupled to the first end of the main frame 10 through the first rotating shaft 21, that is, a coaxial rotating design with the first rotating shaft 21 is formed among the one end of the side belt 30, the first end of the main frame 10 and one end of the main frame 20, and the other end of the side belt 30 is connected to the second end of the main frame 10.
In the present embodiment, a screen 20e may be disposed on the host top side 20a of the host 20; optionally, a bottom-side display screen (not labeled) may also be disposed on the host bottom side 20b of the host 20. In a typical case, the host 20 may be stacked on the host stand 10, i.e., the host bottom side 20b of the host 20 is attached to the upper surface of the host stand 10; when the main body 20 rotates around the first rotation axis 21, the main body top side 20a and the main body bottom side 20b of the main body 20 form an angle with the upper surface of the main body support 10.
In the embodiment of the present application, the host 20 is used as a host of a wearable device with an adjustable shooting angle, and includes not only a motherboard disposed inside, a screen 20e for implementing touch and display functions, a battery for supplying power to the motherboard and the screen 20e, a front camera 22 and a rear camera 23 for implementing the shooting function of the host 20, a communication device (e.g., a wireless communication device, a bluetooth communication device, an infrared communication device, etc.) for implementing the communication function of the host 20, a sensor (e.g., a gravity sensor, an acceleration sensor, a distance sensor, an air pressure sensor, an ultraviolet detector, a water playing detection and identification module) for implementing the detection function of the host 20, a heart rate detector for implementing the heart rate detection of a user, a timer for implementing the timing function of the host 20, an element for identifying the identity of the user, such as a fingerprint module, a touch panel, and a touch panel, A facial recognition module, and a microphone, a loudspeaker and the like for realizing audio input and/or output. It can be understood that all the devices and the functional modules are electrically connected with the mainboard, and the devices and the functional modules are controlled by the mainboard to further control the mainboard to realize corresponding functions. Therefore, the host 20 in the embodiment of the present application is different from a conventional watch dial that can only realize functions of time display, timing, and the like.
Further, since the main body 20 is rotatably connected to the main body support 10 through the first rotating shaft 21, the main body 20 may include a rotating end 20c and a free end 20d, which are oppositely disposed, the rotating end 20c is an end of the main body 20 connected to the first end of the main body support 10 through the first rotating shaft 21, and the free end 20d is an end which can rotate with the rotating end 20c relative to the main body support 10 and can form an angle with respect to the main body support 10. Specifically, in order to avoid affecting the shooting angles of the front camera 22 and the rear camera 23, the front camera 22 may be preferably provided on an end top side (belonging to a part of the host top side 20 a) of the free end 20d, and the rear camera 23 may be preferably provided on an end bottom side (belonging to a part of the host bottom side 20 b) of the free end 20 d.
In the embodiment of the present application, the main frame 10 may have a plate-like structure. When the mainframe support 10 has a plate-shaped structure, the material of the mainframe support 10 may preferably be a heat insulating material, such as plastic. When the user wears the wearable device, the situation that the heat generated by the host 20 is large and the wrist skin of the user is possibly scalded can be avoided. Therefore, the host support 10 not only can realize the function of bearing the rotation of the host 20, but also can realize the effect of insulating the host 20 from the wrist skin of the user. Further, when the host support 10 is a plate-shaped structure, one or more through holes 10a may be disposed on the host support 10, so that the host 20 may implement various physiological characteristics detection including detection of the heart rate of the user through the through holes 10 a. The shape of the through hole 10 may be circular, square, or oval, and the embodiment of the present invention is not limited. It is understood that in other embodiments, the host support 10 may be a closed loop structure, and fig. 1-3 do not form any specific limitation on the shape of the devices in the embodiments of the present application.
When a user needs to adjust the angle of the host 20 relative to the host support 10, the rotation angle of the host 20 relative to the host support 10 can be manually adjusted, when the host 20 rotates to a target angle of the user, at this time, the user stops adjusting the host 20, the host 20 can be kept at the current angle, at this time, the front camera 22 and/or the rear camera 23 can be in a shooting state, and the user can select the front camera 22 and/or the rear camera 23 to execute corresponding shooting operation according to actual shooting requirements.
When the user wants to stack the host 20 on the host support 10, the user can manually adjust the host 20 again, so that the host 20 can be adjusted again to rotate relative to the host support 10 until the host 20 is stacked on the host support 10, and at this time, the user stops adjusting the host 20.
It can be understood that the wearable device disclosed in the embodiment of the present application enables the front camera 22 and the rear camera 23 to obtain different shooting angles along with the rotation of the host, and also can be turned on or off along with the rotation of the host, thereby satisfying various use requirements of the user for the cameras.
Referring to fig. 4, fig. 4 is a flowchart illustrating a shooting control method of a wearable device according to an embodiment of the present disclosure, where the shooting control method is applicable to the wearable device, the wearable device has a host that can be flipped with respect to a host bracket, a front side of the host may be provided with a front camera and a screen, and a back side of the host may be provided with a back camera. As shown in fig. 4, the photographing control method may include the steps of:
402. when the host computer of wearable equipment overturns to the preset angle relative to the host computer support, open the target camera that corresponds with this preset angle, this target camera is leading camera or rear camera.
In the embodiment of the application, the host of the wearable device can be turned relative to the host bracket, and the turning can be performed manually by a user or automatically by waking up the host and triggering the shooting function. In one embodiment, the host may be woken up to roll over by voice. For example, when the wearable device detects a voice signal sent by a user, the voice signal may be analyzed to determine whether the voice signal includes a specific keyword (e.g., "take a picture", "group a picture", etc.), and when the voice signal includes the specific keyword, the host may be awakened and driven by the motor to turn over with respect to the host bracket. In another embodiment, the host can be awakened to flip through gestures. For example, when the wearable device detects that the user wears the wearable device to make a specific gesture (e.g., lifting an arm wearing the wearable device to a certain height, rotating the arm to a certain angle, etc.), the wearable device may also immediately wake up the host to flip the host relative to the host stand.
Further, when the host is turned over to a preset angle relative to the host bracket, the target camera corresponding to the preset angle can be started. For example, when the preset angle belongs to a first angle range, a front camera positioned on the front surface of the host computer can be turned on; when the preset angle belongs to the second angle range, the rear camera positioned on the back of the host can be started. For example, when the preset angle is small (e.g., less than 30 °, less than 45 °, etc.), that is, the host has a small turning amplitude relative to the host bracket, since the rear camera on the back of the host does not yet obtain a sufficient shooting angle, it may be considered that the user wishes to turn on the front camera for shooting, so that this angle range may be used as the first angle range, and when the turning angle of the host belongs to the first angle range, the front camera may be turned on by default. For another example, when the preset angle is large (e.g., greater than 60 °, greater than 75 °, etc.), that is, the host has a large turning width relative to the host bracket, it may be considered that the user wishes to photograph a scene in which the back of the host faces at that time, so that this angle range may be used as the second angle range, and when the turning angle of the host belongs to the second angle range, the rear camera may be turned on by default. By adopting the method, the default opened target camera can be determined in a self-adaptive manner according to the turning angle of the host, so that the operation of the camera before and after the additional switching of the user is reduced as much as possible, and the convenience of shooting by using wearable equipment is improved.
404. And displaying the image collected by the target camera on a screen.
Specifically, when the target camera is a front camera, a scene facing the front of the host of the wearable device can be previewed or photographed; when the target camera is a rear camera, the scene facing the back of the host of the wearable device can be previewed or photographed.
406. When the touch operation aiming at the screen is detected, a first touch gesture corresponding to the touch operation is obtained.
In this embodiment of the application, the front surface of the host of the wearable device has a screen, and the screen may be a touch screen, such as a capacitive touch screen, a resistive touch screen, and the like, which is not specifically limited in this embodiment of the application. When the wearable device detects a touch operation on the screen, the touch operation can be recorded in real time, and a first touch gesture corresponding to the touch operation is acquired according to a recorded result. For example, the first touch gesture may be represented by a track of a corresponding touch operation, or may be represented by a dot matrix coordinate composed of a plurality of touch points on the screen of the corresponding touch operation.
As an optional implementation manner, before acquiring the first touch gesture corresponding to the touch operation, the wearable device may further detect whether the touch operation is a false touch. For example, the wearable device may record a touch duration corresponding to the touch operation, that is, a duration of the screen being touched, and when it is determined that the touch duration is less than a preset duration threshold (e.g., 0.1 second, 0.05 second, etc.), the wearable device may determine that the touch operation is touched by mistake and ignore the touch operation, so as to avoid obtaining a wrong first touch gesture, which is beneficial to ensuring reliability of touch.
408. And extracting a starting touch point and an ending touch point from the first touch gesture.
Specifically, when the first touch gesture is represented by a track of a corresponding touch operation, a starting point of the track is a starting touch point of the first touch gesture, and an ending point of the track is an ending touch point of the first touch gesture. When the first touch gesture is represented by a dot matrix coordinate formed by a plurality of corresponding touch points, the first recorded coordinate can represent the initial touch point, and the last recorded coordinate can represent the ending touch point. It should be understood that the coordinates refer to coordinates on a plane coordinate system established by a plane where the screen is located, and the plane coordinate system may be a rectangular plane coordinate system or a polar plane coordinate system with the center of the screen as an origin, so as to facilitate subsequent calculation of the coordinates of the starting touch point and the ending touch point.
410. And judging whether the first touch control gesture is matched with the camera switching gesture or not according to the starting touch control point and the ending touch control point.
In this embodiment of the application, the wearable device may store one or more standard touch gestures in advance, such as a camera opening gesture, a camera switching gesture, an application wakeup gesture, and the like, and the standard touch gesture may be used to determine a first touch gesture acquired by the wearable device, so as to determine an operation instruction triggered by the first touch gesture correspondingly. For example, when the standard touch gesture is stored, a standard trajectory feature of the standard touch gesture may be stored, so that when it is determined whether the first touch gesture matches a certain standard touch gesture, a trajectory feature corresponding to the first touch gesture may be compared with the standard trajectory feature to determine whether the first touch gesture matches the certain standard touch gesture.
In the embodiment of the application, since the screen of the wearable device is very small, it is difficult to distinguish the touch gestures which are too complicated. To solve the above problem, a start touch point and an end touch point of the first touch gesture may be preferentially extracted, and whether the first touch gesture matches a standard touch gesture (e.g., a camera switching gesture) may be determined according to the start touch point and the end touch point. For example, the first touch gesture may be determined according to the area where the start touch point and the end touch point are located on the screen (for example, when the start touch point is located on the lower half of the screen and the end touch point is located on the upper half of the screen, the first touch gesture may be matched with the camera switching gesture), or the first touch gesture may be determined according to the distance between the start touch point and the end touch point and the screen boundary (for example, when the distance is in a specific range, the first touch gesture may be matched with the camera switching gesture), so that complexity of matching determination may be reduced, it is beneficial to improve determination efficiency for the screen size of the wearable device, and thus reduce touch delay of the wearable device.
412. And if the gesture is matched with the camera switching gesture, switching the target camera, and displaying the image acquired by the switched target camera on a screen.
In this embodiment of the application, when the first touch gesture is matched with the camera switching gesture, a target camera currently in use may be switched. For example, if the target camera currently in use is a front camera, the target camera may be switched to a rear camera, and an image acquired by the rear camera is displayed on a screen; if the target camera currently in use is a rear camera, the target camera can be switched to a front camera, and an image acquired by the front camera is displayed on a screen. Therefore, the front camera and the rear camera of the wearable device can be switched conveniently by implementing the method.
As an alternative embodiment, the shooting control method described above may be applied to a scene in which the pre-shot and post-shot images are combined. For example, after a host of the wearable device is turned over relative to a host bracket and a front camera or a rear camera is started to shoot a first image, the wearable device may detect and acquire a first touch gesture for a screen, and when the first touch gesture matches a camera switching gesture, the wearable device may be quickly switched to another camera to shoot to obtain a second image; then, wearable equipment can carry out the synthesis operation with above-mentioned first image and second image to can obtain the synthetic image conveniently, be favorable to the user to utilize wearable equipment to realize functions such as group photograph, frame, promoted wearable equipment's object for appreciation nature and interest.
As another optional implementation manner, when the user uses the wearable device to perform mobile payment, the host of the wearable device may be lifted to a preset angle, so as to start a rear camera of the wearable device to perform payment code scanning; then, lightly stroking the screen of the wearable device to enable the wearable device to acquire a first touch gesture aiming at the screen of the wearable device; finally, judge this first touch gesture and camera when wearable equipment and switch the gesture and match, can switch the leading camera of using this wearable equipment rapidly to carry out face identification, accomplish the payment safety certification, thereby can promote the convenience and the security of using wearable equipment.
Therefore, by implementing the shooting control method described in the above embodiment, the front camera and the rear camera of the wearable device can be controlled to be switched through a simple gesture, so that the touch effectiveness is ensured, meanwhile, the mistaken touch is effectively avoided, and the convenience of shooting by using the wearable device is improved.
Referring to fig. 5, fig. 5 is a flowchart illustrating a shooting control method of another wearable device according to an embodiment of the present disclosure, where the shooting control method may be applied to the wearable device, the wearable device includes a host that can be flipped with respect to a host bracket, a front side of the host may be provided with a front camera and a screen, and a back side of the host may be provided with a back camera. As shown in fig. 5, the photographing control method may include the steps of:
502. when the host computer of wearable equipment overturns to the preset angle relative to the host computer support, open the target camera that corresponds with this preset angle, this target camera is leading camera or rear camera.
504. And displaying the image collected by the target camera on a screen.
506. When the touch operation aiming at the screen is detected, a first touch gesture corresponding to the touch operation is obtained.
508. And extracting a starting touch point and an ending touch point from the first touch gesture.
Step 502, step 504, step 506, and step 508 are similar to step 402, step 404, step 406, and step 408, and are not described herein again.
510. And acquiring a first distance value of the boundary between the starting touch point and the screen and a second distance value of the boundary between the ending touch point and the screen.
Specifically, a plurality of distance measuring points may be disposed on a screen boundary of the wearable device, and when a first distance value of the boundary where the starting touch point is closest to the screen is obtained, a distance value between the starting touch point and a distance measuring point closest to the starting touch point may be obtained as the first distance value; similarly, when the second distance value of the boundary between the ending touch point and the screen is obtained, the distance value between the ending touch point and a distance measuring point closest to the ending touch point may also be obtained as the second distance value.
512. And judging whether the first distance value and the second distance value are in a standard distance range corresponding to the camera switching gesture, and if so, judging that the first touch gesture is matched with the camera switching gesture.
Illustratively, the standard distance range may be [ a, B ], i.e., there is a lower distance limit a and an upper distance limit B. When the first distance value is determined to be within the standard distance range [ a, B ], it may be indicated that the starting touch point and the closest boundary of the screen are within the range [ a, B ], which is beneficial to screening out potential mistaken touch screen-scratching gestures (e.g., a touch gesture starting to scratch the screen from the boundary of the screen, a touch gesture starting to scratch the screen from the center of the screen, etc.). Optionally, the first distance value and the second distance value may be determined by using the same standard distance range, or may be determined by using different standard distance ranges, which is not specifically limited in this embodiment of the application.
As an optional implementation manner, when it is determined that the first distance value or the second distance value is not within the standard distance range, the sum of the first distance value and the second distance value may be further calculated. Then, the false touch probability of the first touch gesture can be determined according to the sum of the first distance value and the second distance value. Illustratively, the larger the sum value is, the farther the starting touch point or the ending touch point is from the screen boundary; the smaller the sum is, the closer the starting touch point and the ending touch point are to the screen boundary, so that the false touch probability of the first touch gesture can be determined accordingly. Finally, when the calculated false touch probability is higher than the probability threshold, the first touch gesture can be ignored.
514. And if the gesture is matched with the camera switching gesture, switching the target camera, and displaying the image acquired by the switched target camera on a screen.
Therefore, by implementing the shooting control method described in the above embodiment, the effectiveness of the first touch gesture can be accurately determined according to the start touch point and the end touch point corresponding to the first touch gesture, the influence of mistaken touch on controlling the shooting of the wearable device, especially the switching of front and back cameras of the wearable device, is reduced, and the reliability of controlling the wearable device to shoot is improved.
Referring to fig. 6, fig. 6 is a flowchart illustrating a shooting control method of another wearable device according to an embodiment of the present disclosure, where the shooting control method can be applied to the wearable device, the wearable device has a host that can be flipped with respect to a host bracket, a front side of the host can be provided with a front camera and a screen, and a back side of the host can be provided with a back camera. As shown in fig. 6, the … method may include the steps of:
602. when the host computer of wearable equipment overturns to the preset angle relative to the host computer support, open the target camera that corresponds with this preset angle, this target camera is leading camera or rear camera.
604. And displaying the image collected by the target camera on a screen.
606. When the touch operation aiming at the screen is detected, a first touch gesture corresponding to the touch operation is obtained.
608. And extracting a starting touch point and an ending touch point from the first touch gesture.
610. And acquiring a first distance value of the boundary between the starting touch point and the screen and a second distance value of the boundary between the ending touch point and the screen.
612. And judging whether the first distance value and the second distance value are in a standard distance range corresponding to the camera switching gesture, and if so, judging that the first touch gesture is matched with the camera switching gesture.
614. And if the gesture is matched with the camera switching gesture, switching the target camera, and displaying the image acquired by the switched target camera on a screen.
Step 602 to step 614 are similar to step 502 to step 514, and are not described herein again.
616. And re-executing the steps 606 to 614 until the click operation on the screen is detected.
The step 616 is usually executed, and the target camera of the wearable device can be repeatedly switched for multiple times, so as to flexibly select the front camera or the rear camera for shooting according to the actual shooting needs of the user.
618. And after the click operation on the screen is detected, shooting by using a target camera to obtain a shot image.
620. And displaying the shot image on a screen.
By implementing the method, the shooting can be performed by using the currently used target camera through simple clicking operation, so that the simplicity and effectiveness of controlling the shooting on the small wearable device are ensured, the mode of switching the cameras before and after the control of the first touch gesture can be distinguished, and the possibility of misoperation is reduced.
As an optional implementation manner, after the wearable device detects a touch operation on a screen of the wearable device and acquires a first touch gesture corresponding to the touch operation, a second touch gesture corresponding to the touch operation may be acquired within a preset time period, and a special effect image (such as a facial special effect, an environmental special effect, and the like) corresponding to the second touch gesture is displayed on the screen, and the special effect image may be superimposed on an image acquired by the target camera, so that the interest of shooting by using the wearable device may be improved by the second touch operation.
As another optional embodiment, when social friends are added face to face by using the wearable device, the cameras before and after the wearable device are switched to take head shots of the user and the friends in sequence, so that the social function of the wearable device can be improved. For example, a user may turn over a host of the wearable device to a preset angle, start a rear camera, and then shoot towards a friend by clicking a screen; next, switching of the front camera and the rear camera can be controlled through a first touch gesture, the front camera is started to face the user, and a screen is clicked to shoot; finally, the wearable device can display the shot images of the user and the friends on the screen at the same time, the social friends are added, the shot images are set to head images of the user and the friends respectively, and therefore the playability and the interestingness of the wearable device can be further improved, and meanwhile high efficiency and convenience are guaranteed.
Therefore, by implementing the shooting control method described in the above embodiment, the wearable device can be controlled by a simple gesture to shoot, and the high efficiency and convenience of using the wearable device are ensured while the playability and the interest of the wearable device are improved.
Referring to fig. 7, fig. 7 is a schematic view of a wearable device in a modularized manner, the wearable device includes a main frame capable of being flipped with respect to a main frame support, a front camera and a screen may be disposed on a front surface of the main frame, and a rear camera may be disposed on a rear surface of the main frame. As shown in fig. 7, the wearable device may include an activation unit 701, a display unit 702, a first acquisition unit 703, a second acquisition unit 704, a first determination unit 705, and a switching unit 706, where:
the starting unit 701 is configured to start a target camera corresponding to a preset angle when the host is turned over to the preset angle relative to the host bracket, where the target camera is the front camera or the rear camera;
a display unit 702, configured to display an image acquired by the target camera on a screen;
a first obtaining unit 703, configured to obtain, when a touch operation on the screen is detected, a first touch gesture corresponding to the touch operation;
a second obtaining unit 704, configured to extract a starting touch point and an ending touch point from the first touch gesture;
a first determining unit 705, configured to determine whether the first touch gesture matches a camera switching gesture according to the starting touch point and the ending touch point;
a switching unit 706, configured to switch the target camera when the first determining unit 705 determines that the first touch gesture matches the camera switching gesture;
the display unit 702 is further configured to display the image acquired by the switched target camera on the screen.
Therefore, the wearable device described in the above embodiment can control the front camera and the rear camera to be switched through a simple gesture, thereby effectively avoiding mistaken touch while ensuring touch effectiveness, and facilitating the improvement of the convenience of shooting by using the wearable device.
Referring to fig. 8, fig. 8 is a schematic block diagram of another wearable device disclosed in the embodiments of the present application. As shown in fig. 8, the host of the wearable device may include:
a memory 801 in which executable program code is stored;
a processor 802 coupled with the memory 801;
the processor 802 calls the executable program code stored in the memory 801, and may execute all or part of the steps in the shooting control method of any one of the wearable devices described in the above embodiments.
In addition, the embodiment of the present application further discloses a computer-readable storage medium storing a computer program for electronic data exchange, where the computer program enables a computer to execute all or part of the steps in any one of the shooting control methods of a wearable device described in the above embodiments.
In addition, the embodiment of the present application further discloses a computer program product, which when running on a computer, enables the computer to execute all or part of the steps of any one of the shooting control methods of the wearable device described in the above embodiments.
It will be understood by those skilled in the art that all or part of the steps in the methods of the embodiments described above may be implemented by hardware instructions of a program, and the program may be stored in a computer-readable storage medium, where the storage medium includes Read-Only Memory (ROM), Random Access Memory (RAM), Programmable Read-Only Memory (PROM), Erasable Programmable Read-Only Memory (EPROM), One-time Programmable Read-Only Memory (OTPROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), Compact Disc Read-Only Memory (CD-ROM), or other Memory, such as a magnetic disk, or a combination thereof, A tape memory, or any other medium readable by a computer that can be used to carry or store data.
The shooting control method of the wearable device, the wearable device and the storage medium disclosed in the embodiment of the present application are described in detail above, and specific examples are applied in the present application to explain the principle and the implementation of the present application, and the description of the above embodiments is only used to help understand the method and the core idea of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (10)

1. A shooting control method of a wearable device, wherein the wearable device is provided with a host which can be overturned relative to a host bracket, a front camera and a screen are arranged on the front surface of the host, and a rear camera is arranged on the back surface of the host, and the method comprises the following steps:
when the host is turned over to a preset angle relative to the host bracket, starting a target camera corresponding to the preset angle, wherein the target camera is the front camera or the rear camera;
displaying the image collected by the target camera on the screen;
when touch operation aiming at the screen is detected, acquiring a first touch gesture corresponding to the touch operation;
extracting a starting touch point and an ending touch point from the first touch gesture;
judging whether the first touch gesture is matched with a camera switching gesture or not according to the starting touch point and the ending touch point;
and if the gesture is matched with the camera switching gesture, switching the target camera, and displaying the image acquired by the switched target camera on the screen.
2. The method of claim 1, wherein switching the target camera comprises:
if the target camera is the front camera, switching the target camera to the rear camera;
and if the target camera is the rear camera, switching the target camera to the front camera.
3. The method according to claim 1 or 2, wherein the determining whether the first touch gesture matches a camera switching gesture according to the starting touch point and the ending touch point comprises:
acquiring a first distance value of a boundary between the starting touch point and the screen, and a second distance value of a boundary between the ending touch point and the screen;
and judging whether the first distance value and the second distance value are in a standard distance range corresponding to a camera switching gesture, and if so, judging that the first touch gesture is matched with the camera switching gesture.
4. The method of claim 3, wherein after determining whether the first distance value and the second distance value are within a standard distance range corresponding to a camera switching gesture, the method further comprises:
if the distance is not within the standard distance range, calculating the sum of the first distance value and the second distance value;
determining the false touch probability of the first touch gesture according to the sum of the first distance value and the second distance value;
and when the false touch probability is higher than a probability threshold value, ignoring the first touch gesture.
5. The method of claim 1, wherein after the displaying the image captured by the switched target camera on the screen, the method further comprises:
and re-executing the first touch gesture corresponding to the touch operation when the touch operation aiming at the screen is detected until the click operation aiming at the screen is detected.
6. The method of claim 5, wherein after the detecting of the click operation on the screen, the method further comprises:
shooting by using the target camera to obtain a shot image;
displaying the photographed image on the screen.
7. The method according to claim 1, wherein after the acquiring of the first touch gesture corresponding to the touch operation when the touch operation on the screen is detected, the method further comprises:
acquiring a second touch gesture corresponding to the touch operation within a preset time length;
displaying a special effect image corresponding to the second touch gesture on the screen, wherein the special effect image is superposed on the image acquired by the target camera.
8. The utility model provides a wearable equipment, wearable equipment has the host computer that can overturn for the host computer support, the front of host computer is equipped with leading camera and screen, the back of host computer is equipped with rear camera, its characterized in that, wearable equipment includes:
the starting unit is used for starting a target camera corresponding to a preset angle when the host is turned over to the preset angle relative to the host bracket, and the target camera is the front camera or the rear camera;
the display unit is used for displaying the image acquired by the target camera on the screen;
the first acquisition unit is used for acquiring a first touch gesture corresponding to the touch operation when the touch operation aiming at the screen is detected;
the second acquisition unit is used for extracting a starting touch point and an ending touch point from the first touch gesture;
the first judging unit is used for judging whether the first touch control gesture is matched with the camera switching gesture according to the starting touch control point and the ending touch control point;
the switching unit is used for switching the target camera if the target camera is matched with the camera switching gesture;
the display unit is further used for displaying the image acquired by the switched target camera on the screen.
9. A wearable device, comprising:
a memory storing executable program code;
a processor coupled with the memory;
the processor calls the executable program code stored in the memory to perform the method of any of claims 1 to 7.
10. A computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program, wherein the computer program causes a computer to perform the method of any one of claims 1 to 7.
CN202011197335.9A 2020-10-31 2020-10-31 Shooting control method of wearable device, wearable device and storage medium Pending CN114449153A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011197335.9A CN114449153A (en) 2020-10-31 2020-10-31 Shooting control method of wearable device, wearable device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011197335.9A CN114449153A (en) 2020-10-31 2020-10-31 Shooting control method of wearable device, wearable device and storage medium

Publications (1)

Publication Number Publication Date
CN114449153A true CN114449153A (en) 2022-05-06

Family

ID=81357522

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011197335.9A Pending CN114449153A (en) 2020-10-31 2020-10-31 Shooting control method of wearable device, wearable device and storage medium

Country Status (1)

Country Link
CN (1) CN114449153A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023246100A1 (en) * 2022-06-20 2023-12-28 湖北星纪魅族科技有限公司 Photographing processing method and apparatus for terminal, and electronic device and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104182173A (en) * 2014-08-15 2014-12-03 小米科技有限责任公司 Camera switching method and device
CN105072336A (en) * 2015-07-31 2015-11-18 小米科技有限责任公司 Control method, apparatus and device for adjusting photographing function
KR20160022550A (en) * 2014-08-20 2016-03-02 엘지전자 주식회사 Mobile terminal
CN105824530A (en) * 2016-03-14 2016-08-03 乐卡汽车智能科技(北京)有限公司 Method and device for switching cameras
CN106231087A (en) * 2016-07-27 2016-12-14 努比亚技术有限公司 A kind of method and apparatus of positive and negative dual-screen display device false-touch prevention
CN110275734A (en) * 2019-06-20 2019-09-24 广东小天才科技有限公司 A kind of method, apparatus, wearable device and storage medium overturning startup function

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104182173A (en) * 2014-08-15 2014-12-03 小米科技有限责任公司 Camera switching method and device
KR20160022550A (en) * 2014-08-20 2016-03-02 엘지전자 주식회사 Mobile terminal
CN105072336A (en) * 2015-07-31 2015-11-18 小米科技有限责任公司 Control method, apparatus and device for adjusting photographing function
CN105824530A (en) * 2016-03-14 2016-08-03 乐卡汽车智能科技(北京)有限公司 Method and device for switching cameras
CN106231087A (en) * 2016-07-27 2016-12-14 努比亚技术有限公司 A kind of method and apparatus of positive and negative dual-screen display device false-touch prevention
CN110275734A (en) * 2019-06-20 2019-09-24 广东小天才科技有限公司 A kind of method, apparatus, wearable device and storage medium overturning startup function

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023246100A1 (en) * 2022-06-20 2023-12-28 湖北星纪魅族科技有限公司 Photographing processing method and apparatus for terminal, and electronic device and storage medium

Similar Documents

Publication Publication Date Title
CN112911182B (en) Game interaction method, device, terminal and storage medium
US20170255767A1 (en) Identity Authentication Method, Identity Authentication Device, And Terminal
CN110177242B (en) Video call method based on wearable device and wearable device
CN108920202B (en) Application preloading management method and device, storage medium and intelligent terminal
CN108668080A (en) Prompt method and device, the electronic equipment of camera lens degree of fouling
TW201113743A (en) Method, electronic apparatus and computer program product for creating biologic feature data
CN109167877A (en) Terminal screen control method, device, terminal device and storage medium
CN110062171B (en) Shooting method and terminal
CN108549802A (en) A kind of unlocking method, device and mobile terminal based on recognition of face
CN108307106A (en) A kind of image processing method, device and mobile terminal
CN111352507A (en) Information prompting method and electronic equipment
CN108712612A (en) A kind of photographic method, terminal and computer readable storage medium
CN114449153A (en) Shooting control method of wearable device, wearable device and storage medium
CN108229420A (en) A kind of face identification method, mobile terminal
CN111415722A (en) Screen control method and electronic equipment
CN106598425A (en) Photographing method of smart mobile terminal
CN110177241B (en) Posture adjustment method of wearable device and wearable device
CN110135228B (en) Dictation proficiency evaluation method and wearable device
CN110519517B (en) Copy guiding method, electronic device and computer readable storage medium
CN108762641A (en) A kind of method for editing text and terminal device
CN108419007A (en) Clone method, photo taking, terminal and computer readable storage medium
CN111756960B (en) Shooting control method based on wearable device and wearable device
CN106791407A (en) A kind of self-timer control method and system
CN109635622A (en) Personal identification method, device and electronic equipment
CN110177236B (en) Navigation method based on wearable device and wearable device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination