CN113256767B - Bare-handed interactive color taking method and color taking device - Google Patents

Bare-handed interactive color taking method and color taking device Download PDF

Info

Publication number
CN113256767B
CN113256767B CN202110796734.5A CN202110796734A CN113256767B CN 113256767 B CN113256767 B CN 113256767B CN 202110796734 A CN202110796734 A CN 202110796734A CN 113256767 B CN113256767 B CN 113256767B
Authority
CN
China
Prior art keywords
color
finger
gesture
event
gesture event
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110796734.5A
Other languages
Chinese (zh)
Other versions
CN113256767A (en
Inventor
李铁萌
周维
李素雯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing University of Posts and Telecommunications
Original Assignee
Beijing University of Posts and Telecommunications
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing University of Posts and Telecommunications filed Critical Beijing University of Posts and Telecommunications
Priority to CN202110796734.5A priority Critical patent/CN113256767B/en
Publication of CN113256767A publication Critical patent/CN113256767A/en
Application granted granted Critical
Publication of CN113256767B publication Critical patent/CN113256767B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/80Creating or modifying a manually drawn or painted image using a manual input device, e.g. mouse, light pen, direction keys on keyboard
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention provides a bare-handed interactive color taking method and a color taking device, wherein the method comprises the following steps: tracking hand movement by utilizing a somatosensory controller device to acquire hand movement data; identifying hand motion characteristics based on the hand motion data, wherein the hand motion characteristics comprise wrist rotation angle, four-finger gripping strength and palm center depth; respectively controlling the selection of hue, saturation and brightness in a color representation model of the three-dimensional HSV color space by using the identified hand motion characteristics; recognizing gesture events based on the acquired hand action data, and interacting with a color sampling panel based on the recognized gesture events, wherein the gesture comprises a first gesture event and a second gesture event; switching a hue indication mode based on the identified first gesture event; the color sampling is confirmed based on the identified second gesture event. The invention can simultaneously adjust three dimensions of the color through bare-handed interaction, quickly complete color selection and has high accuracy.

Description

Bare-handed interactive color taking method and color taking device
Technical Field
The invention relates to the technical field of color sampling, in particular to a bare-handed interactive color sampling method and a color sampling device.
Background
A color picker is one of the most common controls in drawing applications, typically used to assist a user in selecting a color in a color space. Currently, in immersive and remote interactive environments, the most common solution for freehand interaction with a color selector is to simulate a handheld device by hand to emit a ray and control a cursor by hand to move on a virtual plane to select a color. For example, both the A-Painter and Tilt Brush are composed of two widgets, a color wheel and a slide bar, and a ray casting method is used to select pixel points. While the gray Sketch is a three-dimensional color extractor, the circular portion displaying hue and saturation can be pushed in and pulled out to change brightness, effectively representing an HSV color space cylinder.
The existing color selector controls the selection of rays by depending on hand motion, and needs to aim at an interactive control for many times, so that the interactive mode has low efficiency and high error rate. Most of the color extractor devices directly transfer the shape of the color extractor under the WIMP paradigm, and the color extractor devices need to repeatedly interact with two sub-controls, namely a color wheel and a sliding bar. However, the bare-handed interaction lacks physical constraints and is not as stable as mouse and touch interaction, so that the disadvantage of the interaction is more prominent. Moreover, the existing color sampler does not utilize the naturalness of bare-handed interaction, does not support three-dimensional selection in a color space, and can not effectively improve the cognition of a user on the color space.
Disclosure of Invention
In view of the problems in the prior art, an object of the present invention is to provide a method and an apparatus for selecting colors in a color space by bare-handed interaction, which can overcome one or more of the disadvantages of the existing color selector.
In one aspect of the present invention, a method for extracting colors through bare-handed interaction is provided, which comprises the following steps:
tracking hand movement by utilizing a somatosensory controller device, and acquiring hand movement data, wherein the hand movement data comprises a plurality of frame objects, and the frame objects comprise data of hands, fingers and joint points;
identifying hand motion characteristics based on the hand motion data, wherein the hand motion characteristics comprise wrist rotation angle, four-finger gripping strength and palm center depth;
selecting hue, saturation and brightness in a color representation model of a three-dimensional HSV color space by using the identified hand action characteristics comprising wrist rotation angle, four-finger gripping strength and palm center depth;
recognizing gesture events based on the acquired hand motion data, and controlling selection of hue, saturation and brightness in the color representation model by using the hand motion characteristics based on the recognized gesture events, wherein the gesture comprises a first gesture event and a second gesture event, the first gesture event is used for triggering mode switching of hue indication, and the second gesture event is used for triggering color taking confirmation;
switching a hue indication mode based on the recognized first gesture event if the first gesture event is recognized; when the second gesture event is recognized, the color sampling is confirmed based on the recognized second gesture event.
In some embodiments of the present invention, the color representation model of the three-dimensional HSV color space is a conical representation model of the HSV color space.
In some embodiments of the invention, in the conical space of the conical representation model: the hues are arranged on the circumference of 360 degrees of the bottom surface of the cone and are circularly connected according to a preset hue arrangement sequence; the saturation degree changes along the radial direction of the bottom surface of the cone, the closer to the circle center, the lower the saturation degree, and the farther from the circle center, the higher the saturation degree; lightness varies axially along the height of the cone, the closer to the top of the cone, the lower the lightness; the closer to the bottom of the cone, the higher the brightness.
In some embodiments of the invention, the data structure of the hand motion data follows a right hand coordinate system, with the origin of coordinates being the center of the upper surface of the somatosensory controller device, the X-axis pointing to the long side of the device, and the right direction being the positive direction; the Y axis is vertical and upward is a positive direction; the Z axis is perpendicular to the screen and is in the positive direction when facing outwards.
In some embodiments of the invention, the method further comprises: the recognizing hand motion characteristics based on the hand motion data comprises: acquiring a wrist rotation angle, wherein the wrist rotation angle is expressed in radians; identifying the gripping force by calculating an included angle formed by the metacarpal bones and the proximal phalanges of each finger; the depth coordinates of the palm are calculated by acquiring the position of the palm with respect to the origin of coordinates.
In some embodiments of the invention, the method further comprises: the color representation model further comprises: a color wheel control and a lightness slider control, wherein the color wheel control is a projection of a conical color space on a longitudinal axis; on the color wheel control, the rotation angle of the pointer indicates the currently selected color, and the length of the pointer corresponds to the currently selected saturation; on the brightness slider control, the change in brightness and the current value are reflected by the up-and-down movement of the cursor.
In some embodiments of the present invention, the first gesture event is an air tap event of a specific finger, the specific finger being an index finger, a middle finger or a ring finger; the second gesture event is a thumb touch event that a thumb touches a designated finger, and the designated finger is an index finger or a middle finger.
In some embodiments of the present invention, the air tap event of the specific finger is an air tap event of a middle finger, and the thumb touch event of the thumb touching the specific finger is a thumb touch event of the thumb touching the index finger; the recognizing the gesture event comprises: calculating the distance between the far end phalanges of the index finger and the middle finger, and if the distance is larger than a preset first threshold value, determining that the current gesture event is an air-tap event; and calculating the distance between the far phalange of the thumb and the near phalange of the index finger, and if the distance is smaller than a second preset threshold value, determining that the current gesture event is a thumb touch index finger event.
In another aspect of the present invention, a freehand interactive color-taking apparatus is provided, which includes a processor and a memory, the memory storing computer instructions, the processor being configured to execute the computer instructions stored in the memory, and the color-taking apparatus implementing the steps of the method when the computer instructions are executed by the processor.
In a further aspect of the invention, a computer storage medium is also provided, on which a computer program is stored which, when being executed by a processor, carries out the steps of the method as set forth above.
The bare-handed interactive color taking method and the color taking device provided by the embodiment of the invention can simultaneously adjust three dimensions of colors through bare-handed interaction, quickly complete color selection and have high accuracy; further, the cognitive aspect of the color space can be provided for the user.
Additional advantages, objects, and features of the invention will be set forth in part in the description which follows and in part will become apparent to those having ordinary skill in the art upon examination of the following or may be learned from practice of the invention. The objectives and other advantages of the invention will be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
It will be appreciated by those skilled in the art that the objects and advantages that can be achieved with the present invention are not limited to the specific details set forth above, and that these and other objects that can be achieved with the present invention will be more clearly understood from the detailed description that follows.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the principles of the invention. In the drawings:
fig. 1 is a flowchart illustrating a freehand interaction method according to an embodiment of the present invention.
FIG. 2 illustrates three exemplary hand movements according to an embodiment of the present invention.
FIG. 3 is an exemplary cone representation model of HSV color space in accordance with an embodiment of the present invention.
Fig. 4 is an example of a color wheel control and a brightness slider control in a color sampling panel in an embodiment of the invention.
Fig. 5 is a schematic view of bones of each part of the hand.
FIG. 6 is a diagram illustrating a MiddleTap gesture and a thumb trigger gesture, in accordance with an embodiment of the present invention.
Fig. 7 is a flow of operations for controlling the switching of hue indication modes by MiddleTap gesture and selecting colors by thumb trigger gesture according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention will be described in further detail with reference to the following embodiments and accompanying drawings. The exemplary embodiments and descriptions of the present invention are provided to explain the present invention, but not to limit the present invention.
It should be noted that, in order to avoid obscuring the present invention with unnecessary details, only the structures and/or processing steps closely related to the scheme according to the present invention are shown in the drawings, and other details not so relevant to the present invention are omitted.
It should be emphasized that the term "comprises/comprising" when used herein, is taken to specify the presence of stated features, elements, steps or components, but does not preclude the presence or addition of one or more other features, elements, steps or components.
In order to solve the problems that an existing freehand interaction color sampler is unstable in interaction and does not support simultaneous selection of three color dimensions in a three-dimensional color space, the invention provides an interaction color sampling technology for selecting colors in the color space by freehand, wherein the technology adopts a wrist rotation angle, four-finger gripping strength and palm center depth to simultaneously select hue, saturation and lightness.
Fig. 1 is a schematic flow chart of a freehand interactive color sampling method according to an embodiment of the present invention, which may be implemented by a color sampling device (e.g., a color sampler), as shown in fig. 1, the method includes the following steps:
step S110, tracking hand motion by using a somatosensory controller device, and acquiring hand motion data, where the hand motion data includes a plurality of frame objects, and the frame objects include data of a hand, fingers, and joints.
As an example, Leap Motion may be used as a sensory controller to track user hand movements, e.g., over 200 frames per second, each frame outputting a frame object, thereby obtaining hand Motion data comprising several frame objects, each of which may comprise data of a hand, a finger, and a joint. And a JavaScript library leapJS of Leap Motion is introduced into a webpage of the user computer, so that the Leap Motion equipment can be connected to acquire the frame object. LeapJS includes a vector math library GL-Matrix that can be used to compute the angle, dot product, etc. of a vector. Here, the Leap Motion is merely an example, and the present invention is not limited thereto.
In one embodiment of the invention, the data structure of the hand motion data follows a right-hand coordinate system, the origin of coordinates of the hand motion data is the center of the upper surface of the somatosensory controller device, the X-axis is along the long side of the device, and the right direction is a positive direction; the Y axis is along the vertical direction, and the upward direction is the positive direction; the Z axis is vertical to the screen and is outward in the positive direction.
In step S120, hand motion characteristics are identified based on the hand motion data.
The hand motion characteristics may include wrist rotation angle, four-finger grip strength, and palm center depth, which are hand motion characteristics for controlling hue, saturation, and lightness changes, respectively.
Fig. 2 shows, from left to right, examples of the following hand motion features: the rotation angle of the wrist, the grasping force of the four fingers and the depth of the palm center.
For the wrist rotation angle feature, the method hand. roll () for obtaining the wrist rotation angle provided by the Leap Motion device can be used to obtain the feature, and the output result is represented in radian system, as shown in fig. 2, the range is from [ -pi, pi ].
For the four-finger gripping force characteristic, the gripping force can be identified by calculating the included angle formed by the metacarpal bone and the proximal phalanx of each finger. The Leap Motion device provides direction vectors of joints, and the invention can calculate the included angle of the direction vectors of the joints by using the vec3.angle () function provided by GL-Matrix. When the four fingers are opened, the included angle of the joint is 0; when held by four fingers, the joint angle is approximately pi/2 (as shown in fig. 2).
For the depth feature of the palm center, a hand-stabilized palm position function module in Leap Motion can be used to obtain the position of the palm center relative to the origin of coordinates, then the coordinates are normalized to a [0, 1] interval through frame-interactionbox-normalized point (position) operation, and finally the third element in the array is obtained, namely the depth coordinate of the palm center.
And S130, selecting hue, saturation and brightness in the color representation model of the three-dimensional HSV color space respectively by using the identified hand action characteristics comprising the wrist rotation angle, the four-finger gripping strength and the palm center depth.
More specifically, the color sampler may store in advance a correspondence relationship between changes in the wrist rotation angle, the four-finger grip strength, and the palm depth, and changes in hue, saturation, and lightness in the color representation model. After the wrist rotation angle, the four-finger gripping strength and the palm center depth are identified, the hue, saturation and brightness in the pre-established color representation model of the three-dimensional HSV color space can be controlled based on the changes of the wrist rotation angle, the four-finger gripping strength and the palm center depth respectively. The color representation model of the three-dimensional HSV color space is displayed on a display device of a user connected to a Leap Motion device.
In an embodiment of the present invention, the color representation model of the pre-established three-dimensional HSV color space is a cone representation model of the HSV color space, and the model adopts the cone space to represent color information including hue, saturation and lightness. As shown in fig. 3, the hues are arranged on a 360-degree circle of the bottom surface of the cone and cyclically contact in order of red, orange, yellow, green, blue, and violet (although fig. 3 is a gray scale diagram, the hues cyclically contact in a predetermined hue arrangement order in the cone space of the actual color representation model). The saturation degree changes along the radial direction of the bottom surface of the cone, the closer the bottom surface of the cone to the circle center, the lower the saturation degree, and the farther the bottom surface of the cone from the circle center, the higher the saturation degree. The saturation minimum of 0.0 represents achromatic color and the maximum of 1.0 represents fully saturated color. Lightness varies axially along the height of the cone, the closer to the top of the cone, the lower the lightness; the closer to the bottom of the cone, the higher the brightness. Lightness minimum 0.0 represents black and maximum 1.0 represents white. In fig. 3, H represents hue, S represents saturation, and V represents lightness. The arrangement of the cone representation model shown in fig. 3 is merely an example, and the present invention is not limited thereto, and in alternative embodiments, the saturation may be set to vary axially along the height of the cone and the lightness may be set to vary radially along the base of the cone.
In the embodiment of the present invention, the color representation model of the three-dimensional HSV color space further includes a color sampling panel, where: a color wheel control and a lightness slider control, wherein the color wheel control is a projection of the cone color space in a direction of a longitudinal axis (Y-axis), and the lightness slider control is a bar indicating a change in lightness along an axial direction of a height of the cone in the cone color space. Fig. 4 illustrates an example of a color wheel control and a brightness slider control in a color sampling panel presented by a user display device in an embodiment of the invention. And on the color wheel control, indicating the currently selected hue value by the rotation angle of a pointer in the radial direction from a preset initial position, wherein the length of the pointer corresponds to the currently selected saturation value. On a brightness slider control, to facilitate displaying the change in brightness and the current brightness value, the change in brightness may be reflected by moving a particular marker, such as a cursor (e.g., a circular cursor) on the slider control up and down.
In addition, in the embodiment of the present invention, a metaphorical design is also performed on three-dimensional adjustment of the color representation model, as shown above the gesture feature on the right side of fig. 4: the color metaphor is an elastic sponge ball which is irradiated by a white point light source and contains pigment liquid, the content of the pigment liquid in the ball is related to the extruded degree, the illumination of the color is related to the distance from the light source, and the light source is positioned at the user side on the display depth. The color phase value of the small ball is controlled by the rotation of the wrist; when the four fingers are gripped more tightly, the content of the color liquid in the small ball is squeezed more seriously, the color saturation of the small ball is lower, and vice versa; when the ball is pulled closer to the user, the closer the ball is to the light source, the more light is illuminated, the higher the ball color brightness, and vice versa. In combination with the metaphor, the invention uses three hand motion characteristics of wrist rotation angle, four-finger gripping strength and palm center depth to control hue, saturation and lightness. As shown in fig. 4, the rotating wrist may select a hue, and specifically may control the rotation of the pointer on the color wheel control, where the pointer is directed upwards when the target color is located on the upper semicircle; when the target color is located in the lower semicircle, the pointer is directed downward. The four-finger gripping force can be used for controlling the saturation, and when the four fingers are stretched, the maximum value of the saturation is 1; when the four fingers hold the finger, the minimum value of the saturation is 0. The front and back movement of the palm can control the change of brightness; pushing the hand inwards, wherein the minimum value of the lightness is 0; the hand is pulled outward, corresponding to a maximum value of 1 for lightness.
And step S140, recognizing gesture events based on the acquired hand motion data, and controlling selection of hue, saturation and lightness in the color representation model by using hand motion characteristics (wrist rotation angle, four-finger gripping strength and palm center depth) based on the recognized gesture events.
In the embodiment of the invention, the gesture event is customized. Switching hue indication mode and confirming color-taking operation by recognizing gesture event. More specifically, in the embodiment of the present invention, the gesture event may include a first gesture event and a second gesture event, where the first gesture event is used to trigger the switching of the hue indication mode, the second gesture event is used to trigger the color sampling confirmation, and the first gesture and the second gesture corresponding to the first gesture event and the second gesture event may be static gestures.
As an example, the first gesture event may be a mid-tap (MiddleTap) event of a particular finger, which may be, for example, an index finger, middle finger, ring finger, or the like; the second gesture event may be a thumb touch (thumb trigger) event where the thumb touches a designated finger, such as the index finger or middle finger. FIG. 6 illustrates a schematic diagram of a MiddleTap gesture and a thumb Trigger gesture in one embodiment.
As an example, the tap-in-air event for a particular finger is a middle finger tap-in-air event, and the thumb touch event for a thumb touching a designated finger is a thumb touch event for a thumb touching a forefinger. In this case, the step of identifying a gesture event based on the acquired hand motion data may comprise:
calculating the distance between the far end phalanges of the index finger and the middle finger, and if the distance is greater than a preset first threshold value, determining that the current gesture of the user is an air tap gesture and the corresponding event is an air tap gesture event; the predetermined first threshold may be a minimum distance that can be generally achieved between the distal phalanges of the index finger and the middle finger of the user when the middle finger performs the air tapping action. The distance can be obtained by calculating the Euclidean distance through the index finger far-end phalange position point coordinates and the middle finger far-end phalange position point coordinates in the hand data. If the distance between the distal phalanges of the index and middle fingers after the air-tap becomes less than a predetermined first threshold, the air-tap event is deemed to be over. The positions of the bones of the hand are shown in fig. 5.
Calculating the distance between the far-end phalange of the thumb and the near-end phalange of the index finger, and if the distance is smaller than a preset second threshold value, determining that the current gesture is a thumb touch gesture of touching the index finger by the thumb, and the corresponding event is a thumb touch gesture event; the predetermined second threshold may be a predetermined maximum distance between the distal phalanx of the thumb and the proximal phalanx of the index finger of the user when the thumb touches or appears to touch the index finger. The distance can be obtained by calculating the Euclidean distance through the coordinates of the position point of the far phalanx of the thumb and the position point of the near phalanx of the index finger in the hand data. In an embodiment of the present invention, after the thumb touches the gesture, if the distance between the distal phalanx of the user's thumb and the proximal phalanx of the index finger becomes greater than the second threshold, the thumb is considered to be away from the index finger.
In embodiments of the present invention, gesture events may be used to interact with the color fetching panel and generate color fetching events. In the embodiment of the present invention, two gesture events, namely MiddleTap and ThumbTrigger events, are respectively used for hue indication mode (or hue selection mode) switching and color confirmation.
Once the program identifies the gesture, an interaction event is generated, hand motion data are transmitted to the color taking panel, the attributes comprise wrist rotation angle, four-finger gripping strength, palm center depth and the like, the event is dispatched to the target element, and hue selection or confirmation control is achieved through the wrist rotation angle, the four-finger gripping strength and the palm center depth based on an instruction corresponding to the event.
In an embodiment of the present invention, after the gesture and the corresponding gesture event are recognized, the selection of hue, saturation and lightness in the color representation model using the hand motion characteristics may be controlled based on the recognized gesture event. As described in the following steps.
Step S150, under the condition that the first gesture event is recognized, switching the hue indication mode based on the recognized first gesture event; when the second gesture event is recognized, the color sampling is confirmed based on the recognized second gesture event.
For example, upon selection of a hue, if a tap-in-the-air gesture event is identified, a switch of hue indication mode (i.e., a transition from yet another hue indication mode to another hue indication mode) may be triggered based on the tap-in-the-air gesture event. Rotating the wrist can select hue, and when hue is selected, the direction of the pointer on the color wheel control can be switched by using the MiddleTap gesture. When the event MiddleTap is recognized, the pointer direction may be reversed, thereby switching the color representation model of the three-dimensional HSV color space from another hue indication mode (hue indication manner) to another hue indication mode. As a hue indication mode, the target color is located in the upper semicircle and the direction of the pointer is upward, and if the hue indication mode is switched to another hue indication mode by the MiddleTap event, the direction of the pointer becomes downward and the target color becomes located in the lower semicircle. No matter which hue indication mode is adopted, when the target color is positioned on the upper semicircle, the direction of the pointer is upward; when the target color is located in the lower semicircle, the pointer is directed downward.
When the user determines that the currently indicated color is to be selected, the selection of the currently indicated color may be confirmed through a thumb touch gesture (ThumbTrigger) event.
In the example shown in fig. 7, the direction of the pointer on the color wheel control switches from down to up upon recognition of the MiddleTap gesture. Further, the rotation of the pointer on the color wheel control can be controlled by rotating the wrist, so that the pointer can be located at the position of the hue to be selected by rotating the wrist, and after the hue is selected, the color indicated by the current pointer can be selected by making a thumb trigger gesture.
The freehand interactive color extracting method and the color extracting device of the embodiment of the invention allow a user to simultaneously select and adjust three dimensions of hue, saturation and lightness and present feedback in real time. User studies have shown that the efficiency of selecting colors is increasing as user proficiency increases. Moreover, metaphorical design helps users to remember the meaning of gestures more easily, better understanding color selection. Compared with the traditional color sampler, the bare-handed interactive color sampling method is more convenient, quicker, more stable and easier to realize, and the interaction is smoother and more natural because the control sense is stronger.
Correspondingly to the method, the invention also provides a bare-handed interactive color-taking device (color-taking device) comprising a processor and a memory, wherein the memory stores computer instructions, and the processor is used for executing the computer instructions stored in the memory, and when the computer instructions are executed by the processor, the device realizes the steps of the edge computing server deployment method.
Embodiments of the present invention further provide a computer-readable storage medium, on which a computer program is stored, where the computer program is executed by a processor to implement the foregoing steps of the edge computing server deployment method. The computer readable storage medium may be a tangible storage medium such as an optical disk, a U disk, a floppy disk, a hard disk, and the like.
It is to be understood that the invention is not limited to the specific arrangements and instrumentality described above and shown in the drawings. A detailed description of known methods is omitted herein for the sake of brevity. In the above embodiments, several specific steps are described and shown as examples. However, the method processes of the present invention are not limited to the specific steps described and illustrated, and those skilled in the art can make various changes, modifications and additions or change the order between the steps after comprehending the spirit of the present invention.
Those of ordinary skill in the art will appreciate that the various illustrative components, systems, and methods described in connection with the embodiments disclosed herein may be implemented as hardware, software, or combinations of both. Whether this is done in hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention. When implemented in hardware, it may be, for example, an electronic circuit, an Application Specific Integrated Circuit (ASIC), suitable firmware, plug-in, function card, or the like. When implemented in software, the elements of the invention are the programs or code segments used to perform the required tasks. The program or code segments may be stored in a machine-readable medium or transmitted by a data signal carried in a carrier wave over a transmission medium or a communication link.
It should also be noted that the exemplary embodiments mentioned in this patent describe some methods or systems based on a series of steps or devices. However, the present invention is not limited to the order of the above-described steps, that is, the steps may be performed in the order mentioned in the embodiments, may be performed in an order different from the order in the embodiments, or may be performed simultaneously.
Features that are described and/or illustrated with respect to one embodiment may be used in the same way or in a similar way in one or more other embodiments and/or in combination with or instead of the features of the other embodiments in the present invention.
The above description is only a preferred embodiment of the present invention, and is not intended to limit the present invention, and various modifications and changes may be made to the embodiment of the present invention by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (10)

1. A freehand interactive color-picking method is characterized by comprising the following steps:
tracking hand movement by utilizing a somatosensory controller device, and acquiring hand movement data, wherein the hand movement data comprises a plurality of frame objects, and the frame objects comprise data of hands, fingers and joint points;
identifying hand motion characteristics based on the hand motion data, wherein the hand motion characteristics comprise wrist rotation angle, four-finger gripping strength and palm center depth;
selecting hue, saturation and brightness in a color representation model of a three-dimensional HSV color space by using the identified hand action characteristics comprising wrist rotation angle, four-finger gripping strength and palm center depth;
recognizing gesture events based on the acquired hand motion data, and controlling selection of hue, saturation and brightness in the color representation model by using the hand motion characteristics based on the recognized gesture events, wherein the gesture events comprise a first gesture event and a second gesture event, the first gesture event is used for triggering mode switching of hue indication, and the second gesture event is used for triggering color taking confirmation;
switching a hue indication mode based on the recognized first gesture event if the first gesture event is recognized; when the second gesture event is recognized, the color sampling is confirmed based on the recognized second gesture event.
2. The method of claim 1, wherein the color representation model of the three-dimensional HSV color space is a conical representation model of an HSV color space.
3. The method of claim 2, wherein in the cone space of the cone representation model:
the hues are arranged on the circumference of 360 degrees of the bottom surface of the cone and are circularly connected according to a preset hue arrangement sequence;
the saturation degree changes along the radial direction of the bottom surface of the cone, the closer to the circle center, the lower the saturation degree, and the farther from the circle center, the higher the saturation degree;
lightness varies axially along the height of the cone, the closer to the top of the cone, the lower the lightness; the closer to the bottom of the cone, the higher the brightness.
4. The method of claim 1, wherein the data structure of the hand motion data follows a right hand coordinate system with origin of coordinates at the center of the upper surface of the somatosensory controller device, X-axis along the long side of the device, Y-axis along the vertical, and Z-axis perpendicular to the screen.
5. The method of claim 1, wherein the identifying hand motion features based on the hand motion data comprises:
acquiring a wrist rotation angle, wherein the wrist rotation angle is expressed in radians;
identifying the gripping force by calculating an included angle formed by the metacarpal bones and the proximal phalanges of each finger;
the depth coordinates of the palm are calculated by acquiring the position of the palm with respect to the origin of coordinates.
6. The method of claim 4, wherein the color representation model further comprises:
a color wheel control and a lightness slider control, wherein the color wheel control is a projection of a conical color space on a longitudinal axis;
on the color wheel control, the rotation angle of the pointer indicates the currently selected color, and the length of the pointer corresponds to the currently selected saturation;
on the brightness slider control, the change in brightness and the current value are reflected by the up-and-down movement of the cursor.
7. The method of claim 1,
the first gesture event is an air-tap event of a specific finger, and the specific finger is an index finger, a middle finger or a ring finger;
the second gesture event is a thumb touch event that a thumb touches a designated finger, and the designated finger is an index finger or a middle finger.
8. The method of claim 7, wherein:
the air-touch event of the specific finger is a middle-finger air-touch event, and the thumb touch event of the thumb touching the specific finger is a thumb touch event of the thumb touching the index finger;
the recognizing the gesture event comprises:
calculating the distance between the far end phalanges of the index finger and the middle finger, and if the distance is larger than a preset first threshold value, determining that the current gesture event is an air-tap event;
and calculating the distance between the far phalange of the thumb and the near phalange of the index finger, and if the distance is smaller than a second preset threshold value, determining that the current gesture event is a thumb touch index finger event.
9. A colour-fetching apparatus comprising a processor and a memory, wherein the memory has stored therein computer instructions for executing the computer instructions stored in the memory, the apparatus implementing the steps of the method of any one of claims 1 to 8 when the computer instructions are executed by the processor.
10. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 8.
CN202110796734.5A 2021-07-14 2021-07-14 Bare-handed interactive color taking method and color taking device Active CN113256767B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110796734.5A CN113256767B (en) 2021-07-14 2021-07-14 Bare-handed interactive color taking method and color taking device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110796734.5A CN113256767B (en) 2021-07-14 2021-07-14 Bare-handed interactive color taking method and color taking device

Publications (2)

Publication Number Publication Date
CN113256767A CN113256767A (en) 2021-08-13
CN113256767B true CN113256767B (en) 2021-11-09

Family

ID=77191242

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110796734.5A Active CN113256767B (en) 2021-07-14 2021-07-14 Bare-handed interactive color taking method and color taking device

Country Status (1)

Country Link
CN (1) CN113256767B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114625255B (en) * 2022-03-29 2023-08-01 北京邮电大学 Freehand interaction method oriented to visual view construction, visual view construction device and storage medium

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104063059B (en) * 2014-07-13 2017-01-04 华东理工大学 A kind of real-time gesture recognition method based on finger segmentation
US11106273B2 (en) * 2015-10-30 2021-08-31 Ostendo Technologies, Inc. System and methods for on-body gestural interfaces and projection displays
CN109919039B (en) * 2019-02-14 2023-07-25 上海磐启微电子有限公司 Static gesture recognition method based on palm and finger characteristics

Also Published As

Publication number Publication date
CN113256767A (en) 2021-08-13

Similar Documents

Publication Publication Date Title
US10545580B2 (en) 3D interaction method, device, computer equipment and storage medium
US8552976B2 (en) Virtual controller for visual displays
JP5631535B2 (en) System and method for a gesture-based control system
US10139906B1 (en) Ring human-machine interface
EP2352112B1 (en) Remote control system for electronic device and remote control method thereof
US20130082922A1 (en) Tactile glove for human-computer interaction
CN105980965A (en) Systems, devices, and methods for touch-free typing
JP2015083331A (en) Robot operation device, robot system, and robot operation program
CN108700957A (en) Electronic system and method for text input in virtual environment
US20120249417A1 (en) Input apparatus
US10621766B2 (en) Character input method and device using a background image portion as a control region
Matlani et al. Virtual mouse using hand gestures
CN104866097A (en) Hand-held signal output apparatus and method for outputting signals from hand-held apparatus
CN113256767B (en) Bare-handed interactive color taking method and color taking device
US11537219B2 (en) Feedback input apparatus and method for use thereof
JP5788853B2 (en) System and method for a gesture-based control system
CN114625255B (en) Freehand interaction method oriented to visual view construction, visual view construction device and storage medium
Khaliq et al. Virtual Mouse Implementation Using Color Pointer Detection
Tao et al. Human-Computer Interaction Using Fingertip Based on Kinect
Schlattmann et al. Efficient bimanual symmetric 3d manipulation for bare-handed interaction
CN111766942A (en) Input method and system based on intelligent ring, intelligent ring and mobile device
CN114385007A (en) VR and AR input device based on depth sensor
Oikawa et al. Pointing system using fingers on keyboard
MOUNIKA et al. Digital Art Drawing in the Air through Gesture

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant