CN109118538A - Image presentation method, system, electronic equipment and computer readable storage medium - Google Patents

Image presentation method, system, electronic equipment and computer readable storage medium Download PDF

Info

Publication number
CN109118538A
CN109118538A CN201811045431.4A CN201811045431A CN109118538A CN 109118538 A CN109118538 A CN 109118538A CN 201811045431 A CN201811045431 A CN 201811045431A CN 109118538 A CN109118538 A CN 109118538A
Authority
CN
China
Prior art keywords
image
wearable device
target user
shape information
wearing position
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201811045431.4A
Other languages
Chinese (zh)
Inventor
陈建桥
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Zhangmen Science and Technology Co Ltd
Original Assignee
Shanghai Zhangmen Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Zhangmen Science and Technology Co Ltd filed Critical Shanghai Zhangmen Science and Technology Co Ltd
Priority to CN201811045431.4A priority Critical patent/CN109118538A/en
Publication of CN109118538A publication Critical patent/CN109118538A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B1/00Details of transmission systems, not covered by a single one of groups H04B3/00 - H04B13/00; Details of transmission systems not characterised by the medium used for transmission
    • H04B1/38Transceivers, i.e. devices in which transmitter and receiver form a structural unit and in which at least one part is used for functions of transmitting and receiving
    • H04B1/3827Portable transceivers
    • H04B1/385Transceivers carried on the body, e.g. in helmets

Abstract

The embodiment of the present application discloses image presentation method.One specific embodiment of this method includes: the wearing exemplary requests in response to detecting target user about wearable device, obtains the first image of target user, and wearing exemplary requests include the device identification of wearable device;Based on the first image, the body-shape information of target user is determined;The device identification of body-shape information and wearable device based on target user determines that wearable device is matched with the wearing position of target user;Wearing position determined by being marked in the first image, and the first image after label is determined as the second image;The second image is presented.This embodiment improves the accuracys of the wearing position of wearable device.

Description

Image presentation method, system, electronic equipment and computer readable storage medium
Technical field
The invention relates to field of computer technology, and in particular to image presentation method equipment, system, electronics are set Standby and computer readable storage medium.
Background technique
With the development of artificial intelligence technology, artificial intelligence technology is that people's lives bring various traversals.Based on people The wearable device of work intelligence is also increasingly liked by people.The existing wearable device based on artificial intelligence can be with It can also include that there is the available somatic data of specific function including smartwatch, intelligent glasses, intelligent running shoes etc. Intelligent detecting instrument.
In existing intelligent detecting instrument, multiple patches are generally included, professional person is by being attached to user's body for patch Specific location and detector between form lead, so as to measure user body data (such as blood pressure data, Blood glucose level data, cardiac data etc.).In currently existing scheme, the operation of lead patch depends on professional person, and general user can not Exact operations.
Summary of the invention
The embodiment of the present application provides image presentation method, operates the accurate of wearable device to improve general user Degree.
In a first aspect, the embodiment of the present application provides a kind of image presentation method, this method comprises: in response to detecting Wearing exemplary requests of the target user about wearable device, obtain the first image of target user, and wearing exemplary requests include The device identification of wearable device;Based on the first image, the body-shape information of target user is determined;Figure based on target user The device identification of information and wearable device determines that wearable device is matched with the wearing position of target user;In the first figure Wearing position determined by being marked as in, and the first image after label is determined as the second image;The second image is presented.
In some embodiments, after the second image being presented, method further include: obtain the third figure of target user in real time Picture;Detect third image in whether include wearable device image;It include wearable set in third image in response to detecting Whether standby image determines the wearable device presented in third image at identified wearing position;In response to determination The wearable device presented in third image is presented wearable device wearing and successfully mentions at identified wearing position Show information.
In some embodiments, determine the wearable device presented in third image whether in identified wearing position The place of setting includes: that third image and the second image are carried out similarity-rough set, is determined similar between third image and the second image Whether angle value is greater than preset threshold;And determine the wearable device presented in third image in identified wearing position Place, comprising: be greater than or equal to preset threshold in response to the similarity value determined between third image and the second image, determine the The wearable device presented in three images is at identified wearing position.
In some embodiments, method further include: in response to determining the similarity value between third image and the second image Less than or equal to preset threshold, determine the wearable device presented in third image not at identified wearing position, with And wearable device is presented and does not dress successful prompt information.
In some embodiments, after presentation wearable device does not dress successful prompt information, method further include: really Determine the wearing position of the wearable device presented in third image;Determine the wearable device presented in third image Difference between wearing position and identified wearing position;Based on identified difference, identifying in third image can be worn Wear the amendment instruction of the wearing position of equipment.
In some embodiments, it is based on the first image, determines the body-shape information of target user, comprising: sends out the first image It send to server, so that the first image is input in figure detection model trained in advance by server, determines target user's Body-shape information, wherein figure detection model is used to characterize the corresponding relationship between input picture and body-shape information;Receive service The body-shape information for the target user that device returns.
In some embodiments, the first image includes the profile information of depth of view information and the target user presented;And Based on the first image, the body-shape information of target user is determined, comprising: be based on depth of view information, determine target user and be used to shoot The distance between the capture apparatus of first image information;Based on identified range information and profile information, target user is determined Body-shape information.
In some embodiments, the device identification of body-shape information and wearable device based on target user, determination can be worn Wear the wearing position that equipment is matched with target user, comprising: the body-shape information of target user is sent to server, so that clothes Business device is from preset referring to the matched reference body-shape information of the body-shape information of selection and target user in body-shape information set; Receive the selected reference body-shape information that server returns;Based on referring to body-shape information and device identification, wearing position it Between corresponding relationship wearable device is determined according to the selected device identification referring to body-shape information and wearable device Wearing position.
Second aspect, the embodiment of the present application provide a kind of image presentation system, including terminal device and wearable set Standby, terminal device and wearable device communicate to connect.Terminal device is configured to: in response to detecting wearing for wearable device Exemplary requests are worn, the first image is obtained from capture apparatus, wearing exemplary requests include the device identification of wearable device;Base In the first image, the body-shape information of target user is determined;The equipment of body-shape information and wearable device based on target user Mark, determines that wearable device is matched with the wearing position of target user;Position is dressed determined by marking in the first image It sets, and the first image after label is determined as the second image;The second image is presented.
Second aspect, the embodiment of the present application provide a kind of picture production device, which includes: that the first image obtains Unit is configured in response to the wearing exemplary requests for detecting target user about wearable device, obtains target user's First image, wearing exemplary requests include the device identification of wearable device;Body-shape information determination unit is configured to be based on First image determines the body-shape information of target user;Wearing position determination unit is configured to the figure based on target user The device identification of information and wearable device determines that wearable device is matched with the wearing position of target user;Marking unit, It is configured to mark identified wearing position in the first image, the first image after label is determined as the second image;Figure As display unit, it is configured to present the second image.
In some embodiments, picture production device further include: third image acquisition unit is configured to obtain in real time The third image of target user;Detection unit, be configured to detect in third image whether include wearable device image; First determination unit is configured in response to detect the image in third image including wearable device, determines third image Whether middle presented wearable device is at identified wearing position;First display unit is configured in response to really The wearable device presented in third image is determined at identified wearing position, and it is successful that wearable device wearing is presented Prompt information.
In some embodiments, the first determination unit is further configured to: third image and the second image are carried out phase Compare like degree, determines whether the similarity value between third image and the second image is greater than preset threshold;And first present it is single Member is further configured to: being greater than or equal to default threshold in response to the similarity value determined between third image and the second image Value, determines the wearable device presented in third image at identified wearing position.
In some embodiments, picture production device further include: the second display unit is configured in response to determine the Similarity value between three images and the second image is less than preset threshold, determines the wearable device presented in third image Not at identified wearing position, and wearable device is presented and does not dress successful prompt information.
In some embodiments, picture production device further include: the second determination unit is configured to determine third image The wearing position of middle presented wearable device;Difference value determining unit, be configured to determine presented in third image can Difference between the wearing position of wearable device and identified wearing position;Sign unit is corrected, is configured to be based on Identified difference identifies the amendment instruction of the wearing position of wearable device in third image.
In some embodiments, body-shape information determination unit is further configured to: the first image is sent to clothes Business device determines the target so that the first image is input in figure detection model trained in advance by the server The body-shape information of user, wherein the figure detection model is used to characterize the corresponding pass between input picture and body-shape information System;Receive the body-shape information for the target user that the server returns.
In some embodiments, the first image includes the profile information of depth of view information and the target user presented;With And body-shape information determination unit is further configured to: being based on depth of view information, is determined target user and be used to shoot the first image The distance between camera information;Based on identified range information and profile information, the figure letter of target user is determined Breath.
In some embodiments, wearing position determination unit is further configured to: the figure of the target user is believed Breath is sent to server, so that the server is selected from preset reference body-shape information set with the target user's Body-shape information is matched referring to body-shape information;Receive the selected reference body-shape information that the server returns;Based on reference Corresponding relationship between body-shape information and device identification, wearing position, according to it is selected referring to body-shape information and it is described can The device identification of wearable device determines the wearing position of the wearable device.
The third aspect, the embodiment of the present application provide a kind of image presentation system, including terminal device and wearable set Standby, terminal device and the wearable device communicate to connect.Terminal device is configured to: in response to detecting that target user is closed In the wearing exemplary requests of wearable device, the first image is obtained from the capture apparatus, the wearing exemplary requests include The device identification of the wearable device;Based on the first image, the body-shape information of the target user is determined;Based on institute It is described to determine that the wearable device is matched with for the device identification for stating the body-shape information and the wearable device of target user The wearing position of target user;Wearing position determined by being marked in the first image, and by the first image after label It is determined as the second image;Second image is presented.
The third aspect, the embodiment of the present application provide a kind of electronic equipment, which includes: one or more places Manage device;Storage device, for storing one or more programs;When one or more programs are executed by one or more processors, So that one or more processors realize the method as described in implementation any in first aspect.
Fourth aspect, the embodiment of the present application provide a kind of computer readable storage medium, are stored thereon with computer journey Sequence realizes the method as described in implementation any in first aspect when the computer program is executed by processor.
Image presentation method provided by the embodiments of the present application can determine mesh by obtaining the first image of target user The body-shape information for marking user, then determines wearable device wearing with target user based on identified body-shape information Position is worn, the wearing position of wearable device is finally marked in the first image, the first image after label is determined as second Simultaneously the second image is presented in image, determines wearing position of the wearable device with user so as to the figure according to user It sets, improves the accuracy of the wearing of wearable device, be conducive to the body data for improving user acquired in wearable device Accuracy.
Detailed description of the invention
By reading a detailed description of non-restrictive embodiments in the light of the attached drawings below, the application its Its feature, objects and advantages will become more apparent upon:
Fig. 1 is that this application can be applied to exemplary system architecture figures therein;
Fig. 2 is the flow chart according to one embodiment of the image presentation method of the application;
Fig. 3 is the schematic diagram according to an application scenarios of the image presentation method of the application;
Fig. 4 is the flow chart according to another embodiment of the image presentation method of the application;
Fig. 5 is the structural schematic diagram according to one embodiment of the picture production device of the application;
Fig. 6 is the structural schematic diagram according to one embodiment of the image presentation system of the application;
Fig. 7 is adapted for the structural schematic diagram for the computer system for realizing the electronic equipment of the embodiment of the present application.
Specific embodiment
The application is described in further detail with reference to the accompanying drawings and examples.It is understood that this place is retouched The specific embodiment stated is used only for explaining related invention, rather than the restriction to the invention.It also should be noted that being Convenient for description, part relevant to related invention is illustrated only in attached drawing.
It should be noted that in the absence of conflict, the features in the embodiments and the embodiments of the present application can phase Mutually combination.The application is described in detail below with reference to the accompanying drawings and in conjunction with the embodiments.
Fig. 1 is shown can be using the exemplary of the embodiment of the image presentation method or picture production device of the application System architecture 100.
As shown in Figure 1, system architecture 100 may include terminal device 101,102,103, network 104 and server 105. Network 104 between terminal device 101,102,103 and server 105 to provide the medium of communication link.Network 104 can To include various connection types, such as wired, wireless communication link or fiber optic cables etc..
The image that terminal device 101,102,103 shoots user can be used in user 110, and will be shot by network 104 Image be sent in server 105.Terminal device 101,102,103 can be the terminal device with various functions, such as Camera function, camera function etc..Terminal device 101,102,103 can analyze the image of the target user received, Wearing position of the wearable device on the body of target user is determined based on the analysis results, is then used in the target received Mark the wearing position of wearable device to present on the image at family.
Terminal device 101,102,103 can be hardware, be also possible to software.When terminal device 101,102,103 is hard When part, include but is not limited to the smart phone for being equipped with image collecting device (such as camera), tablet computer, it is on knee just Take computer etc..When terminal device 101,102,103 is software, may be mounted in above-mentioned cited electronic equipment. Multiple softwares or software module (such as providing Distributed Services) may be implemented into it, and single software also may be implemented into Or software module.It is not specifically limited herein.
Server 105 can be to provide the server of various services, such as server 105 can be to the end received The information that end equipment 101,102,103 is sent is handled, and returns to the background server of processing result.Background server can It is detected with the image to the target user received, determines the body-shape information of user according to testing result, then will test The body-shape information of user out returns to terminal device 101,102,103.The use that server 105 is also an option that and receives The body-shape information at family is matched referring to body-shape information, and terminal device 101,102,103 will be returned to referring to body-shape information.
It should be noted that server can be hardware, it is also possible to software.When server is hardware, may be implemented At the distributed server cluster that multiple servers form, individual server also may be implemented into.When server is software, Multiple softwares or software module (such as providing Distributed Services) may be implemented into, also may be implemented into single software or Software module.It is not specifically limited herein.
It should be noted that image presentation method provided by the embodiment of the present application generally by terminal device 101,102, 103 execute, and correspondingly, picture production device is generally positioned in equipment 101,102,103.
It should be understood that the number of terminal device, network and server in Fig. 1 is only schematical.According to realization need It wants, can have any number of terminal device, network and server.
With continued reference to Fig. 2, the process 200 of one embodiment of the image presentation method according to the application is shown.The figure As rendering method, comprising the following steps:
Step 201, it in response to detecting wearing exemplary requests of the target user about wearable device, obtains target and uses First image at family.
In the present embodiment, the executing subject (such as terminal device shown in FIG. 1) of image presentation method can be examined in real time It surveys and whether deposits wearing exemplary requests of the target user about wearable device.For example, after dressing exemplary requests when the user clicks, Request mark can be generated.Above-mentioned executing subject can confirmly detect target user about can after getting request mark The wearing exemplary requests of wearable device.Herein, wearable device includes but is not limited to smartwatch, Intelligent bracelet, is used for Carry out the magnetic therapeutic paster of body acupuncture point massaging, the electrocardio equipment for detecting human blood-pressure data, electrocardiogram (ECG) data, for detecting The electrocardio patch of electrocardiogram (ECG) data.Above-mentioned wearing exemplary requests can also include the device identification of wearable device.To above-mentioned Executing subject is by detecting that the device identification of wearable device can determine the equipment type to be dressed of above-mentioned target user Number, device type etc..
In the present embodiment, above-mentioned executing subject can also obtain target use after detecting above-mentioned wearing exemplary requests First image at family.It herein, include the wearing region of wearable device in the first image of the target user.Image is presented The executing subject of method can obtain the first image of target user by the camera being arranged on, can also be from local Obtain the first image of target user.Herein, the first image of target user can be by mobile phone, camera, video camera, Still image, dynamic image, real-time video acquired in camera etc. etc..
Step 202, it is based on the first image, determines the body-shape information of target user.
In the present embodiment, above-mentioned executing subject passes through acquired after getting the first image of target user One image can determine the body-shape information of target user.
As a kind of implementation, the first image may include depth of view information and the profile letter of the target user presented Breath.Above-mentioned executing subject can be based on depth of view information, determine between target user and capture apparatus for shooting the first image Range information;Based on identified range information and profile information, the body-shape information of target user is determined.
Specifically, what first image can synthesize for infrared camera and visible image capturing head while after shooting Depth image, to may include depth of view information and three primary colors color value in the depth image.Above-mentioned executing subject can basis The depth of view information of first image determines the distance between target user and infrared camera.Then, above-mentioned executing subject can be from The body contour information of first extracting target from images user.To which above-mentioned executing subject can pass through identified target The body contour information of the distance between user and infrared camera and user determine the body-shape information of user, figure letter Breath includes but is not limited to bust data, waistline data, shoulder breadth data, the long data of leg etc..
As another implementation, when above-mentioned executing subject detects that user demonstrates about the wearing of wearable device When request, above-mentioned executing subject can control camera shooting when detecting user and above-mentioned executing subject within a preset range Head obtains the first image of target user.Obtain target user's by setting the distance between target user and camera First image can to include the portion for determining the figure of target user in user images accessed by camera Position, while may also be ensured that the clarity of acquired image, be conducive to the accuracy of the body type of user determined by improving. Then, above-mentioned executing subject first image can be input to communication connection server (such as shown in figure service Device).To which the first image can be input in figure detection model trained in advance by server, so as to obtain target The body-shape information of user.Herein, which is used to characterize the corresponding pass between input picture and body-shape information System.Above-mentioned server can also be by being trained preset initial figure detection model, to obtain figure detection mould Type.Herein, the initial figure detection model for example may include artificial neural network (such as convolutional neural networks, circulation mind Through network etc.).The artificial neural network can have existing various neural network structures (such as DenseBox, VGGNet, ResNet, SegNet etc.).Above-mentioned figure detection model can be technical staff to be had based on existing artificial neural network Obtained from the training of supervision ground.Specifically, in the present embodiment, above-mentioned figure detection model can train as follows It obtains:
Firstly, obtaining training sample set.Herein, training sample may include using default camera according to default The sample image and markup information corresponding with sample image of distance shooting.Wherein, which includes in sample image Each position of the human body of presentation and data information corresponding with each position, the data information for example may include waistline, Height, leg length, bust etc..
Then, the sample image of at least one training sample in training sample set is separately input into preliminary classification Model obtains the corresponding data information of each sample image at least one sample image.Then, obtained data are believed Breath is compared with the data information marked, determines the difference between obtained data information and the data information marked Whether value is less than preset threshold.Difference between response and determining obtained data information and the data information marked is small In being equal to preset threshold, the initial figure that can determine that above-mentioned initial figure detection model training is completed, and training is completed is examined Model is surveyed as above-mentioned figure detection model.
Step 203, the device identification of body-shape information and wearable device based on target user, determines wearable device It is matched with the wearing position of target user.
It in this embodiment, can be according to wearable device after above-mentioned executing subject determines the body-shape information of target user Device identification further determines that out that wearable device is matched with the wearing position of target user.Specifically, above-mentioned executing subject It can determine the relative co-ordinate information between wearable device and target user.
In some optional implementations of the present embodiment, above-mentioned executing subject can believe the figure of target user Breath is sent to the server for communicating with connection.To which the server for communicating with connection can be believed from preset referring to figure Selection and the body-shape information of target user are matched referring to body-shape information in breath set.Then, above-mentioned executing subject can receive The selected reference body-shape information out that server returns.Finally, above-mentioned executing subject can based on referring to body-shape information with Corresponding relationship between device identification, wearing position, according to the selected equipment mark referring to body-shape information and wearable device Know, determines the wearing position of wearable device.
Specifically, with that can preset in the server of above-mentioned executing subject communication connection with reference to body-shape information collection Close and referring to figure and the wearing position of wearable device mapping table or correspondence diagram.Pass through the reference Body-shape information can determine that the wearing position of wearable device physically.The server connecting with above-mentioned executing subject can be with The body-shape information of identified target user is matched with referring to the body-shape information in body-shape information set, will also be used It is compared in the data of the figure of characterization user with referring to the data for characterizing figure in body-shape information set, from ginseng It is the smallest referring to body-shape information according to the difference selected in body-shape information set between the body-shape information of target user, with And this is returned into above-mentioned executing subject referring to body-shape information.To which above-mentioned executing subject can determine the reference body-shape information It is corresponding referring to figure, and according to the matched wearable device of device identification the above-mentioned wearing position referring in figure come Determining and wearing position of the matched wearable device of device identification on target user's body.
Step 204, the wearing position of wearable device determined by being marked in the first image, and by the after label One image is determined as the second image.
In the present embodiment, above-mentioned executing subject is determining to go out wearable device on the body of target user After wearing position, can be marked in the first image determined by wearable device wearing position, and by first after label Image is determined as the second image.
Step 205, the second image is presented.
In the present embodiment, the second image identified in step 204 can also be presented on display by above-mentioned executing subject In screen.
With continued reference to the schematic diagram that Fig. 3, Fig. 3 are according to the application scenarios of the image presentation method of the application.Fig. 3 In, using terminal device as computer 301, with wearable device for the equipment comprising heart leads patch, it is illustrated.? In the application scenarios of Fig. 3, in application that user dresses heart leads patch that user was arranged in computer 301 instruct Click heart leads patch wearing exemplary requests.Applying in computer 301 wears heart leads patch receiving After wearing exemplary requests, the first image that the camera being set on computer 301 obtains user is controlled.Then, computer 301 is obtaining After getting the first image of user, the figure for the user that can be presented to the first image is analyzed, to determine the body of user Type information.Then, computer 301 determines heart leads patch with user according to the body-shape information of identified user Wearing position.Finally, computer 301 marks the wearing position of heart leads patch in the first acquired image, then will The first image after label is determined as the second image, and the second image is presented on the display screen of computer 301.It is also shown in Fig. 3 Specific explanatory note is carried out to wearing position.
Image presentation method provided by the embodiments of the present application can determine mesh by obtaining the first image of target user The body-shape information for marking user, then determines wearable device wearing with target user based on identified body-shape information Position is worn, the wearing position of wearable device is finally marked in the first image, the first image after label is determined as second Simultaneously the second image is presented in image, determines wearing position of the wearable device with user so as to the figure according to user It sets, improves the accuracy of the wearing of wearable device, be conducive to the body data for improving user acquired in wearable device Accuracy.
With further reference to 4, it illustrates the processes 400 of another embodiment of image presentation method.The image presentation side The process 400 of method, comprising the following steps:
Step 401, it in response to detecting wearing exemplary requests of the target user about wearable device, obtains target and uses First image at family.
In the present embodiment, the executing subject (such as terminal device shown in FIG. 1) of image presentation method can be examined in real time It surveys and whether deposits wearing exemplary requests of the target user about wearable device.For example, after dressing exemplary requests when the user clicks, Request mark can be generated.Above-mentioned executing subject can confirmly detect target user about can after getting request mark The wearing exemplary requests of wearable device.Herein, wearable device includes but is not limited to smartwatch, Intelligent bracelet, is used for Carry out the magnetic therapeutic paster of body acupuncture point massaging, the electrocardio equipment for detecting human blood-pressure data, electrocardiogram (ECG) data, for detecting The electrocardio patch of electrocardiogram (ECG) data.Above-mentioned wearing exemplary requests can also include the device identification of wearable device.To above-mentioned Executing subject is by detecting that the device identification of wearable device can determine the equipment type to be dressed of above-mentioned target user Number, device type etc..Above-mentioned executing subject can also obtain the of target user after detecting above-mentioned wearing exemplary requests One image.
Step 402, it is based on the first image, determines the body-shape information of target user.
Step 403, the device identification of body-shape information and wearable device based on target user, determines wearable device It is matched with the wearing position of target user.
Step 404, wearing position determined by being marked in the first image, and the first image after label is determined as Second image.
Step 405, second image is presented.
The specific implementation of step 401, step 402, step 403, step 404 and step 405 in the present embodiment with And brought beneficial effect please refers to step 201 shown in embodiment 200, step 202, step 203, step 204 and step 205, details are not described herein.
Step 406, the third image of target user is obtained in real time.
In the present embodiment, after the second image is presented in above-mentioned executing subject, the third of target user can also be obtained in real time Image.By obtaining the third image of target user, above-mentioned executing subject can detecte target user and wear to wearable device Whether accurate wear the position whether wearable device is placed in body by correct or target user.
Step 407, detect third image in whether include wearable device image.
In the present embodiment, above-mentioned executing subject can detect the third image got in real time, so that it is determined that In third image whether include wearable device image.
Specifically, can store the image of wearable device in above-mentioned executing subject.Above-mentioned executing subject can traverse The pixel value of above-mentioned third image determines in the third image whether to include pixel with the image of above-mentioned wearable device Difference between value is less than the pixel value of preset threshold.If in third image including the image with above-mentioned wearable device Difference between pixel value is less than preset pixel value, then can determine the image in third image including wearable device.
Step 408, in response to detecting the image in third image including wearable device, institute in third image is determined Whether the wearable device of presentation is at identified wearing position.
In the present embodiment, above-mentioned executing subject is in response to detecting the figure in third image including wearable device As after, it may further determine that the wearable device presented in third image whether at identified wearing position.
Specifically, above-mentioned executing subject can will give birth in the third image of acquired target user and step 404 At the second image carry out similarity-rough set, so that it is determined that whether the similarity value between third image and the second image is greater than Preset threshold.Herein, the method for determining the similarity between third image and the second image for example can include determining that third Euclidean distance between image and the second image, Euclidean distance between determine third image when being less than preset threshold It is similar to the second image, it may thereby determine that the wearable device presented in third image at identified wearing position; When Euclidean distance between is greater than preset threshold, it can determine that third image and the second image are dissimilar, so as to Determine the wearable device presented in third image not at identified wearing position.Herein, Euclidean distance refers in m Actual distance in dimension space between two points.Since image is made of pixel, above-mentioned executing subject can use Europe Family name's distance calculates between the grey level histogram of pixel and the grey level histogram of the pixel in third figure in the second image Difference.And the difference is determined as the Euclidean distance between the second image and third image.Herein, second image and It can have identical camera parameter between three images, and be the image of same target user.Since the second image is in The wearing position of existing wearable device is to generate by calculating, by comparing the phase between the second image and third image Can determine that the wearing position of wearable device that third image is presented and the second image are presented like degree wearable sets Difference between standby wearing position, may thereby determine that out the wearable device that is presented in third image whether really At fixed wearing position.It, can by determining whether the similarity value between third image and the second image is greater than preset threshold It is whether correct to quickly detect wearing position of the wearable device on the body of target user, improve detection speed.
Step 409, in response to determining that the wearable device presented in third image at identified position, is presented Wearable device dresses successful prompt information.
In the present embodiment, according to the wearable device presented in the determined third image of step 408 whether really At fixed wearing position, the wearable device that above-mentioned executing subject is presented in determining third image is at identified place When, wearable device can be presented in display screen and dress successful prompt information.Herein, which for example can be with For voice messaging, can be text information, can be pictorial information etc..
Specifically, above-mentioned executing subject is in response to determining that the similarity value between third image and the second image is greater than Or when being equal to preset threshold, it can determine the wearable device of third image presentation at identified wearing position.
In some optional implementations, above-mentioned executing subject is presented wearable in determining third image When equipment is not at identified position, it can present and not dress successful prompt information.Specifically, above-mentioned executing subject can Can determine in third image when determining that the similarity value between third image and the second image is less than preset threshold The wearable device presented is not at identified wearing position, so as to which above-mentioned wearable device is presented in display screen Successful prompt information is not dressed.
In some optional implementations, after wearable device is presented and does not dress successful prompt information, may be used also To comprise determining that the wearing position of the wearable device presented in third image;Determine that is presented in third image wears Wear the difference between the wearing position of equipment and identified wearing position;Based on identified difference, get the bid in third image Know the amendment instruction of the wearing position of wearable device out.
Figure 4, it is seen that the present embodiment highlights detection target user unlike embodiment shown in Fig. 2 Whether successful step is dressed to wearable device and amendment is presented when detecting that wearable device is not dressed successfully and is referred to The step of showing, so that wearing position of the wearable device on user's body is more accurate.
With further reference to Fig. 5, as the realization to method shown in above-mentioned each figure, this application provides a kind of presentations of image One embodiment of device, the Installation practice is corresponding with embodiment of the method shown in Fig. 2, which can specifically apply In various electronic equipments.
As shown in figure 5, the picture production device 500 of the present embodiment may include: the first image acquisition unit 501, figure Information determination unit 502, wearing position determination unit 503, marking unit 504 and image presentation unit 505.Wherein, the first figure As acquiring unit 501, it is configured to be configured in response to detect that target user asks about the wearing demonstration of wearable device It asks, obtains the first image of target user, wearing exemplary requests include the device identification of wearable device;Body-shape information determines Unit 502 is configured to determine the body-shape information of target user based on the first image;Wearing position determination unit 503 is matched It is set to the device identification of body-shape information and wearable device based on target user, determines that wearable device is matched with target use The wearing position at family;Marking unit 504 is configured to mark identified wearing position in the first image, after label The first image be determined as the second image;Image presentation unit 505 is configured to present the second image.
In the present embodiment, the first image acquisition unit 501 in picture production device 500, body-shape information determination unit 502, the specific processing of wearing position determination unit 503, marking unit 504 and image presentation unit 505 and its bring are beneficial Effect can be referring to the realization side of step 201, step 202, step 203, step 204 and step 205 in Fig. 2 corresponding embodiment The associated description of formula, details are not described herein.
In some optional implementations of the present embodiment, picture production device 500 further include: third image obtains Unit (not shown) is configured to obtain the third image of target user in real time.Detection unit (not shown) is configured to examine Survey third image in whether include wearable device image.First determination unit (not shown), is configured in response to detect Include the image of wearable device into third image, determine the wearable device that is presented in third image whether really At fixed wearing position.First display unit (not shown) is configured in response to determine that is presented in third image wears Equipment is worn at identified wearing position, wearable device is presented and dresses successful prompt information.
In some optional implementations of the present embodiment, the first determination unit (not shown) is further configured to: Third image and the second image are subjected to similarity-rough set, determine whether is similarity value between third image and the second image Greater than preset threshold;And first display unit be further configured to: in response to determining between third image and the second image Similarity value be greater than or equal to preset threshold, determine the wearable device presented in third image in identified wearing position Set place.
In some embodiments, picture production device 500 further include: the second display unit (not shown) is configured to ring Preset threshold should be less than in determining the similarity value between third image and the second image, determine and presented in third image Wearable device is not at identified wearing position, and wearable device is presented and does not dress successful prompt information.
In some optional implementations of the present embodiment, picture production device 500 further include: the second determination unit (not shown) is configured to determine the wearing position of the wearable device presented in third image.Difference value determining unit is (not Show), be configured to determine the wearable device presented in third image wearing position and identified wearing position it Between difference.Sign unit (not shown) is corrected, is configured to identify in third image based on identified difference The amendment instruction of the wearing position of wearable device out.
In some optional implementations of the present embodiment, body-shape information determination unit 502 is further configured to: The first image is sent to server, so that the first image is input to figure trained in advance by the server In detection model, the body-shape information of the target user is determined, wherein the figure detection model is for characterizing input picture Corresponding relationship between body-shape information;Receive the body-shape information for the target user that the server returns.
In some optional implementations of the present embodiment, the first image includes depth of view information and the mesh presented Mark the profile information of user;And body-shape information determination unit 502 is further configured to: being based on depth of view information, is determined target User with for shooting the distance between the camera of the first image information;Based on identified range information and profile information, Determine the body-shape information of target user.
In some optional implementations of the present embodiment, wearing position determination unit 503 is further configured to: The body-shape information of the target user is sent to server, so that the server is from preset referring to body-shape information set Middle selection and the body-shape information of the target user are matched referring to body-shape information;Receive the selected of the server return Reference body-shape information;Based on the corresponding relationship between reference body-shape information and device identification, wearing position, according to selected Reference body-shape information and the wearable device device identification, determine the wearing position of the wearable device.
Below with reference to Fig. 6, it illustrates a kind of structure charts for image presentation system that application provides.
As shown in fig. 6, image presentation system 600 includes terminal device 601 and wearable device 602.Herein, the end End equipment 601 can be the terminal device with various functions, include but is not limited to the intelligent hand for being equipped with image collecting device Machine, tablet computer, pocket computer on knee etc..Herein, which can be set camera, display screen Deng.To functions such as terminal device 601 can carry out Image Acquisition, image is presented.
It is also provided in terminal device 601 and provides the application supported to wearable device.To terminal device 601 It can be communicated to connect with wearable device 602.Herein, which for example may include wireless network connection, bluetooth company It connects.So as to receive wearable device user's body data information collected.The user's body data information for example may be used To include cardiac data information, physical condition data information, exercise data information etc..Wearable device 602 includes But be not limited to smartwatch, Intelligent bracelet, the magnetic therapeutic paster for carrying out body acupuncture point massaging, for detecting human blood-pressure number According to, the electrocardio equipment of electrocardiogram (ECG) data, the electrocardio patch for detecting electrocardiogram (ECG) data.
In the present embodiment, terminal device is specifically configured to: in response to detecting target user about wearable device Wearing exemplary requests, obtain the first image from capture apparatus, wearing exemplary requests include the equipment mark of wearable device Know.Based on the first image, the body-shape information of target user is determined.Body-shape information and wearable device based on target user Device identification determines that wearable device is matched with the wearing position of target user.It is worn determined by label in the first image Position is worn, and the first image after label is determined as the second image.The second image is presented.
Below with reference to Fig. 7, it illustrates the computer systems for the electronic equipment for being suitable for being used to realize the embodiment of the present application 700 structural schematic diagram.Electronic equipment shown in Fig. 7 is only an example, function to the embodiment of the present application and should not be made With range band come any restrictions.
As shown in fig. 7, computer system 700 includes central processing unit (CPU) 701, it can be read-only according to being stored in Program in memory (ROM) 702 is loaded into the program in random access storage device (RAM) 703 from storage section 708 And execute various movements appropriate and processing.In RAM 703, also it is stored with system 700 and operates required various program sum numbers According to.CPU 701, ROM 702 and RAM 703 are connected with each other by bus 704.Input/output (I/O) interface 705 also connects To bus 704.
I/O interface 705 is connected to lower component: the importation 706 including keyboard, mouse etc.;It is penetrated including such as cathode The output par, c 707 of spool (CRT), liquid crystal display (LCD) etc. and loudspeaker etc.;Storage section including hard disk etc. 708;And the communications portion 709 of the network interface card including LAN card, modem etc..Communications portion 709 via The network of such as internet executes communication process.Driver 710 is also connected to I/O interface 705 as needed.Detachable media 711, such as disk, CD, magneto-optic disk, semiconductor memory etc., are mounted on as needed on driver 710, in order to from The computer program read thereon is mounted into storage section 708 as needed.
Particularly, in accordance with an embodiment of the present disclosure, it may be implemented as computer above with reference to the process of flow chart description Software program.For example, embodiment of the disclosure includes a kind of computer program product comprising be carried on computer-readable Jie Computer program in matter, the computer program include the program code for method shown in execution flow chart.Such In embodiment, which can be downloaded and installed from network by communications portion 709, and/or from detachable Medium 711 is mounted.When the computer program is executed by central processing unit (CPU) 701, execute in the present processes The above-mentioned function of limiting.It is situated between it should be noted that computer-readable medium described herein can be computer-readable signal Matter or computer readable storage medium either the two any combination.Computer readable storage medium for example can be with System, device or the device of --- but being not limited to --- electricity, magnetic, optical, electromagnetic, infrared ray or semiconductor, or arbitrarily with On combination.The more specific example of computer readable storage medium can include but is not limited to: have one or more conducting wires Electrical connection, portable computer diskette, hard disk, random access storage device (RAM), read-only memory (ROM), erasable type can Program read-only memory (EPROM or flash memory), optical fiber, portable compact disc read-only memory (CD-ROM), optical memory Part, magnetic memory device or above-mentioned any appropriate combination.In this application, computer readable storage medium, which can be, appoints What includes or the tangible medium of storage program, the program can be commanded execution system, device or device using or with It is used in combination.And in this application, computer-readable signal media may include in a base band or as carrier wave one Divide the data-signal propagated, wherein carrying computer-readable program code.The data-signal of this propagation can use more Kind form, including but not limited to electromagnetic signal, optical signal or above-mentioned any appropriate combination.Computer-readable signal is situated between Matter can also be that any computer-readable medium other than computer readable storage medium, the computer-readable medium can be sent out It send, propagate or transmits for by the use of instruction execution system, device or device or program in connection.Meter The program code for including on calculation machine readable medium can transmit with any suitable medium, including but not limited to: wireless, electric wire, Optical cable, RF etc. or above-mentioned any appropriate combination.
The calculating of the operation for executing the application can be write with one or more programming languages or combinations thereof Machine program code, described program design language include object oriented program language-such as Java, Smalltalk, C+ +, python, further include conventional procedural programming language-such as " C " language or similar programming language.Journey Sequence code can be executed fully on the user computer, partly execute on the user computer, be independent as one Software package executes, part executes on the remote computer or completely in remote computer or clothes on the user computer for part It is executed on business device.In situations involving remote computers, remote computer can pass through the network of any kind --- including Local area network (LAN) or wide area network (WAN)-are connected to subscriber computer, or, it may be connected to outer computer (such as benefit It is connected with ISP by internet).
Flow chart and block diagram in attached drawing are illustrated according to the system of the various embodiments of the application, method and computer journey The architecture, function and operation in the cards of sequence product.In this regard, each box in flowchart or block diagram can be with A part of a module, program segment or code is represented, a part of the module, program segment or code includes one or more A executable instruction for implementing the specified logical function.It should also be noted that in some implementations as replacements, box Middle marked function can also occur in a different order than that indicated in the drawings.For example, two boxes succeedingly indicated Can actually be basically executed in parallel, they can also be executed in the opposite order sometimes, this according to related function and It is fixed.It is also noted that the group of each box in block diagram and or flow chart and the box in block diagram and or flow chart It closes, can be realized with the dedicated hardware based system for executing defined functions or operations, or specialized hardware can be used Combination with computer instruction is realized.
Being described in unit involved in the embodiment of the present application can be realized by way of software, can also be passed through The mode of hardware is realized.Described unit also can be set in the processor, for example, can be described as: a kind of processor It is presented including the first image acquisition unit, body-shape information determination unit, wearing position determination unit, marking unit and image single Member.Wherein, the title of these units does not constitute the restriction to the unit itself under certain conditions, for example, obtaining single Member is also described as " in response to the wearing exemplary requests for detecting target user about wearable device, obtaining target and using The unit of first image at family ".
As on the other hand, present invention also provides a kind of computer-readable medium, which be can be Included in electronic equipment described in above-described embodiment;It is also possible to individualism, and without the supplying electronic equipment In.Above-mentioned computer-readable medium carries one or more program, when said one or multiple programs are set by the electronics When standby execution, so that the electronic equipment: in response to detecting wearing exemplary requests of the target user about wearable device, obtaining The first image of target user is taken, wearing exemplary requests include the device identification of wearable device;Based on the first image, determine The body-shape information of target user;The device identification of body-shape information and wearable device based on target user, determines wearable set The standby wearing position for being matched with target user;Wearing position determined by being marked in the first image, and by first after label Image is determined as the second image;The second image is presented.
Above description is only the preferred embodiment of the application and the explanation to institute's application technology principle.Art technology Personnel should be appreciated that invention scope involved in the application, however it is not limited to skill made of the specific combination of above-mentioned technical characteristic Art scheme, while should also cover in the case where not departing from foregoing invention design, by above-mentioned technical characteristic or its equivalent feature into Row any combination and the other technical solutions formed.Such as features described above and (but being not limited to) disclosed herein have class Technical characteristic like function is replaced mutually and the technical solution that is formed.

Claims (11)

1. a kind of image presentation method characterized by comprising
In response to detecting wearing exemplary requests of the target user about wearable device, the first figure of the target user is obtained Picture, the wearing exemplary requests include the device identification of the wearable device;
Based on the first image, the body-shape information of the target user is determined;
The device identification of body-shape information and the wearable device based on the target user, determines the wearable device Wearing position assigned in the target user;
Wearing position determined by being marked in the first image, and the first image after label is determined as the second image;
Second image is presented.
2. the method according to claim 1, wherein the method is also after presentation second image Include:
The third image of the target user is obtained in real time;
Detect in the third image whether include the wearable device image;
In response to detect include in the third image wearable device image, determine in the third image and be in Whether the existing wearable device is at the identified wearing position;
The wearable device in response to being presented in the determination third image is at the identified wearing position The existing wearable device dresses successful prompt information.
3. according to the method described in claim 2, it is characterized in that, presented in the determination third image it is described can Wearable device whether at the identified wearing position include:
The third image and second image are subjected to similarity-rough set, determine the third image and second image Between similarity value whether be greater than preset threshold;And
The wearable device presented in the determination third image is at the identified wearing position, packet It includes:
It is greater than or equal to preset threshold in response to the similarity value between the determination third image and second image, determines The wearable device presented in the third image is at the identified wearing position.
4. according to the method described in claim 3, it is characterized in that, the method also includes:
It is less than preset threshold in response to the similarity value between the determination third image and second image, determines described the The wearable device presented in three images is not at the identified wearing position, and described wearable set is presented It is standby not dress successful prompt information.
5. according to the method described in claim 4, it is characterized in that, the presentation wearable device is not dressed and is successfully mentioned After showing information, the method also includes:
Determine the wearing position of the wearable device presented in the third image;
Determine the wearing position and the identified wearing position of the wearable device presented in the third image Between difference;
Based on identified difference, the amendment for the wearing position for identifying the wearable device in the third image refers to Show.
6. the method according to any one of claims 1 to 5, which is characterized in that it is described to be based on the first image, it determines The body-shape information of the target user, comprising:
The first image is sent to server, so that the first image is input to body trained in advance by the server In type detection model, the body-shape information of the target user is determined, wherein the figure detection model is for characterizing input picture Corresponding relationship between body-shape information;
Receive the body-shape information for the target user that the server returns.
7. the method according to any one of claims 1 to 5, which is characterized in that the first image includes depth of view information With the profile information of the target user presented;And
It is described to be based on the first image, determine the body-shape information of the target user, comprising:
Based on the depth of view information, determine between the target user and capture apparatus for shooting the first image away from From information;
Based on the profile information of identified range information and the target user, the body-shape information of the target user is determined.
8. the method according to any one of claims 1 to 5, which is characterized in that the body based on the target user The device identification of type information and the wearable device determines that the wearable device is matched with the wearing position of the target user It sets, comprising:
The body-shape information of the target user is sent to server, so that the server is from preset referring to body-shape information collection It is selected in conjunction matched referring to body-shape information with the body-shape information of the target user;
Receive the selected reference body-shape information that the server returns;
Based on referring to the corresponding relationship between body-shape information and device identification, wearing position, believed according to selected referring to figure The device identification of breath and the wearable device, determines that the wearable device is matched with the wearing position of the target user.
9. a kind of image presentation system characterized by comprising terminal device and wearable device, the terminal device and institute State wearable device communication connection;The terminal device is configured to: in response to detecting target user about wearable device Wearing exemplary requests, obtain the first image from the capture apparatus, the wearing exemplary requests include described wearable set Standby device identification;Based on the first image, the body-shape information of the target user is determined;Body based on the target user The device identification of type information and the wearable device determines that the wearable device is matched with the wearing position of the target user It sets;Wearing position determined by being marked in the first image, and the first image after label is determined as the second image;It is in Existing second image.
10. a kind of electronic equipment, comprising: one or more processors;
Memory is stored thereon with one or more programs,
When one or more of programs are executed by one or more of processors, so that one or more of processors are real Now such as method described in any one of claims 1-8.
11. a kind of computer readable storage medium, is stored thereon with computer program, wherein when the program is executed by processor Realize such as method described in any one of claims 1-8.
CN201811045431.4A 2018-09-07 2018-09-07 Image presentation method, system, electronic equipment and computer readable storage medium Pending CN109118538A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811045431.4A CN109118538A (en) 2018-09-07 2018-09-07 Image presentation method, system, electronic equipment and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811045431.4A CN109118538A (en) 2018-09-07 2018-09-07 Image presentation method, system, electronic equipment and computer readable storage medium

Publications (1)

Publication Number Publication Date
CN109118538A true CN109118538A (en) 2019-01-01

Family

ID=64858929

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811045431.4A Pending CN109118538A (en) 2018-09-07 2018-09-07 Image presentation method, system, electronic equipment and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN109118538A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023065866A1 (en) * 2021-10-20 2023-04-27 华为技术有限公司 Wearable device and wearable system

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103413229A (en) * 2013-08-30 2013-11-27 郝晓伟 Method and device for showing baldric try-on effect
CN104866103A (en) * 2015-06-01 2015-08-26 联想(北京)有限公司 Relative position determining method, wearable electronic equipment and terminal equipment
CN104881526A (en) * 2015-05-13 2015-09-02 深圳彼爱其视觉科技有限公司 Article wearing method and glasses try wearing method based on 3D (three dimensional) technology
CN105139248A (en) * 2015-09-07 2015-12-09 深圳创维数字技术有限公司 Method and apparatus for displaying wearable article
US20160157718A1 (en) * 2014-12-09 2016-06-09 WiseWear Corporation Choosing a heart rate monitor for a wearable monitoring device
US20160189431A1 (en) * 2014-12-25 2016-06-30 Kabushiki Kaisha Toshiba Virtual try-on system, virtual try-on terminal, virtual try-on method, and computer program product
CN106250938A (en) * 2016-07-19 2016-12-21 易视腾科技股份有限公司 Method for tracking target, augmented reality method and device thereof
CN107103513A (en) * 2017-04-23 2017-08-29 广州帕克西软件开发有限公司 A kind of virtual try-in method of glasses
CN107615214A (en) * 2015-05-21 2018-01-19 日本电气株式会社 Interface control system, interface control device, interface control method and program
US20180082479A1 (en) * 2016-09-22 2018-03-22 Boe Technology Group Co., Ltd. Virtual fitting method, virtual fitting glasses and virtual fitting system
CN108289620A (en) * 2015-11-13 2018-07-17 皇家飞利浦有限公司 Equipment, system and method for sensing station guiding

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103413229A (en) * 2013-08-30 2013-11-27 郝晓伟 Method and device for showing baldric try-on effect
US20160157718A1 (en) * 2014-12-09 2016-06-09 WiseWear Corporation Choosing a heart rate monitor for a wearable monitoring device
US20160189431A1 (en) * 2014-12-25 2016-06-30 Kabushiki Kaisha Toshiba Virtual try-on system, virtual try-on terminal, virtual try-on method, and computer program product
CN104881526A (en) * 2015-05-13 2015-09-02 深圳彼爱其视觉科技有限公司 Article wearing method and glasses try wearing method based on 3D (three dimensional) technology
CN107615214A (en) * 2015-05-21 2018-01-19 日本电气株式会社 Interface control system, interface control device, interface control method and program
CN104866103A (en) * 2015-06-01 2015-08-26 联想(北京)有限公司 Relative position determining method, wearable electronic equipment and terminal equipment
CN105139248A (en) * 2015-09-07 2015-12-09 深圳创维数字技术有限公司 Method and apparatus for displaying wearable article
CN108289620A (en) * 2015-11-13 2018-07-17 皇家飞利浦有限公司 Equipment, system and method for sensing station guiding
CN106250938A (en) * 2016-07-19 2016-12-21 易视腾科技股份有限公司 Method for tracking target, augmented reality method and device thereof
US20180082479A1 (en) * 2016-09-22 2018-03-22 Boe Technology Group Co., Ltd. Virtual fitting method, virtual fitting glasses and virtual fitting system
CN107103513A (en) * 2017-04-23 2017-08-29 广州帕克西软件开发有限公司 A kind of virtual try-in method of glasses

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023065866A1 (en) * 2021-10-20 2023-04-27 华为技术有限公司 Wearable device and wearable system

Similar Documents

Publication Publication Date Title
CN107680684B (en) Method and device for acquiring information
CN109887077B (en) Method and apparatus for generating three-dimensional model
CN108509915A (en) The generation method and device of human face recognition model
US9675278B2 (en) Body rehabilitation sensing method based on a mobile communication device and a system thereof
CN108494778A (en) Identity identifying method and device
WO2020062493A1 (en) Image processing method and apparatus
CN108229376A (en) For detecting the method and device of blink
CN108182412A (en) For the method and device of detection image type
CN109241934A (en) Method and apparatus for generating information
CN108062544A (en) For the method and apparatus of face In vivo detection
CN108491823A (en) Method and apparatus for generating eye recognition model
CN108509921A (en) Method and apparatus for generating information
CN108133197A (en) For generating the method and apparatus of information
CN108171204A (en) Detection method and device
CN108462832A (en) Method and device for obtaining image
CN108171208A (en) Information acquisition method and device
CN110298850A (en) The dividing method and device of eye fundus image
CN108229375A (en) For detecting the method and apparatus of facial image
CN115601811A (en) Facial acne detection method and device
CN108399401A (en) Method and apparatus for detecting facial image
CN109118538A (en) Image presentation method, system, electronic equipment and computer readable storage medium
KR20200062764A (en) Method and system for segmentation of vessel using deep learning
CN109165570A (en) Method and apparatus for generating information
CN108416595A (en) Information processing method and device
CN110516511A (en) Method and apparatus for handling information

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20190101

RJ01 Rejection of invention patent application after publication