CN116027908A - Color acquisition method, device, electronic equipment and storage medium - Google Patents

Color acquisition method, device, electronic equipment and storage medium Download PDF

Info

Publication number
CN116027908A
CN116027908A CN202310135722.7A CN202310135722A CN116027908A CN 116027908 A CN116027908 A CN 116027908A CN 202310135722 A CN202310135722 A CN 202310135722A CN 116027908 A CN116027908 A CN 116027908A
Authority
CN
China
Prior art keywords
virtual
electronic device
input
color
display area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310135722.7A
Other languages
Chinese (zh)
Inventor
吴彪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202310135722.7A priority Critical patent/CN116027908A/en
Publication of CN116027908A publication Critical patent/CN116027908A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a color acquisition method, a device, an electronic device and a storage medium, which belong to the field of intelligent glasses, wherein the method is applied to first electronic equipment, the first electronic equipment is in wireless communication connection with second electronic equipment, the first electronic equipment can display a virtual display area, the virtual display area can display a virtual picture and a real picture at the same time, and the method comprises the following steps: receiving a first input; in response to the first input, identifying a first indication location of an indicator in the virtual display area; and acquiring the color information at the first indication position, and transmitting the color information at the first indication position to the second electronic equipment.

Description

Color acquisition method, device, electronic equipment and storage medium
Technical Field
The application belongs to the field of intelligent glasses, and particularly relates to a color acquisition method, a device, electronic equipment and a storage medium.
Background
At present, with the continuous development of technology, various electronic products, such as intelligent glasses, enter the life of consumers, such as common VR and AR glasses, and consumers are rapidly adapting to the products, so that the intelligent glasses are gradually applied to various aspects of daily life, such as entertainment, study, drawing, and the like. At present, after a user wears intelligent glasses, the user can see the picture of the real content overlapped with the virtual content, at the moment, the terminal such as a tablet personal computer or a mobile phone can be used for drawing exercises, however, in the process of drawing exercises, the user can draw the outline of the seen content, when the seen picture is colored, the user is required to repeatedly go to a color mixing disc to set colors by oneself, the operation is complex, certain colors are relatively close and are not distinguished well, the user cannot accurately draw the color of the seen object in drawing, and deviation occurs in drawing.
Disclosure of Invention
The embodiment of the application aims to provide a color acquisition method, a device, electronic equipment and a storage medium, which can solve the problem that the color in the picture of intelligent glasses cannot be accurately determined.
In a first aspect, an embodiment of the present application provides a color acquisition method, which is applied to a first electronic device, where the first electronic device is connected to a second electronic device in a wireless communication manner, and the first electronic device may display a virtual display area, where the virtual display area may display a virtual screen and a real screen at the same time, and the method includes: receiving a first input; in response to the first input, identifying a first indication location of an indicator in the virtual display area; and acquiring the color information at the first indication position, and transmitting the color information at the first indication position to the second electronic equipment.
In a second aspect, an embodiment of the present application provides a color acquisition apparatus, which is applied to a first electronic device, where the first electronic device is connected to a second electronic device in a wireless communication manner, and the first electronic device may display a virtual display area, where the virtual display area may display a virtual screen and a real screen at the same time, and the apparatus includes: a first receiving module for receiving a first input; a first response module for identifying a first indication location of an indicator in the virtual display area in response to the first input; the first acquisition module is used for acquiring the color information at the first indication position and transmitting the color information at the first indication position to the second electronic equipment.
In a third aspect, embodiments of the present application provide an electronic device comprising a processor, a memory and a program or instruction stored on the memory and executable on the processor, the program or instruction implementing the steps of the method according to the first aspect when executed by the processor.
In a fourth aspect, embodiments of the present application provide a readable storage medium having stored thereon a program or instructions which when executed by a processor implement the steps of the method according to the first aspect.
In a fifth aspect, embodiments of the present application provide a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and where the processor is configured to execute a program or instructions to implement a method according to the first aspect.
In a sixth aspect, embodiments of the present application provide a computer program product stored in a storage medium, the program product being executable by at least one processor to implement the method according to the first aspect.
In the embodiment of the application, the first electronic device may respond to the first input, identify the indication position of the indicator on the virtual display area, acquire the color information of the indication position, and send the color information of the indication position to an electronic device communicatively connected with the first electronic device. By the method, when a user wants to use the color of a certain position in the AR image, the user can simply, conveniently and accurately determine the color information of the position and send the color information to the second device, so that the user can use the color to perform subsequent operations such as drawing, color filling and the like on the second device, and the interactive experience of the user is improved.
Drawings
Fig. 1 is a flowchart of a color acquisition method provided in an embodiment of the present application;
FIG. 2 is a schematic diagram of a stylus according to an embodiment of the present application;
fig. 3 is a schematic diagram of an example of a color acquisition method provided in an embodiment of the present application;
fig. 4 is a schematic structural diagram of a color acquisition device according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of an electronic device according to another embodiment of the present application.
Detailed Description
Technical solutions in the embodiments of the present application will be clearly described below with reference to the drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments in the present application are within the scope of the protection of the present application.
The terms first, second and the like in the description and in the claims, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged, as appropriate, such that embodiments of the present application may be implemented in sequences other than those illustrated or described herein, and that the objects identified by "first," "second," etc. are generally of a type and not limited to the number of objects, e.g., the first object may be one or more. Furthermore, in the description and claims, "and/or" means at least one of the connected objects, and the character "/", generally means that the associated object is an "or" relationship.
The color acquisition method provided by the embodiment of the application is described in detail below by means of specific embodiments and application scenes thereof with reference to the accompanying drawings.
Please refer to fig. 1, which is a flowchart of a color acquisition method according to an embodiment of the present application. The method can be applied to a first electronic device, and in one example, the first electronic device can be a head-mounted display device, a smart glasses and the like, and concretely can be AR glasses or VR glasses with AR functions and the like. As shown in fig. 1, the method may include steps S11 to S13, which will be described in detail below.
Step S11, a first input is received.
Step S12, in response to the first input, identifying a first indication position of the pointer in the virtual display area.
In an example of this embodiment, the first electronic device and the second electronic device may be connected in communication based on signals such as bluetooth, wifi, UWB, etc., and the second electronic device may be a device for drawing such as a tablet computer, a mobile phone, etc.
In one example of the present embodiment, the first input may be an input that the user clicks a virtual button or a real button on the second electronic device, an input that the user clicks a stylus button, an input that the first electronic device is controlled by voice, or the like.
In an example of this embodiment, the first electronic device may display a virtual display area, where the virtual display area is a picture obtained by superimposing a virtual picture that can be seen by a user and a real picture that directly enters the field of view of the user through a lens or a camera, and the virtual display area may be generated by superimposing a virtual picture generated by the first electronic device on the optical display and the real picture that directly enters the field of view of the user.
In one example of the present embodiment, the virtual screen includes a virtual object, and the real screen includes a real object. The virtual object is an object in a virtual screen generated by the first electronic device, for example, a virtual object such as dinosaur, football, or chair generated by AR. Generally, in a virtual display area seen by a user, a virtual object covers a real picture of a location where the virtual object is located. A real object is an object that exists in the real world that directly enters the user's field of view through a lens or camera.
In one example of this embodiment, the pointer may be a finger, a stylus, or other object that may indicate a position, and the user may identify the position of the fingertip or the pen tip in the virtual display area, and take the position of the fingertip or the pen tip in the virtual display area as the indication position.
In one example of this embodiment, the pointer is a stylus, the stylus being communicatively coupled to the first electronic device, receiving the first input comprising: based on the communication connection, a first input sent by the stylus is received.
In an example of the present embodiment, as shown in fig. 2, the stylus 400 may be communicatively connected to the first electronic device by bluetooth, wifi, or the like. The user may take a color by clicking a button 410 on the stylus, after which the stylus may send an acquire color information signal to the first electronic device based on the communication connection. The first electronic device receives a signal sent by the handwriting pen, namely a first input, and responds to the input to carry out subsequent operation.
In one example of the present embodiment, the first electronic device is provided with a camera; in response to the first input, identifying a first indication location of the pointer in the virtual display area, comprising: responsive to a first input, turning on a camera of a first electronic device; based on image information acquired by a camera of the first electronic device and a SLAM (Simultaneous Localization and Mapping, while being positioned in a map creation) algorithm, a first indication position of the pointer in the virtual display area is identified.
In one example of the present embodiment, a camera is further provided in the first electronic device. In this example, when the first electronic device receives an input of acquiring color information, the camera may be turned on to capture a surrounding image, including a picture or an image, and the position of the indicator in the virtual display area is determined based on the image information acquired by the camera and a SLAM algorithm.
In one example of the present embodiment, identifying a first indication position of an indicator in a virtual display area based on image information acquired by a camera of a first electronic device and a SLAM algorithm includes: based on image information acquired by a camera of the first electronic device and a SLAM algorithm, constructing an environment map and determining the position and the view angle of the camera of the first electronic device in the environment map; and identifying a first indication position of the indicator in the virtual display area according to the position and the visual angle of the camera of the first electronic device in the environment map, the position of the indicator in the image information acquired by the camera of the first electronic device, and the position parameter and the depth parameter of the virtual object in the virtual picture.
In one example of the present embodiment, since the position and the angle of view of the camera are different from those of the human eye, the indication position of the pointer in the image acquired from the camera may be different from the indication position seen by the actual human eye, i.e., the indication position in the virtual display area, and thus the position of the pointer in the virtual display area may be determined using the SLAM algorithm. Specifically, for example, an image shot by the first electronic device is used for constructing an environment map, and the position and the viewing angle of the camera are determined through the algorithm. After the position and the visual angle of the camera are determined, the position of the indicator in the image information shot by the camera is further converted into the indication position of the indicator in the eyes of the user, namely the indication position in the virtual display area. In addition, since the image captured by the camera does not have a virtual reality object, when determining the indicated position of the pointer in the virtual display area, the position of the virtual object in space needs to be determined by the position parameter and the depth parameter of the virtual object, and further, whether the indicated position of the pointer is the position where the virtual object is located can be determined.
Step S13, color information at the first indication position is acquired, and the color information at the first indication position is sent to the second electronic equipment.
In one example of the present embodiment, acquiring color information at a first indicated location includes: and acquiring color information of the virtual object under the condition that the first indication position is positioned on the virtual object.
In one example of the present embodiment, first image information is acquired by a camera of a first electronic device in response to a first input; acquiring color information at a first indicated location, comprising: and acquiring color information of the real object through image information of the real object under the condition that the first indication position is located on the real object, wherein the image information of the real object is acquired from the first image information.
In this example, if the indicated location is located on the virtual object generated by the first electronic device, the first electronic device may directly obtain color information used when generating the corresponding location of the virtual object, to determine the color information of the indicated location. If the indication position is located in the real object, the indication position can be determined according to the image information shot by the camera of the first electronic device, and specifically, the color information of the pixel corresponding to the indication position in the first image in the image shot by the camera can be used as the color information at the indication position.
After the color information at the indicated location is acquired, the color information may be sent to the second electronic device based on the communication connection, so that the user may use the color information on the second electronic device, for example, to draw or paint using the color information.
In one example of this embodiment, the first electronic device is smart glasses. The virtual display area 200 that the user sees after wearing the smart glasses can be shown in fig. 3, where football 201 is a virtual object generated by the smart glasses and bicycle 203 is a real object. When the user wants to acquire the color in the virtual display area, the user can point the pen point of the handwriting pen to the position where the user wants to acquire the color, and presses the color acquisition button on the handwriting pen, at this time, the handwriting pen sends a signal for performing color acquisition to the smart glasses, namely a first input, the smart glasses respond to the input, the camera can be started, the position indicated by the handwriting pen is determined based on the image shot by the camera, for example, the position of the pen point is located, and when the pen point is located at the position 202 in the virtual object, the smart glasses can directly acquire the color information of the position 202 based on the information for generating the virtual image. When the pen tip is located at a position 204 in the real image, the smart glasses can recognize color information at the corresponding position based on the image photographed by the camera. After the color information of the indication position is obtained, the color information can be sent to the tablet personal computer, so that a user can use the color to draw on the tablet personal computer.
In this example, the first electronic device, in response to the first input, recognizes an indication position of the indicator on a virtual display area where the virtual screen and the real screen can be simultaneously displayed, acquires color information at the indication position, and transmits the color information at the indication position to an electronic device communicatively connected to the first electronic device. By the method, when a user wants to use the color of a certain position in the seen AR image, the color of the certain position in the picture can be simply, conveniently and accurately determined, so that the user can use the color to conduct subsequent operations such as drawing, and interaction experience of the user is improved.
In one example of this embodiment, the method further comprises: in the case where the first instruction position is located in the real object, a virtual object corresponding to the real object is generated in the virtual screen by image information of the real object.
In one example of the present embodiment, in the case where the first indication position is located on the real object, a specific category of the real object, such as a bicycle, a football, a chair, etc., may be identified from a picture photographed by the camera through an image recognition algorithm, and after the specific category of the real object is identified, a corresponding virtual object is generated according to image information, such as color information, pixel size information, etc., of the real object, and displayed in the virtual display area. Specifically, the position of the corresponding virtual object may be set according to the actual situation. For example, it may be fixed in a certain position of the virtual display area.
In this example, a method is also provided, which can generate a corresponding virtual object under the condition that the user indicates a real object, thereby enhancing the interest, and meanwhile, under the condition that the virtual object is fixed at a certain position of the virtual display area, the user can also see the virtual object indicated by the user after switching the viewing angle, so that the user does not need to frequently move the head to switch the viewing angle when drawing, and drawing can be performed according to the virtual object.
In one example of the present embodiment, after acquiring the color information at the first indicated position, the method includes: receiving a second input; responsive to the second input, determining a target virtual object indicated by the pointer; the color of the target virtual object is changed to a color corresponding to the color information at the first indicated position.
In one example of the present embodiment, the second input may be an input that the user clicks a virtual button or a real button on the second electronic device, an input that the user clicks a stylus button, an input that controls the first electronic device through voice, or the like. When the user wants to change the color of a certain virtual object, the virtual object indicated by the pointer can be determined through a second input, which in one example can be an input made by the user clicking a button in the stylus. The first electronic device may, in response to the input, turn on the camera to capture an image, further determining a virtual object indicated by the pointer. After the indicated virtual object is determined, the color of the virtual object may be changed according to the color information acquired by the foregoing method.
In one example of this embodiment, a method includes: receiving a third input; in response to the third input, identifying a second indication location of the pointer in the virtual display area; and under the condition that the second indication position is located in the virtual object, acquiring the contour information of the virtual object, and transmitting the contour information of the virtual object to the second electronic equipment.
In one example of the present embodiment, the third input may be an input that the user clicks a virtual button or a real button on the second electronic device, an input that the user clicks a stylus button, an input that controls the first electronic device through voice, or the like. The first electronic device can respond to the input, identify a second indication position of the indication object in the virtual display area, directly acquire the outline information of the virtual object under the condition that the indication position is positioned on the virtual object, and send the information to the second electronic device, so that the outline of the virtual object can be directly displayed in the second electronic device.
In one example of this embodiment, a method includes: responding to the third input, and acquiring second image information through a camera of the first electronic device; and under the condition that the second indication position is located in the real object, acquiring the outline information of the real object through the image information of the real object, and sending the outline information of the real object to the second electronic equipment, wherein the image information of the real object is acquired from the second image information.
In an example of this embodiment, when the second indication position is located in the real object, the contour information of the real object may be obtained through the image information of the real object acquired by the camera, and the information may be sent to the second electronic device, so that the contour of the real object may be directly displayed in the second electronic device according to the contour information of the real object.
In this example, the user may determine, through the third input, contour information of the virtual object or the real object indicated by the user, and send the information to the second electronic device, where the second electronic device may display, according to the contour information, a contour of the object, so that the user performs operations such as secondary creation, painting, drawing, and the like on the contour.
According to the color acquisition method provided by the embodiment of the application, the execution body can be a color acquisition device. In the embodiment of the present application, a color acquisition device performs a color acquisition method as an example, and the color acquisition device provided in the embodiment of the present application is described.
Corresponding to the above embodiment, referring to fig. 4, the embodiment of the present application further provides a color obtaining apparatus 100, which is applied to a first electronic device, where the first electronic device is connected to a second electronic device in a wireless communication manner, the first electronic device may display a virtual display area, and the virtual display area may display a virtual screen and a real screen at the same time, where the apparatus includes: a first receiving module 101 for receiving a first input; a first response module 102 for identifying a first indication position of the pointer in the virtual display area in response to the first input; the first obtaining module 103 is configured to obtain color information at the first indicated location, and send the color information at the first indicated location to the second electronic device.
Optionally, the first electronic device is provided with a camera; the first response module is specifically configured to: responsive to a first input, turning on a camera of a first electronic device; and identifying a first indication position of the indicator in the virtual display area based on the image information acquired by the camera of the first electronic device and a SLAM algorithm.
Optionally, the virtual picture comprises a virtual object, and the real picture comprises a real object; the acquisition module is specifically used for: and acquiring color information of the virtual object under the condition that the first indication position is positioned on the virtual object.
Optionally, the first electronic device is provided with a camera, the virtual picture comprises a virtual object, and the real picture comprises a real object; the device comprises: the first response module is also used for responding to the first input and acquiring first image information through a camera of the first electronic equipment; acquiring color information at a first indicated location, comprising: and acquiring color information of the real object through image information of the real object in the case that the first indication position is located on the real object, wherein the image information of the real object is acquired from the first image information.
Optionally, the apparatus comprises: and the generation module is used for generating a virtual object corresponding to the real object in the virtual picture through the image information of the real object under the condition that the first indication position is positioned on the real object.
Optionally, the virtual screen comprises a virtual object; after acquiring the color information at the first indicated location, the apparatus includes: a second receiving module for receiving a second input; a second response module for determining a target virtual object indicated by the pointer in response to the second input; and the color changing module is used for changing the color of the target virtual object into the color corresponding to the color information at the first indication position.
Optionally, the virtual picture comprises a virtual object, and the real picture comprises a real object; the device comprises: a third receiving module for receiving a third input; a third response module for identifying a second indication location of the pointer in the virtual display area in response to a third input; and the second acquisition module is used for acquiring the contour information of the virtual object and transmitting the contour information of the virtual object to the second electronic equipment under the condition that the second indication position is located in the virtual object.
Optionally, the first electronic device is provided with a camera; the device comprises: the acquisition module is used for responding to the third input and acquiring second image information through a camera of the first electronic equipment; and the third acquisition module is used for acquiring the contour information of the real object through the image information of the real object and transmitting the contour information of the real object to the second electronic equipment under the condition that the second indication position is located in the real object, wherein the image information of the real object is acquired from the second image information.
Optionally, identifying the first indication position of the indicator in the virtual display area based on the image information collected by the camera of the first electronic device and the SLAM algorithm includes: based on image information acquired by a camera of the first electronic device and a SLAM algorithm, constructing an environment map and determining the position and the view angle of the camera of the first electronic device in the environment map; and identifying a first indication position of the indicator in the virtual display area according to the position and the visual angle of the camera of the first electronic device in the environment map, the position of the indicator in the image information acquired by the camera of the first electronic device, and the position parameter and the depth parameter of the virtual object in the virtual picture.
In this example, a device is further provided, so that when the user wants to use the color of a certain position in the seen AR image, the user can simply, conveniently and accurately determine the color information of the position and send the color information to the second device, so that the user can use the color to perform subsequent operations such as drawing, color filling and the like on the second device, and the interactive experience of the user is improved.
The color acquisition device in the embodiment of the application may be an electronic device, or may be a component in the electronic device, for example, an integrated circuit or a chip. The electronic device may be a terminal, or may be other devices than a terminal. By way of example, the electronic device may be a mobile phone, tablet computer, notebook computer, palm computer, vehicle-mounted electronic device, mobile internet appliance (Mobile Internet Device, MID), augmented reality (augmented reality, AR)/Virtual Reality (VR) device, robot, wearable device, ultra-mobile personal computer, UMPC, netbook or personal digital assistant (personal digital assistant, PDA), etc., but may also be a server, network attached storage (Network Attached Storage, NAS), personal computer (personal computer, PC), television (TV), teller machine or self-service machine, etc., and the embodiments of the present application are not limited in particular.
The color acquisition device in the embodiment of the present application may be a device having an operating system. The operating system may be an Android operating system, an ios operating system, or other possible operating systems, which are not specifically limited in the embodiments of the present application.
The color acquisition device provided in the embodiment of the present application can implement each process implemented by the above method embodiment, and in order to avoid repetition, details are not repeated here.
In correspondence to the above embodiment, optionally, as shown in fig. 5, the embodiment of the present application further provides an electronic device 800, including a processor 801 and a memory 802, where a program or an instruction that can be executed on the processor 801 is stored in the memory 802, and the program or the instruction when executed by the processor 801 implements each step of the above color acquisition method embodiment, and the same technical effect can be achieved, so that repetition is avoided, and no further description is given here.
The electronic device in the embodiment of the application includes the mobile electronic device and the non-mobile electronic device described above.
Fig. 6 is a schematic hardware structure of an electronic device implementing an embodiment of the present application.
The electronic device 900 includes, but is not limited to: radio frequency unit 901, network module 902, audio output unit 903, input unit 904, sensor 905, display unit 906, user input unit 907, interface unit 908, memory 909, and processor 910.
Those skilled in the art will appreciate that the electronic device 900 may also include a power source (e.g., a battery) for powering the various components, which may be logically connected to the processor 910 by a power management system to perform functions such as managing charge, discharge, and power consumption by the power management system. The electronic device structure shown in fig. 6 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than shown, or may combine certain components, or may be arranged in different components, which are not described in detail herein.
Wherein the processor 910 is configured to receive a first input; in response to the first input, identifying a first indication location of the pointer in the virtual display area; and acquiring the color information at the first indication position and transmitting the color information at the first indication position to the second electronic equipment.
Optionally, in response to the first input, identifying a first indication position of the pointer in the virtual display area includes: responsive to a first input, turning on a camera of a first electronic device; and identifying a first indication position of the indicator in the virtual display area based on the image information acquired by the camera of the first electronic device and a SLAM algorithm.
Optionally, the virtual picture comprises a virtual object, and the real picture comprises a real object; acquiring color information at a first indicated location, comprising: and acquiring color information of the virtual object under the condition that the first indication position is positioned on the virtual object.
Optionally, the virtual picture comprises a virtual object, and the real picture comprises a real object; the processor 910 is configured to acquire, in response to a first input, first image information through a camera of a first electronic device; acquiring color information at a first indicated location, comprising: and acquiring color information of the real object through image information of the real object in the case that the first indication position is located on the real object, wherein the image information of the real object is acquired from the first image information.
Alternatively, the processor 910 is configured to generate a virtual object corresponding to the real object in the virtual screen by image information of the real object in a case where the first indication position is located on the real object.
Optionally, the processor 910 is configured to receive a second input after acquiring the color information at the first indicated location; responsive to the second input, determining a target virtual object indicated by the pointer; the color of the target virtual object is changed to a color corresponding to the color information at the first indicated position.
Optionally, the processor 910 is configured to receive a third input; in response to the third input, identifying a second indication location of the pointer in the virtual display area; and under the condition that the second indication position is located in the virtual object, acquiring the contour information of the virtual object, and transmitting the contour information of the virtual object to the second electronic equipment.
Optionally, the processor 910 is configured to acquire, in response to the third input, second image information by a camera of the first electronic device; and under the condition that the second indication position is located in the real object, acquiring the outline information of the real object through the image information of the real object, and sending the outline information of the real object to the second electronic equipment, wherein the image information of the real object is acquired from the second image information.
Optionally, identifying the first indication position of the indicator in the virtual display area based on the image information collected by the camera of the first electronic device and the SLAM algorithm includes: based on image information acquired by a camera of the first electronic device and a SLAM algorithm, constructing an environment map and determining the position and the view angle of the camera of the first electronic device in the environment map; and identifying a first indication position of the indicator in the virtual display area according to the position and the visual angle of the camera of the first electronic device in the environment map, the position of the indicator in the image information acquired by the camera of the first electronic device, and the position parameter and the depth parameter of the virtual object in the virtual picture.
In this example, an electronic device is provided, so that when a user wants to use a color at a certain position in a seen AR image, color information of the position can be simply, conveniently and accurately determined and sent to a second device, so that the user can use the color to perform subsequent operations such as drawing, color filling and the like on the second device, and interactive experience of the user is improved.
It should be appreciated that in embodiments of the present application, the input unit 904 may include a graphics processor (Graphics Processing Unit, GPU) 9041 and a microphone 9042, with the graphics processor 9041 processing image data of still pictures or video obtained by an image capture device (e.g., a camera) in a video capture mode or an image capture mode. The display unit 906 may include a display panel 9061, and the display panel 9061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 907 includes a touch panel 9071 and other input devices 9072. Touch panel 9071, also referred to as a touch screen. The touch panel 9071 may include two parts, a touch detection device and a touch controller. Other input devices 9072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and so forth, which are not described in detail herein.
The memory 909 may be used to store software programs as well as various data. The memory 909 may mainly include a first storage area storing programs or instructions and a second storage area storing data, wherein the first storage area may store an operating system, application programs or instructions (such as a sound playing function, an image playing function, etc.) required for at least one function, and the like. Further, the memory 909 may include a volatile memory or a nonvolatile memory, or the memory 909 may include both volatile and nonvolatile memories. The nonvolatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable EPROM (EEPROM), or a flash Memory. The volatile memory may be random access memory (Random Access Memory, RAM), static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double Data Rate SDRAM (ddr SDRAM), enhanced SDRAM (Enhanced SDRAM), synchronous DRAM (SLDRAM), and Direct RAM (DRRAM). Memory 909 in embodiments of the present application includes, but is not limited to, these and any other suitable types of memory.
Processor 910 may include one or more processing units; optionally, the processor 910 integrates an application processor that primarily processes operations involving an operating system, user interface, application programs, and the like, and a modem processor that primarily processes wireless communication signals, such as a baseband processor. It will be appreciated that the modem processor described above may not be integrated into the processor 910.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored, and when the program or the instruction is executed by a processor, the processes of the embodiment of the color acquisition method are implemented, and the same technical effects can be achieved, so that repetition is avoided, and no further description is given here.
Wherein the processor is a processor in the electronic device described in the above embodiment. The readable storage medium includes a computer readable storage medium such as a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk or an optical disk, and the like.
The embodiment of the application further provides a chip, the chip includes a processor and a communication interface, the communication interface is coupled with the processor, the processor is used for running a program or an instruction, implementing each process of the above color acquisition method embodiment, and achieving the same technical effect, so as to avoid repetition, and no redundant description is provided herein.
It should be understood that the chips referred to in the embodiments of the present application may also be referred to as system-on-chip chips, chip systems, or system-on-chip chips, etc.
The embodiments of the present application provide a computer program product stored in a storage medium, where the program product is executed by at least one processor to implement the respective processes of the embodiments of the color acquisition method described above, and achieve the same technical effects, and are not repeated herein.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Furthermore, it should be noted that the scope of the methods and apparatus in the embodiments of the present application is not limited to performing the functions in the order shown or discussed, but may also include performing the functions in a substantially simultaneous manner or in an opposite order depending on the functions involved, e.g., the described methods may be performed in an order different from that described, and various steps may also be added, omitted, or combined. Additionally, features described with reference to certain examples may be combined in other examples.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solutions of the present application may be embodied essentially or in a part contributing to the prior art in the form of a computer software product stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk), comprising several instructions for causing a terminal (which may be a mobile phone, a computer, a server, or a network device, etc.) to perform the methods described in the embodiments of the present application.
The embodiments of the present application have been described above with reference to the accompanying drawings, but the present application is not limited to the above-described embodiments, which are merely illustrative and not restrictive, and many forms may be made by those of ordinary skill in the art without departing from the spirit of the present application and the scope of the claims, which are also within the protection of the present application.

Claims (10)

1. A color acquisition method, applied to a first electronic device, where the first electronic device is connected to a second electronic device in a wireless communication manner, and the first electronic device can display a virtual display area, where the virtual display area can display a virtual screen and a real screen at the same time, the method comprising:
receiving a first input;
in response to the first input, identifying a first indication location of an indicator in the virtual display area;
and acquiring the color information at the first indication position, and transmitting the color information at the first indication position to the second electronic equipment.
2. The method of claim 1, wherein the first electronic device is provided with a camera;
the identifying, in response to the first input, a first indication location of an indicator in the virtual display area, comprising:
responsive to the first input, turning on a camera of the first electronic device;
and identifying a first indication position of the indicator in the virtual display area based on image information acquired by a camera of the first electronic device and a SLAM algorithm.
3. The method of claim 1, wherein the virtual picture comprises a virtual object and the real picture comprises a real object;
the acquiring the color information at the first indication position includes:
and acquiring color information of the virtual object under the condition that the first indication position is positioned on the virtual object.
4. The method according to claim 1, wherein the first electronic device is provided with a camera, the virtual picture comprises a virtual object, and the real picture comprises a real object;
the method comprises the following steps:
responsive to the first input, acquiring first image information by a camera of the first electronic device;
the acquiring the color information at the first indication position includes:
and acquiring color information of the real object through image information of the real object under the condition that the first indication position is located in the real object, wherein the image information of the real object is acquired from the first image information.
5. The method of claim 1, wherein the virtual picture comprises a virtual object;
after the acquiring the color information at the first indicated location, the method includes:
receiving a second input;
responsive to the second input, determining a target virtual object indicated by the pointer;
and changing the color of the target virtual object into a color corresponding to the color information at the first indication position.
6. The method of claim 1, wherein the virtual picture comprises a virtual object and the real picture comprises a real object;
the method comprises the following steps:
receiving a third input;
in response to the third input, identifying a second indication location of an indicator in the virtual display area;
and under the condition that the second indication position is located in the virtual object, acquiring the contour information of the virtual object, and sending the contour information of the virtual object to the second electronic equipment.
7. The method of claim 6, wherein the first electronic device is provided with a camera;
the method comprises the following steps:
responsive to the third input, acquiring second image information by a camera of the first electronic device;
and acquiring contour information of the real object through image information of the real object and sending the contour information of the real object to the second electronic equipment under the condition that the second indication position is located in the real object, wherein the image information of the real object is acquired from the second image information.
8. A color acquisition apparatus, characterized in that it is applied to a first electronic device, the first electronic device is connected with a second electronic device in a wireless communication manner, the first electronic device can display a virtual display area, and the virtual display area can display a virtual screen and a real screen at the same time, the apparatus comprising:
a first receiving module for receiving a first input;
a first response module for identifying a first indication location of an indicator in the virtual display area in response to the first input;
the first acquisition module is used for acquiring the color information at the first indication position and transmitting the color information at the first indication position to the second electronic equipment.
9. An electronic device comprising a processor and a memory storing a program or instructions executable on the processor, which when executed by the processor, implement the steps of the color acquisition method of any one of claims 1-7.
10. A computer readable storage medium, characterized in that the readable storage medium has stored thereon a program or instructions which, when executed by a processor, implement the steps of the color acquisition method according to any one of claims 1-7.
CN202310135722.7A 2023-02-17 2023-02-17 Color acquisition method, device, electronic equipment and storage medium Pending CN116027908A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310135722.7A CN116027908A (en) 2023-02-17 2023-02-17 Color acquisition method, device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310135722.7A CN116027908A (en) 2023-02-17 2023-02-17 Color acquisition method, device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN116027908A true CN116027908A (en) 2023-04-28

Family

ID=86076072

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310135722.7A Pending CN116027908A (en) 2023-02-17 2023-02-17 Color acquisition method, device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116027908A (en)

Similar Documents

Publication Publication Date Title
US11734336B2 (en) Method and apparatus for image processing and associated user interaction
CN110456907A (en) Control method, device, terminal device and the storage medium of virtual screen
CN102779000B (en) User interaction system and method
US20150379770A1 (en) Digital action in response to object interaction
US20140317576A1 (en) Method and system for responding to user's selection gesture of object displayed in three dimensions
US20220148279A1 (en) Virtual object processing method and apparatus, and storage medium and electronic device
US11244511B2 (en) Augmented reality method, system and terminal device of displaying and controlling virtual content via interaction device
WO2021016334A1 (en) Trackability enhancement of a passive stylus
US11367416B1 (en) Presenting computer-generated content associated with reading content based on user interactions
CN111901518B (en) Display method and device and electronic equipment
CN103713387A (en) Electronic device and acquisition method
WO2024012268A1 (en) Virtual operation method and apparatus, electronic device, and readable storage medium
CN111240483B (en) Operation control method, head-mounted device, and medium
US20220189128A1 (en) Temporal segmentation
CN116027908A (en) Color acquisition method, device, electronic equipment and storage medium
CN115480639A (en) Human-computer interaction system, human-computer interaction method, wearable device and head display device
Bhowmik Natural and intuitive user interfaces with perceptual computing technologies
CN103793053A (en) Gesture projection method and device for mobile terminals
CN113031793B (en) Contour acquisition method and device and intelligent pen
CN117608408A (en) Operation execution method, device, electronic equipment and readable storage medium
US20240094819A1 (en) Devices, methods, and user interfaces for gesture-based interactions
US20230046337A1 (en) Digital assistant reference resolution
CN109144234A (en) Virtual reality system and its control method with external tracking and built-in tracking
CN116166120A (en) Gesture recognition method and device, electronic equipment and storage medium
CN117572966A (en) Device control method, device, electronic device and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination