CN105005379A - Integrated audio playing device and audio playing method thereof - Google Patents

Integrated audio playing device and audio playing method thereof Download PDF

Info

Publication number
CN105005379A
CN105005379A CN201510342359.1A CN201510342359A CN105005379A CN 105005379 A CN105005379 A CN 105005379A CN 201510342359 A CN201510342359 A CN 201510342359A CN 105005379 A CN105005379 A CN 105005379A
Authority
CN
China
Prior art keywords
cpu
user
audio
electrically connected
control unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201510342359.1A
Other languages
Chinese (zh)
Other versions
CN105005379B (en
Inventor
朱华明
武巍
宁洲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Jinruidelu Technology Co Ltd
Original Assignee
Beijing Jinruidelu Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Jinruidelu Technology Co Ltd filed Critical Beijing Jinruidelu Technology Co Ltd
Priority to CN201510342359.1A priority Critical patent/CN105005379B/en
Publication of CN105005379A publication Critical patent/CN105005379A/en
Application granted granted Critical
Publication of CN105005379B publication Critical patent/CN105005379B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Circuit For Audible Band Transducer (AREA)

Abstract

The invention discloses an integrated audio playing device. The device comprises a head beam portion, a first ear portion and a second ear portion. The head beam portion is connected between the first ear portion and the second ear portion. The invention further discloses an audio playing method of the device. The integrated audio playing device of the invention could achieve functions such as audio selection, audio play, play control, and so on, and integrates all operating functions of hearing, selecting and playing.

Description

Integrated audio playing device and audio playing method thereof
Technical Field
The present invention relates to audio processing technologies, and in particular, to an integrated audio playing device and an audio playing method thereof.
Background
The existing music playing device is either a pure music player, such as a sony Walkman personal music player, an apple iPod music player, or a music player embedded in a smart phone or a music application. When using the above music playing device to listen to songs, the user needs to connect an additional earphone or the like to listen to music. Especially when listening to music outdoors, the user needs to wear the music player, the earphone and the audio connecting line, and the wearing is cumbersome and inconvenient to operate.
Disclosure of Invention
In order to solve the existing technical problems, the invention provides an integrated audio playing device and an audio processing method thereof.
In order to achieve the purpose, the technical scheme of the invention is realized as follows:
an integral audio playback device, the device comprising: the head-beam part is connected between the first lug part and the second lug part;
the head beam part is internally provided with a routing wire;
the first ear comprises a first ear shell, a first loudspeaker, an audio signal processing circuit, a CPU, a memory and a wireless communication circuit which are wrapped in the first ear shell, and a touch screen positioned on the outer side of the first ear shell; the touch screen, the memory, the wireless communication circuit and the audio signal processing circuit are respectively electrically connected with the CPU, and the first loudspeaker is electrically connected with the audio signal processing circuit;
the second ear part comprises a second ear shell, a second loudspeaker, a power supply device, a touch pad supporting circuit and a touch pad, wherein the second loudspeaker, the power supply device, the touch pad supporting circuit and the touch pad are wrapped inside the second ear shell; the second loudspeaker is electrically connected with the audio signal processing circuit through the direction inside the head beam part; the power supply device is electrically connected with all other devices of the second ear part and is electrically connected with all the devices of the first ear part through the routing in the head beam part.
The audio signal processing circuit and the touch pad are electrically connected with the CPU through an interface of a built-in integrated circuit I2C.
Wherein the first ear further comprises an inertial sensor, and/or a positioning device; the inertial sensor is electrically connected with the CPU and is connected with the power supply device through a routing wire in the head beam part; the positioning device is electrically connected with the CPU.
Wherein the inertial sensor comprises: acceleration sensors, and/or gyroscopes; the acceleration sensor and/or the gyroscope adopt a two-in-one 6-axis chip and are electrically connected with the CPU through I2C.
Wherein the wireless communication circuit comprises: a cellular communication unit, and/or a wifi communication unit, and/or a bluetooth communication unit; the cellular communication unit and/or the wifi communication unit and/or the Bluetooth communication unit are electrically connected with the CPU respectively.
Wherein the second ear further comprises: the heart rate sensor is electrically connected with the CPU through a wiring in the head beam part; and/or the second ear further comprises: a contact sensor; the contact sensor is electrically connected with the CPU through the routing in the head beam part.
Wherein the heart rate sensor and/or the contact sensor are electrically connected with the CPU through an I2C interface.
Wherein the first ear further comprises: the first radio device is electrically connected with the CPU; and/or, the second ear further comprises: and the second sound receiving device is electrically connected with the CPU through the routing in the head beam part.
The first sound receiving device and/or the second sound receiving device respectively comprise two microphones, one microphone is used for collecting ambient noise, and the other microphone is used for collecting voice serving as a target signal.
Wherein the first ear further comprises: a control unit that is an operating system stored in a computer readable medium of the CPU;
the control unit includes: the system comprises a human-computer interaction module and an execution control module; the human-computer interaction module is used for providing a user interface presented by the touch screen and receiving a user instruction; the execution control module is used for controlling the CPU to execute corresponding actions based on user instructions received by the user interface on the touch screen; and the touch control device is used for converting the gesture data transmitted by the touch control plate into a play control instruction and controlling the CPU to execute.
The audio signal processing circuit comprises a digital-analog conversion chip and a power amplifier chip, one end of the digital-analog conversion chip is electrically connected with the CPU, the other end of the digital-analog conversion chip is electrically connected with one end of the power amplifier chip, and the other end of the power amplifier chip is electrically connected with the first loudspeaker and the second loudspeaker.
The touch pad is a capacitive matrix touch pad.
An audio playing method of an all-in-one audio playing device, the method comprising: the CPU acquires an audio source from the memory or through a wireless communication circuit; the CPU converts the audio source into a digital signal and sends the digital signal to an audio signal processing circuit; the audio signal processing circuit converts the digital signal into an analog signal, amplifies the analog signal and sends the amplified analog signal to a first loudspeaker and a second loudspeaker; and the first loudspeaker and the second loudspeaker convert the amplified analog signals into sound for playing.
Wherein the method further comprises: a user interface on the touch screen receives a user instruction and transmits the user instruction to the control unit; the control unit controls the CPU to read an audio file from the memory or acquire an audio stream through a wireless communication circuit according to the user instruction, so that the CPU acquires an audio source.
Wherein the method further comprises: the method comprises the steps that a touchpad detects operation gestures of a user in real time and sends obtained gesture data to a control unit; the control unit converts the gesture data into a play control instruction and controls the CPU to execute a corresponding audio play control action; the play control instruction comprises any one or more of the following instructions: audio file switching instructions, volume control instructions, play stop/pause instructions, voice control on/off instructions.
Wherein the method further comprises: the heart rate sensor detects heart rate data of a user in real time and transmits the heart rate data to the control unit; and/or the inertial sensor detects the motion behavior of the user in real time and transmits the obtained motion data to the control unit; and/or, the positioning device acquires the position of the user in real time and transmits the position data of the user to the control unit; the control unit analyzes the heart rate data and/or the motion data and/or the position data, searches a matched audio source from a local memory or searches the matched audio source in an online searching mode through a wireless communication circuit, and presents related information of the audio source on a user interface of the touch screen.
Wherein the method further comprises: the contact sensor senses the contact of a user on the equipment in real time; when a user is sensed to touch the equipment, the touch data is sent to the control unit; and the control unit analyzes the contact data, generates a wake-up instruction and controls the CPU to wake up the equipment.
Wherein the method further comprises: the first radio receiving device and/or the second radio receiving device collect voice serving as a target signal; the control unit acquires and analyzes voice data, generates corresponding instructions and controls a user interface on the CPU or the touch screen to execute corresponding actions; the instructions include: a user interface presented on the touch screen can receive user instructions and/or the same play control instructions on the touch pad.
Wherein the method further comprises: the first radio device and/or the second radio device collects environmental noise and collects voice as a target signal; and the audio signal processing circuit performs noise reduction processing on the voice serving as the target signal based on the environmental noise to obtain effective voice data and sends the effective voice data to the control unit.
According to the integrated audio playing device provided by the embodiment of the invention, all functions such as audio selection, audio playing, playing control and the like can be realized by a single device, all operation functions of listening, selecting and playing are integrated, the operation is simple and easy, and the carrying is convenient.
The embodiment of the invention integrates the music player, the audio processor, the interpersonal interaction module, the sound generating device and the like, and a user can realize the requirement of listening to songs only by one device in the embodiment of the invention.
Drawings
In the drawings, which are not necessarily drawn to scale, like reference numerals may describe similar components in different views. Like reference numerals having different letter suffixes may represent different examples of similar components. The drawings illustrate generally, by way of example, but not by way of limitation, various embodiments discussed herein.
Fig. 1 is a schematic overall structure diagram of an integrated audio playing device according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a structure of an integrated audio playback device according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of a user interface on a touch screen of an integrated audio playback device according to an embodiment of the present invention;
FIG. 4 is a schematic diagram illustrating an operation of a touchpad of an integrated audio playback device according to an embodiment of the present invention;
fig. 5 is a flowchart illustrating an audio playing method of an integrated audio playing device according to an embodiment of the present invention.
Detailed Description
Preferably, as shown in fig. 1, the overall structure of the device may be a structure of a headphone, or may adopt other similar structures, which is not described herein again.
The apparatus comprises: left ear 11, right ear 12 and head roof beam portion 13, be connected through head roof beam portion 13 between left ear 11 and the right ear 12, head roof beam portion 13 is in between left ear 11 and the right ear 12.
The head beam part 13 is internally provided with a trace for electrically connecting the electronic device of the left ear part 11 with the electronic device of the right ear part 12.
Specifically, as shown in fig. 2, the left ear 11 includes a left ear shell, a CPU, an audio signal processing circuit, a first speaker, a wireless communication circuit, a touch screen, a positioning device, and a memory, wherein the CPU, the audio signal processing circuit, the first speaker, the wireless communication circuit, the positioning device, and the acceleration sensor are all wrapped inside the left ear shell, and the touch screen is located outside the left ear shell, which is convenient for the user to operate. The touch screen, the memory, the wireless communication circuit and the audio signal processing circuit are respectively electrically connected with the CPU, and the first loudspeaker is electrically connected with the audio signal processing circuit.
Right side ear 12 includes right ear shell, second speaker, power supply device, touch-control board supporting circuit and touch-control board, and wherein, touch-control board supporting circuit, power supply device and second speaker all wrap up inside the right ear shell, the touch-control board is located the outside of right ear shell, the position of the user operation of being convenient for. The touch pad is electrically connected with a touch pad supporting circuit, and the touch pad supporting circuit is electrically connected with the CPU through the routing in the head beam part; the second loudspeaker is electrically connected with the audio signal processing circuit through the direction inside the head beam part; the power supply device is electrically connected with all other devices of the second ear part and is electrically connected with all the devices of the first ear part through the routing wire inside the head beam part.
The audio source is processed by the CPU coding and decoding to obtain audio digital signals, the audio digital signals are converted into audio analog signals through the audio signal processing circuit and amplified, the amplified audio analog signals are sent to the first loudspeaker and the second loudspeaker, and the first loudspeaker and the second loudspeaker convert the amplified audio analog signals into sound to be played.
Specifically, the wireless communication circuit may include: a cellular communication unit, and/or a wifi communication unit, and/or a bluetooth communication unit; the cellular communication unit and/or the wifi communication unit and/or the Bluetooth communication unit are electrically connected with the CPU respectively. The cellular communication unit can support various cellular networks such as 3G, 4G and the like; the wifi communication unit may perform network communication based on different network protocols, such as a Real-Time Transport Protocol (RTP), a Streaming media Protocol (MMS), a Streaming Protocol (RTSP), and the like; the bluetooth communication unit may support bluetooth data interaction of the current device and the nearby device.
Also included on the CPU is a computer readable medium having a custom operating system stored thereon, the execution and calculation of which is performed on the CPU. The operating system can be considered as a control unit capable of controlling the CPU to perform an audio playback operation according to a user instruction. The operating system comprises a human-computer interaction module, wherein the human-computer interaction module provides a user interface for human-computer interaction, and the user interface is presented on the touch screen of the left ear part, can receive a user instruction and transmits the user instruction to the CPU. The operating system further includes an execution control module, configured to generate a CPU control instruction and control the CPU to execute a corresponding action, where the instruction may include an audio source acquisition instruction generated based on the user instruction, and may also include a play control instruction, such as a volume control instruction, an audio file switching instruction, a voice control on/off instruction, a play pause/start instruction, and the like. Specifically, the execution control module is used for controlling the CPU to execute corresponding actions based on user instructions received by the user interface on the touch screen; and the touch control device is used for converting the gesture data transmitted by the touch control plate into a play control instruction and controlling the CPU to execute.
The following is a detailed description of the selection of audio sources and the user interface of the touch screen.
The audio source may be an audio file stored in a memory or an audio stream acquired from the network side through a wireless communication network. The audio sources may be selected by the user himself as desired. In particular, the user may select an audio source through a user interface on a touch screen. For example, if the user selects local music, the touch screen transmits a user instruction for selecting the local music to the CPU, the CPU reads a corresponding audio file from the memory according to the user instruction and plays the audio file, if the user selects online music, the touch screen transmits a user instruction for selecting the online music to the CPU, and the CPU acquires an audio stream from the network side through the wireless communication circuit and plays the audio stream.
In particular, the user interface on the left ear touchscreen presents an audio source selection page on which a plurality of user interface elements are presented for the user to perform operations for audio source selection. For example, as shown in fig. 3, four elements, i.e., "search", "play history", "local music", and "my song list", each corresponding to one of the audio source options, are presented on the audio source selection page.
The user clicks a search page of online music presented on the touch screen after the search, after the user performs search operation on the search page, the touch screen further presents a search result page, the search result page presents at least one selectable music file or music file list, the user selects a music file or music file list to be played, a user instruction for playing the music is input, the CPU receives the corresponding user instruction, and a corresponding audio source is obtained through a wireless network and played based on a network address of the music file or music file list selected by the user on the search result page.
The playing history records a history file list played recently by a user, the user clicks the playing history to present a playing history page, the playing history page presents at least one history file list, the user selects the history file list to be played on the playing history page, a user instruction for playing music is input, the CPU receives the corresponding user instruction, reads the corresponding audio source from the memory according to the history file list selected by the user and plays the audio source, or acquires the corresponding audio source through a wireless network and plays the audio source based on the network address recorded in the history file list selected by the user.
The My song list records a user-defined music file list, a touch screen presents a My song list page after a user clicks the My song list, at least one user-defined music file list is presented on the My song list page, the user selects a music file list to be played on the My song list page, a user instruction for playing music is input, a CPU receives the corresponding user instruction, reads a corresponding audio source from a memory according to the music file list selected by the user and plays the audio source, or acquires the corresponding audio source through a wireless network and plays the audio source based on a network address recorded in the music file list selected by the user.
In the playing process, the user interface on the touch screen presents a playing page, and the playing page presents the related information of the currently played music file, such as singer related information and music name, and also can present elements such as a playing progress bar and a volume control bar. The user interface of the embodiment of the invention also supports other similar pages, and is not described again.
In practical application, the touch screen of the left ear can adopt a 3.2-inch LCD display screen, and a multi-point touch functional screen is arranged outside the display screen.
The playing control is mainly completed by the touch pad positioned at the right ear part and the CPU together. The touch pad of the right ear senses the gesture and transmits the related information to the CPU to realize playing control. Specifically, the connection mode between the touch pad and the CPU adopts an Inter-Integrated Circuit (I2C) bus protocol, the touch pad is a capacitive matrix touch pad, and gestures of user operation, such as forward-slide, backward-slide, upward-slide, downward-slide, single-click, double-click, and the like, are recognized by tracking a change trajectory of capacitance on a finger capacitance matrix. The touch pad senses an operation gesture of a user, the obtained gesture data is transmitted to the CPU through a built-in Integrated Circuit (I2C), and a control unit on the CPU converts the gesture data into a play control instruction based on a preset mapping relation and controls the CPU to execute a corresponding audio play control action. The corresponding playing control instructions may include instructions for switching audio files (e.g., next, previous), controlling volume (e.g., increasing volume, decreasing volume), pausing/starting playing, turning on/off voice control, etc. As shown in fig. 4, an example of a mapping relationship between gestures and play control commands is shown, where a gesture of sliding back and forth corresponds to a command of previous/next, a gesture of sliding up and down corresponds to a command of increasing or decreasing volume, and a gesture of rotating sliding corresponds to a command of "fast browsing a song title or a singer list or other list information".
The following describes a preferred mode of the composition structure of the integral audio playing device according to the embodiment of the present invention.
As shown in fig. 2, the unitary audio playing device according to the embodiment of the present invention may further include a sensor. For example, the unitary audio playback device may include inertial sensors, contact sensors, heart rate sensors, and the like.
In particular, the inertial sensor may be provided at the left ear, wrapped inside the left ear shell, the inertial sensor comprising an acceleration sensor and/or a gyroscope. The inertial sensor is connected with the CPU and is connected to the CPU through an I2C interface. After a user selects and starts a motion detection function of the equipment, the inertial sensor detects motion behaviors of the user in real time and transmits the obtained motion data to the CPU, an algorithm for generating a play control instruction based on the motion data is realized through an operating system, namely a control unit, on the CPU, the algorithm can be realized by the control unit through analyzing the motion data to obtain a motion rhythm, then a matched audio file is searched in a local memory based on the motion rhythm, or the matched audio file is searched through a wireless communication circuit access network, and the searched audio file is displayed through a user interface on a right ear touch screen so as to be pushed to the user. The above process may be controlled by the user through a user interface or other means such as buttons or the like. In practical application, the acceleration sensor and/or the gyroscope may adopt a two-in-one 6-axis chip, and be connected with the CPU through an I2C interface, so as to detect the motion behavior of the user when wearing the audio playing device in real time. In practical application, the playing content and/or the playing process can be controlled by sensing the body language (nodding, shaking the head, etc.) of the user through the inertial sensor. Specifically, the control unit converts the body language of the user into a control instruction and controls the CPU to execute a corresponding action by analyzing the movement data.
Contact sensor can set up at right ear, wraps up in right earlap, and contact sensor walks the line through the head roof beam is inside to be connected with CPU. When the contact sensing function of the equipment is started, the contact sensor senses the contact of a user on the equipment in real time and transmits the detected contact data to the CPU, an operating system on the CPU, namely the control unit, analyzes the contact data, judges the wearing state of the audio playing equipment, determines whether the audio playing equipment is in the wearing state or the non-wearing state, generates a playing control instruction for continuously playing or pausing the playing of the music on the same day through a preset software algorithm, and controls the CPU to execute the corresponding instruction.
Furthermore, the control unit can also switch the operating state of the device in dependence on the contact data. Specifically, when a contact sensor senses that a user contacts the equipment, the contact state data are sent to a control unit, the control unit analyzes the contact data to wake up the equipment, and the equipment is switched from a dormant state to a working state so as to automatically start the equipment; when the contact sensor senses that the user leaves the equipment, the corresponding contact data are sent to the control unit, and the control unit sleeps the equipment according to the contact data, namely, the equipment is switched from a working state to a sleeping state, so that the equipment is automatically closed. Therefore, when the device is not used, the device can enter a dormant state, and a user can wake up the device by lightly touching the outer sides of the left and right ear shells.
Heart rate sensor can set up at right ear, and this heart rate sensor passes through head roof beam inside and walks the line and be connected with CPU, and real-time detection user's heart rate data send CPU to. Specifically, in the embodiment of the present invention, the heart rate sensor is connected to the CPU through an I2C interface, and detects periodic variation in intensity of blood reflected light of a capillary vessel of an ear of the user in an optical manner, calculates the heart rate of the user when wearing the device, and transmits the obtained heart rate data to the CPU, and an operating system on the CPU, that is, the control unit, analyzes the heart rate data and processes the heart rate data according to a related algorithm, predicts the emotion of the user, finds an audio file matching the heart rate data, and presents the found audio file information through a user interface on a touch screen, so as to recommend music matching the current heart rate to the user.
In the apparatus shown in fig. 2, the positioning device supports circuitry and sensors that inject location determination capabilities provided by the Global Positioning System (GPS) or other positioning system. The power supply means may comprise a power supply means connected to an external power source or a battery. The battery may be a secondary battery or a rechargeable battery.
The integrated audio playing device of the embodiment of the invention can also comprise a sound receiving device, wherein the left ear part and the right ear part can be respectively provided with one sound receiving device, and one sound receiving device can also be arranged on only one part. The sound receiving device is electrically connected with the CPU, and particularly, the sound receiving device can also be connected with the CPU through an I2C interface, or can also be connected with an audio signal processing circuit. The sound receiving device can be a microphone, and can also comprise two microphones. When the sound receiving device comprises two microphones, one microphone is used for collecting ambient noise, and the other microphone is used for collecting voice serving as a target signal. In practical application, the sound receiving device collects environmental noise, collects voice serving as a target signal and transmits the voice to the audio signal processing circuit, and the audio signal processing circuit performs noise reduction processing on the voice serving as the target signal based on the environmental noise to obtain effective voice data and sends the effective voice data to the control unit on the CPU. The user can carry out voice control on the equipment through the radio receiving device. Specifically, a sound receiving device collects a voice as a target signal; the control unit acquires and analyzes voice data, generates corresponding instructions and controls a user interface on the CPU or the touch screen to execute corresponding actions; the instructions include: a user interface presented on the touch screen can receive user instructions and/or the same play control instructions on the touch pad. For example, the user can directly speak "play liu de hua" ice rain "to the sound receiving device, the sound receiving device collects the voice of the user and sends the voice data to the CPU, and the CPU generates a voice command based on the voice data and controls the CPU to directly play the song specified by the user. As another example, when a device enters a sleep state, a user may also wake the device up through voice commands.
In practice, the audio signal processing circuit may comprise: the Digital-to-analog converter comprises a Digital-to-analog converter (DAC) and a power amplifier chip, wherein one end of the DAC is electrically connected with the CPU, the other end of the DAC is electrically connected with one end of the power amplifier chip, and the other end of the power amplifier chip is electrically connected with the first loudspeaker and the second loudspeaker. For example, a song or a voice file is coded and decoded by the CPU to be converted into a digital signal, the digital signal is transmitted to the digital-to-analog conversion chip through the I2S interface, the digital-to-analog conversion chip converts the digital signal into an analog signal, and finally the analog signal is amplified by the power amplifier chip and then transmitted to the left and right speakers for playing.
During practical application, the left ear shell and the right ear shell can be provided with earmuffs outside to ensure good fitting of equipment and human ears and increase comfort. The left ear and the right ear can be interchanged and can be worn at will in the using process, and the specific implementation process is similar to the process and is not repeated. In addition, the integrated playing device of the embodiment of the present invention may further include some keys for facilitating user operations, for example, a switch key for controlling the touch screen, and/or a switch key for controlling the touch pad, and the like. For example, as shown in the example of the device in fig. 1, a switch button of the touch screen is disposed on the outer side of the left ear shell, and the switch button controls the on and off of the touch screen and is electrically connected with the touch screen.
An embodiment of the present invention further provides an audio playing method of an integrated audio playing device, as shown in fig. 5, the method includes:
step 501, a CPU acquires an audio source from the memory or through a wireless communication circuit;
step 502, the CPU converts the audio source into a digital signal and sends the digital signal to an audio signal processing circuit;
step 503, the audio signal processing circuit converts the digital signal into an analog signal, amplifies the analog signal and sends the amplified analog signal to a first speaker and a second speaker;
step 504, the first speaker and the second speaker convert the amplified analog signal into sound for playing.
Wherein the method may further comprise: a user interface on the touch screen receives a user instruction and transmits the user instruction to the control unit; the control unit controls the CPU to read an audio file from the memory or acquire an audio stream through a wireless communication circuit according to the user instruction, so that the CPU acquires an audio source.
Wherein the method may further comprise: the method comprises the steps that a touchpad detects operation gestures of a user in real time and sends obtained gesture data to a control unit; the control unit converts the gesture data into a play control instruction and controls the CPU to execute a corresponding audio play control action; the play control instruction comprises any one or more of the following instructions: audio file switching instructions, volume control instructions, play stop/pause instructions, voice control on/off instructions.
Wherein the method may further comprise: the heart rate sensor detects heart rate data of a user in real time and transmits the heart rate data to the control unit; and/or the inertial sensor detects the motion behavior of the user in real time and transmits the obtained motion data to the control unit; and/or, the positioning device acquires the position of the user in real time and transmits the position data of the user to the control unit; the control unit analyzes the heart rate data and/or the motion data and/or the position data, searches a matched audio source from a local memory or searches the matched audio source in an online searching mode through a wireless communication circuit, and presents related information of the audio source on a user interface of the touch screen.
Wherein the method may further comprise: the contact sensor senses the contact of a user on the equipment in real time; when a user is sensed to touch the equipment, the touch data is sent to the control unit; and the control unit analyzes the contact data, generates a wake-up instruction and controls the CPU to wake up the equipment.
Wherein the method may further comprise: the first radio receiving device and/or the second radio receiving device collect voice serving as a target signal; the control unit acquires and analyzes voice data, generates corresponding instructions and controls a user interface on the CPU or the touch screen to execute corresponding actions; the instructions include: a user interface presented on the touch screen can receive user instructions and/or the same play control instructions on the touch pad. Here, the first radio device and/or the second radio device collects ambient noise and collects voice as a target signal; and the audio signal processing circuit performs noise reduction processing on the voice serving as the target signal based on the environmental noise to obtain effective voice data and sends the effective voice data to the control unit.
According to the integrated audio playing device provided by the embodiment of the invention, all functions such as audio selection, audio playing, playing control and the like can be realized by a single device, all operation functions of listening, selecting and playing are integrated, the operation is simple and easy, and the carrying is convenient. The whole equipment can run on an operating system, the operating system is provided with a highly customized user interface and is matched with the capacitive touch pad, and a user can conveniently complete common play control operation such as music selection and music play in various modes such as touch, voice, gestures and the like. The integrated audio playing device provided by the embodiment of the invention can better interact with people and has more humanized and personalized use experience based on the feedback of human physiological parameters such as user behaviors, heart rate and the like. The embodiment of the invention integrates the music player, the audio processor, the interpersonal interaction module, the sound generating device and the like, and a user can realize the requirement of listening to songs only by one device in the embodiment of the invention.
The above description is only a preferred embodiment of the present invention, and is not intended to limit the scope of the present invention.

Claims (19)

1. An integrated audio playback device, the device comprising: the head-beam part is connected between the first lug part and the second lug part;
the head beam part is internally provided with a routing wire;
the first ear comprises a first ear shell, a first loudspeaker, an audio signal processing circuit, a CPU, a memory and a wireless communication circuit which are wrapped in the first ear shell, and a touch screen positioned on the outer side of the first ear shell; the touch screen, the memory, the wireless communication circuit and the audio signal processing circuit are respectively electrically connected with the CPU, and the first loudspeaker is electrically connected with the audio signal processing circuit;
the second ear part comprises a second ear shell, a second loudspeaker, a power supply device, a touch pad supporting circuit and a touch pad, wherein the second loudspeaker, the power supply device, the touch pad supporting circuit and the touch pad are wrapped inside the second ear shell; the second loudspeaker is electrically connected with the audio signal processing circuit through the direction inside the head beam part; the power supply device is electrically connected with all other devices of the second ear part and is electrically connected with all the devices of the first ear part through the routing in the head beam part.
2. The device of claim 1, wherein the audio signal processing circuit and the touch pad are electrically connected to the CPU through an inter-integrated circuit I2C interface.
3. The apparatus of claim 1, wherein the first ear further comprises an inertial sensor, and/or a positioning device; the inertial sensor is electrically connected with the CPU and is connected with the power supply device through a routing wire in the head beam part; the positioning device is electrically connected with the CPU.
4. The apparatus of claim 3, wherein the inertial sensor comprises: acceleration sensors, and/or gyroscopes; the acceleration sensor and/or the gyroscope adopt a two-in-one 6-axis chip and are electrically connected with the CPU through I2C.
5. The device of claim 1, wherein the wireless communication circuitry comprises: a cellular communication unit, and/or a wifi communication unit, and/or a bluetooth communication unit; the cellular communication unit and/or the wifi communication unit and/or the Bluetooth communication unit are electrically connected with the CPU respectively.
6. The apparatus of claim 1, wherein the second ear further comprises: the heart rate sensor is electrically connected with the CPU through a wiring in the head beam part;
and/or the second ear further comprises: a contact sensor; the contact sensor is electrically connected with the CPU through the routing in the head beam part.
7. The device of claim 6, wherein the heart rate sensor and/or contact sensor is electrically connected to the CPU through an I2C interface.
8. The apparatus of claim 1, wherein the first ear further comprises: the first radio device is electrically connected with the CPU;
and/or, the second ear further comprises: and the second sound receiving device is electrically connected with the CPU through the routing in the head beam part.
9. The apparatus according to claim 8, wherein the first and/or second sound-collecting device comprises two microphones, one microphone for collecting ambient noise and the other microphone for collecting voice as a target signal.
10. The apparatus of claim 1, wherein the first ear further comprises: a control unit that is an operating system stored in a computer readable medium of the CPU;
the control unit includes: the system comprises a human-computer interaction module and an execution control module; wherein,
the human-computer interaction module is used for providing a user interface presented by the touch screen and receiving a user instruction;
the execution control module is used for controlling the CPU to execute corresponding actions based on user instructions received by the user interface on the touch screen; and the touch control device is used for converting the gesture data transmitted by the touch control plate into a play control instruction and controlling the CPU to execute.
11. The apparatus according to claim 1, wherein the audio signal processing circuit comprises a digital-to-analog conversion chip and a power amplifier chip, one end of the digital-to-analog conversion chip is electrically connected to the CPU, the other end of the digital-to-analog conversion chip is electrically connected to one end of the power amplifier chip, and the other end of the power amplifier chip is electrically connected to the first speaker and the second speaker.
12. The device of claim 1, wherein the touch pad is a capacitive matrix touch pad.
13. An audio playing method of an integrated audio playing device, the method comprising:
the CPU acquires an audio source from the memory or through a wireless communication circuit;
the CPU converts the audio source into a digital signal and sends the digital signal to an audio signal processing circuit;
the audio signal processing circuit converts the digital signal into an analog signal, amplifies the analog signal and sends the amplified analog signal to a first loudspeaker and a second loudspeaker;
and the first loudspeaker and the second loudspeaker convert the amplified analog signals into sound for playing.
14. The method of claim 13, further comprising:
a user interface on the touch screen receives a user instruction and transmits the user instruction to the control unit;
the control unit controls the CPU to read an audio file from the memory or acquire an audio stream through a wireless communication circuit according to the user instruction, so that the CPU acquires an audio source.
15. The method of claim 13, further comprising:
the method comprises the steps that a touchpad detects operation gestures of a user in real time and sends obtained gesture data to a control unit;
the control unit converts the gesture data into a play control instruction and controls the CPU to execute a corresponding audio play control action;
the play control instruction comprises any one or more of the following instructions: audio file switching instructions, volume control instructions, play stop/pause instructions, voice control on/off instructions.
16. The method of claim 13, further comprising:
the heart rate sensor detects heart rate data of a user in real time and transmits the heart rate data to the control unit; and/or the inertial sensor detects the motion behavior of the user in real time and transmits the obtained motion data to the control unit; and/or, the positioning device acquires the position of the user in real time and transmits the position data of the user to the control unit;
the control unit analyzes the heart rate data and/or the motion data and/or the position data, searches a matched audio source from a local memory or searches the matched audio source in an online searching mode through a wireless communication circuit, and presents related information of the audio source on a user interface of the touch screen.
17. The method of claim 13, further comprising:
the contact sensor senses the contact of a user on the equipment in real time;
when a user is sensed to touch the equipment, the touch data is sent to the control unit;
and the control unit analyzes the contact data, generates a wake-up instruction and controls the CPU to wake up the equipment.
18. The method of claim 13, further comprising:
the first radio receiving device and/or the second radio receiving device collect voice serving as a target signal;
the control unit acquires and analyzes voice data, generates corresponding instructions and controls a user interface on the CPU or the touch screen to execute corresponding actions;
the instructions include: a user interface presented on the touch screen can receive user instructions and/or the same play control instructions on the touch pad.
19. The method according to claim 13 or 18, characterized in that the method further comprises:
the first radio device and/or the second radio device collects environmental noise and collects voice as a target signal;
and the audio signal processing circuit performs noise reduction processing on the voice serving as the target signal based on the environmental noise to obtain effective voice data and sends the effective voice data to the control unit.
CN201510342359.1A 2015-06-18 2015-06-18 Integral type audio-frequence player device and its audio frequency playing method Active CN105005379B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510342359.1A CN105005379B (en) 2015-06-18 2015-06-18 Integral type audio-frequence player device and its audio frequency playing method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510342359.1A CN105005379B (en) 2015-06-18 2015-06-18 Integral type audio-frequence player device and its audio frequency playing method

Publications (2)

Publication Number Publication Date
CN105005379A true CN105005379A (en) 2015-10-28
CN105005379B CN105005379B (en) 2018-11-27

Family

ID=54378075

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510342359.1A Active CN105005379B (en) 2015-06-18 2015-06-18 Integral type audio-frequence player device and its audio frequency playing method

Country Status (1)

Country Link
CN (1) CN105005379B (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105992088A (en) * 2015-01-05 2016-10-05 致伸科技股份有限公司 Earphone device with control function
CN106162412A (en) * 2016-08-31 2016-11-23 乐视控股(北京)有限公司 A kind of method of sound collection and earphone
CN107896355A (en) * 2017-11-13 2018-04-10 北京小米移动软件有限公司 The control method and device of AI audio amplifiers
CN107995552A (en) * 2018-01-23 2018-05-04 深圳市沃特沃德股份有限公司 The control method and device of bluetooth headset
CN109644299A (en) * 2017-06-27 2019-04-16 深圳市柔宇科技有限公司 Touch-control speaker and its control method
CN109729459A (en) * 2017-10-27 2019-05-07 北京金锐德路科技有限公司 The heart rate device of formula interactive voice earphone is worn for neck
CN109729469A (en) * 2017-10-27 2019-05-07 北京金锐德路科技有限公司 The inertial measuring unit of formula interactive voice earphone is worn for neck
CN109729473A (en) * 2017-10-27 2019-05-07 北京金锐德路科技有限公司 Neck wears formula interactive voice earphone
CN110750722A (en) * 2019-10-21 2020-02-04 出门问问信息科技有限公司 Method for pushing audio content through earphone, computing equipment and pushing system
CN113938781A (en) * 2021-08-27 2022-01-14 北京声智科技有限公司 Earphone-based holographic projection method, device, equipment and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100303258A1 (en) * 2008-07-14 2010-12-02 Yang Pan Portable media delivery system with a media server and highly portable media client devices
CN204046782U (en) * 2014-09-25 2014-12-24 深圳市埃微信息技术有限公司 The portable intelligent earphone of physical function and motor pattern can be monitored
CN204833155U (en) * 2015-06-18 2015-12-02 北京金锐德路科技有限公司 Integral type audio playback equipment

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100303258A1 (en) * 2008-07-14 2010-12-02 Yang Pan Portable media delivery system with a media server and highly portable media client devices
CN204046782U (en) * 2014-09-25 2014-12-24 深圳市埃微信息技术有限公司 The portable intelligent earphone of physical function and motor pattern can be monitored
CN204833155U (en) * 2015-06-18 2015-12-02 北京金锐德路科技有限公司 Integral type audio playback equipment

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
91.COM移动互联网第一平台: "可3G/WiFi联网智能耳机VOW体验", < HTTP://PJ.91.COM/REVIEW/AUDIO/140703/21708566.HTML > *

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105992088A (en) * 2015-01-05 2016-10-05 致伸科技股份有限公司 Earphone device with control function
CN105992088B (en) * 2015-01-05 2019-05-10 惠州迪芬尼声学科技股份有限公司 Earphone device with control function
CN106162412A (en) * 2016-08-31 2016-11-23 乐视控股(北京)有限公司 A kind of method of sound collection and earphone
CN109644299A (en) * 2017-06-27 2019-04-16 深圳市柔宇科技有限公司 Touch-control speaker and its control method
CN109729459A (en) * 2017-10-27 2019-05-07 北京金锐德路科技有限公司 The heart rate device of formula interactive voice earphone is worn for neck
CN109729469A (en) * 2017-10-27 2019-05-07 北京金锐德路科技有限公司 The inertial measuring unit of formula interactive voice earphone is worn for neck
CN109729473A (en) * 2017-10-27 2019-05-07 北京金锐德路科技有限公司 Neck wears formula interactive voice earphone
CN107896355A (en) * 2017-11-13 2018-04-10 北京小米移动软件有限公司 The control method and device of AI audio amplifiers
CN107995552A (en) * 2018-01-23 2018-05-04 深圳市沃特沃德股份有限公司 The control method and device of bluetooth headset
WO2019144464A1 (en) * 2018-01-23 2019-08-01 深圳市沃特沃德股份有限公司 Control method and apparatus for bluetooth headset
CN110750722A (en) * 2019-10-21 2020-02-04 出门问问信息科技有限公司 Method for pushing audio content through earphone, computing equipment and pushing system
CN113938781A (en) * 2021-08-27 2022-01-14 北京声智科技有限公司 Earphone-based holographic projection method, device, equipment and storage medium

Also Published As

Publication number Publication date
CN105005379B (en) 2018-11-27

Similar Documents

Publication Publication Date Title
CN105005379B (en) Integral type audio-frequence player device and its audio frequency playing method
US10466961B2 (en) Method for processing audio signal and related products
CN203075421U (en) Music playing system based on emotion change
US10162593B2 (en) Coordinated hand-off of audio data transmission
US20160198319A1 (en) Method and system for communicatively coupling a wearable computer with one or more non-wearable computers
EP3139261A1 (en) User terminal apparatus, system, and method for controlling the same
KR102067019B1 (en) Apparatus and method for controlling charging path of mobile terminal
US10687142B2 (en) Method for input operation control and related products
CN104484045B (en) Audio play control method and device
US10628119B2 (en) Sound effect processing method and mobile terminal
WO2017181365A1 (en) Earphone channel control method, related apparatus, and system
JP2016502723A (en) Method of controlling terminal with earphone cable, apparatus, device, program and recording medium for controlling terminal with earphone cable
CN110568926B (en) Sound signal processing method and terminal equipment
WO2018223837A1 (en) Music playing method and related product
CN107291242B (en) Intelligent terminal control method and intelligent terminal
CN107633853B (en) Control method for playing audio and video files and user terminal
WO2019105376A1 (en) Gesture recognition method, terminal and storage medium
WO2017088527A1 (en) Audio file re-recording method, device and storage medium
CN103634717A (en) Method, device and terminal equipment utilizing earphone to control
CN109067965A (en) Interpretation method, translating equipment, wearable device and storage medium
CN105824424A (en) Music control method and terminal
CN204971260U (en) Head -mounted electronic equipment
CN204833155U (en) Integral type audio playback equipment
CN112256135A (en) Equipment control method and device, equipment and storage medium
CN105049995B (en) Ear multimedia playing equipment

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant