Question
I've been reading about the potential for sensors to be integrated into hearing instruments and I think it's very exciting. What do you think is a realistic timeline for integrating this type of technology into future hearings aids?
Answer
Simon Carlile: Some people may know that Starkey has a partnership with the German company Bragi. Bragi has realized a hearable, what’s referred to as The Dash. The interesting thing about The Dash is it already has all of those sensors built into it. So, The Dash is a Bluetooth headset you stick in your ears at it will measure your heart rate. It will measure galvanic skin response. It has all the motion sensors and activity capabilities of, say, your FitBit. So, technically, the task has already been achieved. And really, it's a matter now of sort of going through and looking at ways that we can integrate this in a very robust and highly reliable package. But, also doesn’t draw a lot of current. So, that’s going to be a big challenge here, of course, because hearing aid wearers want to stick their hearing aids in for more than three or four hours a day. I would like to say we could get these type of devices out really soon, say within a small number of years; a couple of years. But, frankly, the research still has to be done about how we can build these such that we can still have a hearing aids that someone can wear for, say, 10 or 12 hours then pop into a recharger at the end of the day. So, there is still a bit of research to do and it is always hard to judge how long that’s going to take. But I would say, if they went out in the 3-5 year time frame, I wouldn’t be surprised.
Do you foresee any negatives to incorporating sensor technology into hearing aids? For example, people may be rely too much on the technology such as think their hearing aids will monitor their entire health all the time? Or do you think the technology will involve smoothly and be well integrated so that we're not over-reliant on it.
I think you put your finger on some very important questions. We would need to be very careful about how we use the data that these sensors and this technology would collect. If we are simply using the sets of data to index into our models of auditory ecology, then I don’t think there’s a risk because we’re not telling people anything about their health. We are just using it to help us control the listening instrument in the particular environment that the person finds themselves in. On the other hand, it would seem to be a lost opportunity if we didn’t also provide some other capabilities. For example, the FitBit does provide you with information about your activity. It does provide you with information about heart rate and sleep patterns and that sort of thing. But, I think we need to be very cautious about how we do that. Because we wouldn’t position this as a medical grade technology on which people would depend to monitor their health. We might, for instance, use activity trackers as fall detectors. If we have the ability to detect when a fall has happened, we could then allow a signal to be sent by Bluetooth from the hearing aid to the person’s mobile phone, and connect to an emergency response center or the caretaker. There are a range of things we could do. I think we need to be cautious about how we roll those out. But there is great opportunity.
The other important consideration is that data is personal. And, we need to be aware of the fact that individuals have to build up trusted relationships with these sorts of technology. And, the way that this technology is used. Because, if people think that their hearing aid is spying on them, or has that capability, well then, who would use one? We don’t want to put another barrier in the way. So we need to be very explicit in how we are managing the data.
As the development of hearing aids with sensors moves forward, how closely linked is the development of new mobile apps to integrate with these new hearing aids?
In the early stages of sensor integration, it is likely that much of the data will be uploaded to a mobile phone and the Cloud for tracking and processing. Later, as new generation mobile telecommunications become more widespread, the mobile phone may well disappear as the repository. I would expect that the Cloud and the listening devices will have a direct connection.
Along the lines of mobile apps, with the new capabilities of future hearing aids, do you think hearing aids will become even more “automatic” or do you think the patient will be interacting with them more?
It depends on what you mean by interaction. The brain sensing technologies based around EEG will mean that the listening device will develop the capability to read listener intent. This will become a control signal to enhance the source that has the listener's attention. I guess that qualifies as an interaction between the listener and the device but it will appear as a more automatic interaction from the listener’s perspective. Another possible form of interaction will be by the listener talking to the device. Using advanced speech recognition, the device will change its operations based on spoken commands. I think we will also begin to see a much more sophisticated auditory display rendered by the device. There will be virtual and augmented rendering of spoken and non-spoken sounds as a basis for menu selections and other functions. Listeners could interact with these by gestures such as nodding towards the spatial location into which the virtual sound source has been rendered or by simply focusing attention on a desired item as means of selecting the item.
Will the form factor of future technology be driven by that technology? In other words, will the integration of sensors and wireless connectivity dictate hearing aid styles, or will cosmetics always drive the technology?
I think both will play a role. The real wild card in this will be the rise of the hearables and the convergence with hearing aids. Headphones are the most accepted form of wearable technology other than eyeglasses. This has the potential to negate the current level of stigma associated with hearing aids. When that happens, the technology itself will be able to provide support to listening in all kinds of difficult circumstances for people with hearing loss, or for anyone listening in noisy environment, communicating over distances. or even across language barriers. Here I think the drivers will likely be fashion and utility.
This Ask the Expert is an excerpt from the Starkey CEU webinar, Leading in Hearing Science - register here to view the course in its entirety.