Dasher - originally a Cambridge university inference group project was an awesome text input method for people with disabilities.Here is a quick video: https://www.youtube.com/watch?v=QOzmX2WpPZY. You can try a quick demo here: https://dasher.acecentre.net/ (select Try the latest beta - and select PPM - its very basic word prediction but you get the idea).
It’s super fast for things like eye gaze, head mouse or stylus users who need to use this because of a physical disability which may lead them to have difficulty speaking. We are on a long path to remake it in web tech and under a new MIT license. We have a great team working on the language model but we have a big backlog of issues to work on. If you are interested in working on something v unique and literally helping people to speak then step right up! - we need some more coding help to fill in the gaps
I’m starting to look for funding for a lead developer and some UI/UX work - as well as user testing. If anyone has ideas would love to hear them
An aside - the project was originally GPLv3 - we really struggled to get commercial entities involved. Its now MIT.