Visual Studio Code Accessibility is getting better and better
After a longer hiatus, I recently returned to using Visual Studio Code. I’m retired nowadays, but feel the itch to flex my muscles a bit again and maybe learn one of the modern programming languages like Python or Rust properly. And after some research, I decided to go back to using VS Code once more. Here’s what I found.
A bit of history
I became involved with VS Code in the late 2010s, when I was still working at Mozilla, searching for a comprehensive IDE that could deal with the mixed bag of the Mozilla code base. As with many projects, in the early versions, VS Code was largely inaccessible to screen reader users. But a very vocal community, and some very capable and dedicated people at Microsoft, gradually introduced more and more accessibility support into each version. This included better keyboard support, screen reader feedback and proper mapping of UI concepts into something screen readers could understand.
In early 2021, I got ill with my third and hopefully final burnout episode, which lead to my retirement in late 2022, I also withdrew from engagement in any accessibility-related projects, including VS Code. There was simply no energy in me to do any testing, reporting of bugs, or other activities that so much reminded me of the work I had to leave behind.
Fast-forward to early 2025. I feel a bit better, have started to contribute a bit of time to a local blindness organisation, and as mentioned in the introduction, feel a bit of an itch to return to programming as a hobbyist. It’s been a hobby of mine since adolescence. I later made it my profession. But as with anything accessibility-related, you quickly become an expert in many things more.
I also found that during my work, I never had the time to really get into one of the modern programming languages like Python, Rust, or Go. The only programming language I “speak” very well is Turbo Pascal, and later Borland Delphi, and that was originally conceived in the 1980s. I can read somewhat Python code, I also know my way around JavaScript or TypeScript, enough to fix the occasional accessibility bug in the past. But that’s about it.
New discoveries
Since the last time I used VS Code, all the computers that originally ran it have been replaced with newer ones in my household. So a fresh installation was in order. I decided to go for the release builds, not anything beta or insider.
Accessible Views
The first thing that greeted me after installing VS Code on my Mac was a welcome wizard, as expected. However, as I navigated, I found hints like “press Option+F2 to open this in accessible view”. What this does is take the current item of information and puts it in a read-only text field that can be navigated line by line, character by character, word by word. It also displays well on a Braille display. As I found out later, this accessible view can be applied to many areas like hover output, debug output, terminal output, and others. Yes, it is a plain text version of what’s displayed, but in most cases, that’s what is needed to get the full picture. If there are focusable elements like links, one can go back to the original view and navigate using the keyboard (like tab or VoiceOver commands) to get to those and activate.
Context-sensitive accessibility Help
The second thing I found was that the onboarding experience now includes a whole section on accessibility features, including many hints for screen reader users. And one of those is the availability of accessibility help. It works whether you’re in an editor, a terminal console, the debugger, or others. Pressing Option+F1 (on the Mac) will bring up an explainer in the same accessible view. It will tell you about the current context, or which keyboard shortcuts you can use to perform certain tasks quickly. It also allows you to assign keyboard shortcuts to currently unassigned functions you may need, straight from that accessibility help screen. Users of the JAWS screen reader for Windows may find this somewhat familiar. JAWS has had a feature called screen-sensitive help since literally version 1 30 years ago, which explains for most common controls, or app specific screens, how to operate them. If desired, this can either be just spoken, or brought up in an accessible view (they called it virtual viewer) to navigate and read on a Braille display. For a complex UI like a full IDE that Visual Studio Code is, this is definitely a great help to better familiarise oneself with it.
Sounds
One more thing hinted at in the accessibility onboarding was the availability of sounds for certain actions or states of various UI elements. This was just getting started when I last used VS Code, and has now been expanded to numerous events and actions. If, for example, you navigate a code file and have set a breakpoint for the debugger, when you navigate to that line, a sound is played to alert you of the fact that there is a breakpoint here. Sighted folks see an indicator in the left gutter of the editor. These indicators usually cannot be recognised by screen readers, and there is no easy way to communicate that via the screen reader itself. However, since VS Code knows of all its internal workings, the developers have no problems implementing mechanisms to play a sound when this event occurs. Likewise, if something is in progress, it is being indicated via a different sound.
If you’d like to play around with the sounds and hear what each event sounds like, bring up the command palette and search for the command “Help: List signal sounds”. Up and down arrow through the options to hear each sound associated with the selected event. You can also configure the sounds here and create your very own sound scheme that suits you best.
I find the sounds a very welcome addition. They create a level of efficiency that I have not seen in other IDEs. Simply hearing that there is a breakpoint, an error or such on a line while browsing a file via a sound while the actual line is being read to me, is hugely valuable.
Now, there may be those who argue that these are queues that should be provided by the screen reader instead of the application. And for many general-purpose scenarios, I agree with you. However, these are very specific events that only pertain to IDEs like Visual Studio Code and some others. Especially if these are also cross-platform, giving screen readers the ability to make those kinds of announcements would mean introducing these specific events to three, maybe four different accessibility APIs: IAccessible2 and UIAccessibility on Windows, GTK on Linux, and Apple’s Universal Access APIs on macOS. And then have JAWS, NVDA, Narrator, Orca and VoiceOver support those new events. That would mean years of tedious standards work. And some APIs, like UIA and Apple’s UA, are not exactly known to be open and transparent. So, implementing these internally in VS Code is not only pragmatic, but also means that these features can get to users faster, and be consistent across platforms. It wouldn’t matter if I used VS Code on Windows, Mac or Linux, the experience would be similar, if not identical.
The only other editor I know of that maybe has this kind of access is Emacs and Emacspeak. Emacspeak is a mod to Emacs that directly becomes part of Emacs and turns it into a self-voicing application. Like VS Code, it has to know all the internals of Emacs to be able to provide this kind of access. However, I have never actually set it up and tried to use it because Emacs is too far out of my comfort zone and would have a very steep learning curve for me.
Conclusion
After I found out about all the accessibility features in VS Code, some of which I knew, but many of which were new to me, I went through some initial tutorials to get started on programming languages. For example, I stepped through the Python tutorial without any problems and with a great accessibility experience. I haven’t yet decided with programming language it will be, but Python is certainly high on the list. Rust is also a close candidate.
I’m thrilled to have made my return to VS Code. I’ve tried some other editors on the Mac as well, but none of them give me this level of access. As a blind person, I want to focus on my actual tasks like all my sighted friends, not struggle with too many accessibility issues in the IDE itself. And I wish to be able to use the same tools as they do, including auto-complete, code hints, syntax help and others. And as of right now, VS Code gives me the best experience.
Furthermore, the fact that virtually every release note contains a section on accessibility improvements, is very encouraging. Let’s hope it stays this way with the new political climate in the U.S. and many of its companies. But as long as the team is willing and allowed to work on it, I’m sure they’ll do great work in the future.