One Week with NVDA: A JAWS User’s Immersion Journey

What started as a seven-day experiment ended with a new primary screen reader.

I’ll be honest: I didn’t expect this to go the way it did. On February 14th, 2026, I set myself a challenge — use NVDA exclusively on my personal computer for one full week, switching back to JAWS only if my work required it. I’ve been a longtime JAWS user, and NVDA has always been on my radar as the powerful, free, open-source alternative. But radar is different from reality. So I dove in.

One week later — and several days beyond that — I’m still running NVDA. It has become my primary Windows screen reader. I won’t be abandoning JAWS entirely; both tools have their place. But if you’ve been on the fence about giving NVDA a serious try, read on. Here’s everything that happened.

Day 1 (February 14): First Impressions and the Punctuation Problem

The very first thing that tripped me up was punctuation. NVDA defaults to “some” punctuation, while I was accustomed to “most” in JAWS. The practical effect: symbols like the underscore were being silently skipped. I switched to “most” punctuation right away, and that helped — but it opened its own can of worms.

In “most” mode, NVDA announces the underscore as “line.” I found that maddening. The colon inside timestamps (insert+F12 for the time) was also being spoken aloud, which felt odd. These were small things, but they added up quickly.

I also explored the NVDA Addon Store. It’s a great concept, but I found the execution a bit rough — many addons lack solid documentation, and reading user reviews means navigating away to an external website. There’s room to grow here.

One more early grievance: common commands like Control+C and Control+S are completely silent in NVDA. You press copy or save and hear… nothing. The option to speak command keys does exist, but it makes everything chatty — tab, arrows, all of it. That’s not what I wanted either.

Day 2 (February 15): Muscle Memory Wars and Customization Overload

Day two was the most turbulent. My JAWS muscle memory fought me at every turn, and I spent a significant portion of the day not doing productive work but rather reconfiguring NVDA to survive.

Browse Mode and Focus Mode were a constant source of confusion. In JAWS, Semi Auto Forms Mode handles a lot of this context-switching behind the scenes. With NVDA, I found myself stuck in the wrong mode repeatedly. A simple example: after submitting a prompt to Gemini and hearing its reply, I pressed H to navigate to the heading where the response started. NVDA just said “h” and sat there. I was still in Focus Mode. Insert+Space toggled Browse Mode on and then everything worked — but I had to consciously remember to do that. This will likely get easier with time, but on day two, it was genuinely frustrating.

I remapped a fistful of commands to save my sanity. The NVDA find command in Browse Mode is Control+NVDA+F — not Control+F — which felt deeply wrong. I added Control+F, F3, and Shift+F3 under Preferences > Input Gestures. I also kept repeatedly bumping into Insert+Q being the command to exit NVDA rather than announcing the active application, which nearly gave me a heart attack the first time it happened. I enabled exit confirmation in Preferences > General, then later reassigned Insert+Q to announce the focused app and moved the exit command to Insert+F4.

The underscore-as-“line” issue got its resolution today. The fix wasn’t in NVDA’s speech dictionaries as I first expected — it was in Preferences > Punctuation/Pronunciation. Problem solved. I also tackled the exclamation mark, which sits in the “all” punctuation tier rather than “most.” I mapped it to announce as “bang” when it appears mid-sentence.

There was also a frustrating addon conflict: the NVDA+Shift+V keystroke, officially assigned to announce an app’s version number, was instead being intercepted by the Vision Assistant Pro addon to open its command layer. Addon keystrokes can silently override core NVDA functionality — something worth knowing. I ended up assigning Control+NVDA+V to get version info.

One gap I noticed that NVDA doesn’t yet fill: quickly reading the current page’s URL without shifting focus to the address bar. JAWS handles this with Insert+A. NVDA doesn’t have an equivalent. Alt+D works, but it moves focus, which isn’t always what I want.

Day 3 (February 16): The Good, The Annoying, and a Genuine Win

By day three — President’s Day — I was settling into something like a rhythm, though NVDA was still throwing surprises at me.

One thing I couldn’t crack was typing echo. In JAWS, I run character-level echo at a much higher speech rate than everything else. This gives me fast, confident confirmation of each keystroke without slowing down general speech. NVDA doesn’t appear to support different speech rates per context, so typed characters come through at the same rate as everything else. I know I can’t be the only person who relies on this, so I kept digging — but no solution yet.

I also noticed a recurring issue: NVDA going silent after focus changes. Closing Excel or Word and returning to File Explorer? Silence. Switching browser tabs with Control+Tab? Sometimes silence. This felt like potential bug territory.

PDFs were another pain point. I work with many poorly tagged PDFs, and NVDA with Adobe Reader exposes every formatting flaw without mercy. JAWS has historically done more smoothing and pre-processing before those errors reach the user. I’m withholding final judgment here — there are third-party PDF tools that work well with NVDA, and I planned to test them.

I experimented briefly with turning off automatic say-all on page load to reduce repetitive speech on websites. Bad idea. After toggling an action, nothing was announced — I had to manually navigate just to figure out where I had ended up. I turned it back on immediately.

The genuine win of the day: the Vision Assistant Pro addon. While working on a freelance project that required a visual description of a web page’s layout, I pressed NVDA+Alt+V then O for an on-screen description. Within seconds I had exactly what I needed. A follow-up question was answered just as quickly. Cross-checking with other tools confirmed the accuracy. This was an impressive moment and a real argument for NVDA’s addon ecosystem.

Day 4 (February 17): The 32-Bit Revelation and Eloquence Arrives

I learned something on day four that genuinely surprised me: NVDA 2025.3.3, the current stable release, is 32-bit. I had assumed for years that I was running a 64-bit screen reader. This discovery came about through an unexpected path.

I came across a link to a 64-bit version of the Eloquence speech synthesizer built for NVDA. Excited, I installed it and restarted — only to find NVDA using Windows OneCore voices with no trace of Eloquence. After posting about it on Mastodon, the community quickly pointed out the 32-bit issue. The 64-bit Eloquence addon requires a 64-bit NVDA, which only exists in the 2026 beta builds. I grabbed the beta, installed everything, and was finally running Eloquence on NVDA. The 64-bit upgrade is coming in the official 2026.1 release — well worth watching for.

I also continued searching for an NVDA equivalent to JAWS’s Shift+Insert+F1, which gives a detailed browser-level view of an element’s tags, attributes, roles, and IDs. This is invaluable for accessibility work. I hadn’t found a satisfying answer by end of day.

Day 5 (February 18): Discovering NVDA in Microsoft Word

I don’t often think of Browse Mode as a Word feature, so I was pleasantly surprised to learn — after reading some documentation — that NVDA supports a version of it in Word, allowing quick navigation by headings using the H key. This made my document work much more manageable.

I also received another update to 64-bit Eloquence, which fixed bugs I hadn’t even noticed. As for the work computer, I decided against installing the NVDA beta there — my employer deserves results from the stable release. That upgrade will wait for the official 2026.1 launch.

Day 6 (February 19): The Quiet Day

Day six was uneventful in the best possible way. I used my computer heavily and NVDA just worked. No major incidents, no emergency remappings. I noticed I was reaching for JAWS less and less in my thoughts. That felt significant.

Day 7 (February 20): Amateur Radio and a Happy Ending

The final day of the official challenge coincided with the start of the ARRL International DX CW (Morse Code) contest — one of the bigger amateur radio events of the year. I was curious how N3FJP’s contest logging software would hold up with NVDA, since this is specialized, legacy-adjacent software that doesn’t rely on standard accessibility APIs.

The answer: it worked great — and actually felt snappier than with JAWS. The one wrinkle was reviewing the call log. The standard screen review commands on the numpad didn’t yield useful information at first. The solution was object navigation. By pressing NVDA+Numpad 8 to climb to the parent object (“call window”), I found that each column in the log is its own object. Navigating with NVDA+Numpad 4, 5, and 6 moved between objects at the same level, announcing “Rec Window,” “PWR Window,” “Country Window,” “Call Window,” and so on. From there, Numpad 9 and 7 moved through the log in reverse chronological order. Once I understood the structure, it worked beautifully.

My two radio control apps — JJRadio and Kenwood’s ARCP software — also worked flawlessly. Just when I was expecting NVDA to hit its limits, it didn’t.

What NVDA Does Really Well

After a week of intensive use, here’s what impressed me most:

  • Speed and responsiveness. NVDA frequently felt faster than JAWS, especially in applications like the N3FJP logging software.
  • Deep customizability. The Input Gestures system makes it relatively easy to remap commands. Preferences > Punctuation/Pronunciation gives granular control.
  • The addon ecosystem. Despite rough edges, the Vision Assistant Pro addon alone demonstrated real power. The 64-bit Eloquence support is also a significant upgrade.
  • Object navigation. Once I understood NVDA’s object model, navigating legacy and non-standard interfaces became genuinely manageable.
  • Cost. NVDA is free, actively developed, and open source. The value proposition is extraordinary.

Where NVDA Still Has Room to Grow

  • Silent focus changes. NVDA going quiet after closing apps or switching tabs is disorienting and may be a bug worth filing.
  • PDF handling. Poorly tagged PDFs hit differently with NVDA than with JAWS, which smooths many errors before they reach the user.
  • Typing echo speech rate. The inability to set a faster speech rate specifically for typed characters is a real productivity gap for fast typists.
  • Element inspection. JAWS’s Shift+Insert+F1 for examining element attributes has no obvious NVDA equivalent, which matters for accessibility work when I just need to start with a quick-and-dirty answer before digging deeper into the code.
  • URL reporting without focus change. A read-only way to hear the current page address — without moving focus to the address bar — is missing.
  • Addon documentation and conflict resolution. Keystroke conflicts between addons and core NVDA aren’t surfaced clearly enough.

The Verdict: One Week Became the New Normal

I went in expecting to survive a week and then gratefully return to JAWS. Instead, I’m writing this article as an NVDA user. The first two days were genuinely hard — partly NVDA’s rough edges, partly years of JAWS muscle memory fighting back. But by day six, NVDA was simply humming along, and I wasn’t thinking about JAWS at all.

For experienced JAWS users considering a serious NVDA trial, my main advice is this: budget real time for reconfiguration in the first two days. The defaults won’t feel right. But the tools to make NVDA feel right are mostly there — they just require some digging. Preferences > Punctuation/Pronunciation and Input Gestures will be your best friends.

JAWS isn’t going anywhere in my toolkit. For professional accessibility auditing, PDF work, and certain specialized contexts, it remains the gold standard. But for day-to-day use on my personal computer? NVDA has earned the top spot.

The 2026.1 release — bringing official 64-bit support — is going to be a milestone worth watching. If you’ve been waiting for a good moment to give NVDA a real chance, that moment is here, now.

Sources

This article is primarily a firsthand account based on my direct experience. The following resources document or corroborate the specific factual claims made in the article.

  • NV Access: NVDA 2025.3.3 Released — Official release announcement for the stable version of NVDA tested throughout this article, confirming it is a 32-bit build.
  • NV Access: In-Process, 10th February 2026 — NV Access’s own blog post confirming that NVDA 2026.1 is the first 64-bit release, and discussing the scope of that transition.
  • NV Access: NVDA 2026.1 Beta 3 Available for Testing — The beta release announcement for the 64-bit version of NVDA referenced in the Day 4 entry.
  • NVDA 2025.3.3 User Guide — The official NVDA documentation covering Browse Mode, Focus Mode, Input Gestures, object navigation, Punctuation/Pronunciation settings, and the Add-on Store — all features discussed throughout the article.
  • Switching from JAWS to NVDA — A community-maintained transition guide for experienced JAWS users switching to the free, open-source NVDA screen reader, covering key differences in keyboard commands, terminology, cursors, navigation, synthesizers, settings, add-ons, and common troubleshooting scenarios.
  • N3FJP’s ARRL International DX Contest Log — The official page for the N3FJP contest logging software tested with NVDA on Day 7.
  • ARRL International DX Contest — The American Radio Relay League’s official page for the ARRL International DX CW contest referenced in the Day 7 entry.

Blind Access Journal Launches Community Effort to Improve WSJT-X Accessibility for Aging and Disabled Amateur Radio Operators

FOR IMMEDIATE RELEASE

Blind Access Journal Launches Community Effort to Improve WSJT-X Accessibility for Aging and Disabled Amateur Radio Operators

Peoria, Arizona — December 20, 2025 — Darrell Hilliker, NU7I, a totally blind Amateur Radio operator and accessibility professional, is spearheading a community initiative to improve the accessibility of WSJT-X (and WSJT-X Improved) for blind, low-vision, and mobility-impaired hams. The work is being organized and documented through Blind Access Journal, the blog Hilliker publishes to advance practical accessibility and inclusion in technology.

Digital weak-signal protocols like FT8 have become a core part of modern Amateur Radio. Yet many hams—especially those who are aging or who acquire disabilities—are finding it harder to participate fully when widely used software lacks accessible user interface foundations.

“A month doesn’t go by where I don’t hear at least one conversation on the bands where an older ham is contemplating giving up or curtailing their activities due to a physical disability like arthritis or a visual impairment,” said Hilliker. “We can do better as a community—and we can do it together.”

Recognizing Existing Innovation and Building an Inclusive Future

This initiative is not a critique of existing community solutions, nor is it intended to replace them. Blind Access Journal recognizes and commends the developers of alternative tools such as QLog, whose efforts have helped many operators. Instead, Hilliker’s project aims to broaden inclusion by improving accessibility in the widely adopted WSJT-X ecosystem so that more hams can participate using the tools their clubs, friends, and on-air communities already rely on.

“The entire Amateur Radio community benefits from all efforts to adapt,” Hilliker added, “especially in situations where disabled hams are not fully included from the beginning.”

Goal: Full and Equitable Access to Digital Operating

The initiative’s objective is nothing less than full and equitable access to Amateur Radio digital communication protocols and the software that enables them. Key accessibility goals include:

  • Expected keyboard navigation throughout the application
  • Strong compatibility with screen readers such as JAWS and NVDA (NonVisual Desktop Access)
  • UI that can reflow and resize for operators using magnification
  • Support for dark mode, high contrast, and other visual accommodations that many aging hams depend on

Highest Priority Technical Need

The most critical improvement—especially for blind screen-reader users—centers on the Band Activity and Rx Frequency tables. Today, these areas are widely experienced as inaccessible because the data is effectively “painted” to the screen or presented as unstructured text, rather than implemented using the underlying Qt5 UI structures that expose information to accessibility interfaces.

The initiative seeks a redesign and implementation approach that ensures these tables are true, semantically structured UI components—so assistive technologies can reliably read, navigate, and interact with them.

Call for Volunteer Developers

Blind Access Journal is calling on a small group of experienced Amateur Radio software builders and tinkerers—especially those who:

  • Have deep experience with Qt5 user interfaces
  • Can build and compile WSJT-X or WSJT-X Improved from source with confidence
  • Are willing to collaborate with disabled hams in an open, test-driven, user-centered process

Familiarity with accessibility design and standards such as WCAG (Web Content Accessibility Guidelines) is welcome but not required. Disabled hams involved in the effort are prepared to lead the process, define needs, perform testing, write documentation, and support the work in every way outside of the core design and coding tasks.

Volunteers will gain the satisfaction of delivering long-sought, meaningful accessibility improvements to a widely used mainstream Amateur Radio application—work that can make a real difference for thousands of fellow hams.

Looking Toward 2026

Blind Access Journal thanks the Amateur Radio community for its time, creativity, and tradition of public service. The initiative’s organizers hope to make 2026 a year of digital accessibility and inclusion for all radio amateurs.

To volunteer or learn more:
Email editor@blindaccessjournal.com and follow updates via Blind Access Journal.

Media Contact

Darrell Hilliker, NU7I
Blind Access Journal
Email: editor@blindaccessjournal.com

Amateur Radio Field Day 2017

In this approximately 10-minute podcast, Darrell Hilliker demonstrates high-frequency (HF) amateur radio operation and thanks Gary (AC7R) and his crew for the chance to spend some time on the air during Field Day weekend.

We love hearing from our listeners! Please feel free to talk with us in the comments. What do you like? How could we make the show better? What topics would you like us to cover on future shows?

If you use Twitter, let’s get connected! Please follow Allison (@AlliTalk) and Darrell (@darrell).

World Radio Day: What Radio Means in a Technology World

The United Nations Educational, Scientific and Cultural Organization (UNESCO) has declared Feb. 13 World Radio Day to emphasize the ongoing value radio contributes to an ever-changing technological world. Despite the proliferation of the Internet, radio remains the single most important medium for communication and information access to the widest possible audience. Radio still goes many places the Internet infrastructure can’t, especially in many of the world’s developing nations. So, why do we need to give special emphasis to radio and what does the technology mean to us?

Have we taken radio for granted in our high-tech world? I think the answer is an emphatic “yes!” We may not realize this, but many of us are constantly on the air nowadays. It’s no longer just about the DJ on the broadcast radio airwaves, the ham radio operator keying Morse Code on a primitive transmitter or the pilot talking with her air traffic controllers to ensure a safe flight.

The world is now comprised of an uncountable number of tiny radios found in many electronic devices we have come to enjoy and use every day. We know, for instance, that an iPhone 4S contains at least six distinct radios: a radio capable of receiving and transmitting signals on the 2.4 GHz Wi-Fi band, a radio that can talk to Bluetooth devices such as headsets and keyboards, two different radios for talking on the CDMA and GSM cellular frequencies, another radio to facilitate Internet access through the cellular data networks and, finally, a GPS receiver. When you use your iPhone, it is safe to say you are probably using at least two, if not more, different radios all at the same time!

How is it we have come to forget about radio and take it for granted in our highly-developed technological society? I think the answer is that radios are not as obvious as they were once upon a time. In the not-too-distant past, if you listened to the radio, you were looking at a separate box with buttons, dials and switches and a set of headphones or a pair of speakers. If you were a radio star, you held a microphone and faced a bewildering panel of carts and controls. If you talked on a two-way radio, you probably had a special license or it was part of your job and you either held a small walkie-talkie type box or you sat in front of a bunch of equipment with lots of buttons, dials, knobs, meters and switches. In any case, the radio part of the task you were performing was front and center. That’s not so now.

When do you think about radio today? Perhaps, most of us really give it serious thought when we’re riding in our cars or listening to our stereos at home. Otherwise, although the radios in our lives are present, they’re usually buried. When I was talking with a friend about the radios in the iPhone, she thought I was referring to all the radio apps out there for listening to broadcast stations streamed on the Internet. Despite the shrinking of radios into tiny chips on circuit boards hidden inside our favorite electronic devices, we’re using them more often today than we ever have at any time in the past. When we talk on a cordless or cell phone, we’re talking on a radio. When we use a laptop computer to go online from our favorite coffee shop, we’re on the air. Believe it or not, we are all radio stars!

What does all this mean for the world? I think we’re slowly forgetting about radio’s past and, in the process, we may be leaving many people in disadvantaged populations and developing nations behind. The advent of Internet streaming and satellite radio has been cited as justification for massive cutbacks in the availability of programming on the shortwave radio broadcast bands, despite the fact that these radios are the only way hundreds of millions of people may be able to gain timely access to entertainment and important information about their world.

The long-time switch from the inherently non-visual radio medium to television and, now, streaming video on the Internet has meant that it can be more challenging for blind people to enjoy many forms of entertainment that were once more accessible. This is probably a significant reason for the resurgence of old-time-radio listening in the blind community.

How about emergency communication? What happens when the cell towers are blown down in a hurricane? What would happen if a significant number of the satellites we rely on for communication and navigation suddenly became unavailable? What would the world’s survivors do in the event of a massive electromagnetic pulse or nuclear war? The uber-geeky amateur (ham) radio operators have the enthusiasm, innovative spirit, qualifications and access to older equipment it would take to communicate during an emergency and coordinate the reorganization of the world when our high-tech gadgets and infrastructure become useless.

Unfortunately, the world’s governments continue to deemphasize radio. Shortwave broadcasts to many parts of the world are cut every year. Fewer and fewer people are interested in ham radio and there’s no longer a Morse Code requirement for any class of amateur radio license in America and many other countries. Morse Code can cut through radio noise like no other mode of radio transmission, but who is going to know how to use it when it is needed most?

How can we continue to move forward into the bright future of a technology-driven world while ensuring our safety and promoting stability and security? I think one small thing we can do is to keep radio in our minds and think about it a little each day. When you’re checking your email, talking or tweeting on your iPhone, remember that you are using several tiny radios to make it all happen. When you’re listening to satellite radio or streaming your favorite station through ooTunes, think about all the people in the developing world who don’t have access to this content and remember that an older technology called shortwave radio can reach them if we ensure its continued existence. Finally, think about those of us who have passed numerous qualification exams and learned Morse Code to earn our ham radio licenses, which we may someday need to use as a means of providing life-saving communications services in the event of a disaster.

I’d love to hear from readers. What does radio mean to you? Please feel free to post your story in the comments or mention me, darrell on Twitter.