At the intersection of the Salesforce ecosystem and the accessibility community, it has been long known that Experience Builder contains task-blocking accessibility issues that hold many disabled people back from being able to perform important job duties including site administration and content management. While the company continues efforts to improve the accessibility of Experience Builder, disabled administrators, content managers and site developers who rely on keyboard-only navigation and screen readers are finding ways to work around barriers thanks to new tools based on artificial intelligence (AI).

A new AI-powered Windows app called Guide accepts a user’s prompt and performs all the mouse clicks, double clicks, right clicks, drags, drops and other similar actions required to get work done with inaccessible apps that would otherwise represent insurmountable barriers locking disabled people out of education and employment.

Users of this tool have found they are able to click the inaccessible attestation checkboxes for agreeing to privacy policies and terms of service found on many forms, select options from inaccessible dropdown controls and even drag and drop components on to canvases in builder apps that are not designed with affordances for keyboard-only navigation.

One such app is the Experience Builder Salesforce administrators and developers use to create and manage Experience Cloud sites. The lack of keyboard alternatives in many functions of this web app means that blind people, people with physical disabilities affecting their hand movements and others who use alternative pointing devices, keyboards, screen readers and other assistive technologies can easily find themselves locked out of job opportunities when they cannot perform such basic functions as adding and rearranging components on a page canvas. Now in comes Guide, which can manipulate the mouse pointer to accessify these types of tasks based on the user’s plain-language prompt. Want to add a component to a canvas in an inaccessible builder? Just press Guide’s keyboard shortcut, enter a prompt such as “Please drag the headline component to the content section of the page canvas,” and listen as Guide gets the job done while explaining its reasoning and describing its actions along the way. Does this seem unbelievable? Well, stay tuned for a demonstration of this new reality in which we find ourselves and prepare to have your mind blown.

Now its time to accept the challenge. Please watch the demo video and think about how AI-powered assistive technologies like Guide might change the game. Do software designers and developers still need to build for accessibility, or is it becoming acceptable for them to foist the problem on to AI tools? Who pays for these tools, not only in monetary terms but also in other costs such as cognitive load? What sort of middle ground might exist where product owners, designers and developers still do the right accessible thing while assistive technologies themselves incorporate more and more advanced features?