From Blind Confidential:
Yesterday, I engaged in an email conversation with an old buddy from the blindness community with whom I hadnâ€™t communicated in well over a year. We got a bit into the old Windows v. GNU/Linux/Macintosh discussion and which may emerge as the next accessibility leader.
We agreed that, today, with an excellent collection of AT products in all categories, Windows had a substantial lead. We then commiserated over the recent announcement that UI Automation (UIA) would not make it into the first Vista release and that AT products must continue to rely on MSAA. As new applications will use various Vista enhancements for which there will be no MSAA, the first year after the Vista release could be pretty rocky for those of us who depend upon AT to do our jobs, get an education or just enjoy computing.
The GNU/Linux discussion went a bit differently. We agreed that the gnome accessibility API certainly could provide an excellent amount of information to an AT product but as few applications exist to really exercise the framework, how will we know if it is usable â€“ another chicken and egg problem. We also questioned why it seems that, at every CSUN, the open source people have a few new demos of AT for gnome but never seem to release anything beyond an alpha test version.
This year, both IBM and Sun showed off new alpha test screen readers for the GNU/Linux platform. Sun has ORCA and IBM has a program described by three initials which I canâ€™t recall at the moment. Neither talked about gnopernicus so I guess that project died on the vine. This leaves me with the question, â€œBecause both programs are open source and both are targeting the same platform, why do we have two alphas and zero betas?â€ï¿½ Why canâ€™t we all just get along? How many more years until we hear something described as a â€œreleasedâ€ï¿½ screen reader for the gnome desktop must we wait? How many roads must a man walk down before they call him a man?
The open source world seems to have more screen readers than users.
Finally, we get to Apple. I really like some of the people working on their screen reader very much and donâ€™t want to trash it as I donâ€™t want to continue to stamp on the toes of old friends. I will just suggest that anyone interested in it read Jay Leventhalâ€™s article in Access World (I think it appeared in the December edition) and try to give it a whirl at an Apple salon shop at your local mall before committing to using it.
Thus, the recent future seems pretty murky. Personally, Iâ€™ll stick with JAWS on Windows because it will not require me to learn a whole new platform and the idiosyncrasies associated with it. I know which applications I can use and I know who I can call if Iâ€™m in a bind.
The discussion of the major platforms led us to talking about handhelds and, specifically, the â€œno blind person need applyâ€ï¿½ iPod. With a variety of different accessible portables ranging from talking cell phones to the iPAQ to PAC Mate, BrailleNote and some others with accessible interfaces that can play most, if not all, multi-media formats, why does Apple remain so completely bigoted against us blinks? Donâ€™t the hipster blind kids have the right to destroy their hearing by playing 50 Cent at an ear shattering volume?
So, why is the iPod Inaccessible?
Letâ€™s start by looking at some of the highlights of Appleâ€™s history. In 1984, Steve Jobs walked out onto the stage at a Boston Computer Society (BCS) meeting. He placed an original, 64K, single floppy disk Macintosh on a table, clicked a few things and then stood back. Although I lived in Boston at the time, I did not attend this event but Iâ€™ve seen it on video many times.
â€œHello, I am Macintosh,â€ï¿½ said the robotic speech synthesizer inside this oversized lunchbox with a screen. The Macintosh, through what we later learned was the MacTalk synthesizer, continued to describe itself as Jobs stood proudly on the stage next to his baby at its first public performance.
The attendees at this general meeting of the BCS sat silently, awed by a computer who could describe itself. Jobs went on to show the audience a WYSIWYG word processor, a paint package and a few other little doo dads that he could launch by swapping a few floppies and clicking his mouse. The 1984 audience found his performance vexing and, by the following day, the buzz about Jobsâ€™ new miracle machine had conquered the entire Boston/Cambridge nerd scene and the gossip grew louder each day until a few people got their hands on actual first run Macs.
To those of us with an interest in accessibility, Steve Jobsâ€™ performance at the BCS meeting had an entirely separate impression. The Macintosh that Steve showed the world that night included the first standard issue software speech synthesizer. This, we thought, would rock the world. The earthquake of excitement slowly dwindled to a mild vibration and then to silence. While the Mac had a major screen reader component built in, it exposed so little information as to render the synthesizer useless for most real blindness applications. I know, outspoken for the Mac came along but the screen reader later to be acquired by Alva and, more recently, permitted to die a lonely death, felt like using JAWS with only the JAWS cursor or Window-Eyes with its mouse cursor.
Later on, as my vision deteriorated, I didnâ€™t know about programs like JAWS and the accessibility on Windows but I did remember that Macintosh had a built in magnifier (CloseView) and a synthesizer. So, with the help of a Mac hacker friend of mine, I set out to create my own screen reader-like utility that, with CloseView running at 10-16X magnification, I could actually use (very inefficiently) the Internet, WordPerfect and Eudora. My utility wouldnâ€™t win any technology awards as it simply copied selected text to the clipboard and then spouted it out through the synthesizer. This solution, crufty as it may seem, provided me with good enough computer access to take creative writing classes at Harvard University and to keep in touch with friends and family via email.
Then, a friend of my family who also lost his vision to RP, told my dad about JAWS, Window-Eyes and the Windows solutions. Bob (my dad) bought me a Gateway laptop, a copy of Window-Eyes and sent it up to our house in Cambridge. My wife struggled, with the excellent assistance of Mike Lollar on the telephone, for about three hours to get Dec Talk Access 32 installed without bothering the pre-installed virus protection too badly. I thought I had found heaven. Within six months, it was bye-bye Harvard and hello Henter-Joyce and my full time pursuit of access technology.
So what happened to Apple between the time it showed off the first computer to ship with a standard speech synthesizer and the release of its iPod?
If you have followed the business side of the computer industry, you probably have noticed that Steve Jobs got fired and replaced by that guy from Pepsi. The soda guy got fired and was replaced by Gil who, in turn, got fired and replaced by Steve Jobs. Throughout all of this, Apple would create some really innovative concepts and then kill them before letting them hit the market. They built things like the Newton about a decade before the technology had matured to a point it could be commercially viable and they floundered listlessly without a real leader at the heart of the organization. Thus, the return of Steve meant joy in Macville, ding dong the corporate witch was dead and the dreamer had returned. The rainbow colored Macintosh logo glowed brightly once again.
Steve Jobs, though, had learned a lot about business while in exile at NeXT Corporation and other disasters. He had learned about saving money, cost cutting and not going too far from the path to relatively certain dollars.
One of the first moves Steve made upon his return furloughed the speech team. Some of the most talented people in speech technology lost their jobs (none had trouble finding employment elsewhere) because, according to an official statement issued by Apple on that day, â€œSpeech technology is superfluous to our mission.â€ï¿½ I remember reading this article and feeling my heart fall into my stomach.
More recently, in a move typical of Apple, they reversed direction and started a reconstituted speech team and the synthesizer and voice command control in OSX is really quite good.
Why, then, canâ€™t an iPod talk?
Because Apple doesnâ€™t want it to.
Why doesnâ€™t Apple want the iPod to talk?
Is it technically feasible for an iPod to talk?
At last, the crux of the biscuit, from the very first iPod released a few years ago to the fanciest one out there today, all had more than enough compute power and storage (with zillions of bytes left over) to run a speech synthesizer. Having walked through the iPod interface with a sighted guide, I can also state quite clearly, that offering the interface as a self voicing application would not challenge the talented Apple engineers to much. Including a full talking interface, would definitely add to the â€œcool factorâ€ï¿½ of the device as sighted and blind users alike could keep the iPod in their pocket and navigate to their Led Zeppelin or Pink Floyd folder quickly and easily without diverting their mind numbed, 120 decibel charged gaze away from whatever they had been staring at.
Effectively, the iPod has no accessibility features because Apple thinks of accessibility well after anything else they design into their products. Speech in an iPod would have been relatively cheap and easy but Apple thinks of â€œcoolâ€ï¿½ first and nerdy ideas like universal design just isnâ€™t cool.
So, I cringe every time I hear the term â€œPod castâ€ï¿½ on a blind personâ€™s web site. Well before the iPod, an Apple trademark, we blinks enjoyed all kinds of streaming audio on the Microsoft platform using Windows Media Player, Real Player, WinAmp and other programs. Today, we have the PAC Mate, Braille Note, iPAQ, a whole pile of cell phones on which screen readers run and probably other products Iâ€™m forgetting to use to listen to music, books and other information while out and about. Why then do we insist on giving Apple a free advertisement for a product that might as well have a sign saying, â€œNo Blacks, No Dogs, No Irishâ€ï¿½ hanging on it as far as we blinks are concerned.
Iâ€™m also dubious of anything containing the word â€œpodâ€ï¿½ that doesnâ€™t refer directly to food. This comes from the classic Sci-Fi thriller, â€œInvasion of the Body Snatchers,â€ï¿½ not the remake but the 1950s original. In the movie, the townspeople disappeared one at a time to be replaced by replicants (who had that zoned out look of an iPod user on their faces) who, perhaps not coincidentally, grew out of giant pea pods. Are Steve Jobs and Apple snatching the portable music lovers of the world and replacing them with mindless servants of their corporate goals? Am I one of the last townspeople left running around to spread the information that Apple employees come from outer space and intend to conquer our planet?
Sorry for the fairly lame posts the past two days. I had little time to write so I depended heavily on material I could draw from other news items. I do think both items described important events but I didnâ€™t do much to add any color or useful commentary to improve on their value.
From what I’m seeing, the latest RC of Windows Vista and Windows Presentation Foundation include UI Automation. What concerns me is the lack of AT support for UIA, and with Window-Eyes having released their public beta of version 6.0 (which, in all probability, will be the version that first claims to have Vista Support), and with “first public beta” usually being a codeword for “feature lockdown”, I fear what will pass for “Vista support”. For grins, I downloaded the Vista developer tools, created a test application in XAML using the Windows Presentation Foundation SDK, and launched it. The app was utterly pointless, containing a single button. JAWS found the button. Window-Eyes did not. JAWS did not, however, find the text on the screen with the JAWS cursor, meaning that WPF uses some arcane way of writing text to the screen that may require that a chicken be sacrificed to the display driver gods in order to get the text. Now, having said this, and having made a cursory examination of the UI Automation APIs, I wonder how feasible it would be to write a UIA client that just feeds the information to /* pick your favorite screen reader */ as needed (I haven’t looked at the framework for extracting text via UIA to determine if some kind of “virtual buffer” would need to be implemented for reviewing text, similar to what JAWS is doing with Java at the moment). But I am rather worried, as the buzz around the office is that Vista is getting awfully close to shipping and … other buzz I’m hearing is enough to make me worry on the AT front … that we may find ourselves at least somewhat left behind when Vista ships (now, bear in mind as I say this that, as I work at a certain large software company in Redmond, and my department is starting to test things on Vista, lack of AT support for UIA right this second poses more of an immediate problem for me than it might for Joe Q. Blink of Podunk, Iowa). Still … shouldn’t we at least be seeing something?