Monday, 15 March 2010

accessibility - Is there any documentation on how screen readers should act? -


I'm reviewing and recommending a small web application change / improvement which recently reached Has become worthy.

The problem I am having is that it does not seem to be anything about the readers (or even) working on the screen. For example, if you see the specifications for you, a TabPanel and authoring practice guide tells a basic definition and how it works, but does not really answer a question like "screenreader tabenel's content when it appears, they speak?"

That example is problematic, I need to understand the requirements of the business, it should not speak, yet nothing actually says in one way or another. (The best I can do is that the examples of authoring practice guides do not speak.)

For this, and a half dozen other issues, it is a guide that "

There are some very simple principles:

  1. In the screen reader DOM order the page starts from the beginning It will start reading up to the end. It gives some basic data on the page such as title and links, number of titles etc. Or before. However, users will not usually allow the screen reader to read the entire page completely and will interfere with reading to navigate
  2. If a user knows the page, then they will The common navigation system consists of titles, forms, landmarks, links, tables, etc. If you use it, you will find a way to navigate the things that they know on the page. If the subject does not know the page, then they can use different strategies to use the kind of strategy that a visually impaired user will scan a page with his eyes. / Li>
  3. When the user navigates, they move their virtual cursor. Normally the focus follows the cursor to jump from the focusable element to the cursor, as they are facing (it is configurable). The screen reader reads anything that it encounters because the user navigates on it. It is similar to scanning a page to read a visually impaired user. The key here is that the user is controlled by navigating what is read. One warning for this is that if the user activates the control, some other part of the page is updated, and a perceived user will be expected to know whether he has updated immediately or detected its value. If it is, then the application should read it using Aria-Live.

As you will notice, this is the last point where it has crossed the usable area from technical reach.

  1. You need to make every tab in mind for the screen readers: No, the screen reader can see everything even without the focus of the tab
  2. Need to announce every update on the page: No, you do not, if a user is interacting with a tab, then through experience they know that selecting the tab will expose its contents And that content There are keyboard commands to go to. You do not even have to tell them that the tab is shown, you have to update the selected state of the tab.
  3. You do not have to announce anything: No, you need to make the decision that is important enough to automatically announce the information. For example, if you are implementing the Chat app , It would be dumb that the user will have to navigate to hear messages from his friends. These should be declared automatically.

I strongly suggest that you will bring a blind screen reader user in your organization and will perform them for their performance how they can be understood in these points.


No comments:

Post a Comment