GIAN WILD:
Hi, and thank you to everyone joining me remotely and not so remotely at MidCamp this year. I am really pleased to be there in spirit. So thank you very much for coming to this session. There's opportunities for questions at the end and you can always reach out to me via Twitter if you would like to or via email. That's very, you know, 20th-century technology. So today I will be talking about mobile accessibility, building accessible mobile sites and native apps for accessibility. But before I get into that, I just wanna talk about the contribution day between Friday 10am to 4pm. Be aware you don't have to know code to give back, which is good. I'm not really a very strong coder. In fact, I'm not really a coder at all. And there's new contribution training from 10am to noon with Amy June Hineline at opensource.com. And there's also something I'm very jealous of, I have my coffee right here because I'm presenting from Australia. It is currently 5:30am at the Drupal Coffee Exchange. So bring a bag of beans, leave with a bag of beans.
I am so, so jealous. So wish I was there with you. I love Chicago, but maybe next year or year after. I'll just like to acknowledge the Yuggera people as the traditional owners of the land that I'm currently on. And also to let you know that you can access this presentation via the Midcamp website, which has a PowerPoint, but also on pz.tt/mobile23. And that's also something I wanna say is I say mobile, you guys say something like mobile, I don't know. I can't say it. That's why I say Mobile 23, so pz.tt/mobile23 and that will be put in the chat. So before get started, I just wanna introduce you to my team that's about two-thirds of it. And that was in 2018, which seems like last year, but really was five years ago. But let's not talk about that. When I started accessibility, always I'm proud to say we just turned 12. We wanted to support people with disabilities by providing employment opportunities as well as just, you know, making websites accessible and things like that. And so we actively hire people with disabilities.
About 20% of the population has some kind of significant disability, not that you would necessarily know. They're often hidden in plain sight. At this point, about 60% of my staff had some kind of significant disability and we're at 65% now. So some of the people with disabilities that we have employed include people with dyslexia, moderate vision impairment, severe vision impairment, epilepsy, migraines, physical impairments, fibromyalgia, multiple sclerosis, Crohn's disease, PTSD, and Asperger's. So one thing if you're new to the accessibility world is that it's not just about vision impairments and you aren't necessarily going to know if someone has a disability just by interacting or talking or looking at them. So a little bit about me, I started in 1998, long, long, long time ago, of course, started when I was five years old. I worked on the first accessible website in Australia and I created Australia's first automated accessibility testing tool. I was an invited expert to the W3C WCAG2 Working Group.
I worked with them for six years and then I spent some time working on the Melbourne 2006 Commonwealth Games. Then spent five years Managing Usability and Accessibility Services at Monash University. So I've been on both sides, I have been on the private industry side and I've been on the actual, you know, engaging people side so I know what it's like. And then 12 years ago I founded AccessibilityOz. We released OzPlayer, our accessible video player in 2013. Released OzART our automated accessibility testing tool In 2014. I spoke at the United Nations on the importance of web accessibility in 2015. Nominated for Australian of the Year in 2016 and inducted into the Centre for Accessibility Hall of Fame as the Accessibility Person of the Year in 2019. And the one thing that is missing from that 'cause as we know, the world ended in 2020 was these guidelines that I'm gonna talk to you about were released in early 2020, January 2020. I say that they had no relation at all to the spread of COVID, but you'll have to take my word on it.
So a little background on these guidelines. So why did we develop this methodology? So we need to go back to basically 2017 where we got together at what we call the ICT Accessibility Testing Symposium, which is really aimed at accessibility testers. And at the end of this conference every year we have what is called a town hall where we talk as an industry about what we feel is missing from the industry. And in 2017 we'd been waiting for several years for WCAG2.1, which had been the web content accessibility guidelines that you need to follow to make our sites accessible. And we really were struggling with the fact that basically mobile accessibility wasn't really being addressed. And so what we did is we got together as a committee and everyone, all the different accessibility companies had their own mobile guidelines and we basically amalgamated them all into one set of standards and we released them in 2018. And we thought that would be the end of it because we thought WCAG2.1 would answer all our questions, all our problems.
But, you know, things were not to be the case. So before I go into that, I just wanna talk a little bit about why mobile is different. So mobile is different to desktop. And when I say desktop, I mean PC, laptop, you know, Mac, anything that basically is not a mobile device. So mobile or tablet device, you know, what we're used to sort of sitting at a desk and working on. Mobile has things like native screen readers, TalkBack on Android, and VoiceOver on iOS, whereas VoiceOver is native to Macs. But when it comes to Windows, there really are a variety of different screen readers, and so there's a whole kind of range of different ways that people can access content and it's very different, how those screen readers interpret content. There's also things like volume control, which is used, of course, it's on desktop as well, but it's used a lot more on mobile devices. You have a haptics or vibrational keyboard, you have visual as well as auditory as well as vibrational notifications, and not just those really annoying sounds when you get emails on your desktop there's really a reliance on those types of non-visual notifications that you don't have on desktop.
There's screen rotation, as you will know, we all look at screens on desktops as an orientation of landscape, unless you are my father who decides to turn it so it's an orientation of portrait. But on mobile, it's a portrait orientation. You have mono audio. So listening to audio-only on one side. Voice Control, so you can actually control your mobile devices with saying, 'go to the search bar.' Increased text and display size, reduction of motion, zoom, and things like the reader view and the simplified view. Now desktop often has some kind of iteration or a type of these things, but what's really different is that these features are very commonly used by the general population and not just by people with disabilities. So people who would never identify as having some kind of disability would be quite happy to increase text and display size. In fact, my stepmother, for example, you know, she's getting older and she's losing her sight. And she is an avid reader and she reads all her books on her iPhone now because she can't read it in print.
Now, she could get large print books, but they're few and far between. And instead, she reads everything on her mobile phone by increasing the text so it's something that she can read. Now, she would never identify as someone with a disability and never perhaps seek out those assistive technologies on desktop. But it's basically is something that commonly used by mobile and the general population. So let's talk about WCAG2.1. So before we talk about WCAG2.1, and I just wanna say WCAG was released in 2008, which is the Web Content Accessibility Guidelines. And 2.1, it was an iteration that was released in 2018. So basically one of the things that really was problematic with WCAG was that it didn't really address mobile accessibility. And, you know, that's because it was released in 2008 and the first iPhone was released in 2007. And I am watching the show 'Revenge', please don't judge me. And it's set in 2011, 2012, and everyone's using BlackBerrys or as they are known, were known CrackBerry because they were as addictive as crack, and they were really the precursor of the iPhone.
And I must say, I did not actually have one. I'm very proud of that fact. But the thing is, is that, you know, even in 2011, 2012, people weren't using iPhones. It really took, you know, or Android devices, I mean, they weren't even around back then. So, you know, there really wasn't this concept of how we use mobile devices as we do today, things like native apps. And the best example, if someone really like corners you and says, give me an example is keyboard accessibility. So WCAG2 says that sites must be accessible to the keyboard user. However, it does not specify that it should also be accessible to the touch screen user or to the mouse user, for example. So that's something that, you know, really was something we didn't really even think about. We thought that people were only ever gonna interact with the web through a desktop, through a keyboard, and through a mouse. And touch screens were for people who had disabilities. And that's actually really interesting. If you look at a lot of the features that we take for granted and that are, you know, basically improving everyone's lives, they often started as assistive technologies.
Even Alexander Graham Bell, who invented the telephone, did so so that he could talk to people who are partially deaf. So this is something that, you know, it's a very interesting aspect of accessibility. So WCAG2.1, which was an iteration of WCAG, definitely built on that and did include criteria related to touchscreens. So pointer gestures, sensors, small screen devices, orientation, et cetera. But it still didn't cover all the user needs related to mobile accessibility. And the best example for you, once again, if you get cornered by that very annoying person is that is touch targets. So in all the methodologies that we merged for the first iteration of the mobile accessibility guidelines, every single company that had their own mobile guidelines had a requirement around touch target size. So touch target size is, for example, hopefully, you can see this, basically the touch target of an actionable item. And so what we found was that the touch target size being an adequate size was in WCAG2.1, but it was in level triple-A, There are three levels to WCAG2.1, A, double, and triple-A.
Almost all governments require level double A and as I like to say about level triple-A, level, triple-A is where success criterion go to die. So this is something that the industry itself was quite annoyed at. And, you know, there's a bit of a furor. So we got together again at the 2018 ICT Accessibility Testing Symposium and we were like, OK, we can't rely on the W3C providing this information, so let's, you know, properly develop a set of guidelines, start from the start, not just amalgamate the guidelines that are out there, but go through things systematically, write a serious set of guidelines. And we decided we would split it into two sets, a mobile site, a set of guidelines, and a native app set of guidelines. And so that's how this methodology came about. So please note that this methodology does not include those errors already included in WCAG2. So the methodology does not say things like you need to have alt attributes on your images or alternative text for your nontext content because that's in WCAG2.
But it does include the things that are in WCAG2.1 because we thought there were organizations who were interested in mobile accessibility who perhaps hadn't moved to WCAG2.1. So basically it does include things like orientation and we do require adequate touch target size, even though that's at level triple-A. So the methodology is slightly different to WCAG2.1 but it's very clear when we are referencing WCAG2.1 error and how our methodology differs. And of course, there's a lot more other than just what's in WCAG2.1. So you can download the guidelines at bit.ly/3AdDjjP. So that's bit.ly/3AdDjjP. Now this is actually hosted on the AccessibilityOz website, even though a range of different accessibility companies were involved or the large accessibility companies in the world because we use them a lot and I keep finding typos. And so I kept emailing the ICT Accessibility Testing Symposium saying, "Hey, can you just, you know, update the guidelines? I found this, you know, missing comma." And they just got sick of me and they're like, just host it on your own website and we'll link it to you.
So that's why it's on our website. I absolutely assure you it is not just the AccessibilityOz guidelines. A lot of people are using them now. OK. So let's talk about some other mobile issues. So one of the things that you need to be aware of with WCAG2 is zooming to 200% and that should already be included in regular testing. However, often people who use the zoom function inherent in the desktop browser are often restricted to a mobile view of the site on their desktop. So therefore it is absolutely essential that functionality is not removed due to what we call a variation in the page. But what we really mean is if you have different content or lesser content on your mobile site than you do on your desktop site, say you hide content, then you, you know, really are going to make sure that you are falling foul of these users. So it's absolutely essential that the same content that's on your desktop is the same content that's on your mobile version of the site. And we're talking about responsive websites and if you like, responsive, I don't know what you're doing in developer conference, but I will explain that to him.
So this is what I mean. So this has been fixed, by the way. This is a very old example, but it's a very, very good example. So this is what YouTube looked like about four years ago on desktop at 100% zoom. So, you know, like how anyone would look at it when loading it in a browser. And so you can see you've got your upload button and your notifications button in the top right-hand corner. But if you increase text size, say by using control plus which people who have low vision do because they need to increase text size, those features, the upload and notifications disappear. Now why do they disappear? Because YouTube is assuming that you are now looking at this on a mobile device and they don't want you to upload your videos or check your notifications on the YouTube website. They want you to use the YouTube native app. So this is a perfect example of having different, you know, content depending on the variation of the page. And this is something that you should absolutely not do. Another thing that you should be really aware of when it comes to testing mobile is what we call the Accessibility Support, which requires that implementation techniques that support assistive technology, you know, are used.
So that means if you have a PDF, you need to use tagging features, et cetera, et cetera. But what it really means for mobile is that when it comes to testing, you have to use the assistive technologies or the mobile features and you need to test with them in much more of a case than you do on desktop. And the reason that is, is because on desktop you can access the code, on a mobile site unless you can do it on desktop or you have some fancy app, you can't view the code and you absolutely can't view the code unless you're coding the system on a native app. So it's really hard to determine how things will interact properly. So even if you are coding a native app, you really need to do this testing because there's just not enough information out there about how the coding will interact with these features. So you have to test with these features and do not use simulators. But I'll talk about that in a moment. So what assistive technologies and features do we say that you should test with?
Talkback, keyboard, keyboard and switch, magnification, remove animations, invert colors, grayscale, increase display size, increase text size with Chrome, and simplified view with Chrome. And that for the iPhone it's VoiceOver, keyboard, keyboard and switch, zoom, reduce motion, invert colors, grayscale, and reader view for mobile sites. And the iPad is the same as the iPhone. Unfortunately, you need to test on both the iPad and the iPhone because the iPad now has a different operating system, which does make things different. And I know you're looking at that and going grayscale. How on earth do you test with grayscale? I can tell you right now that we have copious information in that documentation about how you turn on these features, how you use these features, examples of failures, examples of parsers and, you know, quite detailed information on testing. So, you know, there's a lot of information in the documentation about what it is that you should be looking for. So we have specific testing methods which differ whether you're testing on a mobile site or a native app.
So on the mobile site, there's four main testing methods. So devices testing on mobile and tablet devices, devices with assistive technologies, testing on mobile and tablet devices with assistive technologies, responsive window testing on a responsively sized window on desktop and desktop, which is testing on desktop. Native app only has two main testing methods testing with devices and testing with devices with assistive technology. We did look at some automated testing tools for testing mobile sites and native apps, and they really weren't mature enough for us to recommend them back in 2020. I mean now it's three years later. Things may have improved, but back then we didn't feel that the automated tools available were, you know, could be really relied on. The other thing to be aware of is you absolutely cannot use simulators. This is an example of someone using a simulator. I started flying to the States regularly in 2014. Actually, my first presentation in the States was for the future of web design on mobile accessibility, and it was in 2014.
And back then we did not have Wi-Fi on airplanes and it's a 15-hour flight from Australia to LAX, and that's a long time without Internet. But the good thing is, is that when we landed on the tarmac, we could access the LAX Wi-Fi. However, it did give us this interesting message, which is basically this text that says, 'This page will redirect.' So content doesn't really make sense to have here. And all of a sudden I don't really feel like giving you my contact details or my credit card details. Now, how does this happen? This happens because someone's not testing it on location with an actual device, they're testing it on a simulator. And so you see this a lot of the time with, you know, text being too small to read, you know, a whole lot of different features. And this is why you cannot rely on simulators. If you take one thing from this presentation, it is no simulators. So let's talk about the mobile site and native app testing methodologies. There's five overarching steps. Of course, the actual things you need to do.
Under those steps is different depending on which methodology you choose but for the mobile site testing methodology there's step one, identify devices. Step two, identify site type and variations. Step three Test critical issues. Step four Test mobile-specific issues. Step five Test mobile assistive technology and feature support. Now the native app testing methodology have the same overarching steps with the difference of step two, which is define application functionality. And I do wanna say I understand that this is a development camp, and this is a testing methodology, but absolutely it can be used as a development methodology and that's something that we're hoping that we'll be able to do is convert it into a development methodology. But there's still a lot of information that you can get there. In fact, we'll be asking for volunteers. So anyone who wants to turn it into a dev methodology, please contact me. So the first step is identify devices. So you need to determine which devices to test on.
So in the United States, Australia and other Western countries, iOS devices are the most popular. In Asia and other Eastern countries, Android devices are most popular. You know, if you have any analytics system, have a look at who's using, you know, your site and what devices they're on. But that's something just to, you know, to go by. Be aware that there's a huge number of Android systems, there's different combinations of Android operating systems and browser combinations. It's not possible to test on all of them. And so one of the things that we found in our testing is that if you're testing with a Samsung device, you can't test with the Internet browser app that comes pre-packaged with the Samsung phones. It's better to download Chrome and test with that. So whatever you do, whatever system you're using when it comes to Android, it's best to download Chrome and test with that. Be aware that even if the site is a desktop site, people will still use the site on mobile. The devices have a way to say show desktop site and people, especially if they think content is missing, will actually choose that particular option.
So be aware that if you have a site that you think, no, no one's ever gonna use this on mobile because I have my own separate mobile site, they're still gonna use that desktop site and you still need to test it and develop it appropriately. So also be aware that there are assistive technologies that operate on desktop and mobile that are called the same, such as VoiceOver. But if you've tested VoiceOver on Mac for your mobile site, you cannot assume that you have tested appropriately VoiceOver on iOS. So you have to test both. And there's some really great information about screen readers if you're interested in the WebAIM Screen Reader survey. They also aware that Samsung likes to do things differently. They have an additional screen reader called Voice Assistant. However, TalkBack is still available, and people who really know about screen readers are likely to keep using TalkBack. But people who are maybe sort of new to the devices or new to needing a screen reader may use Voice Assistant.
So you may want to look at testing with that as well if you have a lot of Samsung users. An Amazon Fire uses a different screen reader called Voice View. And that's something else that you might wanna test with if you've got a lot of users who are, you know, using a Kindle or something like that. So what we recommend is that you test with an iPhone and iPad and an Android phone. And for if you're testing a mobile site, then it's Safari, Safari, and Chrome. And other devices to consider are things like the Android tablet and alternative devices such as the Kindle. We recommend testing on the latest version of iOS and iPadOS and the latest two versions of Android. However, if you're really stuck for time, it's really worth testing just with the latest version of Android. And be aware that when a site is directly aimed at people with a particular kind of disability, say someone who site on acquired brain injury, then you really should test with the assistive technologies used by those potential users.
So, people who have acquired brain injury, they are likely to perhaps use Voice Control on iOS or Dragon NaturallySpeaking. So actually testing with those is really important. And just to reiterate, you need to meet WCAG2 and this mobile testing methodology if you want to make sure your mobile sites and native apps are fully accessible on development. So step two is identify the site type and variations of the page when it comes to mobile sites. And this is really from a development perspective, you know, how is it that you're going to develop this? Are you going to develop a desktop site, m.dot site or a responsive site? Most people develop responsive sites. If it is, are there variations of a page? So for those who perhaps are a bit newer to the IT world, the desktop websites are sites that have only one display, whether viewed on desktop or mobile or tablet. M.dot sites have a particular display for mobile. And then there's a separate site called a www site that you see on desktop and that's basically two different websites and you have to test those two different websites on desktop and those two different websites on mobile.
So that's a lot of work. And then responsive websites change depending on the screen size or other feature as determined by the developer. Now, most sites are responsive and we actually WCAG2.1 requires that sites are responsive. And also the methodology requires that when you are determining whether there is a variation of a page, you have to do so via screen size. You can't do it via, you know, sniffing the browser or anything else like that. It has to be by screen size. Now, if all of this is double Dutch to you, once again, there's a lot of information in the documentation about this. That was for sites. And this is for native apps. So step two for native apps is define application functionality. So for your understanding of the purpose of the native app, define which functionality is critical to its purpose and use and must be tested for efficacy, operability, and workflow from a user experience perspective. So I ask the question how would the experience be impacted if the functionality failed, the content could not be reached, and or the experience caused a barrier to the user?
And then prioritize. So all functionality should be accessible within the native app. However, it's important to define and include the critical functionality for each individual app to be prioritized in testing. So then there are always really common elements that need to be tested, such as navigation, landing screens, emergency sections, login flows, settings, account and profile, Contact Us, real-time updates, privacy policy terms and conditions, interactional functionality, help sections, widgets, calendars, et cetera geolocation maps, and high traffic areas. So those methodologies are available on the AccessibilityOz website under the Resources section. So let's get into some more detail. Step three is test critical issues. Now, what we call critical issues is basically traps. A trap is when a user is trapped within a component and cannot escape without closing the browser or the app. And there's many more traps in mobile sites and native apps than in desktop. So we have identified five traps, exit trap, swipe/scroll trap, text-to-speech trap, headset trap, and layer trap.
So the exit trap ensure there is always an accessible actionable item, e.g. a closed button that makes color contrast requirements and has an accessible name that closes any feature that overlays the current page such as a full-page-add. This applies to all users and is in both methodologies. So this is an example here. So this is Facebook. There's a overlaying ad on the HP Elitebook Folio and the only way to actually stop that ad or remove that ad is to hit the tiny area of white below. The URL isn't editable and the back button doesn't work. So this is an exit trap. The only way to close this is to close the app and start again. This is another very common example, basically a dark UX pattern. So the close button here on your little pop-up saying, "Hey, you know, why this thing? 'Cause we're having a fall sale." The little close button doesn't make color contrast requirements or touch target size requirements. And you might say, oh, but you can just tap outside the pop-up to close the pop-up.
But of course, there are some users that can't do that. So this is an example once again of an exit trap where a user has to close the browser and start again. This is a swipe/scroll trap. Ensure you do not override standard mobile touch function functions such as swiping, scrolling, et cetera on the majority of the page. This applies to touch users and the methodology is both methodologies. So call this the zoom of doom. This is actually an example in my very first mobile presentation, you know, in New York, this is definitely been fixed now and we don't actually see this issue all that often, but still happens. But basically, you have a map that takes up almost the entire page and the only way to scroll the page is to hit these very small areas of white. Otherwise, if you don't, you end up finding that you are scrolling the map. And then we have a text-to-speech trap... just one second. Text-to-speech trap. If the app has an ability to provide content via text-to-speech, the screen reader user must be able to pause or stop the app speaking in a simple manner, for example, by performing a swipe on a screen.
So this applies to screen reader users and it's only found so far in the native app methodology. So basically this is Pocket. I believe they fix this now, but basically, you can press play on an article in Pocket saves all your articles and it will read that article to you. But once activated the screen reader users cannot stop the text-to-speech for easily. Why is that important? Well, screen reader users rely on the audio of a screen reader to navigate, so if they're hearing audio about an article, then they need to have a very simple way to stop that audio so they can listen to their screen reader and navigate through the site. Now, if there's no simple way to stop it, then they basically have to close the app and start again. And the headset tracks. So headset users must always be able to pause media, audio, or video content by using the pause/play control on the headset. This applies to screen reader users, headset users and it's both methodologies. So this is an example here where you've got a pop-up video at the bottom of a website.
So I don't know who thought that was a good idea and you have an ability to mute the video with a little mute button. However, once again, as I was saying, the screen reader audio is how people who are visually impaired rely on navigating through the site. So if there is audio overlapping that, then they can't easily mute the content. And lastly, layer trap. So the user should not be trapped on a non-visible layer. This applies to all users, but it's mostly encountered by screen reader users and sometimes keyboard users. And this is both methodologies. So this is an example here, you've got a website and you open the menu which overlays the page. But for screen reader users and keyboard users, they are stuck on the underlying page. They can't access the menu content, so keyboard users can't do anything at all, they can't see where their keyboard is, they can't close the menu content. Screen reader users, you know, can't access the menu content. So that's another trap. And then we have mobile-specific issues and they're broken into a variety of categories.
So the first category is alternatives with nine requirements, motion, interaction and gesture, touch gestures, geolocation, change of state, audio cues, status messages, abbreviations, summary of content, and ambiguous text. So let's see an example from the document. So this is the touch gestures requirement. So any touch gesture must have an alternative accessible actionable item. And this is very similar to 2.5.1: Pointer Gestures in WCAG2.1. So examples of touch gestures are things like swiping up and down or left and right, dragging up and down or left and right, double tapping, tap and hold, tap and swipe, two pinch zoom, and press and long hold. And examples of an alternative accessible gesture is a link, a button, a dropdown, or a separate page with the same functionality. Then all the requirements have an about this requirements section, which explains why we have included it. And so the about this requirement for touch gestures is this requirement is particularly important for screen reader users.
For example, if you require your user to swipe right to complete a purchase when the screen reader is on, the swipe right gesture moves you to the next focus item and doesn't complete the purchase. You must be able to perform the same action by using a link, an up or down swipe, or some other gesture. Please note that this requirement is similar to the exit trap requirement. A failure of the exit trap requirement is that a user cannot escape from content or a page. A failure of the touch gestures requirement is that the user cannot choose content or a page, i.e. they are not trapped. And then we have a how-to-test section, identify any site controls if they require any of the following gestures, is there an accessible, actionable item provided as an alternative? So swiping up and down or left and right or dragging up and down or left and right. Double tapping or two pinch zoom, tap and hold, or tap and swipe, and press and long hold. So this is an example. So here you have Information under Top Stories.
And you can see there's additional information because they're cut off on the right-hand side. And this is a very common way to indicate that additional information is available on swipe. However, there is a link that says see more at the bottom of those examples, which gives you all that information on one page in a linear format. So that's a way that the alternative is provided on another page. Then there's another example here where you can drag along a pointer, along a certain amount of, you know, certain time and it will give you the expected weather at that particular time. But you can actually tap on those particular links as well. So they're actually just simple links. So you can use the dragging gesture, but you can also use a ordinary link. So the display category also has nine requirements three flashes, changes on request, target size, inactive space, fixed size containers, justified text, color contrast, orientation, and animation. So the target size requirement is size of touch targets is at least 44 by 44 CSS pixels, approximately 7 to 10 millimeters.
And for more information, see WCAG2.1 Success Criterion 2.5.5: Target Size. Please note that this differs from WCAG2.1 as Success Criteria 2.5.5 is a level triple A requirement, but in this methodology, it's a mandatory requirement. So when it comes to target size, most people use touch as the form of interaction on mobile and tablet devices, and touch is not as granular as mass interaction and could depend on the size of the person's fingers. And people with certain physical disabilities, such as Parkinson's, may find it difficult to activate very small touch target sizes. So there's a number of kind of requirements around, you know, touch target sizes, which is detailed in the document. But here's some examples. So when visiting the Airbnb website, a pop-up appears at the top asking if you'd like to install the Airbnb native app. The only way to close this pop-up is to hit this tiny little close item which you almost can't see. It's right up next to the actual icon for Airbnb, and the close item is very small and doesn't meet touch target size requirements.
So that's a failure. This is a pass. This is basically the Outlook email message and the two CC, BCC, and subject fields are appropriately spaced out in terms of touch target sizes, but also the, you know, camera, the attach item and the download and the send, et cetera also have appropriate touch target sizes. So this is similar and kind of a related requirement is in active space, which is actionable items have sufficient inactive space between them. Inactive space of at least 10 pixels should be provided around active elements. So, you know, this is also really important because, you know, touch is not as granular. And this is an example here where this is the Asana App and up in the top right-hand corner, you've got an edit button and mark complete button and they are very close together and it's very easy to select one, you know, when you mean to select the other. So that's a failure. This is another example here. This is Wikipedia and you've got related articles at the bottom. You know about Beyonce, and they have no inactive space between them.
And we actually were talking in the committee about whether we should remove the requirement for an active space if the actionable item was large enough. And we actually had some users come to us and say, "No, even if it's quite a large touch target, we still need some inactive space." And this is a pass. So you can see the space between, you know, the types, women, men, et cetera, the examples of what can be purchased, et cetera, all have sufficient inactive space around them. So the next category is actionable items, which has seven requirements content on hover, focus or input, Native UI, descriptive text links, non-keyboard options, infinite scrolling color alone, and removal of touch. So we're gonna talk about color alone. So color alone should not be used to indicate actionable items if not underlined. A secondary methods such as underline or bold should be used in addition to color. So this technique is aimed only at visual users, and this relies very heavily on the existing Success Criteria in 1.4.1 Use of Color.
However, that criterion allows exceptions for actionable items that differ from text in color alone if the difference makes color contrast requirements. Now, mobile devices are by nature, mobile and are used in a variety of environments, including full sun and full darkness. And so that means that color differences that might be obvious on desktop are not necessarily as obvious on mobile devices. And in addition to this, actionable items on desktops provide feedback to the user when the user mouse is over them at, you know, at the minimum the cursor changes or tabs to them, you should have a keyboard focus indicator, but usually there's a little address that appears at the bottom of the screen. But that information is not available on mobile devices. They don't provide that kind of feedback by default. So that's why this is more important on mobile than on desktop. And so this is an example here, very confusing example from the Commonwealth Bank, call Michael, call Matthew, and call Froz, which are links.
And then you've got the email which are links as well, and they're just a change in color. And that's very difficult to differentiate, especially for people who have some kind of colorblindness. This is another example where, you know, you've got looking for older documents, log into your account, see your previous policy, or call 133 233. And there are some links in there and probably some of you can't see them because they're blue text on a white background versus gray text on a white background. And so people who are blue, red, colorblind, not gonna be able to read that. This is an example here, I mean, they're long URLs for links which is not a great thing but the links are very clearly underlined and a different color. And then we have navigational aids, which have five requirements visual indicators, character key shortcuts, descriptive headings, inactivity, time out, and navigation features. So visual indicators such as arrows, next and previous buttons have been used to indicate swipe or scroll areas or additional functionality.
And this is similar to WCAG2.12.5.1: Pointer Gestures. So this is also about how there's less feedback presented to the user as to the functionality available. And this is a good example. So here you have the BBC App and if you swipe from right to left, it moves to the next tab. However, there's no visual indicator that that's actually what you know is available. So you need to provide some kind of indicator that that functionality is available. Um, this is another example here with a health app and you got the dates along the top and there's no indicator that you can swipe along the top to get additional dates. This is a pass, this is what I mentioned before where you have the images cut off slightly, and that is a visual indicator to show you that the swipe feature will show you additional content. Then we have audio and video which have three requirements transcript, captions, and live audio and video. So there is a requirement that all video and audio have an accessible transcript.
So people who are deaf or hard of hearing are reliant on captions and people who are visually impaired or blind are reliant on audio descriptions. But there are some users who can't interpret either of those features who are perhaps deaf blind, or the video player itself is not accessible. So you need to provide information of the video to them as well. So this is an example from CW, 'The Curse of the Dark Storm' by Nancy Drew. There is no transcript alternative. That's just a description of the episode, which is one line. This is something else to be careful about as well. This is a video which has a transcript which sounds good, but it's actually just a transcript of the speech. So this happens a lot where the people just give you in the transcript just that captions content and you have to include the audio description content as well. And this is an example here of a full transcript. It's a three-second or five-second video and basically, it's about a particular toe extension, you know, like dislocated my toes.
And the transcript is, "Seat in a chair with your feet flat on the floor, take a deep breath in and lift your toes up and hold the position, then relax." And so that's exactly what is said and what is shown on the video. And then we have forms which have seven requirements CAPTCHAs, contact-sensitive help, error prevention, position field labels, visible field labels, accessible name, and form and keyboard interaction. So the requirement for field labels is that field labels are positioned adjacent to their input field and appear closest to their respective input field in relation to other field labels and other input fields. And this is an example here where you got a radio button that is actually closer to the no when it's actually the yes radio button and the no radio button doesn't meet color contrast requirements. So that's one that's very easy to get mixed up. This is another example where there is a differentiation through borders to show, you know, where the fields are. But that's not really visible.
It certainly doesn't meet color contrast requirements. And this is an example here where you've got the field labels much closer to their respective input fields. Then you need to test mobile assistive technology and feature support. So basically it's all about how actionable items and content can be accessed and activated by the following assistive technologies which we talked about before. I just wanna give some examples of VoiceOver and TalkBack. So when you're testing these, then it's basically you, you have to make sure that all actionable items and content can be accessed and activated by VoiceOver on iOS. And this is an example of a failure where you got your social media links along the top and they're not read Facebook, Twitter, YouTube, Pinterest, they're read, link, link, link, link. That is a failure to watch out for. This is another example which is a little bit more complex where you have a crossword and if you activate the clue or you press enter on the clue, you don't get taken to that clue position on the crossword.
It just swaps down clue instead of an across clue. So that's something that to watch out for as well. So what's next? Updates from WCAG2.2 if it shall ever be released. So there have been some changes here around WCAG2.2. So there's consistent help. So providing consistent help mechanisms, redundant entry that information previously entered is auto-populated. For double A, its focus appearance, ensuring the keyboard focus indicator is visible to all. Focus not obscured that the current keyboard focus is not hidden. Dragging movements, dragging has an accessible alternative. Target size minimum, targets have an adequate size. Sounds familiar, doesn't it? That's in double A now. Accessible authentication, providing easy methods of authentication. And there are two additional tripple A requirements which is focus not obscured enhanced and accessible authentication enhanced. So we do want to incorporate those when 2.2 is released. There are also some additional assistive technologies and mobile features that we want to include, including Voice Control, increasing text size, font color from reader view, and testing with a mouse.
We also want to review existing test cases. For example, the text-to-speech traps were only in the native app methodology, however big come to occur on the mobile site, so we need to look into that. And we also wanna remove some of the assistive technologies and mobile features because we find a lot of them generate the same results. So grayscale in iOS and Android, color inversion in Android, and invert colors in iOS. They always fail together or pass together, so they only need to be tested once. Reduce motion on iOS and remove animations in Android always fail together or pass together as well. And classic invert on iOS doesn't provide any additional testing information to smart invert. And then we wanna create an online resource. These are currently Word documents, but of course, it would be better to be a website which is searchable, easily updated, more likely to be used and more accessible. So firstly, I'd like to say please get involved. Please email us at [email protected] if you would like to be involved.
We desperately need volunteers. Don't feel that you don't know enough. If you are here and you've been to this presentation and you are interested, then you know enough. We learned so much by, you know, doing this investigation, doing this research. So if this is something that you are interested in, then please do get involved. I just wanna say thank you for coming. I need to shorten this presentation to give people a little bit more time for questions of course. And we'd also like you to provide your feedback. If you go to mid.camp/6894, then that would be great. Access this presentation and all links on pz.tt/mobile23. And we have five minutes for questions.
SPEAKER:
All right. Thank you. Anybody in the room here have a question (INAUDIBLE)? Looks like we've got no questions.
GIAN WILD:
There you go. Any from the online crowd?
SPEAKER:
Non there either.
GIAN WILD:
Excellent. Well, that's good. I'm glad I took 55 minutes. (LAUGHS) So I'll just finish then with sharing some of our resources if I can find where Zoom went because we have a lot of free resources. So let me just share my screen again. So we do basically everything to do with accessibility, application audits, mobile testing, accessible word, accessibility during the web development life cycle, et cetera, et cetera. And we have a number of products, the CCC Videos are specifically aimed at developers, and the Factsheets are specifically aimed at developers and they are free. So if you go to... let me start a browser. If you go to resources, we've got these developer CCC Videos which are on the topic of HTML, Forms, and Aria. They're all about five minutes each. So if you wanna learn more, that's definitely one place to go. The other place to go is these Factsheets. So we have Factsheets on all these different topics, images, PDF, video, interactive maps, HTML5, content, JavaScript tables, coding, keyboards, source order, forms, and mobile.
And so the JavaScript one is specifically detailed, and so you've got a section on principles for all of them impact on users. And then what we have is what we call a developer checklist. And so, you know, the requirement that visually dynamic information such as a progress meter should have a text equivalent actually has a associated appendix, which gives you actual code that you can use. Just copy and paste. It's all under Creative Commons and it gives you the actual live content. So there's a heap of information there as well. So if you are looking for the mobile information, just go to resources mobile testing. So thank you very much. Please get in touch if you have any questions and we'd love for you to be involved. (APPLAUDS)
SPEAKER:
OK. Thank you.
GIAN WILD:
Thank you. See you later.
SPEAKER:
Bye-bye.