We dislike voice command searches because they’re awful for multitasking. Mega multitasking isn’t efficient, and even when performing one simple and one complex task simultaneously, voice commands create a task that requires full attention. The technology is not the problem, and Google’s word error rates are down to 8% as of April, 2015.
Talking to our phones is tremendous for quick, easy Q&A, but typing or writing lets us document thoughts in pieces. We use voice commands as one-offs. They will have broader use as machine learning and better UI in the space make the Siri’s and OK Google’s of our world better at stringing together questions to provide complex solutions.
You can type and talk on the phone or carry a conversation at the same time, but good luck issuing instructions while speaking to a work colleague simultaneously:
- You: “OK Google, what is the change in ridership on the NYC A line subway when the temperatures are over 95 degrees”
- You: “Ah, Ms. Bishop, sorry for the delay, I had the question on the other line, so yea, looks like we’ll do better product with giveaways for the very hot days. HUH, same super hot temps in NYC near A line but for handouts to pedestrians? Um, just a sec.”
- You: “OK Google, same question, but what if it’s pedestrians, not subway riders”?
- Google: “Sorry, what do you mean ‘same question’”
- Ms. Bishop: “Are you asking a robot to answer these questions? Just Google them on your laptop, get multiple results, and click on the best suited answer? Then you can multitask with me and a search engine.”
Apple recently took a big leap in it’s use of muli-touch or gestures on their new Macs. These will no doubt limit use of keyboard typing as well. They also made haptic feedback, a useful aide particularly on cell phones, a bigger part of their new Macs. One can imagine more interactive screens will help communication by providing answers to questions not dependent on keyword search, but instead on reacting to sound prompts and visual areas of a screen.
Can you imagine a world where we present images or video to machine learning software and get answers to questions after analysis on par with a 6-year-old child? Don’t count on it any time soon. Recently Google identified an African American couple as “Gorillas” in an uploaded image, prompting jaw-dropping reactions usually reserved for the KKK’s activities. Using image recognition for tasks above the level of a 6-month-old infant produces results with uniquely damaging repercussions.
Alternative Search Technologies and Accessible, WC3 Standards
20.6 million American Adults age 18 and older experience vision loss according to a 2012 National Health survey. About 1 million adults with vision loss, defined as those with trouble seeing even if using contacts or glasses, use computers. What happened to the other 19 million?
Let’s hope that in coding websites for different platforms and devices companies will also make them more compatible with devices the sight impaired use such as screen readers and braille embossers. Can Apple tweak it’s multi-touch technology so those with disabilities can more easily get answers to questions? Gestures are also easier for those with cognitive and physical disabilities, for example. If devices return high-contrast results to questions input by users in high-glare situations, might that also make it easy to transfer that software or technology to the indoors for the visually impaired?
The visually impaired do not have the luxury of choosing audio output from content on the Web. They rely on it. Fortunately, large websites and well funded search engines are attuned to the importance of web accessibility standards. Google organizes it’s SERP so screen readers used by the visually impaired can efficiently and fully present results to users. Search results need quick and comprehensive presentation from queries. Google uses strict guidelines for code that displays ads vs. organic results. For example, “Search result lists: Both the search results and ads are in ordered lists so you can get to them quickly with keyboard commands.”
Imagine a world where the Internet helps those with disabilities rather than creating extra challenges for them? Don’t imagine. One day you may need live in care, and your web browsing habits may require you to have a machine reader assist.
The awareness you’re building for your organization and educational value your website provides is lost to this group if you don’t take time to understand accessibility. Head over to the World Wide Web Consortium’s (W3C) website and read about Essential Components of Web Accessibility.
Do away with the Gobbledygook millions of disabled people find when they visit your website. You’ll also notice an SEO benefit as these standards force you to follow other guidelines Google wants to see.