Q Actions Auto Mode

What if cars could understand ALL voice commands?

By | App Actions, Auto Mode, Digital Assistants, News, Voice | No Comments

The following transcript was taken from a casual conversation with my son.

Son: Dad, what are you working on?

Me: It’s a new feature in our product called “Auto Mode”.  We just released it in version 2.1 of our Q Actions App for Android.  We even made a video of it.  We can watch it after dinner if you’re interested.

Son: The feature sounds cool.  What’s it look like?

Me: Well, here.  We have this special setting that switches our software to look like the screen in a car. See how the screen is wider than it is tall? Yeah, that’s because most car screens are like that too.

Son: Wait. How do you get your software into cars? Can’t you just stick the tablet on the dashboard?

Me: Humm, not quite.  We develop the software so that car makers can combine it with their own software inside the car’s console.  We’ll even make it look like they developed it by using their own colors and buttons. I’m showing you how this works on a tablet because it’s easier to demonstrate to other people – we just tell them to pretend it’s the car console.  Until cars put our software into their consoles, we’ll make it easy for users to use “Auto Mode” directly on their phones. Just mount the phone on the car’s dash and say “turn on auto mode” – done!

Son:  So how do you use it?  And what does that blue button with a microphone in it do?

Me:  Well, we want anyone in the car to be able to say a command like “navigate to Great America” or “what’s the weather like in San Jose?” or “who are Twenty One-Pilots?”.  The button is simply a way to tell the car to listen. When we hear a command, our software figures out what to do and what to show on the console in the car. Sometimes it even speaks back the answer.  Now we don’t always want people to have to press the button on the screen so we’ll work with the car makers to add a button on the steering wheel or even a microphone that is always listening for a special phrase such as “Ok, Q” to start.

Son: How does it do that?  I mean, the command part.

Me: Good question.  Since you’re smart and know a little about software, I’ll keep it short.  Our software takes a command and tries to figure out what app or service can best provide the answer.  For example, if the command is about showing the route to say, an amusement park like Great America, we’ll ask Google Maps to handle it, which it does really well. Lots of cars come installed with mapping software like Google Maps so it’s best to let them handle those. For other types of commands that ask for information, like “what’s the weather like in San Jose” or  “who are Twenty One Pilots”, we’ll send it off to servers in the cloud. They then send us back answers and we format it and display it on the screen – in a pretty looking card like this one.

Me: Sometimes, apps running on our phones can best answer these commands and we use them to handle it.

Son: Wait. Phones?  How are phones involved? I only see you using a tablet.

Me:  Ahhh.  You’ve discovered our coolest feature.  We use Apps already installed on your phone.   Do you see those rectangle-looking things in the upper right corner of the tablet? The ones with the pictures and names of people? Well, those are phone profiles.  They appear when a person connects their phone, running our Q Actions app, to the car’s console through Bluetooth, sort of like you do with wireless earbuds. When connected, our software in the console sends the phone your commands and the phone in turn attempts to execute the command using one of the installed apps.   Let me explain with an example. Let’s pretend you track your daily homework assignments using the Google Tasks app on your phone. Now you hop into the car and your phone automatically pairs with the console. Now I asked you to show me your homework assignments. You then press the mic button and say “show my homework tasks”.  The software in the console would intelligently route the command to your phone (because Google Tasks is not on the console), open Google Tasks on your phone, grab all your homework assignments and send them back to the console to be displayed in a nice card. Oh, and it would also speak back your homework assignments as well. Let’s see what happens when I tell it to view my tasks.

Son:  Big deal.  I can just pick up my phone and do that.  Why do I need to use voice for that?

Me: Because if you’re the driver, you don’t want to be fumbling around with your phone, possibly getting into an accident! Remember, this is supposed to help drivers with safe, “hands-free” operation. You put your phone in a safe place and our software figures out how to use it to get the answers.

Son: Why can’t the car makers put all these apps in the console so you don’t have to use your phone?

Me: Great question.  Most people carry their phones on them at all times, especially when they drive.  And these phones have all their favorite apps with all their important personal information stored in them.  There’s no way the car makers could figure out which apps to include when you buy the car. And even if you could download these apps onto the console, all your personal information that’s on your phone would have to transferred over to the console, app by app.  Clumsy if you ask me. I prefer to keep my information on my phone and private, thank you very much!

Son: Oh. Now I get it.  So what else does the software do?

Me: The console can call a family member.  If you say “call Dad”, the software looks for ‘dad’ in your phone’s address book and dials the number associated with it.  But wait. You’re probably thinking ‘What’s so special about that? All the cool cars do it”. Well, we know that a bunch of apps can make phone calls so we show you which ones and let you decide.  Also, If you have two numbers for ‘dad’, say a home and mobile number, the software will ask you to choose one to call. Let’s see how this works when I say “call Dad”.

Me: It asks you to pick an app.  I say ‘phone’ and then it asks me to pick a number since my dad has both a home and mobile number.  I say ‘mobile’ and it dials the number through my phone.

Son: Cool. But what if I have two people with the same name, like Julie?

Me: It will ask you to pick a ‘Julie’ when it finds more than one.  And it will remember that choice next time you ask it to call Julie.  See what happens when I want to call Jason. It shows me all the people in my address book who are named Jason along with their phone numbers.  If a person has more than one number it will say ‘Multiple’

Son: Wow.  What else?

Me: How about sending a message on WhatsApp? Or setting up a team meeting in the calendar. Or joining a meeting from the car if you are running late. Or even checking which one of your friends have birthdays today.   All these actions are performed on your phone using the apps you are familiar with and use.

Son: Which app shows you your friends birthdays? That’s kind of neat.

Me: Facebook

Son: I don’t use Facebook. I use Instagram. It’s way better.  Plus all the cool kids use it now.

Me:

Me: You get the picture though, right?

Son: Sure.

Son: So what if all of my friends are in the car with you and we connect to the console?  How does the software know where to send the command?

Me: We use the person’s voice to identify who they are and route the command to the right person’s phone automatically.

Son: Really? That seems way too hard.

Me: Not really.  Although we haven’t implemented it yet, the technology exists to do this sort of thing today.

Son: Going back to main screen, why does the list of actions under ‘Recent’ and ‘Favorites’ change when you change people?

Me: Oh, you noticed that!   Whenever the software switches to a new profile, we grab the ‘Recent’ and ‘Favorites’ sections from that person’s phone and display it in the tablet, er, console.  This is our way of making the experience more personalized or familiar to the way the app appears on your phone. In fact, the ‘Favorites’ are like handy shortcuts for frequently used actions, like “call Mom”.  

Me: One more thing.  Remember the other buttons on the home screen? One looked like a music note, the other a picture for messaging and so on.  Well, when you press those, a series of icons appear across the screen, each showing an action that belongs to that group.  If your phone had Spotify installed, we would show you a few Spotify actions. If Pandora was installed, we would show you Pandora actions and so on.   Check out what happens when I activate my profile. Notice how Pandora appears? That’s because Pandora is on my phone and not on the tablet like Google Play Music and YouTube Music.

  

Me: Same is true for messaging and calling.   Actions from apps installed on your phone would appear.  You would simply tap on the icon to run the action.   In fact, if you look carefully, you’ll notice that all the actions that show up on the console are also in the ‘My Actions’ screen in the Q Actions app on your Android Phone.   Check out what’s on the tablet vs. my phone.

 .     

Son: Yep.

Me: Oh and before I forget, there’s one last item I’d like to tell you about.

Son: What’s that.

Me: Notifications.  If you send me a message on WhatsApp, Messenger or WeChat, a screen will popup letting me know I have a message from you.  I can listen to the message by pressing a button or respond to the message – by voice, of course, all while keeping my focus on the road.   You’ll get the response just as if I had sent it while holding the phone.

Son:  Cool. I’ll have fun sending you messages on your way home from work.

Me: Grrrrrr.

Son: Hey, can I try this out on my phone?

Me: Sure.  Just download our latest app from the Google Play Store.  After you get it installed, goto the Preferences section under Settings and check the box that says ‘Auto Mode’ (BETA).  You’ll automatically be switched into Auto Mode on your phone. Now this becomes your console in the car.

Of course, things appear a bit smaller than on your phone than what I’ve shown you on the tablet.  Oh, and since you’re not connected to another phone, all the commands you give it will be performed by apps on your phone.   Try it out and let me know what you think.

Son:  Ok. I’ll play around with it this week.

Me: Great.  Now let’s go see what your mom’s made us for dinner.

 

Q Actions 2.0

Do more with Voice! Q Actions 2.0 now available on Google Play

By | Action Recipes, App Actions, Artificial Intelligence, Conversation, Digital Assistants, Natural Language, Voice, Voice Search | No Comments

Do more with Voice

Q Actions 2.0 is here. With this release, we wanted to focus on empowering users throughout their day. As voice is playing a more prevalent part in our everyday lives, we’re uncovering more use cases where Q Actions can be of help. In Q Actions 2.0, you’ll find new features and enhancements that are more conversational and useful.

Directed Dialogue™

Aiqudo believes the interaction with a voice assistant should be casual, intuitive, and conversational. Q Actions understands naturally spoken commands and is aware of the apps installed on your phone, so it will only return personalized actions that are relevant to you. When a bit more information is required from you to complete a task, Q Actions will guide the conversation until it fully understands what you want to do. Casually chat with Q Actions and get things done.

Sample commands:

  • “create new event” (Google Calendar)
  • “message Mario (WhatsApp, Messenger, SMS)
  • “watch a movie/tv show” (Netflix, Hulu)
  • “play some music” (Spotify, Pandora, Google Play Music, Deezer)

Q Cards™

In addition to providing relevant app actions from personal apps that are installed on your phone, Q Actions will now display rich information through Q Cards™. Get up-to-date information from cloud services on many topics: flight status, stock pricing, restaurant info, and more. In addition to presenting the information in a simple and easy-to-read card, Q Cards™ support Talkback and will read aloud relevant information.

Sample commands:

  • “What’s the flight status of United 875?”
  • “What’s the current price of AAPL?”
  • “Find Japanese food

Voice Talkback™

There are times when you need information but do not have the luxury of looking at a screen. Voice Talkback™ is a feature that reads aloud the critical snippets of information from an action. This enables you to continue to be productive, without the distraction of looking at a screen. Execute your actions safely and hands-free.

Sample commands:

  • “What’s the stock price of Tesla?” (E*Trade)
    • Q: “Tesla is currently trading at $274.96”
  • “Whose birthday is it today?” (Facebook)
    • Q: “Nelson Wynn and J Boss are celebrating birthdays today”
  • “Where is the nearest gas station?”
    • Q: “Nearest gas at Shell on 2029 S Bascom Ave and 370 E Campbell Ave, 0.2 miles away, for $4.35”

Compound Commands

An enhancement to our existing curated Actions Recipes, users can now create Action Recipes on the fly using Compound Command. Simply join two of your favorite actions using “and” into a single command. This allows the users the capability to create millions of Action Recipe combinations from our database of 4000+ actions.

Sample commands:

  • “Play Migos on Spotify and set volume to max”
  • “Play NPR and navigate to work”
  • “Tell Monica I’m boarding the plane now and view my boarding pass”

Simply do more with voice! Q Actions is now available on Google Play.

Q Actions - Voice Talkback

Q Actions – Voice feedback from apps using Talkback™

By | App Actions, Conversation, Digital Assistants, User Interface | No Comments

Wonder why you can’t talk to your apps, and why your apps can’t talk back to you?  Stop wondering, as Talkback™ in Q Actions does exactly that. Ask “show my tasks” and the system executes the right action (Google Tasks) and, better yet, tells you what your tasks are – safely and hands-free, as you drive your car.

Driving to work and stuck in traffic?  Ask “whose birthday is it today?” and hear the short list of your friends celebrating their birthdays (Facebook). You can then say  “tell michael happy birthday”  to wish Mike (WhatsApp or Messenger). And if you are running low on gas, just say “find me a gas station nearby” and Talkback™ will tell you where the nearest gas station is and how much you’ll pay for a gallon of unleaded fuel.

Say it. Do it. Hear it spoken back!

Voice Enable System Settings with Q Actions 1.3.3!

By | App Actions, Digital Assistants, News, Voice Search | No Comments

Somewhere in the Android Settings lies the option for you turn on Bluetooth, turn off Wifi, and change sound preferences. These options are usually buried deep under menus and sub-menus. Discoverability is an issue and navigating to the options usually means multiple taps within the Settings app. Yes, there’s a search bar within the Settings app, but it’s clunky, requires typing and only returns exact matches. Some of these options are accessible through the quick settings bar, but discovery and navigation issues still exist. 

In the latest release, simply tell Q Actions what System Settings you want to change. Q Actions can now control your Bluetooth, Wifi, music session, and sound settings through voice.

Configure your Settings:

  • “turn on/off bluetooth”
  • “turn wifi on/off”

Control your music:

  • “play next song”
  • “pause music”
  • “resume my music”

Toggle your sound settings:

  • “enable do not disturb”
  • “mute ringer”
  • “increase the volume”
  • “put my phone on vibrate”

In addition to placing calls to your Contacts, Q Actions helps you manage Contacts via voice. Easily add a recent caller as a contact in your phonebook or share a friend’s contact info with simple commands. If you have your contact’s address in your Contacts, you can also get directions to the address using your favorite navigation app.

Place calls to Contacts:

  • “call Jason Chen
  • “dial Mario on speaker”

Manage and share your Contacts:

  • “save recent number as Mark Johnson
  • “edit Helen’s contact information“
  • “share contact info of Daniel Phan
  • “view last incoming call”

Bridge the gap between your Contacts and navigation apps:

  • “take me to Rob’s apartment”
  • “how do I get to Mike’s house?”

Unlock your phone’s potential with voice! Q Actions is now available on Google Play.

Ever-growing index of App Actions

The largest mobile App Action index in the world!

By | App Actions, Digital Assistants, Voice Search | No Comments

You often hear the phrase “Going from 0 to 1” when it comes to the accomplishment of reaching a first milestone – an initial product release, the first user, the first partner, the first sale.   Here at Aiqudo, I believe our “0 to 1” moment occurred at the end of the summer in 2017 when we reached our aspirational goal of on-boarding a total of 1000 Actions. It was a special milestone for us as we had built an impressive library of actions across a broad category of apps, using simple software tools, in a relatively short time, with only a handful of devs and interns.  For comparison, we were only 5 months in operation and already had one tenth the number of actions as that “premier bookseller in the cloud” company. These were not actions for games and trivia – these were high utility actions in mobile apps that were not available in other voice platforms. On top of that, we did it all without a single app developer’s help – no APIs required. That’s right, no outside help!

So how were we able to accomplish this? Quite simply, we took the information we knew about Android and Android apps and built a set of tools and techniques that allowed us to reach specific app states or execute app functions.  Our initial approach provided simple record and replay mechanics allowing us to reach virtually any app state that could be reached by the user. Consequently, actions such as showing a boarding pass for an upcoming flight, locating nearby friends through social media or sending a message could be built, tested, and deployed in a matter of minutes with absolutely no programming involved!   But we haven’t stopped there. We also incorporate app-specific and system-level intents whenever possible, providing even more flexibility to the action on-boarding process and our growing library of actions including those that control Alarms, Calendar, Contacts, Email, Camera, Messaging and Phone to name a few. With the recent addition of system level actions, we now offer a catalog of very useful actions for controlling various mobile device settings such as audio controls, display orientation and brightness, wifi, bluetooth,  flash and speaker volume.

Our actions on-boarding process and global actions library solves the action discovery problem that we described in an earlier post. We do the heavy lifting, so all you need to say is show my actions”, or “show my actions for Facebook” and get going! And you don’t need to register your credentials to invoke your personal actions.

Today our action library is ~4000 strong and supports 7 languages across 12 locales.  Not bad for a company less than a year and a half old! We haven’t fully opened up the spigot either! 

Of course, all of this would not be possible without the hard work of the Aiqudo on-boarding team whose job, among other things, is to create and maintain Actions for our reference Q Actions app as well as our partner integrations.   The team continues to add new and interesting actions to the Aiqudo Action library and optimize and re-onboard actions as needed to maintain a high quality of service.

Check back with us for a follow-on post where we’ll discuss how our team maintains actions through automated testing.

Automatically personalized action recipes.

Automate your day with Action Recipes in Q Actions 1.3.2!

By | Action Recipes, App Actions, Digital Assistants, News | No Comments

Finding yourself routinely using the same set of actions as you commute to work or prepare for your workout session? Action Recipes string together your favorite actions to help you get through the day. With Action Recipes, you can run multiple actions from the apps you use with simple voice commands.

Hands on the steering wheel as you start your daily commute to work?

“start my morning commute”

  • Start streaming NPR as Google Maps navigates you through the best route to work

Earbuds on, phone stowed, as you get ready for your routine run around the neighborhood or nearby trail?

“start my workout”

  • Play your favorite tracks on Spotify, Pandora, or Google Play Music as MapMyRun, Mi Fit or Google Fit logs your workout session

Hands tied as you gather your carry-ons and prepare to board the plane?

“ready to board”

  • Send someone a quick message through SMS, WhatsApp, or WeChat as United, American Airlines, Alaska, or Delta brings up your boarding pass

These Action Recipes are already created for your convenience. Just grab the latest version of Q Actions from the Google Play Store and swipe left until you reach the My Action Recipes page to preview your supported Action Recipes. More interesting recipes will just start surfacing here as they come online.

Action Recipes are automatically personalized for you – the right actions are executed based on the apps you use for these tasks. We are working on further controls and customizability.

App Store Icon

We would love to hear your feedback on these Action Recipes. Please let us know what you think!

Q Actions power Moto Voice

Q Actions Platform now powers App Actions in Moto Voice. #HelloMoto

By | App Actions, Digital Assistants, Machine Learning, Voice | No Comments

Our first official day at Aiqudo was in April, 2017.  One year later, we are excited to announce that our Q Actions platform is now live and powering app actions in Moto Voice. The experience is being rolled out, as we speak, to millions of users using Motorola phones in 7 languages in 12 markets, with more to come. Watch the coverage of the always on voice capabilities during Motorola’s recent launch event. 

Most of the app actions we power are not currently available in other digital assistant platforms – actions in apps like Facebook, Whatsapp, Wechat, Netflix, Spotify, Hulu, Waze, to mention a few.  And we just got started …

On supported Motorola phones, you just say “Hello Moto” and issue simple commands – hands free.

Our solution provides high utility to users. You can get things done instantly within your favo(u)rite apps, privately and without having to register credentials. Check out the Voice-to-Action™ experience in the video below:

We’ve addressed several hard technical problems, including:

  • Command matching for simple, intuitive commands in multiple languages: You speak naturally – no need to learn a specific syntax. A single command can provide matching actions from multiple apps, providing user choice.
  • Action execution of personal app actions:  We execute actions in your favo(u)rite apps, including your private actions, without requiring registration or login credentials. We use several techniques for action execution, and can even execute tasks consisting of multiple actions in different apps.
  • Action on boarding operations: We support actions in multiple versions of apps simultaneously – in multiple locales. Our on boarding process takes minutes, does not mandate APIs, coding or developer engagement, enabling rapid scale. Our flexible Machine Learning systems are trained incrementally with simple exemplary commands.

We will be writing more about our contributions in these areas over the next few weeks.

For the most powerful, fully hands free experience, get a new phone with always on Moto Voice, and say “Hello Moto”!

Or, for other Android phones, you can download the Q Actions app from the Play Store.

Why Apps?

By | App Actions, Digital Assistants, Voice | No Comments

With all the hype around chatbots, skills, and other forms of custom voice UX, we’re often asked why we chose mobile apps as the first target domain for Q Actions – our voice AI platform.

The short answer is: apps are where the utility is – consumers spent a trillion hours using mobile apps last year. With voice, all those familiar apps are even easier to use.

We believe there is a critical gap in the voice assistant marketplace. The ideal assistant MUST:

  • Be ubiquitous – not just available in the kitchen or living room
  • Provide high utility – help us do useful things we do every day
  • Work intuitively – let users speak naturally, without the need to learn new syntax
  • Offer user choice – across platforms, applications and devices
  • Be private and secure – on device where possible

Mobile apps remain the best way to achieve these goals. Your phone is always with you, and mobile apps provide high utility for you wherever you are.

Venturebeat ran a survey last year asking 1000 people “Which of these (app, mobile website, or chatbot) would you prefer to use in order to engage with a brand?” There was a clear winner. Apps! It’s particularly interesting because these 1000 respondents were self-described “chatbot users”.

Users prefer Apps

Users prefer to use Apps

 

Are we at “Peak App” and does it even matter?

We often hear the concept of “Peak App”, which describes a general state of app fatigue. In this narrative, people already have all the apps they need, so they no longer download new apps. And for developers, this peak means creating new apps is no longer exciting, and breaking through as a new app is increasingly rare, so maybe develop a skill or a chatbot for one of the closed platforms and see how that goes (aka starting over with your customers).  

Global app download rates defy the idea of “Peak App”. We’ve seen 60% growth over the past three years, and this trend continues in 2018, with app downloads (and revenue) breaking records yet again in Q1.

app downloads

App Downloads continue to grow

 

People continue to spend more money in the app economy. Both iOS and Google Play saw 20% year-over-year growth in worldwide consumer spend in Q4 2017. The total app spend in 2017 was $17 billion.

App Spend

App Spend continues to grow

As noted by Mary Meeker in her 2017 Internet Trends report, Internet Usage (Engagement) continues to grow (+4% Y/Y), with mobile >3 Hours / Day per User vs. <1 Five Years Ago, USA

Mobile continues to dominate time spent

 

Peak is a moot point anyway, because…

People use 30 to 40 apps, and still have another 50+ apps installed on their phone.

More apps can be used

Many app are usable, but out of sight.

 

We want to bring easy-to-use voice AI to the apps people use, while also helping them make use of those apps installed but not used. Out of sight is out of mind, but if you could just ask, and if the right action in the right app is executed, you’re more likely to use those installed apps. Further, if I don’t have to know which app can execute my command, I can just say what I want and our Q Actions platform will:

  • Understand what you intend to do
  • Determine which apps can get it done
  • Execute the action using the most relevant app installed on your phone

It’s easier, more natural, and … faster! It reduces time to action.

Unlocking Utility

This approach unlocks the high utility of mobile apps by putting the effort of app discovery on the voice AI platform, not the consumer.

ComScore’s “2017 U.S. Mobile App Report” illustrates that many people have apps they consider “Hidden Gems”. These are gems because they are helpful and offer high utility when needed, but are not in the top 25 most used apps. We help people make use of these gems by simply issuing natural voice commands.

Hidden Gems

Hidden gems : apps that are not head apps, but provide huge utility

 

Most of these “hidden gems”, along with millions more – photo apps, payment apps, airline apps, etc. are just not available in existing voice platforms. Alexa Skills offer limited utility compared to mobile apps already installed on your phone.

Critical Gaps In The Big Voice Platforms

The big voice platforms don’t currently support many of the most popular, helpful, and engaging mobile apps. Here’s a look at top mobile apps vs apps currently supported through an Alexa Skill.

Apps on Assistants

Many popular apps are not available on Alexa or Google Assistant

 

Current voice platforms don’t support enough useful actions. Even those apps supported by Alexa, Google Assistant, Cortana, Siri, et al, often limit voice support to a small number of app functions. For example, with Alexa, I can order a Lyft, but I can’t schedule one, or look at my ride history. Voice should make using these familiar apps easier, not require you to remember what Lyft can do with Alexa.

Don’t Reinvent The Wheel

Current voice platforms require new, custom development, ongoing maintenance and support. Why would a developer reinvent the wheel just to offer voice support to their customers, expanding their maintenance and support requirements in the process?

Voice enabling your existing app gets developers and brands started on capturing customer voice search commands, a valuable asset that should be protected from competitors, some of which operate digital platforms eager to disintermediate brands from their customers.

Apps Are Useful, Personal, Private, and Secure

A compelling consumer voice experience is our goal, and apps are a great starting point. Further, because you already trust the apps you use, and we don’t require registration or any user credentials, we execute the right actions for you privately.  We can enable personal actions, like playing your personal playlists, viewing your photos, sending payments to friends, and messaging family – quickly and securely.

Through our Q Action Kit (developer SDK) or our Q Actions App, Aiqudo’s action intent AI connects voice computing to the mobile app ecosystem, helping you take action quickly and easily, wherever you go.