Walled gardens

Open or Walled?

By | Artificial Intelligence, Digital Assistants, Machine Learning, Voice | No Comments

Voice has the promise to be the next disruption, upending massive, established business models and putting search, commerce, messaging, and navigation up for grabs again. But a walled garden mentality could stifle that disruption.

Even over its relatively short history, we see a pattern of behavior on the Internet: some innovator creates a marketplace for consumers, helping to organize information (Yahoo and AOL in their first iterations), commerce (Amazon), a place to keep in touch with our friends (Facebook), and they create huge consumer value in bringing us together, providing us with tools that make it easy to navigate, buy, message, etc. But as the audience grows, there is always a slide away from an open marketplace toward a walled garden, with the marketplace operators initially becoming toll takers and moving toward ever greater control (and monetization) of their users’ experience, and more recently, their data.

Mobile carriers in the US tried to erect walled gardens around their users in the 1.0 version of mobile content — the carriers thought they had captive users and captive vendors, and so created closed platforms that forced their subscribers to buy content from them. Predictably, monopoly providers offered narrow product offerings at high prices and squeezed their vendors so hard that there was no free cash flow for innovation. Mobile content stagnated, as the carriers failed to cultivate fertile ecosystems in which vendors could make money and in which consumers had a growing variety of new and interesting content. When the iPhone came along (thankfully Steve Jobs could wave his magic wand over the guys at AT&T), consumers could finally use their phones to get to the Internet for the content they wanted, and the carriers went back to being dumb pipes.

Will voice platforms become walled gardens?

If you want to enable your users to reach you through Alexa, you have to create a Skill. Then you have to train your users to invoke your Skill using a precise syntax. Likewise Google Assistant. For Siri, your business has to fit into one of the handful of domains that SiriKit recognizes. There’s a reason we refer to them as voice platforms — their owners are in control.

Initially, there are good QA reasons for this, making sure we get a good user experience. But pretty quickly, the walls will become constraints on who can be included in the garden (will Amazon and Facebook play nice together?), and ultimately, what will be the tax that must be paid in order to offer services in the garden. As users, this results in less openness, fewer choices, and constraints on our ability to quickly and easily do what we want to do, which typically includes using different services from all of the different platform providers (does Tencent really think that if you block people from Alipay inside WeChat that users will stop using Alipay?)

The carriers’ experience should be a cautionary tale — walled gardens, with their limited choices and monopolist pricing are bad for consumers; the Internet is a place of unlimited choice, the world of mobile apps is vast and diverse, again allowing for broad consumer choice — this is what we expect, and if our horizons are constrained by a platform’s policies, we’ll abandon it. The carriers fumbled Mobile Content 1.0; their walled gardens never met their promise to become massive businesses, and today they don’t even exist.

Voice interfaces should be our gateway to everything we want to do, whether it’s in Alexa, in our mobile apps, or in our connected cars or homes. So will voice platforms be these open gateways that make our lives easier, or will they be cramped walled gardens that try to make our choices for us, funneling us to a narrow selection of preferred vendors?

Assistant+Q Actions

Aiqudo Q Actions enhances Google Assistant

By | Digital Assistants, News, Voice | No Comments

Google Assistant is popular among Android users.   It is also integrated with Google Home, Google’s smart assistant device for the home. However, Google Assistant has only a a limited set of supported Actions. There are many actions that Assistant currently does not perform optimally for you, even on your phone. Instead of executing the right actions in the relevant app, Assistant offers web search results in many cases:  

  • “Show my boarding pass” (You want to pull up your boarding pass in your airline app when you are in the security line at the airport)
  • “I need a haircut” (You want to check in to your favorite salon)
  • “Who’s at the front door?” (You want to open up the video from your security device app)

We integrated the Q Actions Android app with Google Assistant. Now you can talk to Google Assistant and execute actions instantly in your favorite apps on your mobile device. You can do this on Google Assistant on your phone or in Google Home.

The first step is to open Q Actions with the command – “Talk to Q Actions”

Now you can talk natively to Q Actions voice application, for example:

  • “Play narcos” -> will open the Netflix app and start playing the TV show
  • “I’d like to board” -> will open the United Airlines app and take you right to your boarding pass
  • “Show hotels in Chicago” -> will open the HotelTonight app and show hotel deals in Chicago
  • “Show my photo albums” -> will allow  you to choose between Flickr and Google Photos apps (if you have both apps) and will show your albums

It’s easy and convenient to use Q Actions with Google Assistant. Simply say what you want to do. You don’t need to learn a specific syntax; you can speak naturally. Further, Q personalizes actions based on the apps installed on your device. If you have two apps that are a match you will get an option to choose, using voice. In the future the system will learn your preferences  and execute the best action for you. 

We would love your feedback on your experience with using Q Actions with Google Assistant. We are in the process of adding more features, so stay tuned!

Q Actions Alexa Skill

Aiqudo Q Actions enhances Amazon Alexa

By | Digital Assistants, News, Voice | No Comments

Amazon Alexa is currently the #1 smart assistant in the world.   It is also integrated with Alexa Dot and Echo – Amazon’s smart assistant device for the home. However, Amazon Alexa has only a a limited set of supported Skills. Further, because it hard to remember the specific command syntax, users only re-use about 3% of Alexa’s skills. Discovering new skills is difficult.

There are many actions that Alexa currently does not perform optimally for you:  

  • “Show my boarding pass” (You want to pull up your boarding pass in your airline app when you are in the security line at the airport)
  • “I need a haircut” (You want to check in to your favorite salon)
  • “How far can I drive?” (You want to see how many miles are on your Chevy Volt in the OnStar app)

We integrated the Q Actions Android app with Amazon Alexa. Now you can talk to Amazon Alexa and execute actions instantly in your favorite apps on your mobile device. You can do this on Amazon Alexa on your phone or on Alexa Echo device.

The first step is to install the Q Actions skill and open Q Actions with the command – “Talk to Q Actions” (Link to Alexa Skill)

Now you can talk natively to Q Actions voice application, for example:

  • “Play narcos” -> will open the Netflix app and start playing the TV show
  • “I’d like to board” -> will open the United Airlines app and take you right to your boarding pass
  • “Show Tesla stock” -> will open the ETrade app and show you a real-time stock quote for Tesla
  • “Show my photo albums” -> will allow  you to choose between Flickr and Google Photos apps (if you have both apps) and will show your albums

It’s easy and convenient to use Q Actions with Amazon Alexa. Simply say what you want to do. You don’t need to learn a specific syntax; you can speak naturally. Further, Q personalizes actions based on the apps installed on your device. If you have two apps that are a match you will get an option to choose, using voice. In the future the system will learn your preferences  and execute the best action for you. 

We would love your feedback on your experience with using Q Actions with Amazon Alexa. We are in the process of adding more features, so stay tuned!

Voice will be our interface to everything

By | User Interface, Voice | No Comments
Let’s face it, technology has not always been very user friendly. Sometimes that felt by design, so coders could keep their club small and exclusive. But usually there’s a step function innovation that totally changes how we interact with technology and, in so doing, disrupts the old paradigm. The mouse and graphical user interface launched the PC (if you’re old enough, you remember when saying GUI sounded cool). Touch screens were the brilliant innovation that enabled the whole new world of smartphones that we live in today.

Voice is the next disruption. Voice will change how we search, how we shop and manage our experiences with retailers, how we create and consume media. The big guys are placing big bets in the space, and we’re starting to see the payoff on some of those components now – voice recognition now has accuracy above 90%, which is good enough to be workable. With improvements in AI, we’ll have contextual understanding, maintain state, and get to conversational capabilities.

But today, voice doesn’t do very much. Alexa sets a mean timer, but if I want to order an Uber, I have to go to my Alexa app to sign in and register, and then I only get limited capabilities. Why wouldn’t I just go to my Uber app? My Uber app has Home, Work, SFO, already in it, plus my payment info, and I can share my ETA with my contacts. And if I want to check Surfline, forget it – there’s no Skill for that.

This is why we created Aiqudo. Our mobile apps do tons of things for us already – get rides, order food, check the surf, and loads of other interactions every day. But the touch screen interface has resulted in each app becoming an individual silo: you have to open the app, navigate your way down to the action you want, maybe tap through a few screens to select your size or color, checkout, and confirm, and then move to the next app and repeat. Aiqudo lets you use simple, intuitive voice commands to instantly get to the action you want, then seamlessly move on to the next action in another app. Do all the things you want to do in your favorite apps, but now at the speed of voice.

Voice will be our interface to everything, eventually. We’re starting with making voice the interface to the things we do every day with our mobile apps.

Q Actions -Voice to Action

Announcing Q Actions

By | Artificial Intelligence, Digital Assistants, Machine Learning, News, Voice | No Comments

It’s only been about 3 months since we formally started working @Aiqudo and we’re thrilled to announce the availability of Q Actions (Beta) on the Play Store.

You say it, we do it!

Q Actions allows you to use simple voice commands to instantly execute actions in your favorite Android apps.  Other voice assistants like Alexa or Google Assistant don’t do this!

We’ve solved a few hard problems:

  • Commands in natural language without specific syntax  – Unlike systems like Alexa, where you need to invoke a skill by name, and need to use a specific syntax for your command to be recognized, you can use natural commands with Q. You don’t have to mention an app in your command – we’ll automatically figure out the right action for your command. In fact, our AI Search (AIS) uses sophisticated Machine Learning algorithms to perform high-quality fuzzy matching across commands across multiple apps in multiple verticals.
  • Action invocation in apps without APIs or developer work –  The Q Platform does not require app developers to expose specific APIs just for voice. We can enable key actions in apps without any APIs or code. You execute actions just as you normally would in the app, with the additional benefit that it is faster, and you don’t need to remember where the function resides deep within the app. Easier and faster.
  • Personal actions without registration or loss of privacy –   Other assistant platforms only expose few personal actions, and even these require the user to register third party services on the platform. Since Q executes actions in apps directly, we don’t require registration, and you are using apps you already trust for messaging, banking, payments, stocks, etc.
  • Scalable Action on boarding  – We have figured out how to on board actions within apps directly. We on board and maintain actions at our end. So neither you, nor the app developer have to worry about making the actions available broadly.

All you have to do to get started with Q Actions is say “show my actions” – you’ll see a list of actions already available for your favorite apps out-of-the-box.

Download Q Actions now!

The Aiqudo Team

Aiqudo Day 1 Spaces

Day 1 at Aiqudo

By | Artificial Intelligence, Digital Assistants, Voice | No Comments

Day 1

We’ve hit the ground running with our core team @Aiqudo.

Humble beginnings in 2 small rooms @Spaces in San Jose, near the beautiful Santana Row.

This is the day 1 team – the laptop represents yours truly!

We’re on a mission – make it super simple for users to get things done with simple and intuitive voice commands.

The state of the art is, shall we say, not good enough! Users should not have to learn skills – AI systems should be smarter!!

Voice to Action – You say it, we do it!

Rajat, for the Aiqudo Team!

Welcome to Aiqudo

By | Digital Assistants, Voice | No Comments

 

Welcome to Aiqudo!

At Aiqudo (pronounced: “eye-cue-doe”),  we connect the nascent world of digital voice assistants to the useful, mature world of mobile apps through our Q Voice-to-Action™ platform. Q lets people use voice commands to execute actions in mobile apps across devices.

Aiqudo’s SaaS platform uses machine learning (AI) to understand natural-language requests and then triggers instant actions via mobile apps consumers prefer to use to get things done quickly and with less effort.

Visit us to keep up with innovations in the world of voice!