2017/indiesiri

From IndieWeb

β€œSiri, what is Tantek doing?” was a session at IndieWeb Summit 2017.

Notes archived from: https://etherpad.indieweb.org/indiesiri
Video at: https://www.youtube.com/watch?v=1zLqOmXXxvw


When: 2017-06-24 17:00

Participants

Notes

We are all building open database about ourselves. We have listening devices at home.

Goals:

  • Answer questions about my own activity (which is logged on my website)
  • Answer questions about my "network" (when is the next IndieWebCamp? Where is tantek?)

Use cases:

  • Location: where is Ben?
  • Appointments: Ask Jonathan if he's available for lunch tomorrow at noon?
  • Metadata: What was I watching on September 4th?
  • What is Ben's email?

Non visual interface to webmentions.

aaronpk has set up an Alexa skill to post to his site: https://aaronparecki.com/2017/01/03/10/day14-alexa-app

An open source siri alternative: https://github.com/patrickjquinn/P-Brain.ai

Suggestions of a protocol based on asking questions of websites. People could create custom agents to answer certain types of questions. Could fall back to notes and webmentions. May require notifications to user when they are asked a question.

It's possible that this just works via webmentions to the user's websites, with markup indicating the question type. It's just a way for your website to ask my website a question on your behalf. Your website can then be the intelligent agent; my website can be another intelligent agent. You start stupid and then you build in the smarts.

  • Perhaps the "requestor" could send a webmention to the front page of the "requestee's" website, marked up with microformats to indicate what the question is.
  • If the website of the requstor is smart enough to handle it automatically, it could, otherwise, it could just store the webmention and let the user handle it.

You can set your own rules. It's more personalized. Your agent is yours, and you decide what the rules and the intelligent capabilities are.

It's agnostic of operating system - it'll work whether you use Alexa, Siri, Google Assistant, etc, as your interface.

It's better to have it be an API than a chatbot. It needs to be more systemized. We're breaking the chatbot into two pieces: language recognition (which we can outsource to Siri et al) and the programmatic call & response mechanism that actually answers structured questions.

Doctors won't look at a medical record if a certain percentage is missing. Google says that users won't use a voice assistant if more than 4% of commands are recognized incorrectly. The same factors are at play here.

Constrained problems that are aligned with existing indieweb protocols would be interesting.

Website to website webmention resulting in an inbox is a good place to start. The resulting code is a bit like a mail filter. An IFTTT for webmentions / any notification. Whether that's via a web, or voice agent, or other interface is almost a secondary decision.

Notifications are important.

Examples:

Action Items

  • Site-to-site webmentions
  • Question and answer mf2
  • Proofs of concept where the above two items are not required (eg going to find information already encoded on indieweb sites without further required intelligence) - this will incentivize people to add more information to their websites
  • A simple interface for the above
  • Determine the initial questions
  • An accepted way to determine someone's friends / connections