I mean EVERYTHING being controlled from the backend. How cool would it be to have an app which is completely controlled by the backend ? No, I don't mean A/B tests or deciding which colour a button should be, or what set of features should be available for a certain user. On iOS: Integration with Siri (SiriKit, Siri Shortcuts) On Android: Integration with the Google Assistant via App Actions 3rd-party solutions such as Porcupine, Snips, Amazon Lex, Snowboy and PocketSphinx Other packages that integrate with other solutions such as ML Kit, TensorFlow and Dialogflow Flutter Packages that take advantage of APIs provided by Google or Apple In this talk, I'll explain what integrations can be done on Flutter apps: Let's not forget that voice interactions are extremely accessible, not only in a physical way (for people with dexterity or motion impediments) but also in a cognitive way (I think we all have a loved one in our lives that really struggles with technology, and people from some emerging countries have very limited access to computers and are not at ease with technology). What about the fully-capable computers that we have with us all the time, our smartphones? Some moments on our day to day lives are very well suited for voice interactions: while in a car or cooking for example. When we think about developing features that are voice-forward, we think about existing voice assistants such as Alexa and the Google Assistant. But, as an Android developer, what can I do on my existing app in relation to conversational features? The age of voice assistants is here with Alexa, the Google Assistant and others. Thanks to the latest advancements in Machine Learning, we're now capable of interacting with machines through natural language.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |