Quantcast
Channel: CodeSection,代码区,Python开发技术文章_教程 - CodeSec
Viewing all articles
Browse latest Browse all 9596

A First Attempt at An Amazon Echo Alexa Skills App Using Python: Parlibot, A UK ...

$
0
0

Over the last couple of years, I’ve been dabbling with producing simple textual reports from datasets that can be returned in response to simple natural language style queries using chat interfaces such as Slack (for example, Sketching a Slack Slash Parliamentary Auto-Responder Using AWS Lambda Functions ). The Amazon Echo , whichlaunches in the UK at the end of September, provides another context for publishing natural languages style responses, in this case in the formof spoken responses to spokenrequests.

In the same way that apps broughta large amount of feature level functionality to mobile phones, the Amazon Echo provides an opportunity for publishers to develop “skills” that can respond to particular voice command issued within hearing of the Echo. Amazon is hopeful that one class of commands Smart Home Skills will be used to bootstrap a smart home ecosystem that allows you to interact with smart-homedevices though voice commands, such as commands to turn your lights on and off, orquestions about the statusof your home, ( “is thegarage door still open?” , for example). Another class of services relate to more general information based services, or even games, which can be developed using a second API environment, the Alexa Skills Kit . For a full range of available skills, see the Alexa Skills Store .

The Alexa Skills Kit has a similar sort of usability to other AWS services (i.e. it’s a bit rubbish…), but I thought I’d give it a go repurposing some old functions around the UK Parliament API, such as finding out which committees a particular MP sits on, or who are the members of a particular committee, as well as some new ones.

For example, I thought it might be amusing to try to implement a skill that could respond toquestions like the following:

what written statements were published last week? were there any written statements published last Tuesday?

using some of the “natural language” date-related python functions I dabbled with yesterday.

One of the nice things about the Alexa Skills API is that it also supports conversational contexts. For example, an answer to one of the above questions (generatedby my code) might take the form “There were 27 written statements published then” , but session state associated with that response can also be passed back as metadata to the Alexa service, and then returned from Alexa as session metadata attached to afollow-up question. The answer to the follow-up questions that can then draw on context generated earlier in the conversation.So for example, exchanges such as the following now become possible:

Q: were there any written statements published last Tuesday? A: There were 27 written statements published then. Do you want to know them all? Q: No, just the ones from DCLG. A: Okay, there were three written statements issuedby the Department for Communities and Local Government last week. One on …. by….; etc etc

So how can we build an Alexa Skill? I opted for implementing one using Python, with the answer engine running on my Reclaim Hosting webserver rather than as an AWS Lambda Function, which I think Amazon would prefer. (The AWS Lambda functions are essentiallyfree, but it means you have to go through the pain of using another AWS service.) For an example of getting a Python application up and running on your own web host using cPanel, seehere.

To make life simpler, I installed the Flask-ASK library ( docs ), which extends the Flask web application framework so that it plays nicely with the Alexa Skills API. (There’s a standalone tutorial that runs without the need for any web hosting described here: Flask-Ask: A New Python Framework for Rapid Alexa Skills Kit Development .)

The Flask-Ask library allows you to create two sorts of response typesin your application that can respond to “intents” defined as part of the Alexa skill itself:

a statement , which is a response Alexa that essentially closes a session; and a question , which keeps the session open and allows you to popsession state into the response so you can get it back as part of the next intent command issued from Alexa in that conversation.

The following bit of code shows how to decorate a function that will handle a particular Alexa Skill intent. The session variable can be used to pass session state back to Alexa that can be returned as part of the next intent. The question() wrapper packages up the response ( txt ) appropriately and keeps the conversational session alive.

@ask.intent("WrittenStatementIntent")
def writtenStatement(period,myperiod):
txt,tmp=statementGrabber(period=period,myperiod=myperiod)
session.attributes['period'] = period
session.attributes['myperiod'] = myperiod
session.attributes['typ'] = 'WrittenStatementIntent'
if tmp!='': txt='{} Do you want to hear them all?'.format(txt)
else: txt="I don't know of any."
return question(txt)

We might then handle a response identified as to the affirmative (“yes, tell me them all”) using something like the following, which picks up the session state from the response, generates a paragraph describing all the written statements and returns it, suitably packaged, as a session ending statement() .

@ask.intent("AllOfThemIntent")
def sayThemAll():
period= session.attributes['period']
myperiod= session.attributes['myperiod']
typ=session.attributes['typ']
txt,tmp=statementGrabber(period=period,myperiod=myperiod)
return statement(tmp)

So how do we define things on the Alexa side? To start with, we need to create a new skill and give it a name. A unique ID is created for the application that is passed in all service requests that we can use a key to decide whether or not to accept and respond to a request from the Alexa Skill server in our application logic. (For convenience, I defined an open service that can accept all requests. I’m not sure if Flask-Ask has a setting that allows the application to be tied to one or more Alexa Skill IDs?)


A First Attempt at An Amazon Echo Alexa Skills App Using Python: Parlibot, A UK  ...

The second thing we need to do is actually define the interactions that the skill will engage in. This is composed ofthree parts:

an Intent Schema , defined as a JSON object, that specifies a list of intents that the skill can handle. Each intent must begiven a unique label (for example, “AllOfThemIntent” ), and may be associated with one or more slots . Each slot has a name and a type

Viewing all articles
Browse latest Browse all 9596

Trending Articles