Jonathan Gill CTO, Watchfinder
Jonathan Gill is CTO at Watchfinder. Watchfinder buys watches from members of the public, returns them to as new condition, warranties them, and sells them back to the public via their website and retail stores.
They have grown to 120+ staff, and turnover £70 million per year.
On the 3rd of November 2016, endjin hosted a 2 day Bot Framework Hackathon for Watchfinder and The Microsoft UK DX Team.
Acquiring and restoring pre-owned watches is central to Watchfinder’s business. The aim was to use the Microsoft Bot Framework to build an integrated conversational experience that would quickly and engagingly capture details about the individual’s watch, connecting with Watchfinder’s product catalog API to narrow down the search, and use Watchfinder’s product imagery API to display matching watch images for user confirmation. We had two days to see if this was at all possible.
A room full of people and two days ahead of us; four from Microsoft, three from Watchfinder and a varying number between two and eight from endjin! The first day kicked off with a round of introductions…
…Which focused on why people were here, what experience they had, and explaining how much (or how little) experience everyone had using the Microsoft Bot Framework.
Given the relative newness of the framework, it was unsurprising that the majority of the people around the table had done little more than watch the tutorials. One of the team from Microsoft proceeded with a short presentation describing the Bot Framework.
Bots are tools that carry out a conversation with users, providing a more interactive and personal interaction than a wizard or web form, across a wide range of channels; web, Skype, Slack, SMS to name a few. They can be very simple – prompting users with questions and accepting their input as the answer – or more complex, carrying out natural language processing to determine the user’s intention.
The Microsoft Bot Framework provides support for these different types of conversational flow – the most flexible is using Dialogs to handle the interactions, however modeling a guided conversation – like describing a watch – can require a lot of effort, as each interaction might need clarification, you might want to show progress or provide more help along the way. The FormFlow model solves this problem by removing some of the flexibility in favor of introducing more structure to the conversation, and it seemed a great fit for us.
However, there’s also integrated support for using the Language Understanding Intelligence Service (or LUIS for short) from the Microsoft Cognitive Services APIs to handle natural language processing within the Dialogs.
We didn’t really have time to go all out for a complex LUIS-based bot so decided to stick with a guided bot using FormFlow, but wanted to look at using LUIS as an alternative way to start the conversation.
For example, if a user asked “I want to sell my Rolex watch” then we could infer an intent to a sell a watch, and also prepopulate our form with the brand as Rolex.
We split into two teams and a rough plan for integrating a FormFlow bot and LUIS was developed – with LUIS being used to gather initial data about the watch from a free text input by the user, and the model passed into FormFlow would be pre-populated with this data where possible.
I joined the FormFlow team, and we started by putting together a simple form and wiring it into the bot, running the bot locally and using the Bot Framework emulator to test if everything was working.
The bot could integrate with the existing Watchfinder website using the DirectLine connector, which offers a REST API for communicating with the bot, and would prompt the users to complete the various fields necessary to provide a watch valuation, such as case metal, strap type, dial colour, brand, model, serial number.
I was part of the LUIS team, and we started created a new LUIS application using the luis.ai site. This consists of entering a number of “utterances” and picking out the “intents” and “entities”, before training the model and publishing as a web service.
If we take an example like “I want to sell my gold rolex”, the intent would be “sell” and the entities that could be selected would be “gold” (for the watch casing) and “rolex” (for the brand).
If you train the model with enough different utterances and examples of entities, you can then use it to recognize similar natural language requests. Once the trained model is published as a simple HTTP web service, you can send it the user’s text, and it sends back the potential intents and entities along with a score for how likely they are to be correct.
The LUIS application needed to be kept tightly focused and domain specific to be feasible. It was intended to allow the user to enter a query before dropping into the form-like Bot dialog, with the discovered entities already completed – meaning the user needs to be asked less questions to provide a proper identification of the watch to be valued.
Jonathan had created a repository containing a Bot template project, published the Bot and set up continuous integration pipeline in advance, which saved a lot of set up time. The starting point for implementing FormFlow was to add a Watch class which would act as the model, A form class, and a dialog class, and wire these up to the standard MessagesController. This was pretty straightforward, using BotBuilder samples as a guide.
Throughout the afternoon we improved the FormFlow, adding the necessary fields we needed to capture, the prompts for the bot to display, and validation for the fields.
FormFlow offers the option to validate responses by defining model properties using enums, but we didn’t want to hardcode in the names of 1000s of brands and series so we chose to define this dynamically using a validation function as each field was created in the form class.
We integrated Watchfinder’s API to search its database of watches each time the user provided more information, and offered a response to the user informing them of the number of filtered down results.
Towards the end of the day, I tried to get an integration working between a dummy website and our bot, using the DirectLine connector. We were having some difficulties getting it to connect, so that was left as something to solve the following morning.
With some fresh eyes, one of the team had discovered what our DirectLine issue was (which turned out to be some misunderstanding of the documentation regarding configuration), which left us free to continue… until we hit the next issue.
The version of the bot that was being published to Azure using a continuous deployment pipeline that had been setup was no longer working. After some failed diagnostics into why, we worked around it by manually publishing the bot using Visual Studio.
We resumed the work on the FormFlow functionality, developing the code to dynamically present options to the user based on the previous inputs. We used the SetDefine() function, which can access the state of the model. Ideally, we wanted to present the user with a set of options if there were less than say 5 to pick from, or a free text field otherwise.
However, it was rather frustrating to find that the type of field could not be conditionally altered from a list of values to a free text field depending on the state of the model, as the state object wasn’t available to the SetType() function. So we had to leave presenting the user with a set of options for a future enhancement – something that the Watchfinder guys could pick up and look into further at a later date.
By early afternoon the two teams had converged into one and the focus was on integrating the Bot into the real Watchfinder site using DirectLine. As well as querying the Watchfinder API to get live results and only display the possible values for each form field, we were now also trying to display product images in the Bot conversation once the result had been narrowed down enough.
The majority of the rest of the afternoon was spent adding some basic styling to the front-end UI, to fit in with Watchfinder’s existing branding, and fixing small bugs that were found when trying to use all the parts of the system together.
By the end of the afternoon everyone gathered back in the room for a demo of the Bot being used from the Watchfinder web site. There was definitely still work to be done, for example the we didn’t get the images displaying properly on the web site, but, fundamentally the Bot was working – starting with an input allowing the user to enter their natural language query, which fed into the Bot’s dialog for gathering the remaining data required.
It was a fairly satisfying end to the hack days, though it was quite hard to down tools and stop on something that had absorbed so much attention over the past few days.
Overall, the hack days had built the foundations for a tool that could be integrated into the new beta site with a lot less effort than the current web-based tool, and, maybe more importantly, extended out to other channels like Facebook messenger, Skype or SMS very easily through the Bot Framework’s built in connector configurations.
Overall, I really enjoyed taking part, and was quite pleased to get to play with building a LUIS application for a real-world offering. Hopefully Watchfinder will go on to polish the foundation created during these two days, and it becomes an integral part of their watch acquisition process.