Today, we’re releasing in Beta 3 new built-in entities that many of you have requested:
These binary entities are of type trait and will help you detect when people say hello, bye or thanks to your app!
Why trait? As you probably know, trait entities are entities that are spanless. Like the classic “intent” entity, trait entities are determined by the sentence as a whole rather than one or certain other words in the sentence. For example: “I’ll see you later”, “see you later”, “bye bye” and “goodbye” are examples of the wit/bye entity.
Why binary? They only have one value that’s “true”. So either the sentence has this meaning and the value=true or the sentence doesn’t have this entity. We’ve always wanted to do multiple intent detection and now binary trait entities allow you to detect multiple intents in a query.
We hope you’ll like these new entities. Since they still need a bit of training, they’re being released in Beta for English. Use them in lieu of your custom intents. As with any built-in entities, they’ll be leveraging all the Wit apps that use them and will be much more inclusive and accurate over time.
As always, feel free to reach out if you have any questions, comments, or suggestions.
In September 2013, we created Wit to make it easy for developers to add a natural language interface to their apps or devices. You can read more on our journey here. For the past 3.5 years, our community has kept growing. Today, we are pleased to announce that we’ve passed 100,000 developers in the Wit community building mobile apps, devices, robots, and bots for Messenger, Slack, Telegram, and other platforms!
To cope with this outstanding growth (from 20,000 developers 12 months ago), we have been working a lot behind the scenes on scalability and stability by leveraging the Facebook stack. Now we are ready to scale and we will start improving our NLU algorithms by building on top of Facebook’s AI and NLP platforms.
We have learned a ton since the launch of Bot Engine last year. Bot Engine has an ambitious goal to learn conversation flows from examples. It is still in beta until it meets that goal. As an alternative, most of you have been using natural language understanding to extract structured data first, and then rely on custom code or other bot platforms to handle longer conversations.
Speaking of bot platforms, since launching Wit.ai 3 years ago, we’ve been focused on arming developers with a solid API to build great experiences for their users. We’ve been heavily reliant on our web app to manage certain steps, such as creating a new app or validating expressions with several entities. Because of the strong network of developers that are a part of Wit.ai, we’ve seen more and more platforms being built on our API especially in the last 12 months… with the rise of chatbots. These platforms have been building their own web interface tailored for their use cases. In true hacking fashion, these platforms had to be very creative (i.e. reverse engineer Wit) to use the API behind the scenes.
Today we are updating our Wit.ai API to make it even easier for developers and platforms to understand natural language at scale.
POST /apps to create a new app
PUT /apps/:id to update the language or timezone of an existing app
POST /samples to validate expressions with multiple entities (trait and or keyword/free-text)
We are also reintroducing the N-Best feature. Simply put, it means that now Wit.ai will not just return the best intent, but a ranked list of several possible intents. This is especially helpful to implement a reformulation strategy if you’re implementing an FAQ: for an ambiguous query, your app could say “Did you mean X,Y or Z…”. It is also great to implement your own training mechanism: when Wit is not super confident, you could retrieve the 3 most relevant entities and let your user validate the correct one…
Please look at the HTTP API reference for more details. You can also check out the new github tutorial
This is indeed a first step, but if you’re thinking of building a platform on Wit, this should get you started. If you run across issues, please drop us a line!
Finally, we wanted to thank you for being a part of our community. Your energy, continuous enthusiasm and endless support have been a great source of motivation in this journey.
We introduced composite entities on May 2015 but had to remove them temporarily due to the launch of Bot Engine last year. Many of you asked us to reintroduce them since then. And you have been super patient ;-)
Today, composite entities are back.
Composite entities are spanful entities which have entities within them. For example, you can have “2014 grey Pilot” as a car entity and “Pilot” as the model, “grey” as the color, and “2014” as the year.
Composite entities are trained just like regular entities. First, select the entire span of the composite entity and assign it to a new entity just like you would before. Then, select spans within the tagged entity and tag them as any entity you want (user-defined or provided by Wit).
Reminder, Wit will only process one level of composite entities; you cannot have a composite entity within another composite entity.
In the API response, entities found within composite entities are returned as a list under the field “entities” and are formatted just like regular entities. An example response is below.
"value": "2014 grey Pilot"
As always, feel free to reach out if you have any questions, comments, or suggestions. We’re working hard to make Wit smarter and way more robust, see our community update. This is one step in this direction, expect more in the near future!
Responding to feedback on our FB group, we wrote a small community update to reflect a bit on our journey and share how we’re thinking about the future.
Wit started as a private beta in September 2013. The idea was to provide a simple tool to make it easy for developers to add natural language to their apps and devices. Wit introduced the .ai domain as well as “intent and entities”-driven natural language interfaces.
In January 2015, we decided to join FB to advance the state of natural language interfaces in everyday life and help Messenger become a successful platform. Our focus quickly shifted to a new project: a virtual assistant called M. We had a small team (7 engineers) and M was ambitious enough that we felt the need to put Wit.ai in maintenance mode for a bit.
In March 2016, Messenger opened up its API to send and receive text messages programmatically. Developers would be able to create conversational bots to connect users and businesses. In order to support this huge announcement, we started working again on Wit and shipped a new feature called Stories.
Stories is a dialog layer on top of the existing natural language understanding layer allowing bots to carry out complex transactions using text. Our goal was to provide something very flexible (and not slot-based like other services) so that anybody would be free to build what they need. We could then identify and focus on where the demand lies.
We’ve grown rapidly and have spent months interacting with the community, helping them build complex flows and interactions, listening to their needs and improving Stories and the rest of the product.
In September 2016, Messenger added support for more visual interactions with Menus, web views, etc. This enables interesting use cases to mix natural language understanding and visual interfaces to deliver the best end-user experience.
The goal of Wit is to make bots successful as a new means of communication between people and their favorite services/businesses. In order to do this, we will focus on a few key areas for the next few months:
- 1) scale Wit to be reliable under any circumstances. To provide maximum robustness, we’ll build on top of proven Facebook technologies, like the ones powering facebook.com and our ads system.
- 2) improve our NLU algorithms to make bots smarter by building on top of Facebook’s AI and NLP platform: DeepText and FBLearner Flow.
- 3) integrate more effectively with the Messenger API and GUI, so that bots will be smart by default.
- 4) incorporate our learnings from Stories, witty-fiddle, etc. into new product features to make developers’ lives easier.
That’s it for now, please join us in our FB group to provide feedback. We’re looking forward to interacting more with the community and improving Wit!
After bringing him in to meet the team and sending him off with his new Oculus Rift and Oculus Touch last week, we caught up with Dhanush to find out a little more about him, his interest in bots, and what he learned from the contest. Here’s what we learned:
Tell us a little about yourself.
I’m currently a second-year computer science student at Diablo Valley College and planning to transfer to a 4-year university this fall. In my spare time, I make Android apps, Alexa skills, and bots – and I’m hoping to get into machine learning in the next year. I usually do 1 or 2 hackathons per month. Basically, I enjoy coding, and I spend a lot of time on it.
Where are you from?
I currently live in the East Bay. I was born in India, but I moved here when I was two.
How did you discover Wit?
I attended F8 in 2016 and that’s where Mark Zuckerberg announced the Messenger bots for the first time publicly on stage. It was really exciting to see this announced and, of course, they talked about Wit.ai. Although I didn’t know Laurent and the team at the time, I remember visiting their booth. And that’s why I used them for my bots when I decided to make them.
Was this your first bot?
This was my second bot. The first one I made was just a few days before. I built my first bot, Kuakey, to tell you the last earthquake that happened in a city or state using USGS API.
Tell us about what you built for WittyCup.
Calo is something to help people in their everyday lives. I wanted it to be useful. Basically, what it does is search through Eventbrite, Yelp and FB to give you events and food place recommendations. It goes through those APIs and brings back results based on user specifications. Even if the user doesn’t specify a search criteria, the bot will still find place to eat or events to attend.
What did you learn?
Obviously, I learned how to use the Wit platform. And after meeting the Wit team, I learned I was using 1 or 2 things incorrectly. I used search_query for when the user specifies search type (food or event) and local_search for cuisine, but from what Laurent said, those are for general use and if you want to get better results you should make your own entity type for cuisine. The default types pull data from all developer projects, so it wouldn’t be as perfect as if you had made your own custom time.
Any best practices you’d recommend for fellow builders?
Make the bot more naturally interactive instead of structured. I’m not an expert yet – though I hope to be. But I think from my experience so far, the one thing I would say is that you have to account for the different ways the user wants to search something in your bot. You may have a flow that you imagine the user going through, but the user might query in unexpected ways. Not making your bot too rigid allows users different ways to express the same things. They might say “Mexican cuisine” or just the word “Mexican” or a “I really want Mexican right now,” and this is definitely addressable with Inbox and Understanding tabs in Wit.
What’s your next bot?
It would probably be something along similar lines (specifically focused on food or events). In general, the type of bots I would create are utilitarian ones – stuff that people would find useful in their everyday lives.
What’s your favorite bot that you’ve used?
I’ve only used a few bots so far but probably one called Sensay. I read about it online and happened to meet the founders at TechCrunch Disrupt. I thought it was cool that you could talk to other random people online.
Big thanks to Dhanush for using Wit and spending time with our team.