So I know I have discussed this before lightly here, but I would like to focus more on the nuts and bolts of an internal search feature in the bot. I thought it could be accomplished by adding a keyword section to each gambit and a short description area (able to be toggled on and off) for us to add that will display to the end user. Then we can add keywords to each gambit and offer a simple search feature off of that. To take it further would be to add a search button that is built into the TARS display for users to select regardless of where they are in the bot. A search feature is typically what other bot techs out their seem to focus on as their primary use-case (I think that is silly personally because there is a search function on our websites, and with API integration we could replicate that in TARS while still having all the other original features other bot techs do not offer), but for large bot builds, an internal search could drastically speed up users getting to the information they need within mega bots. I have noticed trends with users that will back up through multiple conversation flows because they are not quite sure where they need to go. In the beginning I supplemented this by adjusting wording and options, but you can only do so much without affecting your users that CAN find what they are looking for.
With our current bot system BEN, we could use this, but I wouldn’t add keywords to ever gambit, because some areas I don’t want users to jump to out of the conversation flow. Being able to strategically identify which gambits can be jumped too by the user will make the search useful without becoming troublesome. The real question is how to jump to those gambits without interrupting the process by reloading a bot from a startgid, or internally in a bot deployed as a widget or embedded. What do you think about that challenge @vinit?
Now that we have the conditional jump, I believe we could use the auto-suggestion with the conditional jump to accomplish an internal search… Might be a touch difficult, but I will experiment with it. Maybe if you could add an option to include tags (not visible to the user) to a list item, so when the user starts to type it pulls up suggestions based on “tags” and keywords as well, then we use the conditional jump to branch users based on their selection. I believe adding tags would enhance the auto-suggestion in itself as well. Thoughts @vinit?
Interesting idea, i can see that it could be quite useful.
Currently there is a way to add the option val (uval) thing for options in the Auto-Suggestion list by using triple hyphen ---, so if you put this in the auto-suggestion input field list:
You will have the days of the week as the options in the Auto-suggestion list with the numbers set as their Option Vals. So if the value of {{ursp.gid_n}} is Wednesday, then value of {{uval.gid_n}} will be 3
Now this Option Val is hidden from the user, but currently it does not search in that what the user types.
Also Option Val has a specific purpose of identifying the option selected by the user using an ID, so it does not makes sense to put search tags in it. Maybe another field to be added similar to Option Vals for each option, for this purpose.
Yes, I think something similar to uval in implementation that also works with the search in auto-suggestion. My make shift search isn’t perfect by any means, but it is very effective in allowing the user to feel more like they are chatting with the bot. If the user does not select an item from suggested list regardless of what they type, it sends them by default to a duplicate gambit that previews the list. If they still fail to choose off of the list, I give them an option to submit their question to a representative. Like I said, not perfect, but still a good start. Looking forward to growing this capability further!
Man, This is so useful and on point.
Without the hassle of writing much and without the frustration of getting those pesky “I don’t understand you” errors.
I love the simplicity of using this Bot. This can totally replace the FAQ section of a site with a much more interesting and interactive system. It can also contain whole of Knowledge base for something exposed in a much more navigable way.
Thanks! Maybe you can help me with something involving this. I tried to add a final condition at the end of the list that looked for keywords that I identified as being the most likely searched for subject (flight, fly, flying, airport, plane, airline) each keyword was an “OR” condition in the same branch. I had “contains” selected in the hopes that after filtering through all other conditions if the user wanted to know when they needed a REAL ID to fly (the most common question) but typed it in a way that they missed the auto-suggestion on that topic, the condition would pick up those keywords and send them to the info anyways. I couldn’t get the condition to pick up on any words when “contains” is selected. Any idea why this may be?
Hi @Levi,
Your bot construct is really impressive! Good job !
I was talking to clients yesterday and they were actually looking for a similar experience. I will try to implement something similar to my large HR bot in order to provide a smoother experience.
Thanks for leading the way on this ! I would be interested to know if you had any further experience you would like to share since then ?
Cheers,
Kevin
I’ve been experimenting with logic jumps based only off of keyword searches lately. The technique could emulate an actual chat conversation if done correctly. But it would take a lot of strategy on what the bot asks of users, and capturing all the ways they could possibly respond with. The good thing is we receive all the user input, so once a bot is live, we can adjust the keywords to capture what users may have typed that we didn’t think of.
I currently have plans to implement a search-type function in my bot system that works off of keyword searches to jump users to different gambit start points. I believe I will have around 30 possible jump points based solely on what the user types out in their own words. It will take a lot of work to fine tune, but once I have the process down, I could replicate the steps in an article for the TARS community.
That is a really interestng approach towards making a Intelligent looking Bot Assistant. Without any AI/ML, But with a lot of configuration. The good thing about this approach is that you are in complete control of the conversation flow, and it won’t go to some default non-answer.
Indeed I think that as a first step, configuring the bot with complex keywords recognition can provide interesting perspectives while limiting the down side of having keywords unrecognized by the bot. However I can see that becoming an issue with larger bot configuration. In any case, these are the first steps towards the right direction.
To follow up on this project, I started adding keywords more based on user interaction. The keywords (though time-consuming to build out) work pretty well in limiting the amount of users that get auto-filtered to the gambit with the previewed list (or to a second failed attempt prompt that allows them to submit their own question). I accomplished this by:
looking through the user interactions on my dashboard
analyzing the text from users that typed custom text instead of selecting from the list
picking keywords based on user questions and adding them to the list as they came up
Now there is some strategy required when common words spring up in multiple instances where the logic may jump to the wrong answer, but I used AND to clump two or more keywords that would likely be used together to ask a specific question. This seems to work pretty darn well.
I was also asked to help build an FAQ bot for another division. My favorite part of that bot is it prompts the user for their email address to start but also gives them the option to skip. If they skip but end up submitting a question they couldn’t find an answer to later, the conditional logic reads the “pass” (in this case you leave the value empty as I found out), and sends them to a gambit where they are required to enter their email to submit the question. If they already entered the email at the beginning then, no problem! It submits their question without annoying the user by asking to resubmit an email. Pretty smooth trick to keep user experience seamless! Also, this is a great trick to implement for lead gen. By offering a chance to skip in the beginning, it may feel less “pushy” to the user, but once they are sold on your pitch, you can grab their email if they passed in the beginning.
Loved the trick for re-asking the email from earlier. I think this goes back to deconstructing, what is going on the user’s mind when they are in the conversation with the Bot. Try a soft close earlier, and coming back to it later if required by creating more value to the user.
I feel a lot more can be done in this direction. To start with coming up with more such techniques, and then later structure all of them in some framework which results in an engaging conversation.
@Levi, feel free to open a new topic here which discusses this specific topic. I will share some of my own tips that i gathered while making the early bots in code.