Search code examples
c#botframeworkazure-language-understandingadaptive-dialogs

Bot framework - Adaptive dialog


I am working with bot framework adaptive dialog. I have an issue in getting the intents and resolved entities by reading luis data using recognizer. only getting the top scoring intent in the response by reading "turn.recognized" in the child adaptive dialog.i have migrated my luis to v3 and set the IncludeAllIntents property to true while calling the luis. did i miss to set any property in the LuisAdaptiveRecognizer.? Could anyone help me to resolve this because i have a scenario to check the second top scoring intent in bot. Is this an issue with adaptive dialog?

I have used Ms docs to build the bot adaptive dialog.

And one more thing Is there any way to extract the luis resolved entities as a type of RecognizerResult from the result of turn.recognized.

Root dialog:

var rootDialog = new AdaptiveDialog(nameof(AdaptiveDialog))
{
    Recognizer = new LuisAdaptiveRecognizer()
    {
        ApplicationId = Configuration["LuisAppId"],
        EndpointKey = Configuration["LuisAPIKey"],
        Endpoint = Configuration["LuisAPIHostName"],
        PredictionOptions = new Microsoft.Bot.Builder.AI.LuisV3.LuisPredictionOptions
        {
            IncludeAllIntents = true,
            IncludeInstanceData = true,
            IncludeAPIResults = true,
            PreferExternalEntities = true,
            Slot = "producton"
        }
    },
    Triggers = new List<OnCondition>()
    {
         new OnIntent("Greetings")
        {
            Actions = new List<Dialog>()
            {
                new SendActivity("${HelpRootDialog()}")
            }
        },
    },

Child dialog:

public FindLinks(IConfiguration configuration) : base(nameof(FindLinks))
{
    _configuration = configuration;
    this.LinksDialog = new AdaptiveDialog(nameof(FindLinks))
    {
        Triggers = new List<OnCondition>()
        {
            new OnBeginDialog()
            {
                Actions = new List<Dialog>()
                    {
                        new CodeAction(ResolveAndSendAnswer)
                    }
            },
        }
    };

    AddDialog(this._findLinksDialog);
    InitialDialogId = nameof(FindLinks);
}

private async Task<DialogTurnResult> ResolveAndSendAnswer(DialogContext dialogContext, System.Object options)
{
    JObject jObject;
    IList<string> queries = new List<string>();
    dialogContext.State.TryGetValue("turn.recognized", out jObject);

    ....This is how i resolved the luis data from the turn.
}

Solution

  • Unfortunately, adaptive dialogs are designed to only include one intent in turn.recognized regardless of what kind of recognizer you use. You can see that in the source code here:

    result.Intents.Clear();
    result.Intents.Add(topIntent, topScore);
    

    It looks like the only place the other intents can be accessed is in your telemetry. So you have a few options, though I know they're not ideal.

    1. Call your LUIS endpoint explicitly instead of relying on LuisAdaptiveRecognizer. This could be done using an HTTP request as an action inside your adaptive dialog, or it could be done outside the dialog.
    2. Load the extra intents from the logged telemetry. This would perhaps be easiest if you made a custom telemetry client that made the data available in your bot's local memory.
    3. Make a feature request on GitHub, asking them to make all intents available in adaptive dialogs: https://github.com/microsoft/botbuilder-dotnet/issues/new/choose