Video: Live Lab: How to use Prompt Lab and writing prompts | Duration: 1996s | Summary: Live Lab: How to use Prompt Lab and writing prompts | Chapters: Welcome and Introduction (13.464999s), Creating Summary Field (272.54498s), Configuring AI Action (458.9s), Contract Summary Population (775.525s), Customizing AI Prompts (1078.995s), Conclusion and Q&A (1689.7949s)
Transcript for "Live Lab: How to use Prompt Lab and writing prompts":
Alright. Just give me another minute, and then we'll get started. Okay. Let's get started. Hi, everybody. Welcome to our first LifeLab webinar. Today, we're focusing on how to use the problem app within, the Agiloft system. A few, things to note before we go into the agenda. This will be recorded and emailed to you tomorrow. Please use the chat for any discussion during the webinar, for your questions in the q and a pane. And, I'll take you into in a second, I'll take you into the Azure community to share some extra information that's relevant to this webinar. My name is Jack Wicks, and I'm a senior product manager here at Agiloft, and I'm responsible for all things AI in the Agiloft product. I'm joined today by, Maya Lush. She's a senior legal knowledge engineer, and she brings her legal experience into the Agiloft technology, into the Agiloft product, and she'll be co hosting with me today. I'm gonna be taking us through how to configure an Agiloft, prompt and and prompt land. And Maja's gonna be taking you through some prompting tips and tricks to help you improve those prompts, in these use cases and to take those learnings into other use cases as well. I'm gonna jump us into the Agiloft community to start. So, if you've been in the Agiloft community before, that's great. If not, go to community.agileoft.com or just search Agiloft community, and join us. We're talking about all things Agiloft in there. And I just wanna call attention to, the channels drop down and the AI on the inside channel. So this is a place in the community to talk about everything AI and and Gen AI. And I'm also gonna call to attention the library. So in the library, we are adding, instructions, a guide to how to use, all of our out of the box Gen, GenAI PromptLab prompts. And we're actually gonna be going through two of these today. And if you were to download this PDF, you could follow along. So you can take perhaps a a recording of this call, and the download of this PDF and be able to, add these to your own Agiloft system. So, today, we're going to be working on creating a contract summary and a data and document data and document mismatch prompt. The contract summary, will generate a GenAI generated summary to describe, to, summarize, the contract document, in the contract that you're looking at. And the data and document mismatch will compare the information in that contract document against the metadata that's stored in your Agiloft system so you can, ensure and check that the data that you've stored in Agiloft, the data that perhaps you're reporting and searching on actually matches what's in the agreement. This isn't especially, especially effective or important for, legacy documents that may have not had all that information uploaded correctly, or perhaps, you want to double check what a requester is adding compared to what the actual contract says or even, contracts that have evolved and perhaps the metadata in Agiloft has fallen behind and fallen behind and and gotten out of sync. So we're looking at both of these, first from how to configure them, and then Mara will take us through, how to improve the out of the box prompts. And we'll have a look at that out of the box prompt during the configuration. So let's start with contract summary. I'm gonna jump into my Agiloft system. I know that everyone's Agiloft system is a little bit different, so you'll see similarities, with the one that I'm using today and differences, but you should all be able to follow, this process. So every time we wanna add a new prompt from the Agile prompt lab, we need somewhere for the prompt to output into. So the first thing we're gonna do is create our new field for our summary to exist in. You might have an existing field that you want the summary to be posted in, so you might skip this step. But for me, I'm gonna create a new text field, and I'm gonna call it contract summary. We're gonna just use some, default values here in this text field. You might wanna go through and and change some of this. I'm just gonna quickly copy. I like to copy the, contract amount field permissions. And for the display, let's make it three rows, 90 characters, hit finish. We did recently improve quite a few of our, default values when creating fields, so you can skip quite a lot of those, quite a lot of those fields, and stick with the default values. Okay. We've added our contract summary field. That's where our summary will be outputted to. Let's add that to the layout as well. So we'll go into our layout. We'll choose a place. Let's put it perhaps below contract type. That seems good. And we'll search for our summary field, and we'll drag and drop that in, and let's save that. So we've got our field. We put it on the layout. Now we need to, populate that field. So I want my contract summary to be populated, every time a new contract's uploaded and there isn't a summary there yet. Let's populate the summary. Another option could be with an action button, and we'll do that for the other use case. So, let's go into our rules tab, and the easiest rule to edit to do this is probably the all edits rule that runs every time a record is edited. Let's find that rule. This one, most edits by Webware API. Yep. We'll go into actions, and let's edit this rule. So this is a rule that runs every time a record is edited, and we can use this to, add a new section that says, if the record is edited and there's an active attach or there's an attachment and there's no summary, let's generate our summary. So ladder if condition, and we'll say, the latest attachment, which is our attachment that we want to use to generate the summary, we'll say if that is not empty, and if our new contract summary field is empty, then we'll run our, our Gen AI action. So attachments not empty, summary is empty, then let's create an action to populate that summary. So we're finally gonna be creating our generative AI prompt lap action. You'll see it down here as generative AI action. Click into that and we'll give it a name, generate contract summary with AI. We'll probably wanna give it a description there, but we'll skip that, for today. And, in the templates tab of the generative AI action, we've got listed all of our existing templates, but we're gonna go ahead and use an out of the box template. So if you click through into the prebuilt by Agiloft templates, you can see our list of out of the box templates. These go onto the next page as well, and we're gonna be using our simple contract summary. So click on that and then copy it over to your edited templates, and we can give it a rename. I'll just put my name at the end so I know this is the one that I'm editing. You get to pick your model. So right now, the choices are GPC 3.5 and Claude, three Haiku. We'll be adding new models to this list as well. I'm gonna pick Claude Haiku. It's got a larger context window, which means it can accept larger documents. So this is probably the one that you would you would pick, right now. If you wanted to bring your own generative AI model from OpenAI or AWS Anthropic, or Azure OpenAI, you could click into existing, enterprise account and add it there. So, I've chosen my model. Now let's go through to the prompts. We've got a bit of legal spiel here to make sure that, you understand what you're doing by using generative AI. We don't want that in the prompt itself, though. So once we've read that, let's delete that, and then we can look at our prompt. So we're not gonna make, any changes really here to the content of this prompt. Maya's gonna be covering that, but what we are gonna do is configure this action. So we need to pick an output field. Luckily, we already created our contract summary output field. And then we need to give it a quick test. So what I'm gonna do is choose a contract in my Agiloft system. I'll just pick the top one. Doesn't really matter as long as it has an attachment, and I'll hit generate. And we're gonna look for two things. We're gonna look for the quality of the output, and we're gonna look at the quality of the input as well. So right now, we have the output. When we're testing this, we don't really wanna look at the output first. We wanna look at the input to make sure that we've got our prompt correct. So let's hit show process. And what that does, if we scroll down, it shows us exactly what was sent in the input right here. And we can see, okay. This is our prompt. This looks correct. And then right at the bottom, we see where the contract is supposed to go. There's there's nothing listed there. So we actually haven't sent the contract as part of the input, so we need to do we need to do a little bit of troubleshooting here and find out what went wrong. So let's scroll up. Let's have a little look. Okay. We've got this Agiloft field, reference called current contract document. Now like I said at the start, everyone's Agiloft system is a little bit different. So I think that this field doesn't actually exist in this Agiloft system. So let's have a look. Let's look up our fields, and we got into that by hitting formula help. And these are the lists of our fields in Agiloft. And if we do a quick control f for current contract contract document, we can find that's not a field in the system. Therefore, no data was pulled over into our prompt. Now, I know the name of the field we should have here, and that's latest contracts or latest attachment. Latest attachment right there. You might have to look this up in the field wizard. But chances are in your KB, it's either gonna be called current contract document or latest attachment. Latest attachment was the name of it in, sort of an older contract template, and current contract document as if you've got a a newer Agiloft. So let's copy that in. And what this references is the last attachment that was uploaded to this contract. So, we've got latest attachment. Let's hit generate again, and then we'll have another look at the input and then hopefully look at the output, as well. It's taking a bit longer this time. That probably means it's doing the right thing. Okay. So we've got our output. Scroll down to our input, and we can see here after the word contract, we've actually got the entire contract document there. So the correct, context, the correct input was sent to AI, and we could look and evaluate the output now. But, I'll leave that for for Maya to discuss sort of, how you might make improvements to this to improve the output. Let's hit finish. That's gonna save oh, okay. That's no. That's that's what I get for giving me a live demo. This instead is gonna give me an error. Okay. Let's see if I can correct that during, might have to recreate that action real quick. Let me check. Generate a summary. How far through the save did I get? Okay. So it saved it and just gave me an error. Okay. Error again. Okay. That's fine. It's saved. We'll hit finish to add that action into our, rule, and we'll hit finish again to save that rule. And now we can test it. So let's go into a contract, and we'll create a new contract. Let's just fill out the minimum amount of information to do this. Contract summary, make it a third party agreement, and let's drop a file in. We'll hit upload and analyze, and we can see our contract summary field here is blank. That's the new field that we added. And, this upload and analyze button will upload this, file to our KB and will hopefully also cause that edit rule to run, populating that contract summary. I think this upload and analyze button also runs a bunch of other AI extractions in this KB. Of course, your KB might have, different, contract submit buttons than we have here. As this, takes a little bit to run, I'm not sure what else it's doing in this KB, but it takes a little bit to run. It's probably a good, time to point out that it's worth testing that prompt multiple times with different contracts when you're in the prompt lab, to make sure you're getting that output that you're looking for. You wanna be creating these prompts so they're generic enough to be working with, different agreements. Alright. While this is running, I'll just show you an example of one of the PDFs here. In fact, we'll look at our contract summary. So you can go through to these, PDFs in the Agiloft community and click through into them and open them up. And you can see here for each of these, and this is quite a simple one, it's walking through the steps that we just walked through, creating a new field, editing the, all actions with API, and, and, creating that prompt as well. Explains, like, what it solves, how it solves. We've got these, each of these for, well, for how many is that? Nine. For six of them, six of the out of the box prompts that we have today, we'll be continuing to add them for the remainder prompt remainder of the prompts. Let's have a look back in here and oh, our contract summary didn't populate. I think we can force it in a second. Let me make a quick edit and save that again, and I think that should, cause the all edits rule to prompt to run. No. No. Didn't work either. That's fine. Let me just do a real quick bit of troubleshooting to fix this. I love live demos. We get an error in the miss in the middle. Okay. Back into our edit action. Let's just take this out of our if condition. Maybe that was the problem. And test this again. I'll make another edit. Save it. Still nothing. Okay. That definitely should have worked. Probably something with the error message that I was getting when I was saving the summary. Let's have a quick check again. Alright. One last bit of troubleshooting before we move on. Alright. There we go. That was it. Something wrong with that action when I was creating it. So we've got our contract summary now populated into our Agiloft field, into our field in Agiloft, and we could, for the contract summary, we might want to display this in different places. So right now, it's on that record. As soon as you open the record, you can view the summary. Perhaps you want to include this field in notification emails. You could include it so people could get a a quick idea on what this contract's about. Approvals are a great place for this, and I think that the final great place to show this contract summary is in the, esignature envelope. So you could add edit the template for the esignature envelope and, include this for internal signers and perhaps create a different summary for external signers and and talking about creating different summaries for different roles, I think I'll hand it over to, Maya to talk about how you would change, how you would modify the prompts. Thanks, Jack. Okay. So what I'm going to talk a little bit about now is how we can customize prompts, customize our out of the box prompts or create new prompts in order to get more targeted results. So Jack already took us through a bunch of the possibilities and the use cases that we can do with our prompts. Whether we are modifying them or we're starting them from scratch, the best practices will always remain the same. So we are back in our out of the box, contract summary prompt, and what we can do is easily tailor it for specific use cases. The key thing that we want to think about when it comes to prompting is our prompts need to be really clear and direct, almost as if we are giving instructions to a junior team member. We want to make sure that there's no subjectivity, no room for interpretation, and we give the AI the exact instructions that it needs to be able to follow the instructions consistently and properly. So like we have here, you can see we have a number of elements, and these are the kind of elements that we always want to consider including in our prompts that lead to us getting the most effective results. So that would be the task, and, optionally, we have a rule here. Then we want to make sure that we provide inputs and context. We add all of our rules, and we specify exactly what output we're expecting. And then like Jack said, we're gonna be validating and making sure that that everything is working. So the framework, again, is really important to follow. We want to make sure that those elements are included, at least to some degree, based on what the use case is. So maybe, again, out of the box prompt, maybe we just want to make a simple change and see clause numbers or section numbers, and that would be as simple as basically just instructing that again as long as we are being clear. So something like, include the clause number or section heading where the information was found. Something really simple like that. Or maybe instead of a list format, we want a narrative summary depending on what we're using it for. Again, we just have to be clear with those instructions, provide a four to five sentence summary in plain English. And just making sure that we're changing the rest of the prompt to meet the new instructions. So these kind of small changes will really have an effect on the output. So, thinking about what we're trying to achieve with those prompts and then instructing exactly how we want to see them. And one way to do this is to generate summaries for different audiences. Maybe we want one for the legal team and one for procurement. So all we would have to do is go in and copy our out of the box prompt. That's where I started from. You can see the structure is very much the same. But this time, what we're doing is adjusting our template to ask it to focus on risk or what the legal team might care about. And so this output, again, you see the same structure. It includes all of those key elements. But this time, we are asking for a different output to target our legal team. And, again, this can be expanded and changed, depending on why we want it. And quickly, I'll just have a look of the same sort of thing, but, again, working in a procurement use case. Very similar. We're using the same format. But, again, we can just change just by changing these few levers, even from our out of the box prompt, we can see pretty significant differences in the results of the output that we are getting. So, again, when prompting, just thinking about all of those different levers, seeing what you want, thinking about it that way, and making sure that you're instructing really clearly and directly for the output and for the exact task that you're asking the AI to do. Now I will send it back over to Jack to continue to take us through. Yep. Thanks, Maya. Hi, everyone. So in the interest of speed, I've precreated some of the, an Agile field for the data mismatch prompt. So let's talk about the data, document and data mismatch prompt. So the idea here is to use AI to identify the differences between the document and the data stored in Agiloft. Now in Agiloft, setting that up, it's the same process as contract summary. So we need to create a new field for document mismatch. I've already created that field. And then in this case, let's create an action button instead of a rule to run our ComptLab action. So I'll go back into fields, and I'll create an action button, and we'll call it, run document data mismatch prompt, and we'll create our action. So we'll create a new generative AI action. We'll call it run document data mismatch prompt, and we're back in our GenAI templates. We'll go into our prebuilt by Agiloft, and we'll choose our validate contract fields prompt, our document mismatch prompt, and copy it into Agiloft system. We'll give it a name. We'll make sure we've chosen Claude Haiku, the the model with the largest input value, and we'll go into prompt. Delete our legalese at the top, and we'll remember from last time that we need to choose the correct field that our contract is stored in. So fields, probably gonna be called latest attachment in your Agiloft system. Click on that, and we'll pick our output field. So, like I said, I've created a, an output field already for this. It's the same as the summary field. It's a text field with with, three three rows. And we'll do our quick test. And with everything, we'll check our output, and we'll check our, input as well. So quick look at the input. We can see we've got our contract here, but we're supposed to have some fields in here. Now it might be that these fields, that are referenced over here are empty. That's the case for this example, or it might be that they're referenced incorrectly. I would probably want to test this prompt with other Agiloft contracts in the system to see if it's either they're referenced incorrectly or there's just no value. In this case, there is no value. They are referenced correctly. So let's save our prompts. And, again, Maya will go back into making any edits here. We'll save our prompts. We'll get that error that we saw last time. Yes. That has saved it, though, so let's go and find our document mismatch prompt. Oh, I guess it hasn't saved it. There we go. Now it saved it. Run document mismatch prompt. That's gonna run when this action button is clicked. I said about the default values earlier. Default values for action buttons are really good now, so we can just save that, and test it as an admin. I'll add that button to our layout, in a new tab that I've added called document mismatch. Let's put that action button and the, output field on our layout. And I'm actually gonna go one step further than this because we're gonna want to run this prompt on multiple records in our Agiloft system. So I'm gonna add that action button not only to the layout of one record, but also to the action button or the action bar in our contracts table. Now this is a little tip if you didn't know you could do this. You can add action buttons to your action bar, and I'll show you where this, shows up in the Agiloft system. So let's find our data mismatch, action button. We'll just add it to the end of the, action bar and hit finish. And by adding this to the action bar, when I go into the contracts table, you'll see that I've got this action up here. And if I was to select multiple records now, I can select those and run this prompt on all four of those records. So I might be doing a search to find a whole bunch of legacy contracts that I've uploaded that have, an incorrect that have potentially incorrect values, and then I can run the action or the prompt on multiple contracts at once. And we'll see, in a second. If I refresh this, the outputs will be, on the screen. So let me just, refresh this view, and you can see we've got the outputs here. And in these cases, there are errors found for each of these. I think the prompt replies no errors found if there were no errors. So we can use this and filter perhaps on on this field to find the ones that have errors, and we could click in and review them. I'll hand you back over to to Maya to, talk about editing this prompt. Thanks, Jack. And I do want to make sure we leave a little bit of time for questions, so I'll just go through this really quickly. But remembering how we did it with the contract summary prompt, again, this time we're starting with a the out of the box data mismatch, prompt. And by following that structure and rules, again, we can see how we can change the use that we want this for. So Jack took us through what you'll see with the out of the box prompt. Maybe I want to see more detail, and I want to see matches, mismatches, not found. And, I wanna see suggested changes, and I wanna see all of the records that are in the contract or in the KB. So those are all things that I can prompt for simply by including or changing the task, the context, the rules, and format. So you can see here how I've added in additional rules telling the AI how it should answer this question and then also giving it a different format that I want to see. And when specifying formats as well in any sort of structured format, and it can be really helpful to give examples. So the more context and the clearer context that you give, the more consistent and reliable your outputs will be. I will stop sharing so that we can open it up for some questions. Jack, back over to you if you have anything you wanna add or just take us right through the questions. Yeah. Thanks, Maya. Yeah. I'll take us through some questions, couple of questions that are coming in. One of them first one for me, when can we expect new models in our prompt lab? So we are working on releasing, GPT 4.1 Mini and GPT five Mini into the Agiloft provided models. The plan is for October. We should be on track for that right now. And those new models are I think when you when you look at the data, they're they're almost, like, 10 times smarter than GPT 3.5, and and smarter than call free Haiku as well. So that will unlock some, new exciting prompt lab use cases. You might be able to use it for more complex documents and more complex prompts, as well. You still have to be wary of the context window size, so that's the size of the document that you can, submit in one request. It still hovers around sort of a 100 to a 150 pages, for those models, so keep that in mind when you're working with larger contracts. But look out for new Agiloft provided, GPC models coming soon. Another question for, or a question for Maya. How can I improve my prompting, skills? Like, what resources or information? Like, what's the best method to improve prompting? The number one best method is really practice. You start to learn a lot the more that you iterate and you practice your prompting and you see the results and assess those results. You really start to get a feel for what works best and what doesn't. There are a few slight nuance differences. This is when we kinda go into really kind of technical prompting, but between best practices prompting using Claude and GPT, there are some differences and distinctions which I didn't get into today, but some differences that you can use to even if you know you're gonna be using one of those models, you can, structure your prompt in a way that best suits that specific model and helps you to, most accurately get what you want out of that model. And what I would do in order to learn what those best practices are besides, of course, practice is there are so many resources online. Both OpenAI and Claude have a ton of just very simple, short form, almost bulleted list of the best practices to be able to use each one of those, and just learning from those and continuing to practice. Yep. Definitely. And and I used all of those resources during my AI journey as well, so I'm very familiar with those. Good shout out. Okay. It looks like those were the two questions that we had today. If you've got future questions about this, jump over to the Azure community and and post in there. Myself and Maya will be will be in there answering questions and, so will the rest of the community. So, explore what other Agiloft customers, Agiloft other Agiloft users are doing with PromptLab. Look out for new prompt templates. Yeah. Join us over there. Thank you very much for joining the first, LiveLab session. Look out for future LiveLab sessions. I believe they're happening, every month, so look out for information and invites for them. Thanks everyone. Have a good, rest of your day. Bye.