From 03657775a3f2b40df7baede7db0bcefc6c3f9984 Mon Sep 17 00:00:00 2001 From: Noah Ziems Date: Mon, 25 Mar 2024 08:20:35 -0400 Subject: [PATCH] Create 2024-03-25-ziems.md --- _posts/2024-03-25-ziems.md | 16 ++++++++++++++++ 1 file changed, 16 insertions(+) create mode 100644 _posts/2024-03-25-ziems.md diff --git a/_posts/2024-03-25-ziems.md b/_posts/2024-03-25-ziems.md new file mode 100644 index 0000000..a7ef87a --- /dev/null +++ b/_posts/2024-03-25-ziems.md @@ -0,0 +1,16 @@ +--- +layout: post +title: Noah Ziems +--- + +Lunch at 12:30pm, talk at 1pm, in 148 Fitzpatrick + +Title: Inducing Complex Instructions + +Abstract: Instruction following has emerged as an effective alternative to fine tuning large language models (LLMs), as it is interpretable, requires fewer resources, and requires less expertise to implement than fine tuning. +In practice, however, many tasks require complex instructions made up of many smaller atomic instructions, which are difficult to write with limited prompt engineering experience. +Further, the more complex the task is, the more trial and error is needed to accomodate the corner cases. +Existing approaches to this problem use Instruction Induction where input output pairs for a given task are provided with the goal of learning the instrucion that maps from inputs to outputs. However, existing approaches significantly struggle to induce and effective instruction for complex tasks. +In this presentation, I cover the existing approaches to Instruction Induction/Prompt Optimization, I introduce my proposed approach for their limitations, and I explore the exciting downstream datasets that can be created for complex instruction induction. + +Bio: Noah Ziems is a second year PhD student in the Department of Computer Science and Engineering at the University of Notre Dame, and is a member of Dr. Meng Jiang's DM2 lab. His research is focused on instruction induction/prompt engineering, large language models, and information retrieval.