Skip to content

Commit

Permalink
Create 2024-03-25-ziems.md
Browse files Browse the repository at this point in the history
  • Loading branch information
Ziems authored Mar 25, 2024
1 parent 963031d commit 0365777
Showing 1 changed file with 16 additions and 0 deletions.
16 changes: 16 additions & 0 deletions _posts/2024-03-25-ziems.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
---
layout: post
title: Noah Ziems
---

Lunch at 12:30pm, talk at 1pm, in 148 Fitzpatrick

Title: Inducing Complex Instructions

Abstract: Instruction following has emerged as an effective alternative to fine tuning large language models (LLMs), as it is interpretable, requires fewer resources, and requires less expertise to implement than fine tuning.
In practice, however, many tasks require complex instructions made up of many smaller atomic instructions, which are difficult to write with limited prompt engineering experience.
Further, the more complex the task is, the more trial and error is needed to accomodate the corner cases.
Existing approaches to this problem use Instruction Induction where input output pairs for a given task are provided with the goal of learning the instrucion that maps from inputs to outputs. However, existing approaches significantly struggle to induce and effective instruction for complex tasks.
In this presentation, I cover the existing approaches to Instruction Induction/Prompt Optimization, I introduce my proposed approach for their limitations, and I explore the exciting downstream datasets that can be created for complex instruction induction.

Bio: Noah Ziems is a second year PhD student in the Department of Computer Science and Engineering at the University of Notre Dame, and is a member of Dr. Meng Jiang's DM2 lab. His research is focused on instruction induction/prompt engineering, large language models, and information retrieval.

0 comments on commit 0365777

Please sign in to comment.