Skip to content

Latest commit

 

History

History
39 lines (24 loc) · 1.78 KB

README.md

File metadata and controls

39 lines (24 loc) · 1.78 KB

Meta-Learning with Latent Embedding Optimization

Overview

This repository contains the implementation of the meta-learning model described in the paper "Structured Prediction for Conditional Meta-Learning". It will be presented at NeurIPS 2020.

The paper learns a task-conditional meta-parameters using structured prediction, which considers the meta-parameters as structured output.

The code uses the same pre-trained embedding as provided in Google's LEO paper.

Running the code

Setup

To run the code, you first need to need to install:

Getting the data

The code looks for the extracted embedding directory at ~/workspace/data/embeddings by default. This could be changed in config.py.

Running the code

Then, clone this repository using:

$ git clone https://github.com/deepmind/leo

Step 1: to construct a meta-train set of fixed size and compute the task similarity matrix, run

$ python runner_tasml.py gen_db

Step 2 (optional): To train an unconditional meta-learning model for warm-starting structured prediction, run

$ python runner_tasml.py uncon_meta

To run structured prediction on 100 targets tasks, given a fixed size meta-train set (Step 1) with warmstart (Step 2), run $ python runner_tasml.py sp

This will train the model for solving 5-way 1-shot miniImageNet classification.