Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix inference tests in CI #1225

Merged
merged 7 commits into from
Nov 6, 2023
Merged

Fix inference tests in CI #1225

merged 7 commits into from
Nov 6, 2023

Conversation

goliaro
Copy link
Collaborator

@goliaro goliaro commented Nov 5, 2023

Description of changes:

PR #1219 changed the prompt used in the CI tests, resulting in the generated text starting at the very first line of the output. The bash files checking the alignment between FlexFlow and Huggingface, however, still assume that the generated text starts at the second line of the output, so the first line of output is not checked. This PR fixes this bug.

In addition, this PR fixes several latent issues (which we had postponed for a long time) related to the llama tokenizer. And we go back to using the original JackFram/llama-160m (instead of JackFram/llama-160m-base) after aligning the configs with the official LLAMA model (see here)

Related Issues:

Linked Issues:

  • Issue #

Issues closed by this PR:

  • Closes #

This change is Reviewable

@goliaro goliaro enabled auto-merge (squash) November 5, 2023 04:10
@goliaro goliaro merged commit b0fe522 into inference Nov 6, 2023
45 checks passed
@goliaro goliaro deleted the fix_ci branch November 6, 2023 02:45
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant