Llama-3-8B-Mob is a large language model designed for long-term human mobility prediction across multiple cities. Leveraging instruction tuning, it models complex spatial-temporal patterns in human mobility data to predict future trajectories over extended periods. Our model was validated on real-world human mobility data from four metropolitan areas in Japan, showcasing significant improvements over previous state-of-the-art models.
- Instruction-Tuned LLM: Llama-3-8B-Mob employs instruction-tuning, allowing it to handle mobility prediction in a flexible Q&A format.
- Long-term Mobility Prediction: Unlike most models that focus on short-term prediction, Llama-3-8B-Mob excels in predicting individual trajectories up to 15 days in advance.
- Cross-City Generalization: Fine-tuned on a single city, Llama-3-8B-Mob demonstrates impressive zero-shot generalization to other cities without needing city-specific data.
- Superior Performance: 1st in Mean Rank, 2nd in Trajectory Semantic Similarity, and 3rd in Trajectory Shape Similarity in Human Mobility Prediction Challenge@SIGSPATIAL, 2024.
Dependencies can be installed using the following command:
conda create --name llm_mob \
python=3.10 \
pytorch-cuda=12.1 \
pytorch cudatoolkit xformers -c pytorch -c nvidia -c xformers \
-y
conda activate llm_mob
pip install "unsloth[colab-new] @ git+https://github.com/unslothai/unsloth.git"
pip install --no-deps trl peft accelerate bitsandbytes
conda install -y scipy
pip install wandb
pip install pandas==2.0.0
You can run the demo to experience the predictive capabilities of Llama-3-8B-Mob with the following command:
python demo.py
Note: Enter work
for human mobility trajectory inference, enter chat
to interact with Llama-3-8B-Mob freely.
To get started with Llama-3-8B-Mob, follow these steps:
- Download the dataset from the official source, or use a custom dataset with a similar format.
- Modify the configuration in make_dataset.py, and then execute the script to convert the data into conversation datasets.
python tools/Data_tools/make_dataset.py
- Login your wandb account and try your first own finetuning!
python Finetune_Llama3.py
- Evaluate the performance of the finetuned model!
python Evaluate_Llama3.py
- Infer with Llama-3-8B-Mob.
python infer.py --l_idx <left_index> --r_idx <right_index> --city <city_abbreviation>
LP-Bert is the champion of HuMob'23. The reproduction code of LP-Bert by RobinsonXING can be found here.
If you find anything in this repository useful to your research, please cite our paper :) We sincerely appreciate it.
@article{tang2024instruction,
title={Instruction-Tuning Llama-3-8B Excels in City-Scale Mobility Prediction},
author={Tang, Peizhi and Yang, Chuang and Xing, Tong and Xu, Xiaohang and Jiang, Renhe and Sezaki, Kaoru},
journal={arXiv preprint arXiv:2410.23692},
year={2024}
}