8000 GitHub - Cybererer/ChatEV: ChatEV: Predicting electric vehicle charging demand as natural language processing
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content
forked from Quhaoh233/ChatEV

ChatEV: Predicting electric vehicle charging demand as natural language processing

License

Notifications You must be signed in to change notification settings

Cybererer/ChatEV

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

34 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ChatEV

This is a very simple implementation of utilizing large language models (e.g., Llama-3.2-1B-Instruct) for time-series forecasting in the scenarios of electric vehicle charging. If it is helpful to your research, please cite our papers:

Haohao Qu, Han Li, Linlin You, Rui Zhu, Jinyue Yan, Paolo Santi, Carlo Ratti, Chau Yuen. (2024) ChatEV: Predicting electric vehicle charging demand as natural language processing. Transportation Research Part D: Transport and Environment. Paper in TRD

1. Environments

For simple implementaton, we need five major packages, namely torch, pandas, numpy, transformers, and argparse. You can install these useful wheels by:

pip install -r requirements.txt

2. Meta-Llama hf_token

To get access to Meta-Llama models, we need to apply a "hf_token" key through https://huggingface.co/settings/tokens

Then replace input a correct "hf_token" in Line 99 of the "utils.py" file.

hf_token = "Your_HF_TOKEN"

ps: Or you can download a local model through https://www.llama.com/llama-downloads/

3. Simple Implementation

To conduct a simple implementation (inference only), we can run the "simple.py" file.

python simple.py

To conduct a simple finetuning implementation, we can run the "finetune.py" file.

python finetune.py

4. Full Implementation

All code for a complete implementation of ChatEV (including finetuning, validation, and testing) is included in the "code" folder. Besides the five packages for simple version, more environments are required for the full implementation: [argparse, lightning, scikit-learn].

pip install argparse, lightning, scikit-learn

Please remember to change your path to the "code" folder.

cd code

Also replace input a correct "hf_token" in Line 228 of the "model_interface.py" file

hf_token = "Your_HF_TOKEN"

We can run the "main.py" file to fintune a Llama model for EV charging data prediction:

python main.py

5. Alternative Configurations

  • If you wanna load a checkpoint for finetuning, you can do so:
python main.py --ckpt --ckpt_name='last'
  • If you wanna load a checkpoint for testing, you can do so:
python main.py --ckpt --ckpt_name='last' --test_only
  • Train the model in a few-shot scienario:
python main.py --few_shot --few_shot_ratio=0.2
  • Train the model using a simple and effective meta-learning approach, First-Order Reptile:
python main.py --meta_learning

More configurations can be found in the "parse.py" file.

5. Questions

  • If you uncounter a problem of slow downloading, you can set a mirror source at the command terminal:
export HF_ENDPOINT=https://hf-mirror.com

About

ChatEV: Predicting electric vehicle charging demand as natural language processing

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%
0