palm2 chat fine tuning with custom dataset

I am fine tuning palm2-chat bison model with my custom dataset. how can I reduce the resources used during tuning of the model as my dataset is very small. 

Solved Solved
0 1 1,494
1 ACCEPTED SOLUTION

Before fine-tuning, explore prompt engineering, zero-shot, and few-shot learning for better accuracy with a smaller dataset. You can choose the number of training steps and tuning hardware (8 A100 80GB GPUs or 64 cores TPU V3 pod) based on your region (us-central1 or europe-west4).

I hope this is helpful!

View solution in original post

1 REPLY 1

Before fine-tuning, explore prompt engineering, zero-shot, and few-shot learning for better accuracy with a smaller dataset. You can choose the number of training steps and tuning hardware (8 A100 80GB GPUs or 64 cores TPU V3 pod) based on your region (us-central1 or europe-west4).

I hope this is helpful!