Menu
in

Fine-tuning Wikipedia Datasets for improved accuracy and performance #NLP

The content provides information on accessing complete scripts and future improvements through Trelis, as well as one-click fine-tuning and LLM templates. Trelis Livestreams are available on Thursdays at 5 pm Irish time on YouTube. The newsletter and resources/support/discord can be found on the Trelis website. Video resources include slides, datasets, and WikiExtractor. The timestamps outline the topics covered in the video, such as fine-tuning Llama 3 for a low resource language, creating a HuggingFace dataset with WikiExtractor, fine-tuning setup with LoRA, dataset blending to avoid catastrophic forgetting, trainer setup and parameter selection, inspection of losses and results, learning rates and annealing, and further tips and improvements. This content is a valuable resource for those interested in fine-tuning language models and working with low resource languages.

Source link

Source link: https://www.youtube.com/watch?v=bo49U3iC7qY

Leave a Reply

Exit mobile version