In this tutorial, we will walk through a step by step tutorial on how to fine tune Mixtral MoE from Mistral AI on your own dataset.
LINKS:
Colab (free T4 will not work): [ Ссылка ]
Mistral 7B fine-tune video: [ Ссылка ]
@AI-Makerspace
Want to Follow:
🦾 Discord: [ Ссылка ]
▶️️ Subscribe: [ Ссылка ]
Want to Support:
☕ Buy me a Coffee: [ Ссылка ]
|🔴 Support my work on Patreon: [ Ссылка ]
Need Help?
📧 Business Contact: engineerprompt@gmail.com
💼Consulting: [ Ссылка ]
Join this channel to get access to perks:
[ Ссылка ]
Timestamps:
[00:00] Introduction
[00:57] Prerequisites and Tools
[01:52] Understanding the Dataset
[03:35] Data Formatting and Preparation
[06:16] Loading the Base Model
[09:55] Setting Up the Training Configuration
[13:22] Fine-Tuning the Model
[16:28] Evaluating the Model Performance
All Interesting Videos:
Everything LangChain: [ Ссылка ]
Everything LLM: [ Ссылка ]
Everything Midjourney: [ Ссылка ]
AI Image Generation: [ Ссылка ]
![](https://i.ytimg.com/vi/RzSDdosu_y8/maxresdefault.jpg)