GPT-like large language models can run on your laptop. Meta’s LLaMa was quickly jumped by the open-source community and enhanced to be something more. After some clever engineering, you can run the 65 billion parameter model on an M1 mac with a ton of ram and get performance close to GPT-3. Thanks Georgi. After some more clever engineering, you can install an npm package. After some MORE clever engineering, you run the 7 billion parameter version and get performance close to GPT-3 and ChatGPT, on a Raspberry-Pi.
Current Sub Count: 8,110
Business Email: sid@siddhantdubey.com
Patreon: [ Ссылка ]
🤖 Join my discord server: [ Ссылка ]
📸 Instagram - [ Ссылка ]
🐦 Twitter - [ Ссылка ]
💻 GitHub - [ Ссылка ]
🎵 Follow my TikTok: [ Ссылка ]
Music In This Video:
Epidemic Sound: [ Ссылка ]
(I do get benefits from the above link)
WHO AM I?
I'm a second year at Georgia Tech majoring in Computer Science. On this channel, I make videos related to machine learning and programming, lifestyle vlogs, productivity tip videos, and more! If you're interested in that, be sure to hit the subscribe button and leave a like on this video! Subscribe here: [ Ссылка ]
Ещё видео!