this post was submitted on 11 Jul 2023
11 points (100.0% liked)

Natural Language Programming | Prompting (chatGPT)

1 readers
1 users here now

Welcome ton !nlprog where anything Natural Language Programming related is game, prompts, ideas, projects and more for any model are welcome.

We follow Lemmy’s code of conduct.

Communities

Useful links

founded 1 year ago
MODERATORS
11
GPT-4's details are leaked. (threadreaderapp.com)
submitted 1 year ago* (last edited 1 year ago) by [email protected] to c/[email protected]
 

Parameters count:

GPT-4 is more than 10x the size of GPT-3. We believe it has a total of ~1.8 trillion parameters across 120 layers. Mixture Of Experts - Confirmed.

OpenAI was able to keep costs reasonable by utilizing a mixture of experts (MoE) model. They utilizes 16 experts within their model, each is about ~111B parameters for MLP. 2 of these experts are routed to per forward pass.

Related Article: https://lemmy.intai.tech/post/72922

top 2 comments
sorted by: hot top controversial new old
[–] [email protected] 4 points 1 year ago

I understood about 1/10th of the article. It’s crazy how complex this is and I wish I understood it better.

[–] [email protected] 2 points 1 year ago