Microsoft University of Washington medical paper accidentally exposed the GPT-4, GPT-4o, o1 series of model parameters, shockingly, the GPT-4o mini is only 8B. some users speculate that the 4o mini is a MoE model with about 40B parameters, of which the activation parameter is 8B. the paper disclosed the parameters of each model are as follows: the GPT-4 parameters are about 1.76 trillion; GPT-4o parameters are about 200 billion; GPT-4o mini parameters are about 8 billion; o1-preview parameters are about 300 billion; o1-mini parameters are about 100 billion; and Claude 3.5 Sonnet parameters are about 175 billion. (NIC)
❯
Search
Scan to open current page
Top
Checking in, please wait
Click for today's check-in bonus!
You have earned {{mission.data.mission.credit}} points today!
My Coupons
-
¥CouponsLimitation of useExpired and UnavailableLimitation of use
before
Limitation of usePermanently validCoupon ID:×Available for the following products: Available for the following products categories: Unrestricted use:Available for all products and product types
No coupons available!
Unverify
Daily tasks completed:
