7B model against GPT-4o, the first large model training platform for medical code generation

The University of Texas Southwestern Medical Center and other institutions developed the world's first medical code generation large model training platform MedAgentGym, which solves the problems of data privacy, cost and deployment limitations in the field of medical AI; the platform integrates 72,413 instances of programming tasks covering four major fields such as healthcare information retrieval, data science and so on, and provides containerized isolation environments and interactive feedback mechanisms; after the platform The performance of Med-Copilot-7B model trained by the platform is improved by 42.471 TP3T, which is close to the GPT-4o level, and the success rate can be improved from 171 TP3T to 421 TP3T by the AI validator.

Search