AFM-4.5B is a 4.5 billion parameter instruction-tuned language model developed by Arcee AI. The model was pretrained on approximately 8 trillion tokens, including 6.5 trillion tokens of general data and 1.5 trillion tokens with an emphasis on mathematical reasoning and code generation.
Recent activity on AFM 4.5B
Total usage per day on OpenRouter
Prompt
184K
Completion
47K
Reasoning
0
Prompt tokens measure input size. Reasoning tokens show internal thinking before a response. Completion tokens reflect total output length.