github copilot 모델별 토큰 제약 정보

Family Max Context Tokens Max Output Tokens Vision Version
gemini-2.0-flash 1000000 8192 TRUE gemini-2.0-flash-001
o3-mini 200000 100000 FALSE o3-mini-2025-01-31
claude-3.7-sonnet 200000 16384 TRUE claude-3.7-sonnet
gpt-4.1 128000 16384 TRUE gpt-4.1-2025-04-14
gpt-5 128000 64000 TRUE gpt-5
gpt-4o-mini 128000 4096 FALSE gpt-4o-mini-2024-07-18
gpt-4-turbo 128000 4096 FALSE gpt-4-0125-preview
gpt-4o 128000 4096 TRUE gpt-4o-2024-11-20
claude-sonnet-4 128000 16000 TRUE claude-sonnet-4
gemini-2.5-pro 128000 64000 TRUE gemini-2.5-pro
o4-mini 128000 16384 TRUE o4-mini-2025-04-16
claude-3.5-sonnet 90000 8192 TRUE claude-3.5-sonnet
gpt-4 32768 4096 FALSE gpt-4-0613
gpt-3.5-turbo 16384 4096 FALSE gpt-3.5-turbo-0613

GPT 4.1같은 경우 openai api로 사용하면 max context tokens이 1M 인데… 많이 아쉽네요

7개의 좋아요

예전 데이터 기준

[
  {
    "name": "gpt-3.5-turbo",
    "max_context_window_tokens": 16384,
    "max_output_tokens": 4096,
    "max_prompt_tokens": 12288,
    "streaming": true,
    "tool_calls": true
  },
  {
    "name": "gpt-4o-mini",
    "max_context_window_tokens": 128000,
    "max_output_tokens": 4096,
    "max_prompt_tokens": 12288,
    "parallel_tool_calls": true,
    "streaming": true,
    "tool_calls": true
  },
  {
    "name": "gpt-4",
    "max_context_window_tokens": 32768,
    "max_output_tokens": 4096,
    "max_prompt_tokens": 32768,
    "streaming": true,
    "tool_calls": true
  },
  {
    "name": "gpt-4-turbo",
    "max_context_window_tokens": 128000,
    "max_output_tokens": 4096,
    "max_prompt_tokens": 64000,
    "parallel_tool_calls": true,
    "streaming": true,
    "tool_calls": true
  },
  {
    "name": "gpt-4o",
    "max_context_window_tokens": 128000,
    "max_output_tokens": 4096,
    "max_prompt_tokens": 64000,
    "vision": true,
    "parallel_tool_calls": true,
    "streaming": true,
    "tool_calls": true
  },
  {
    "name": "o1-ga",
    "max_context_window_tokens": 200000,
    "max_prompt_tokens": 20000,
    "structured_outputs": true,
    "tool_calls": true
  },
  {
    "name": "o3-mini",
    "max_context_window_tokens": 200000,
    "max_output_tokens": 100000,
    "max_prompt_tokens": 64000,
    "streaming": true,
    "structured_outputs": true,
    "tool_calls": true
  },
  {
    "name": "text-embedding-ada-002",
    "max_inputs": 512
  },
  {
    "name": "text-embedding-3-small",
    "max_inputs": 512,
    "dimensions": true
  },
  {
    "name": "claude-3.5-sonnet",
    "max_context_window_tokens": 90000,
    "max_output_tokens": 8192,
    "max_prompt_tokens": 90000,
    "vision": true,
    "parallel_tool_calls": true,
    "streaming": true,
    "tool_calls": true
  },
  {
    "name": "claude-3.7-sonnet",
    "max_context_window_tokens": 200000,
    "max_output_tokens": 16384,
    "max_prompt_tokens": 90000,
    "vision": true,
    "parallel_tool_calls": true,
    "streaming": true,
    "tool_calls": true
  },
  {
    "name": "claude-3.7-sonnet-thought",
    "max_context_window_tokens": 200000,
    "max_output_tokens": 16384,
    "max_prompt_tokens": 90000,
    "vision": true,
    "streaming": true
  },
  {
    "name": "gemini-2.0-flash",
    "max_context_window_tokens": 1000000,
    "max_output_tokens": 8192,
    "max_prompt_tokens": 128000,
    "vision": true,
    "streaming": true
  },
  {
    "name": "gemini-2.5-pro",
    "max_context_window_tokens": 128000,
    "max_output_tokens": 64000,
    "max_prompt_tokens": 128000,
    "vision": true,
    "parallel_tool_calls": true,
    "streaming": true,
    "tool_calls": true
  },
  {
    "name": "o4-mini",
    "max_context_window_tokens": 128000,
    "max_output_tokens": 16384,
    "max_prompt_tokens": 128000,
    "parallel_tool_calls": true,
    "streaming": true,
    "structured_outputs": true,
    "tool_calls": true
  },
  {
    "name": "gpt-4.1",
    "max_context_window_tokens": 128000,
    "max_output_tokens": 16384,
    "max_prompt_tokens": 128000,
    "vision": true,
    "parallel_tool_calls": true,
    "streaming": true,
    "structured_outputs": true,
    "tool_calls": true
  }
]
2개의 좋아요