XTercodex 5.5不支持高上下文,貌似只有258k 中发帖

[image] 
然而我已经开了350k(刚刚还试过了512k / 1m)
model = "gpt-5.5"
model_reasoning_effort = "xhigh"
model_context_window = 350000
model_auto_compact_token_limit = 328000

然后看了今天的issue,发现貌似是模型问题而不是设置问题
GPT 5.5 发布后 1M 上下文窗口消失 · Issue #19208 · openai/codex — 1m context window gone after Gpt 5.5 Release. · Issue #19208 · openai/codex
config.toml 上下文窗口设置未被遵循 · Issue #19185 · openai/codex — config.toml context...