bayun (@bayun_127) 在 codex 的上下文感觉要比 200K 多的多 中发帖
给codex 干出
⚠ Heads up: Long conversations and multiple compactions can cause the model to be less accurate. Start new a new conversation when possible to keep conversations small
and targeted.
─ Worked for 5m 27s ──
这个来了,实际体验感觉好像给有个500K不止