
@gill_kyle Totally agree, but if this is the case don’t you think that TUI solutions like Claude code would be better in that case ?
English
erel vanono
29 posts















AI coding agents hit a wall when codebases get massive. Even with 2M token context windows, a 10M line codebase needs 100M tokens. The real bottleneck isn't just ingesting code - it's getting models to actually pay attention to all that context effectively.
