
how i use claude code & the @gmgnai api to catch every runner of the day and filter out the noise. most trending tokens aren't tradeable. bundled, botted, or exit liquidity by the time you see them. filtering matters more than speed. 4 days running. 3,200 tokens entered the pipeline. 107 made the cut. the system in stages: 1/ discovery. poll /v1/market/rank every 30 seconds. baseline filters on trending data cut the universe fast. holder count, age, dev concentration, top 10 holder share, wash trading. 2/ tracking. survivors get watched across multiple scans. holder growth, liquidity stability, buy/sell pressure, bundler rate, bot rate. 3/ deep dive. tokens that make it through get a /v1/token/info call. rug ratio, entrapment ratio, KOL presence, social duplicates. 4/ alert. what passes fires to a live dashboard at scgalpha.com/vault. every alerted token gets journaled. full snapshot every 30 seconds of price, mcap, liquidity, holders, bot rate, smart money, whales. that data does two things: first, it refines what actually separates runners from losers. the filters aren't static, they evolve as the journal grows. second, it powers backtests for algorithmic execution strategies. realistic slippage and fees baked in, derived from actual on-chain trade data. speed is a commodity. filtering is the moat.











