Sabitlenmiş Tweet

Here is my submission for #ElevenHacks @firecrawl
@ElevenLabs
Beacon is a voice-first AI assistant built specifically for visually impaired users. You call a number, you talk, Beacon handles the rest.
Built on ElevenLabs Conversational AI, Beacon listens, reasons, and responds in natural speech using a custom voice. It uses Firecrawl Search as its live intelligence layer — fetching real-time weather forecasts, nearby services, load shedding schedules, community safety alerts, and road closures from across the web, then distilling that into clean spoken summaries. No raw data, no screen, no friction.
What makes Beacon different is who it's built for. Most AI assistants assume a sighted user at some point. Beacon assumes limited screen interaction. It's independence — delivered through a phone call. (The number is not "live" as this is in a conceptual state for the hackathon) #ElevenLabsMusic & TTS used.
English
















