
Tether
2.1K posts

Tether
@tether
Finance | Power | Data | Edu | Evo - Empowering Individuals, Communities, Cities, & Nations - Unstoppable Together | https://t.co/uYJ5YbfDFU | https://t.co/DvCYi4ZTMP $USDt



The QVAC SDK is the "LEGO block" of the next era of computing. It’s a modular, local-first framework designed to turn anything—from a simple robot to an industrial server—into a sovereign, autonomous mind. Why build with QVAC? Atomic Intelligence: AI as a raw material embedded directly into your hardware. No Cloud Dependency: 0 latency and total privacy. If the internet breaks, your world keeps thinking. Infinite Scale: A single API for local AI that runs on any device, anywhere. From a child’s toy to the fabric of the universe, if you can dream it, you can build it. Start building the future: docs.qvac.tether.io 🚀





QVAC SDK will support in 0.9.0 (gonna be release in ~10 days) LoRA fine-tuning directly on-device, letting developers customize LLMs with their own data without sending anything to the cloud. You just load a base model, point it at your training dataset, and get a lightweight LoRA adapter back — all running locally. The fine-tuned model can then be used for inference immediately, with no extra setup. Why it matters: LoRA (Low-Rank Adaptation) fine-tuning lets you specialize a general-purpose language model for your specific use case — like matching a brand's tone, mastering domain terminology, or following a particular output format — using a fraction of the compute a full fine-tune would require. QVAC handles the entire workflow locally: dataset preparation, training with configurable hyperparameters, checkpoint saving, and seamless inference with the resulting adapter. Your data never leaves the device. The developer experience: Fine-tuning with QVAC is as simple as calling "sdk.finetune()" with your dataset and a few hyperparameters. Training runs entirely on your local hardware, produces a compact LoRA adapter file, and supports pause/resume so you can stop a job and pick it back up without losing progress. The result plugs straight into QVAC's inference pipeline — no model conversion, no deployment step, just immediate local completions with your fine-tuned model. qvac.tether.io

Tether Acquires 951 BTC, Total Holdings Reach 97,141 BTC — Fifth-Largest On-Chain Holder A Bitcoin reserve address associated with Tether withdrew 951 BTC (approximately $70.47 million) from Bitfinex, representing part of its Q1 2026 purchases. Since 2023, the address has consistently accumulated BTC using roughly 15% of the company’s profits and typically transfers the holdings from Bitfinex after each quarter ends. It currently holds about 97,141 BTC (valued at around $7.2 billion), ranking as the fifth-largest Bitcoin wallet on-chain.




The new @tetherwallet is officially here, built from the ground up using Bare. By integrating our high-performance runtime, @tether is setting a new standard for what a lean, native-first wallet experience should feel like. The power of Bare, now in the palm of your hand. Try the wallet 👉 tether.me

Quality takes time and for good reason. Our team has been heads-down ensuring the next phase of this journey is perfect. From 206 incredible submissions, we have narrowed it down to the best of the best. We’re thrilled to announce our 35 Semi-Finalists! 👇

