NuNet 🌐

4.6K posts

NuNet 🌐 banner
NuNet 🌐

NuNet 🌐

@nunet_global

NuNet is pioneering a new era of computing by empowering anyone to share GPU/CPU resources in a global and open network. #OpenSource Plug in ⚡

Earth Katılım Haziran 2019
608 Takip Edilen23K Takipçiler
Sabitlenmiş Tweet
NuNet 🌐
NuNet 🌐@nunet_global·
"For the past two decades, intelligence has lived in the cloud. Today, we make it possible for intelligence to live anywhere." — Kabir Veitas, CEO & CTO, NuNet — Network Live launch The complete NuNet infrastructure stack is live. The adoption phase starts now
English
7
21
108
6.1K
NuNet 🌐
NuNet 🌐@nunet_global·
@OpenClaw running on the NuNet network, powered by @Qwen 3.5:2b via @ollama Four open-source projects working together on decentralized infrastructure 🤝 Here's a quick look at it running Full technical walkthrough coming next week so you can deploy it yourself
English
6
19
66
1.1K
NuNet 🌐
NuNet 🌐@nunet_global·
The AI industry talks about orchestration constantly. But it's conflating two different things. Software orchestration: coordinating agent logic. Which agent handles which task. How they pass information. When to escalate. LangGraph, CrewAI, AutoGen; all software orchestration. Compute orchestration: coordinating the physical resources agents run on. Which hardware runs which workload. How resources are allocated dynamically. What happens when a machine fails. You need both. Most people only think about the first one. Software orchestration without compute orchestration is like writing a symphony without an orchestra. The composition exists. Nobody can play it. NuNet operates at the compute orchestration layer. The protocol discovers available resources across any hardware; edge devices, GPUs, servers, data centers. Matches workloads dynamically. Recovers from failures. Settles payment. The software layer decides what agents do. The compute layer makes sure they can actually do it. Both layers need to be orchestrated. Only one is getting built.
NuNet 🌐 tweet media
English
2
13
50
884
NuNet 🌐
NuNet 🌐@nunet_global·
Compute provider onboarding is now fully automatic 🚀 Install the NuNet Appliance, onboard your resources, and you're live on the network. No approval queue. No waiting The provider seeding phase is accelerating, and next up: Cardano blockchain support 🤝 What machine are you putting on NuNet? docs.nunet.io
English
6
16
61
988
NuNet 🌐
NuNet 🌐@nunet_global·
Decentralized compute has solved the supply problem. There are GPUs, CPUs, and edge devices available across dozens of networks, but having compute available and making compute work are two different things. A marketplace gives you access to hardware. Orchestration makes that hardware useful. When we talk about orchestration, we mean the system discovering available resources across different hardware types and owners, matching workloads to the right compute automatically, handling failures without anyone noticing, and settling payment between participants without manual invoicing. The term "orchestration" has been trending for months in AI infrastructure, and for good reason. As AI workloads get more complex (multi-agent, multi-device, multi-owner) the hard problem shifts from finding compute to coordinating it. This is especially true as compute moves beyond data centers into the physical world. Edge devices, smart buildings, robots, IoT hardware. These environments have intermittent connectivity, mixed hardware, multiple owners, and real-time constraints that you can't solve with a search bar and a price list. NuNet is an orchestration protocol where the software finds compute, not the user. The workload itself discovers resources, negotiates terms, and settles payments across any hardware, any owner, any location. That's the layer between "compute exists" and "compute works." What do you think is harder to build in decentralized infrastructure, supply or coordination?
English
4
19
68
3K
NuNet 🌐
NuNet 🌐@nunet_global·
Exactly right. Distribution alone isn't the point. NuNet's orchestration layer uses an NuActor model, each compute node and workload operates as an autonomous agent. Resources are discovered, matched, and allocated dynamically. The workload finds the compute. Not a human. Not an operator. Smarter coordination built into the protocol itself, not bolted on top.
English
1
0
5
115
aginaut
aginaut@aginaut·
@nunet_global No central intermediary” is only powerful if the coordination layer gets smarter, not just more distributed.
English
1
0
1
99
NuNet 🌐
NuNet 🌐@nunet_global·
The infrastructure layer for autonomous AI is still being built. Distributed compute. Intelligent orchestration. Resources matched to workloads, without a central intermediary. That's what Network Live delivers. nunet.io What does your AI infrastructure stack look like in 2026
English
3
9
62
1.4K
West
West@Deitywest·
@nunet_global Hey Intern. Is the DePIN ready for testing? If yes hw do i hop on? This is the future and I don't want to miss out
English
1
0
1
46
NuNet 🌐
NuNet 🌐@nunet_global·
NuNet's community support runs on the NuNet network. A real workload, deployed on the NuNet network. Not a demo, not a testnet. In this walkthrough we take you through the full process inside the NuNet Appliance: creating a new deployment, selecting a pre-configured ensemble file, choosing between a provider, and watching the job go live on the network. More deployment walkthroughs coming. What would you like to see? youtu.be/HdVCTr1GKGY
YouTube video
YouTube
English
1
17
73
1.8K
NuNet 🌐 retweetledi
Stein 🌱
Stein 🌱@rhomurl·
What if AI didn’t have to depend on one big cloud to work? For years, most intelligence has lived in the cloud. That worked fine when software mostly stayed online. But now, AI is moving into the real world, into homes, machines, robots, and systems that need to think and react on the spot. That’s where @NuNet_global comes in. NuNet is building a way for unused computing power from different devices to work together as one open network, so intelligence doesn’t have to depend on one central cloud. Instead of forcing everything through expensive and rigid systems, NuNet makes it possible for AI to run closer to where it’s actually needed. And now this is no longer just an idea. NuNet officially launched its mainnet on March 2, 2026, marking the start of its adoption phase. Do you think the future of AI stays in the cloud, or moves closer to where real-world decisions happen?
Stein 🌱 tweet media
English
3
4
20
718
NuNet 🌐
NuNet 🌐@nunet_global·
The part people underestimate is what happens behind the scenes when you scale this across an entire company. Every agent task needs compute, and coordinating that across infrastructure in real time is a genuine engineering challenge. The orchestration layer matters more than most people think.
English
0
4
24
333
Aravind Srinivas
Aravind Srinivas@AravSrinivas·
Perplexity Computer is now available for Enterprise. Computer makes everyone in the company an engineer. Anyone can debug infra, ship PRs, or query data warehouses with natural language. It comes with the same enterprise grade security you get with Perplexity Enterprise.
English
32
31
548
35.8K
NuNet 🌐
NuNet 🌐@nunet_global·
The NuNet Appliance now gives you a full breakdown of your compute resources. Free, allocated, and onboarded, all visible at a glance with expandable detail panels showing CPU cores, RAM, and disk. You can see exactly what capacity is available for new jobs, what's currently in use, and what you've published to the network. Small update, but it makes a real difference when you're managing your machine's contribution to the network Onboard your device to the NuNet Network docs.nunet.io
NuNet 🌐 tweet media
English
1
19
67
1K
$cash
$cash@CashAnvil·
And even on top of all of this - START LOOKING OUTSIDE OF BLOCKCHAIN!! We created blockchain and crypto to stop corruption, create transparency, ownership, etc. Why are we not pushing for that still? There are so many industries who can benefit from this, but almost everyone seemingly wants to keep building the 47th Dex on Cardano or another NFT marketplace (guilty 😆). Let's start productizing outside of our space, let's be in the background. Let someone else take the glory while the technology shines through in the back-end. That's how we win, by becoming the b2b chain of the world. I don't see how we lose this narrative with Leios. It's time to get out there, stop building in a silo, and start building the future products/technologies of the world.
Lorenzo@lorenzoxbt

I think Adam is onto something, BUT we need to be very careful or we will risk promoting products nobody wants I say start from actual, measurable demand for a service/product, prove that it solves a real problem people are willing to pay for. That's the first step once you have established the product is needed and industry experts (i.e. @nesso) can distribute it, go build the product out and come back when you have an MVP with a plan to generate revenue asap with an MVP, user appetite and a revenue model you can now request treasury funds from the Cardano Marketing Cabal so to recap: ✅ known demand ✅ distribution ✅ MVP ✅ revenue model I would participate only if these conditions were met, otherwise we would be marketing another wave of nonsense hobby-projects nobody cares about

English
13
5
55
2.6K
NuNet 🌐
NuNet 🌐@nunet_global·
@cz_binance And 1 million times more compute jobs too. Payments are the last step; orchestration, execution, and settlement happen first. That's the layer being built.
English
0
4
10
175
NuNet 🌐
NuNet 🌐@nunet_global·
NuNet: priced in USD, paid in NTX with real-time conversion. Every transaction flows through the token. USD just removes the friction. Real infrastructure needs real-world pricing
English
5
15
69
1.4K
NuNet 🌐
NuNet 🌐@nunet_global·
The autonomous AI agent market hits $8.5 billion this year. $35 billion by 2030. But here's what Deloitte also found: organizations with mature orchestration capture 2-3x more value from their AI investments. The problem is 50% of enterprise AI agents are still operating in silos. Disconnected workflows, redundant automations, no coordination between systems. The industry spent years building intelligence and basically forgot to build the infrastructure underneath it The industry built intelligence. It forgot to build coordination. Orchestration isn't a feature. It's the entire infrastructure layer that makes multi-agent AI actually work. At the compute level, orchestration means: discover resources, match workloads, deploy dynamically, recover from failures, settle payments. Automatically. That's what NuNet does. Not at the software layer. At the compute layer underneath everything. When AI agents need to coordinate across devices, owners, and environments, the compute layer has to be orchestrated too. The model race is over. The orchestration race is starting 🏎️
English
2
13
59
1.2K
NuNet 🌐
NuNet 🌐@nunet_global·
The AI bottleneck isn't better models. It's getting existing capabilities into production. Trust. Infrastructure. Payment rails. Identity. That's what we built. Network Live is the deployment layer for distributed AI. What's blocking your deployment today?
English
1
12
55
966
NuNet 🌐
NuNet 🌐@nunet_global·
There are two ways to decentralize compute. Most projects chose one. Almost nobody chose the other. Option 1: Build a marketplace. Users browse GPUs, pick a machine, rent it, run a job. This model works. It gives you cheaper cloud. But you're still the scheduler. You pick the hardware. You manage failures. You handle the complexity. Option 2: Build a protocol. Workloads describe what they need. The network finds the compute, matches the resources, handles failures, and settles payment. No browsing. No picking machines. Software finds compute on its own. That's the difference between a GPU rental platform and an orchestration protocol. NuNet built Option 2. Network Live is running it today. Both approaches have a place. But as AI moves into robotics, smart buildings, and edge devices , where nobody is sitting at a screen picking GPUs , which model scales?
NuNet 🌐 tweet media
English
4
21
78
1.3K
NuNet 🌐
NuNet 🌐@nunet_global·
This is exactly why we built an orchestration protocol. Robots can't browse a marketplace mid-task. They need compute to find them. Matched, allocated, and settled automatically. We're working with @Auki on exactly this. Spatial workloads running on community-powered compute, no cloud dependency, no command line. Hyperlocal compute needs hyperlocal orchestration.
Auki@Auki

Prominent robotics labs are betting on cloud robotics, but such high latencies aren't viable for real world use. You know what will work? Robots that connect to hyperlocal compute resources on the real world web and act autonomously at the speed of the real world.

English
1
12
48
1.1K