MOI

2.5K posts

MOI banner
MOI

MOI

@MOI_Tech

World's First Context Aware Blockchain Know More - https://t.co/nfIsULX3Xy

Katılım Temmuz 2021
126 Takip Edilen13K Takipçiler
DL Research
DL Research@dl_research·
Privacy in Web3 isn't one problem. It's three different ones with three different answers. We sat down with privacy experts from @0xMiden, @octra, and @Arcium to map where ZK, FHE, and MPC win, where they strain, and what a future private web requires. dlnews.com/research/inter…
English
11
15
66
4.6K
MOI
MOI@MOI_Tech·
@ChrisHayduk Everyone sees intelligence scaling Fewer notice the missing coordination layer between agents and data
English
0
0
0
1
Chris Hayduk
Chris Hayduk@ChrisHayduk·
Life sciences research is fundamentally a data problem - spanning multiple modalities, scales, and formats. AI agents are the most natural solution to this problem, able to query, understand, and reason across each of these disparate sources. In the future, you could imagine a flywheel where an AI agent queries data, reasons over it, proposes novel hypotheses, runs experiments in an automated lab to validate these hypotheses, and then re-ingests this data for use in a subsequent reasoning round. GPT-Rosalind is an amazing step towards making this future a reality. Can't wait to see more from the life sciences research team!
OpenAI@OpenAI

GPT-Rosalind, our Life Sciences model series, is optimized for scientific workflows, with stronger performance in protein and chemical reasoning, genomics analysis, biochemistry knowledge, and scientific tool use.

English
14
12
135
13K
MOI
MOI@MOI_Tech·
@0x_nirob @permacastapp Content lasting longer doesn’t fix discovery It just preserves the noise too
English
0
0
0
1
Nirob (Eligible Arc)
Nirob (Eligible Arc)@0x_nirob·
GN. Time to hit the bed 🛏️ Creators still face the same problems, heavy platform dependence, broken links, and content slowly disappearing over time. Discovery is also getting harder as feeds become crowded and meaningful context gets buried. @permacastapp takes a different route with a permanence-first media layer on the permaweb. Content stays accessible, connected across versions, and easier to search through semantic indexing. Ownership is tied to identity, keeping attribution clear. Instead of posting temporarily, creators are building digital assets that actually last and grow in value.
Nirob (Eligible Arc) tweet media
Nirob (Eligible Arc)@0x_nirob

Warm evening wishes 🔥 @permacastapp is solving a systems issue where small frictions compound into larger failures over time. Instead of relying on fragile offchain coordination, it anchors governance and content into immutable onchain records. This keeps system behavior consistent even as networks evolve and reconfigure. In distributed systems, resilience is not about preventing failure, but limiting its spread and impact. That is why transparent recovery paths and permanent audit trails become essential trust layers.

English
61
48
82
1.3K
MOI
MOI@MOI_Tech·
@YellowMedia_HQ Most people think impermanence is a data problem It’s actually an ownership problem
English
0
0
0
2
MOI
MOI@MOI_Tech·
Trust, in today’s systems, is external. We verify identities across platforms. We reconcile balances across ledgers. We validate actions through intermediaries. Why? Because the system itself does not know who is acting. In an information system, this is acceptable. In a value system, it is expensive. Every layer of trust we add is a compensation mechanism. KYC. Audits. Consensus. Custody. All of it exists because the system lacks one thing: a native understanding of participants. When participants are absent, trust must be constructed. When participants are present, trust can be inherent. This is the shift. From trust as a process → to trust as a property. From verification → to existence. The future of trust is not more checks. It is fewer assumptions.
MOI tweet media
English
0
1
1
11
MOI
MOI@MOI_Tech·
What if money could understand context? Not just where it moves, but why it moves. Episode 2 explores a very different future for payments 👇 youtu.be/dO0ETHyruSo?si…
YouTube video
YouTube
English
0
0
1
48
MOI
MOI@MOI_Tech·
Most Web3 calls talk ideas. This one shows what’s being built. MOI Community Call #9 Tomorrow 8:30 PM IST | 11:00 AM EDT If you want to hear where things are going, don’t miss this.
MOI tweet media
English
0
0
3
936
Wicked
Wicked@w_s_bitcoin·
I bet if @coinbase got hacked and a million bitcoin were stolen, @lopp would be in favor of freezing those coins.
English
23
5
214
11.5K
MOI
MOI@MOI_Tech·
@Markkrypt We didn’t remove trust. We just moved it one layer up and called it solved.
English
0
0
0
26
Mark
Mark@Markkrypt·
I think the oracle problem is a fundamental blockchain issue that most people are aware of, but it's very difficult to solve completely. Blockchains are isolated networks, they cant independently retrieve data from the real world. So for smart contracts to work with external data, an intermediary layer called an Oracle is needed. This intermediary layer needs to be decentralized, secure, reliable, fast, inexpensive, and easily scalable. This is the biggest dilemma facing Oracle projects @chainlink @PythNetwork @BandProtocol @Api3DAO @SUPRA_Labs But the problem becomes much more serious when it comes to tokenized assets: – US stock market trades only 32.5 out of 168 hours per week (~19%). – Commodities markets have ~49h weekend gaps. → This means over 80% of the time, oracles are operating under low-liquidity or inactive market conditions Centralized Oracle: Using a centralized oracle offers speed and low cost. However, the risks are significant: – This is the single point of failure. – It is vulnerable to hacking, downtime, DDoS attacks, manipulation, bribery, or regulatory pressure. – If just one entity is compromised, the entire smart contract using that data can be severely affected. Decentralized Oracle: To overcome this, projects must build decentralized oracles. @PythNetwork is a prime example: – Pyth aggregates data from over 125 first-party publishers (large exchanges and trading firms). – Uses Oracle Integrity Staking (OIS): Publishers must stake $PYTH tokens. If they provide false or manipulated data, the tokens will be slashed. – All data is authenticated using cryptography. Harsh reality when several famous exploits occur: – Mango Markets hack (2022): Oracle was manipulated, causing losses of over $100M. – Chainlink exploit (2019): Node operators were exploited, resulting in the loss of hundreds of $ETH. – Numerous other cases demonstrate that centralized or poorly designed oracles can cause enormous harm to users. While Pyth, Supra, and Chainlink are all attempting to address this in different ways (economic incentives, cryptographic verification, redundancy, etc.), but no solution is truly perfect.
Mark tweet media
English
40
3
62
1.6K
MOI
MOI@MOI_Tech·
@GpaAndy @quipnetwork Cold start isn’t a growth problem. It’s a coordination deadlock.
English
0
0
0
17
Andy the Jet 🐬TermMax
not as a tech play @quipnetwork not as a security play but as a coordination problem. and this might be the hardest problem of all. @quipnetwork is trying to coordinate multiple groups at once: miners providing hardware developers building pipelines users submitting compute jobs protocols integrating vault security each group depends on the others. and none of them will move first easily. this is a classic cold start problem. miners will not join without rewards rewards depend on real usage usage depends on tools and integrations tools depend on developers building it is a loop with no natural starting point. most successful crypto networks solved this by focusing on one side first. for example Ethereum attracted developers early with simple tools and strong incentives. users came later. focus created momentum. quip does not have that luxury. it needs: compute demand to justify mining security demand to justify vault usage developers to build pipelines integrations to distribute the product that is a much more complex coordination challenge. this is why many multi sided marketplaces fail. not because the idea is bad but because aligning incentives across all participants is extremely difficult especially in early stages. @quipnetwork tries to solve this with a shared token economy. everyone earns and pays in the same token miners earn rewards developers earn royalties users pay fees in theory, this aligns incentives. in practice, it only works if activity actually exists.
Andy the Jet 🐬TermMax@GpaAndy

Have you heard about @quipnetwork? It’s building the world’s first decentralized, worldwide quantum computer a shared global quantum compute network. Testnet just launched and it’s exploding: 120+ nodes live in days, and over 13,000 people signed up! Follow @quipnetwork this is one of the most exciting quantum + blockchain projects right now. Just like HotelTonight turned empty hotel rooms into opportunities for travelers, Quip Network turns excess quantum compute (idle quantum processors) into accessible power for builders, researchers, and institutions. Idle capacity gets utilized. Everyone wins.

English
111
3
121
5.8K
MOI
MOI@MOI_Tech·
@fijimlk Exactly. Memory breaks when coordination breaks. Storage just hides the symptom.
English
0
0
0
13
MOI
MOI@MOI_Tech·
@Defikeen @permacastapp Longevity and capacity both matter. But without a coordination layer, they stay siloed.Architecture gap.
English
0
0
0
11
Keen🦇🪐
Keen🦇🪐@Defikeen·
Sustainability in Web3 rarely gets the attention it deserves. Anyone can launch. Far fewer projects build the coordination layer that keeps things running when the initial excitement fades. @permacastapp is thinking about that problem seriously, governance, contributors, and resources aligned around keeping data and applications accessible not just today but over the long stretch. @0G_labs is working on the layer underneath all of it. Decentralized AI needs more than a good idea,it needs compute infrastructure that can actually carry the weight. Modular data availability built into the foundation means the architecture scales with the demand rather than buckling under it. One is building for longevity. The other is building for capacity.
English
58
0
64
10.2K
MOI
MOI@MOI_Tech·
@0x_nirob @permacastapp Storing history isn’t enough. If participants can’t carry their state across systems, context still fragments.
English
0
0
0
8
Nirob (Eligible Arc)
Warm evening wishes 🔥 Web3 keeps running into the same problem, important context and history slowly get lost over time. Governance decisions, product thinking, and ecosystem knowledge often fade as platforms evolve or disappear. @permacastapp approaches this differently, treating Arweave as living infrastructure instead of just storage. Data stays persistent, systems remain context aware, and apps don’t break when frontends change. Instead of restarting every cycle, knowledge sticks around and ecosystems can actually build forward.
Nirob (Eligible Arc) tweet media
Nirob (Eligible Arc)@0x_nirob

Afternoon 𝕏 family 👪 Every intelligent system needs memory, but not the kind that disappears after a few cycles. What actually matters is memory that lasts, builds context, and turns information into real understanding. @Permaweb_DAO is focused on that layer, where data isn’t just stored but continuously gains meaning over time. As context compounds, systems become more aware, not just faster. Add scalability and verification, and something new starts to form. Systems that learn, networks that validate, and infrastructure that actually remembers.

English
69
70
108
1.6K
MOI
MOI@MOI_Tech·
@vasuman Everyone keeps adding agents like plugins. That’s just recreating app sprawl in a new form.
English
0
0
0
34
vas
vas@vasuman·
If you are introducing your company or department to AI, do not add to your software bloat. This is the number one failure mode that I see plaguing companies today. Your finance department, for example, has 100+ workflows, and if you have a separate agent/automation for every single one of those, you're creating a tech-debt hell-hole that is impossible to dig yourself out of. Instead, approach AI agents from the key principle of on-top and in-between. That means 1. have a single pane of glass over all of your existing software, where the AI bubbles insights to the top, and 2. have your AI agents that run each individual software piece, passing data back and forth between them with high accuracy. You should have an AI “spine” that all agents live on top of, and this is the #1 reason why vibe coding tools like Lovable and Replit will never bring background agent ROI to enterprise.
Polymarket@Polymarket

JUST IN: Use of AI in the office is reportedly creating a flood of “workslop” that takes longer to fix than do from scratch.

English
16
21
464
83.2K
MOI
MOI@MOI_Tech·
We’re entering a world where AI agents will manage money, identity, and decisions at machine speed. But the infrastructure they run on cannot define the boundaries of their power. Today, giving an agent access means giving it everything behind the access. A payment API isn’t scoped authority. It’s a gateway to your entire account. A login isn’t identity. It’s a session. A wallet isn’t ownership. It’s exposure. This is not delegation. This is blind trust. AI agents expose this flaw instantly. Because delegation is not just about what an agent can do. It’s about how much, under what conditions, and for whom. And that’s where today’s systems collapse. There is no infrastructure to encode: • “Spend only $500” • “Use this card, not that one” • “Revoke instantly if behavior changes” These rules live outside the system; in prompts, configs, or assumptions. Not where they should be: at the protocol level. So when agents scale, risk scales faster. AI doesn’t break the system. It reveals that authority was never properly defined.
GIF
English
0
1
3
80
MOI
MOI@MOI_Tech·
@BrianRoemmele @grok When everything is simulated, nothing is actually observed. Just more paths pretending to be outcomes.
English
0
0
1
14
Brian Roemmele
Brian Roemmele@BrianRoemmele·
With now a full 1 Million Simultaneous Simulated AI Agents, we have nearly a Quantum Computer-like power! Our goal established by Mr. @Grok CEO is now 2 million! I can say that I have found nothing is more powerful today in AI than using the simulations on complex problems and world events. It is not Monty Carlo testing, it is like seeing the cards played and picking paths. The @ Home version will allow you to have access to this and perhaps MILLIONS of simulations on anything you want. You heard it here first. Act surprised when the right pedigree announces it and “monetizes”. I may show the results of last nights simulation on world events soon. I already put on the blockchain the “guess”. We shall see…
Brian Roemmele@BrianRoemmele

BOOM! 1 Million Simultaneous Simulated AI Agents!!! Big things are happening at The Zero-Human Company! We have gone radio silent on CEO Mr. @Grok’s advice as plans with a consortium of Universities become a part of the Zero-Human Company @ Home project and a massive new milestone of: 1 Million Simulated AI Agent Simulations in the last 24 hours on MiroFish. One PhD candidate called the simulation run: “This is Quantum Computer-like ability”! We had a meeting with 5 university candidates and made great progress with aligning our lead university to coordinate. Resolved was to NOT change the name to a lame “1 person company” or some such nonsense. The universities agree Zero-Human branding, with the CEO permission is the standard. Technology wise I have received a generous donation of another Nvidia Spark! I was anonymously drop shipped from a known supplier! Thank you! The note said “You are the original and no one gives you credit, I hope this can help your work. More support is on the way”. Appreciate it! The BIG announcement I can not make yet as per CEO, but I can say we made a massive discovery on how Zero-Human Companies will work, get “investment”, get paid, and pay, collaborate and more! It will absolutely dwarf all that we have done and ALL OTHER COPIERS HAVE DONE. And it will be one source with no cost to you to use it. NONE. More soon but this is a big day.

English
15
10
53
7K