Nic

8.3K posts

Nic banner
Nic

Nic

@SatoshiNic

"Hambre de Gloria, Sed de Victoria" - Mercados y Negocios. Siempre creciendo.

Beigetreten Mart 2021
745 Folgt678 Follower
Angehefteter Tweet
Nic
Nic@SatoshiNic·
En este hilo voy a ir dejando lecciones de vida y reflexiones cual estoico: 1) No te apures por mostrar, apúrate por Hacer. Los que consiguen el éxito son los que más hacen y mejor. Mostrar es una manifestación del Ego carente de sustento
Español
2
0
5
421
Nic retweetet
Ben Norton
Ben Norton@BenjaminNorton·
Actually existing libertarianism: Argentina's self-declared "anarcho-capitalist" leader Javier Milei is rapidly deindustrializing the country. Milei's "free market" policies have resulted in the world’s second-worst industrial decline. Argentina’s manufacturing sector declined by 7.9% from 2023 to 2025 (surpassed only by Hungary). More than 2,400 industrial companies have closed and 73,000 manufacturing jobs have been lost, representing 5% of Argentina's total industrial companies. By contrast, in Brazil, where the government has engaged in industrial policy to support domestic manufacturing, industry grew by 3.5% in the same period. Source: buenosairesherald.com/economics/arge…
Ben Norton tweet media
English
138
1.7K
4K
155.4K
Matias B
Matias B@MatiasBoier·
@porquettfin En Escobar, barrio San Sebastián en algunas áreas liquidan lotes entre 10K y 20K y encima financian... Las expensas son un delirio se están borrando muchos de ahí
Español
3
0
4
1.2K
Tendencias Finanzas
Tendencias Finanzas@porquettfin·
"Terrenos en remate" Porque aparecen terrenos por menos de USD 50.000 en Pilar y Escobar, en un contexto de sobreoferta inmobiliaria. Precios de “remate” que llaman la atención. El mercado de lotes viene golpeado por la caída del poder adquisitivo y la menor demanda. Muchos desarrolladores necesitan liquidez y empiezan a bajar precios para vender stock acumulado. Esto genera oportunidades, pero no necesariamente gangas seguras. El punto clave es que no todos los terrenos son iguales. Factores como ubicación, acceso, servicios, infraestructura y estado legal del lote pueden explicar gran parte de la diferencia de precio. Además, este fenómeno refleja algo más profundo: el real estate argentino sigue en proceso de ajuste. La baja en precios en dólares en algunos segmentos marca que el mercado todavía está buscando un nuevo equilibrio.
Tendencias Finanzas tweet media
Español
70
26
610
283.8K
Nic retweetet
Nico
Nico@nicos_ai·
En lugar de ver Netflix este domingo, dedica 1 hora a esto. Un CURSO COMPLETO de Claude que te enseña a automatizar lo que te roba 3 horas al día. El lunes lo agradecerás.
Español
41
1.2K
11.3K
620.2K
Nic retweetet
Ismael Sanz
Ismael Sanz@sanz_ismael·
El auge de una cultura postalfabetizada —pantallas, vídeos cortos, textos fragmentados— no solo está erosionando la concentración y la lectura profunda. Está empezando a generar una nueva forma de desigualdad cognitiva. The New York Times Como con la comida ultraprocesada, leer bien exige recursos, tiempo y entorno. La “lectura experta” reconfigura el cerebro y sostiene ciencia, democracia y pensamiento crítico. Si se convierte en un lujo, las consecuencias serán sociales y políticas nytimes.com/es/2025/07/30/…
Ismael Sanz tweet media
Español
27
1.9K
5.2K
178.3K
Nic retweetet
aditya
aditya@adxtyahq·
Claude Startup Program is also OPEN btw > API credits for early-stage startups (up to ~$25K) > Built by Anthropic (Claude) > No VC needed (unlike OpenAI) > Selection based on product + real Claude usage > Actually friendly to bootstrapped founders Apply: anthropic.com/startups
aditya tweet media
aditya@adxtyahq

OpenAI Startup Credits are OPEN btw > Up to $100K+ in API credits for early-stage startups > Backed by OpenAI + partner VCs / accelerators > Use credits for GPT, vision, embeddings, agents & infra > No revenue requirement, just a real product & traction > One of the easiest ways to ship AI without burning cash Apply: openai.com/startups

English
201
216
3.7K
770.1K
Nic retweetet
Grant
Grant@Grantblocmates·
Anthropic have just buried OpenAI and ChatGPT with this ad lmfao There’s no coming back from that
English
612
2.6K
35.4K
3M
Nic retweetet
Guillermo Casaus
Guillermo Casaus@_guillecasaus·
🚨 Este tipo acaba de lanzar un tutorial completo de 5 horas para dominar Claude. Aprendes a construir, automatizar y crear todo tipo de proyectos usando Claude paso a paso. Si estás aprendiendo IA, esto te interesa 👇
Español
18
100
730
52.8K
Nic retweetet
Aakash Gupta
Aakash Gupta@aakashgupta·
Goldman just told every SaaS CEO their business model has a five-year shelf life and the market hasn’t repriced accordingly. The headline number is $780 billion in application software by 2030, 13% CAGR. Sounds like growth. But agents capturing 60%+ of that economics means the profit pool migrates away from per-seat subscriptions toward workflow-completion pricing. The market gets bigger while the legacy revenue model gets smaller. Two things happening at once. This is already showing up in the data. Seat-based pricing dropped from 21% to 15% of SaaS companies in just twelve months. Hybrid pricing surged from 27% to 41%. Klarna doubled revenue per employee after deploying agents across core workflows. SaaStr is actively downgrading seat counts at vendors because they have 12+ AI agents in production replacing human users. The math problem for incumbents is brutal. Salesforce charges up to $500/seat/month at top tiers. When one agent automates what ten humans used to do, charging per seat becomes a penalty on the vendor. BCG’s buyer survey found 40% of enterprise customers cite seat reduction as their primary lever to cut software spending. The very AI features vendors are building to retain customers are giving those customers the tool to shrink their contracts. ServiceNow saw this coming and pivoted to “AI Control Tower” positioning, generating $600 million from Now Assist in Q4 alone. But even with 21% subscription growth and 25% more monthly active users, the stock dropped double digits after earnings. The market is saying: prove the new pricing model scales before we assign a multiple. Goldman’s own behavior tells the real story. They announced thousands of autonomous AI coding agents working alongside 12,000 human developers, projecting 3-4x productivity gains. Goldman is simultaneously publishing the research that says agents eat SaaS economics while deploying agents internally to eat SaaS economics. They’re the customer proving their own thesis. The vendors who win will be the ones who wrap workflows in agents and price on outcomes, capturing a share of the productivity gain rather than passing it all through. The vendors who lose will be the ones still counting seats while their customers count agents.
English
80
272
1.7K
546.3K
Nic retweetet
Arrepentidos de Milei
Arrepentidos de Milei@ArrepentidosLLA·
JAJA le hicieron LA CAMA a Caputo en LN. Le pusieron un cuadro que muestra que la inflación con el IPC actualizado fue mayor TODOS LOS MESES. Eso da 40% más de inflación en su gobierno (40% que le cagaron a jubilados, docentes, etc) -Nono, ese no! Ese no es el que te mandé! 💀💀
Español
149
1.8K
8.6K
313.9K
Nic
Nic@SatoshiNic·
Dubai is the next destination
Dion Harper@DionHarper15

@Microinteracti1 I know some folk that'd planned on visiting America at some point this year, they've cancelled those plans in favour of visiting another location on earth. They're very concerned about how it's playing out in America, they didn't feel comfortable with taking their kids there.

English
0
0
0
20
Nic retweetet
Dion Harper
Dion Harper@DionHarper15·
@Microinteracti1 I know some folk that'd planned on visiting America at some point this year, they've cancelled those plans in favour of visiting another location on earth. They're very concerned about how it's playing out in America, they didn't feel comfortable with taking their kids there.
English
184
12
173
68.5K
Guillermo Casaus
Guillermo Casaus@_guillecasaus·
🚨 Alguien acaba de resolver el mayor problema de Claude Code. Se llama Claude-Mem y añade memoria persistente entre sesiones, reduciendo hasta un 95 % el uso de tokens y permitiendo muchas más llamadas a herramientas. Es gratis y 100% open-source 👇
Guillermo Casaus tweet media
Español
62
287
2.8K
142.8K
Nic retweetet
Julian Goldie SEO
Julian Goldie SEO@JulianGoldieSEO·
Google Antigravity FULL COURSE 4 HOURS (Build & Automate Anything)
English
10
268
1.4K
81.3K
Nic
Nic@SatoshiNic·
@rohanpaul_ai Oh very interesting. Would this problem also be solved if AI is somehow used in non binary lenguaje, like qbits used in quantum computers? And what is missing in order to achieve that epic milestone?
English
0
0
0
8
Rohan Paul
Rohan Paul@rohanpaul_ai·
This figure shows how inference bottlenecks for AI agents depend on 2 things at once, how much work happens per byte moved and how many bytes must be kept per request. Operational intensity (OI) means ops per byte, so high OI means the chip spends more time doing math per data moved, and low OI means it mostly waits on data movement. Capacity footprint (CF) means bytes per request that must sit in memory, so high CF means the request simply does not fit on 1 accelerator even if compute is fast. The classic roofline view mostly separates compute-bound from memory bandwidth-bound cases, but it does not show what happens when memory capacity is the limiting factor. That missing piece matters for agents because long contexts and key-value (KV) cache can force huge CF, which can make utilization low even if bandwidth and compute are not maxed. The right panel places real transformer work into this OI versus CF space, showing that different parts of inference land in different regions. Prefill work often has higher OI and can be more compute-friendly, while decode work often has lower OI because it streams KV cache repeatedly. Batch size (B) and sequence length (L) shift where a workload sits, so low B plus high L pushes decode blocks toward high CF and low OI. The big deal is that 1 “best GPU” design cannot be best for all these regions, so splitting prefill and decode across different hardware and shared memory becomes a systems-level solution.
Rohan Paul tweet media
English
2
2
11
1.7K
Rohan Paul
Rohan Paul@rohanpaul_ai·
Brilliant paper from Microsoft & Imperial College London. Says, AI agent inference is starting to hit a memory capacity wall and adding more compute FLOPs will not fix it. The simple version is that the compute “brain” is super-fast, but it cannot keep all the stuff it needs close by. It reframes inference bottleneck with operational intensity (OI), operations per byte moved from dynamic random-access memory (DRAM), and capacity footprint (CF), bytes that must sit in memory per request. With long interactive contexts, CF can blow past any single accelerator, and low OI during decoding means the hardware mostly moves data instead of computing. The usual scaling playbook assumes a GPU cluster can cover everything by packing in more compute and more high bandwidth memory (HBM). That maps well to the roofline model, which predicts speed from peak compute and memory bandwidth given a kernel’s ops per byte, but it ignores whether the data even fits. Agentic workflows break the fit assumption because the same base model can see very different prompt growth and state size across coding, web-use, and computer-use loops. A coding agent can accumulate 300K-1M tokens over 20-30 environment interactions, and a computer-use agent can spend orders of magnitude more tokens in prefill than a chatbot. At batch size 1 with a 1M context, a single DeepSeek-R1 request is estimated to need roughly 900GB of memory, and even a LLaMA-70B coding agent can exceed an NVIDIA B200 capacity ceiling. During decode, key-value (KV) cache reads drive OI so low that the system becomes bandwidth-bound and capacity-bound at the same time. The proposed default is disaggregated serving, splitting prefill and decode onto specialized accelerators and separate memory pools, connected by fast links like optical interconnects. --- arxiv. org/abs/2601.22001 "Heterogeneous Computing: The Key to Powering the Future of AI Agent Inference"
Rohan Paul tweet media
English
14
17
124
7.8K
Nic retweetet
César Biondini
César Biondini@BiondiniCesar·
Martín Morales, vecino de El Calafate, logró impedir que una pareja de turistas iniciara un incendio en el bosque de Laguna Torre, Parque Nacional Los Glaciares. Serían israelíes, según el vecino. Casi en simultáneo, la policía detuvo a otro "turista" que intentó hacer lo mismo en la Reserva Natural Urbana La Ribera. La Patagonia está totalmente abandonada por las fuerzas federales.
César Biondini tweet mediaCésar Biondini tweet media
Español
561
7.1K
19.6K
595.4K
Marian Herrera
Marian Herrera@marianherrrera·
Que raro que no hable en español el pirómano este no?
Español
130
1.4K
4.8K
55.7K