
Qeqe Vibes
12.2K posts



ELON: VENTILATORS DID MORE DAMAGE THAN COVID ITSELF
"I called doctors in Wuhan and said, what are the biggest mistakes that you made in the first wave, and they said we put far too many people on intubated ventilators.
So then I actually posted on Twitter at the time, I said, hey, what I'm hearing from Wuhan is that they made a big mistake in putting people on intubated ventilators for an extended period, and that this is actually what is damaging the lungs, not Covid.
People yelled at me and said I'm not a doctor, I'm like, yeah, but I do make spaceships with life support systems, what do you do?"
Source: @joerogan
English

@HaileyLennonBTC You don't even need to bring food, you could but is optional
English

@ribbita2012 you are being whitty. This is a sign of intelligence. What's next.
English

@SwipeWright To reduce crime statistics.
Then they can present this as evidence socialism works.
English

@Artemisfornow @grok What are the main contributing factors for francs collapse
English

@BIPOCracism @MrBeast Men and women have been getting married for thousand of years, STFU moron.
English

@elonmusk One of the most profound peace's of art in the 21'st century.
English

sydney sweeney reveals in an interview that her current favourite rocket is Starship.
"Starship launches have been an enjoyable experience. i cried watching flight 5, the booster catch was so majestic. Lately, spaceX was fucking up and starship was exploding on every flight, it was just boom, reset, boom, reset. It was like they were speedrunning Kerbal Space Program on nightmare mode.
But Flight 10? That one redeemed everything. Finally, less Michael Bay, more Christopher Nolan, flight 10 has been a saviour honestly.


English

💰 Google’s custom chips tensor processing units, or TPUs, are being seen as the strongest alternative to Nvidia’s GPUs,
Research Analysts say if Google ever spun off this business along with its DeepMind lab, the combined unit could be worth $900 billion, of course they do not expect Google to actually do it right now.
TPUs are chips Google designed specifically for machine learning, and they now rival Nvidia in both speed and cost-efficiency, with performance scaling up to 42.5 exaflops.
Developer activity around TPUs in Google Cloud grew by 96% in just 6 months, showing momentum among engineers and researchers outside of Google itself.
The 6th generation called Trillium is already in high demand, and the upcoming 7th generation Ironwood is expected to see even more interest because it is the first one built for large-scale inference, which is the step where AI models are actually used after training.
Big players like Anthropic and xAI are looking at TPUs because they now come with better software support through JAX, which makes them easier to run at scale compared to before.
According to media reports, Google struck a deal with at least one cloud provider, Fluidstack, a London-based company that will run Google’s Tensor Processing Units (TPUs) out of its New York data center.
Google has also tried making similar arrangements with other providers that specialize in NVIDIA hardware, like Crusoe, which is building a data center for OpenAI using a large fleet of NVIDIA chips, and CoreWeave, which supplies NVIDIA chips to Microsoft and OpenAI.
Google's approach effectively puts it in direct competition with NVIDIA, as NVIDIA primarily sells chips to these cloud service providers.
---
marketwatch. com/story/google-may-be-sitting-on-a-900-billion-gem-that-could-disrupt-nvidias-dominance-20662ec6

English























