Michael G
7 posts

Michael G
@M_Gschwind
AI acceleration, created first general purpose programmable Accelerators (Cell SPE PlayStation 3, roadrunner)
Menlo Park, CA Katılım Ekim 2019
1 Takip Edilen0 Takipçiler

@karpathy @benjamin_bolte XFormer and Flash custom kernels come standard with PT2, so you can use them off the shelf. Check out PyTorch nightlies! PT2 has both separate functions for each custom kernel, and generic _scaled_dot_product_attention to pick the best implementation for specific parameters
English

@soumithchintala @karpathy Using the new scaled dot product attention custom kernels that come with PT2 (your choice of vanilla, xFormer, Flash) is as easy as using a context manager: with torch.backends.cuss.sdp_kernel(…)
See Google Colab example here => colab.research.google.com/drive/1L9xbagV…
English

@karpathy Once you publish the "optimized minGPT" repo, we'll probably send some more patches in.
1. with the latest nightlies, the xformers/flash-attention kernels are in PyTorch core now.
2. we have a matmul autotuner that is about to land that gives significant boost in perf
English

@JimiDevine Amazing progress for civilization. Dismantle slums and build student housing.
English

@POTUS @VP @PeteButtigieg greetings from California where TESLA, an American car maker you may not have heard about, has created an alternative to internal combustion engines. Check this out:
teslarati.com/tesla-crushes-…
English


