cTuning foundation

578 posts

cTuning foundation

cTuning foundation

@c_tuning

Empowering everyone to participate in collaborative research (AI/ML/Sys), reproducible experiments and open science to solve the world's most complex challenges

Paris Tham gia Eylül 2011
43 Đang theo dõi103 Người theo dõi
cTuning foundation đã retweet
Google AI
Google AI@GoogleAI·
Today on the blog, we’re excited to announce the release of @MLCommons Croissant, a metadata format to make ML datasets more easily discoverable and usable across a wide array of tools and platforms. Learn more and try it today →goo.gle/4335P4V #ml #datasets
Google AI tweet media
English
28
145
522
99.6K
cTuning foundation đã retweet
HPCA
HPCA@HpcaArchConf·
Distinguished Artifact Runner Up: Title: "An Optimizing Framework on MLIR for Efficient FPGA-based Accelerator Generation" Authors: Weichuang Zhang, Jieru Zhao, Guan Shen, Quan Chen, Chen Chen, Minyi Guo
HPCA@HpcaArchConf

Distinguished Artifact Award: Title: "Gemini: Mapping and Architecture Co-exploration for Large-scale DNN Chiplet Accelerators" Authors: Jingwei Cai, Zuotong Wu, Sen Peng, Yuchen Wei, Zhanhong Tan, Guiming Shi, Mingyu Gao, Kaisheng Ma

English
0
3
6
1.3K
cTuning foundation đã retweet
Grigori Fursin
Grigori Fursin@grigori_fursin·
We have released a new CM-MLPerf automation to benchmark commodity hardware for AI performance, power and cost efficiency - it helped to automate ~90% of MLPerf inference v4.0 submissions while achieving several top performance and power results: linkedin.com/pulse/new-cm-m…
English
0
3
7
105
cTuning foundation đã retweet
Grigori Fursin
Grigori Fursin@grigori_fursin·
Check out our artifact evaluation report for the 56th IEEE/ACM International Symposium on Microarchitecture with the pilot @MLCommons project @https://www.linkedin.com/pulse/micro-2023-artifact-evaluation-report-56th-ieeeacm-symposium-fursin-bsgwe @MicroArchConf
English
0
1
5
183
cTuning foundation đã retweet
Grigori Fursin
Grigori Fursin@grigori_fursin·
Urgent: @MicroArchConf'23 is looking for motivated artifact evaluators to #reproduce results for accepted papers. If you already have relevant AE experience, please use this form for self-nomination: forms.gle/dAtm13fKYUTjLV… . Deadline for self-nomination is August 14. Thank you!
English
0
3
2
314
cTuning foundation đã retweet
HiPEAC
HiPEAC@hipeac·
📢 Calling all students! 👩‍🎓HiPEAC Reproducibility Student Challenge now open for registrations ⏳Deadline: 15 September 🔬 Get hands-on experience of the latest research 📜📜Contribute to reproducibility 🤝Network with top researchers at #HiPEAC24 👉 bit.ly/HiPEAC_Rep_Stu…
HiPEAC tweet media
English
0
9
13
630
cTuning foundation đã retweet
Grigori Fursin
Grigori Fursin@grigori_fursin·
Are you interested to know how fast the open-source GPT-J 6B #LLM model from @AiEleuther runs on your @nvidia GPU? Join the new MLPerf@home challenge to crowd-benchmark GPT-J and submit your results to MLPerf inference v3.1 round (deadline: August 3rd): linkedin.com/feed/update/ur…
English
0
2
6
551
cTuning foundation đã retweet
Grigori Fursin
Grigori Fursin@grigori_fursin·
Very excited to announce the 1st Collective Knowledge Cup - a set of open challenges prepared by @MLCommons organizations to let the community benchmark and optimize AI and ML Systems (latency, throughput, energy, accuracy, cost) linkedin.com/pulse/announci… via @LinkedIn
English
0
2
4
170
cTuning foundation đã retweet
Grigori Fursin
Grigori Fursin@grigori_fursin·
@TheSeaMouse Oh, that's so true! After being hugely frustrated with all that, we developed a simple automation language to address some of these issues: zenodo.org/record/8105339 - we just tested it to reproduce MLPerf inference benchmarks and it seems to work but we will need more feedback ;)
English
0
1
2
42
cTuning foundation đã retweet
Andrej Karpathy
Andrej Karpathy@karpathy·
Very nice & inspiring, "no-gradient architecture" for high-level skills/learning. LLM here is the "prefrontal cortex" orchestrating the lower-level mineflayer API via code generation++. Meta-comment is that I remember how hopeless it felt to work on agents in environments like Minecraft around ~2016, feeling stuck on how RL at the time would ever randomly explore their way into performing long-horizon tasks from super sparse rewards. This block has now to a very large extent been lifted - the correct thing was to forget all that, first train LLMs that learn (1) world knowledge, (2) reasoning and (3) tool-use (esp writing code) all from internet text, then point them back at the problem in this kind of a way. TLDR If I had read about this "no-gradient" approach to agents in 2016 my mind would certainly be blown. Also haha @ source code in the voyager/prompts/*.txt directory :D
Jim Fan@DrJimFan

What if we set GPT-4 free in Minecraft? ⛏️ I’m excited to announce Voyager, the first lifelong learning agent that plays Minecraft purely in-context. Voyager continuously improves itself by writing, refining, committing, and retrieving *code* from a skill library. GPT-4 unlocks a new paradigm: “training” is code execution rather than gradient descent. “Trained model” is a codebase of skills that Voyager iteratively composes, rather than matrices of floats. We are pushing no-gradient architecture to its limit. Voyager rapidly becomes a seasoned explorer. In Minecraft, it obtains 3.3× more unique items, travels 2.3× longer distances, and unlocks key tech tree milestones up to 15.3× faster than prior methods. We open-source everything. Let generalist agents emerge in Minecraft! Welcome you all to try today: voyager.minedojo.org Paper: arxiv.org/abs/2305.16291 Code: github.com/MineDojo/Voyag… Deep dive with me: 🧵

English
41
291
2K
519K
cTuning foundation đã retweet
Grigori Fursin
Grigori Fursin@grigori_fursin·
Thank you, Flavio (@Flav1oV)! It was great to collaborate with you on the original CK framework to facilitate #reproducible research and technology transfer! I am looking forward to collaborate with you on the 2nd generation of our @MLCommons CM automation language in the future!
English
0
1
9
399
cTuning foundation đã retweet
Carlos Maltzahn, linkedin: carlosmaltzahn
We are delighted to have @grigori_fursin confirmed as keynote speaker at #acmrep23. The keynote's title is "Toward a common language to facilitate reproducible research and technology transfer: challenges and solutions".
English
0
4
9
227