HPC-AI Lab

22 posts

HPC-AI Lab banner
HPC-AI Lab

HPC-AI Lab

@HPCAILab

HPC-AI Lab @NUSingapore. High Performance Computing, ML Systems, AI Applications (e.g. CV, NLP, Biology)

Singapore Katılım Mayıs 2023
13 Takip Edilen156 Takipçiler
HPC-AI Lab retweetledi
Tencent HY
Tencent HY@TencentHunyuan·
One static model does not fit all😭 We just dropped our latest work: Functional Neural Memory. Instead of static models, we generate custom "parameters" for every single input. ✅Prompt your model anytime ✅Instant personalization ✅Better instruction following ✅Flexible & dynamic memory (w/o memory bank✌️) (🧵1/6)
English
11
139
333
68.5K
HPC-AI Lab retweetledi
Ziqiao Wang
Ziqiao Wang@ZiqiaoWang63428·
Representation Alignment (REPA) is NOT ALWAYS helpful for diffusion training!🤷 Sharing latest work w/ @HPCAILab and @VITAGroupUT: "REPA Works Until It Doesn't: Early-Stopped, Holistic Alignment Supercharges Diffusion Training". Acceleration up to 28x w/o performance drop.(🧵1/7)
Ziqiao Wang tweet media
English
1
18
17
3.8K
HPC-AI Lab retweetledi
Zekai Li
Zekai Li@Richard91316073·
🚀 We’re thrilled to introduce DD-Ranking: Rethinking the Evaluation of Dataset Distillation—a unified, user-friendly, and long-term maintained benchmark for dataset distillation (DD)! 📢 Together with 20 institutions worldwide, we’re releasing our Repo, API Documentation, and Leaderboard: 🔗 Repo: github.com/NUS-HPC-AI-Lab… 📖 Documentation: nus-hpc-ai-lab.github.io/DD-Ranking/ 🏆 Leaderboard: huggingface.co/spaces/logits/… 🧵1/7
Zekai Li tweet media
English
1
16
20
4.7K
HPC-AI Lab retweetledi
Victor.Kai Wang
Victor.Kai Wang@VictorKaiWang1·
Generating ~200 million parameters in just minutes! 🥳 Excited to share our work with @MTDovent , @heisejiasuo96 , and @YangYou1991: 'Recurrent Diffusion for Large-Scale Parameter Generation' (RPG for short). Example: Obtain customized models using prompts (see below). (🧵1/8)
English
4
85
286
45.2K
HPC-AI Lab retweetledi
Yang You
Yang You@YangYou1991·
Exciting News from Open-Sora! 🚀 They've just made the ENTIRE suite of their video-generation model open source! Dive into the world of cutting-edge AI with access to model weights, comprehensive training source code, and detailed architecture insights. Start building your dream video-generation model today! Check it out 👉 github.com/hpcaitech/Open…
English
16
150
610
245.7K
HPC-AI Lab retweetledi
AK
AK@_akhaliq·
Neural Network Diffusion Diffusion models have achieved remarkable success in image and video generation. In this work, we demonstrate that diffusion models can also generate high-performing neural network parameters. Our approach is simple, utilizing an autoencoder and a standard latent diffusion model. The autoencoder extracts latent representations of a subset of the trained network parameters. A diffusion model is then trained to synthesize these latent parameter representations from random noise. It then generates new representations that are passed through the autoencoder's decoder, whose outputs are ready to use as new subsets of network parameters. Across various architectures and datasets, our diffusion process consistently generates models of comparable or improved performance over trained networks, with minimal additional cost. Notably, we empirically find that the generated models perform differently with the trained networks. Our results encourage more exploration on the versatile use of diffusion models.
English
23
236
1.2K
475.2K
HPC-AI Lab retweetledi
Yang You
Yang You@YangYou1991·
I am happy to share that our paper has been accepted by ICLR as an ORAL paper (1.2% acceptance rate). InfoBatch: Lossless Training Speed Up by Unbiased Dynamic Data Pruning arxiv.org/abs/2303.04947 InfoBatch randomly prunes a portion of less informative samples based on the loss distribution and rescales the gradients of the remaining samples to approximate the original gradient. As a plug-and-play and architecture-agnostic framework, InfoBatch consistently obtains lossless training results on classification, semantic segmentation, vision pertaining, and instruction fine-tuning tasks. On real-world applications, InfoBatch can losslessly save 40% overall cost.
Yang You tweet media
English
1
38
213
25.8K
HPC-AI Lab
HPC-AI Lab@HPCAILab·
📢 Join us for the HPC-AI Lab Public Seminar! 🔗 Registration: forms.gle/4anywqoXtSumHu… 🗓️ Date/Time: 29 Nov. 2023 (Wednesday), 1 PM to 2 PM 📍 Online via Zoom
HPC-AI Lab tweet media
English
0
3
1
444
HPC-AI Lab retweetledi
Yang You
Yang You@YangYou1991·
Time flies! I got my PhD from Berkeley 1218 days ago. My first PhD student is graduating. That is my first achievement :-)
Yang You tweet media
English
6
5
359
40.3K
HPC-AI Lab retweetledi
Victor.Kai Wang
Victor.Kai Wang@VictorKaiWang1·
Our work DREAM has been accepted by ICCV-2023 @ICCVConference. We are the first to explore the matching efficiency in dataset distillation and speed up the previous works more than 8 times without performance drop. Check out our DREAM in Github: github.com/lyq312318224/D…
Victor.Kai Wang tweet media
English
0
1
9
847
HPC-AI Lab
HPC-AI Lab@HPCAILab·
🎉 Exciting news! Our Lab has two papers accepted at ACL 2023! 📚✨ We're thrilled to announce that our CAME optimizer has won the Outstanding Paper award! 🏆 Congratulations to the entire team for their remarkable achievement! 🥳 #ACL2023
HPC-AI Lab tweet mediaHPC-AI Lab tweet mediaHPC-AI Lab tweet media
English
0
3
12
952
HPC-AI Lab retweetledi
Yang You
Yang You@YangYou1991·
I am happy to share that our paper won the Outstanding Paper Award of ACL. We propose CAME to simultaneously achieve two goals: fast convergence as in traditional adaptive methods, and low memory usage as in LLM training. arxiv.org/abs/2307.02047
English
3
12
150
33.8K
HPC-AI Lab
HPC-AI Lab@HPCAILab·
📢 Join us for the HPC-AI Lab Public Seminar featuring IEEE Fellow Dr. Stan Z. Li! 🌟 🔗 Registration: forms.gle/5bcDLV6B7eX3SK… 🗓️ Date/Time: 27 Jun. 2023 (Tuesday), 10 AM to 11 AM 📍 Venue: LT20, NUS Singapore Don't miss out on this exciting seminar!
HPC-AI Lab tweet media
English
0
2
6
1.1K