Prakhar Mishra

291 posts

Prakhar Mishra banner
Prakhar Mishra

Prakhar Mishra

@rattller

MS (by research) scholar at IIIT-Bangalore | ML/NLP Content Creator (link below) ⬇️

Bengaluru, India เข้าร่วม Nisan 2011
0 กำลังติดตาม175 ผู้ติดตาม
Prakhar Mishra รีทวีตแล้ว
Yann LeCun
Yann LeCun@ylecun·
Current LLMs are trained on text data that would take 20,000 years for a human to read. And still, they haven't learned that if A is the same as B, then B is the same as A. Humans get a lot smarter than that with comparatively little training data. Even corvids, parrots, dogs, and octopuses get smarter than that very, very quickly, with only 2 billion neurons and a few trillion "parameters."
Yann LeCun@ylecun

Animals and humans get very smart very quickly with vastly smaller amounts of training data. My money is on new architectures that would learn as efficiently as animals and humans. Using more data (synthetic or not) is a temporary stopgap made necessary by the limitations of our current approaches.

English
489
937
7.4K
2.5M
Prakhar Mishra
Prakhar Mishra@rattller·
1. PDFTriage represents a document as a hierarchical tree of elements. 2. Selects the document frame needed for answering the query and retrieves it directly from the selected page, section, figure, or table. 3. Selected context with Query is used to extract answer. 3/3
English
0
0
0
55
Prakhar Mishra
Prakhar Mishra@rattller·
Existing Q/A methods treat long documents as plain text - which is incongruous with the user’s mental model of these documents with rich structure. This paper represents documents as structured objects and does a focused Q/A while preserving the document structure. 2/3
English
0
0
0
26
Prakhar Mishra รีทวีตแล้ว
Yann LeCun
Yann LeCun@ylecun·
According to Google Scholar's latest ranking of publication venues by h5-index, ICLR is #9 in all of science, a mere 9 years after its creation, just in front of NeurIPS. scholar.google.com/citations?view…
English
4
7
186
0
Prakhar Mishra รีทวีตแล้ว
Segmind
Segmind@Segmind_ai·
Check out Prakhar's (@rattller) informative insights into Efficient Text-to-Text Transformers! youtube.com/watch?v=ec-NVR… Keep an eye for the Segmind demo in the video 👀🙌🏻 For more details, follow Segmind @_segmind
YouTube video
YouTube
English
0
1
3
0
Prakhar Mishra
Prakhar Mishra@rattller·
⏩ Paper Title: Pre-train, Prompt, and Predict: A Systematic Survey of Prompting Methods in Natural Language Processing ⏩ Paper: arxiv.org/abs/2107.13586… ⏩ Author: Pengfei Liu, Weizhe Yuan, Jinlan Fu, Zhengbao Jiang, Hiroaki Hayashi, Graham Neubig ⏩ Organisation: CMU,NUS
Filipino
0
0
0
0
Prakhar Mishra
Prakhar Mishra@rattller·
Prompt-learning is the paradigm to adapt PLMs to downstream NLP tasks. 🔥 Can we get away from manually creating these prompts and mine them automatically? Interested in answering that question? 🤯 watch (Part-2) of the Paper Summary here - youtu.be/OsbUfL8w-mo #AIml
YouTube video
YouTube
English
2
0
1
0
Prakhar Mishra
Prakhar Mishra@rattller·
⏩ Paper Title: Pre-train, Prompt, and Predict: A Systematic Survey of Prompting Methods in Natural Language Processing ⏩ Paper: arxiv.org/abs/2107.13586… ⏩ Author: Pengfei Liu, Weizhe Yuan, Jinlan Fu, Zhengbao Jiang, Hiroaki Hayashi, Graham Neubig ⏩ Organisation: CMU, NUS
Filipino
0
0
0
0