Lukas Geiger

83 posts

Lukas Geiger

Lukas Geiger

@_lgeiger

deep learning scientist | astroparticle physicist | software engineer

London, England Katılım Ekim 2016
226 Takip Edilen127 Takipçiler
Lukas Geiger
Lukas Geiger@_lgeiger·
@KyleRayKelley @zeddotdev @AtomEditor @nteractio Just looked at some old screenshots. Looks like we were copying what ipython did. However, ipython only showed `In[*]` starting version in 8. Before that it would look something like `<ipython-input-1-6fcf9dfbd479> in <module>()` so even ipython didn't have line numbers.
English
1
0
1
64
Kyle Kelley
Kyle Kelley@KyleRayKelley·
Did Hydrogen in Atom show line numbers in replacement for `In[*]`? @_lgeiger
Kyle Kelley tweet media
English
1
0
0
243
Kyle Kelley
Kyle Kelley@KyleRayKelley·
Indexing and retrieval of my local code base as well as documentation from outside sources. I'd love to be able to plop a link to a crate or PyPI project to help me and the LLM discover what's available to work with. I also want to be able to go straight from an inline error to chat/fix. By the way, any plan for interpreters? We should chat. The world needs Hydrogen again. 😁
Kyle Kelley tweet media
English
4
0
3
219
Kyle Kelley
Kyle Kelley@KyleRayKelley·
The editing experience in @zeddotdev is *perfect* while the AI-enabled search in @cursor_ai is incredibly good. I wish the two would get married.
English
3
1
10
964
Lukas Geiger retweetledi
Plumerai
Plumerai@plumerai·
We made our ultra-fast and ultra-small inference engine for Arm Cortex-M processors even smaller. We now require 49% less RAM for 8-bit MobileNetV2 than TensorFlow Lite for Microcontrollers with Arm’s CMSIS-NN kernels. Read the article here: blog.plumerai.com/2021/10/cortex…
Plumerai tweet media
English
0
2
7
0
Lukas Geiger retweetledi
Plumerai
Plumerai@plumerai·
🏆 We built the world’s fastest and most memory-efficient deep learning inference software for Arm Cortex-M microcontrollers 🏆 for both Binarized Neural Networks and for 8-bit deep learning models. blog.plumerai.com/2021/10/cortex…
English
3
4
10
0
Lukas Geiger retweetledi
Plumerai
Plumerai@plumerai·
Our latest paper has been accepted to the #MLSys2021 conference! Congratulations Tom, Arash, Adam, Lukas, Tim, Leon, Jelmer and Koen! We wrote a short blog post about the paper and the importance of Larq Compute Engine: blog.plumerai.com/2021/04/mlsys2…
English
1
5
8
0
Lukas Geiger retweetledi
Plumerai
Plumerai@plumerai·
It was great to see so many people attending our #tinyml webcast earlier this week! In case you missed the event, a recording of the talk is now online along with a short blog post explaining the main takeaways from our presentation: blog.plumerai.com/2021/01/tinyml…
English
1
3
8
0
Lukas Geiger retweetledi
Plumerai
Plumerai@plumerai·
We've released v0.4 of Larq Compute Engine! Some of our internal work optimizing BNN performance on microcontrollers has made its way to mobile and Raspberry Pi, as have a few major architectural improvements. 🚀 Full change log: github.com/larq/compute-e…
Plumerai tweet mediaPlumerai tweet media
English
1
2
14
0
Lukas Geiger retweetledi
kamil
kamil@kamil_k7k·
We just published blog post about open source tools for #DeepLearning presented at our social event at the @iclr_conf! Shout-out to all the co-authors for their great contributions and efforts. Without your help, this would not have been possible! neptune.ai/blog/iclr-2020… (1/n)
English
1
3
6
0
Lukas Geiger
Lukas Geiger@_lgeiger·
@fchollet Improved contributor experience. Moving Keras out of the tf monorepo will hopefully help, but currently having to wait hours for tf to build on a laptop just for a small Python fix and then waiting weeks to get a PR review doesn't encourage community contributions.
English
0
0
0
0
Lukas Geiger
Lukas Geiger@_lgeiger·
@fchollet One recommended way to save and resume training - tf.train.Checkpoint is great but requires too much custom code - Model.save_weights and ModelCheckpoint don't reload optimizer state or epoch - Model.save doesn't play well with tf.distribute
English
1
0
1
0
Lukas Geiger retweetledi
Plumerai
Plumerai@plumerai·
We wrote a blog post about how we’re applying a software 2.0 approach in the new module for state-of-the-art binarized neural networks in Larq Zoo (`zoo.sota`). It also features some exciting improvements across the whole Larq stack: blog.larq.dev/2020/03/larq-e…
English
1
4
7
0
Lukas Geiger retweetledi
Plumerai
Plumerai@plumerai·
We’ve just released Larq v0.9, featuring: 🔲 Non-zero padding 💾 Easier export for binary weights 📋 Improved model summaries And lots more! 🎉 github.com/larq/larq/rele…
Plumerai tweet media
English
1
4
8
0