onnxruntime

282 posts

onnxruntime banner
onnxruntime

onnxruntime

@onnxruntime

Cross-platform training and inferencing accelerator for machine learning models.

Katılım Eylül 2018
42 Takip Edilen1.4K Takipçiler
onnxruntime
onnxruntime@onnxruntime·
ONNX Runtime & DirectML now support Phi-3 mini models cross-platforms & devices! Plus, the new ONNX Runtime Generate() API simplifies LLM integration into your apps. Try Phi-3 on your favorite hardware! Read more: onnxruntime.ai/blogs/accelera… #ONNX #DirectML #Phi3
English
3
14
29
4.7K
onnxruntime retweetledi
Szymon 
Szymon @Szymon_Lorenz·
Developers, don't overlook the power of Swift Package Manager! It simplifies dependency management and promotes modularity. Plus, exciting news: ONNXRuntime just added support for SPM! #iOSdev #SwiftPM #ONNXRuntime
English
0
1
3
1.6K
onnxruntime
onnxruntime@onnxruntime·
#ONNX Runtime saved the day with our interoperability and ability to run locally on-client and/or cloud! Our lightweight solution gave them the performance they needed with quantization & configuration tooling. Learn how they achieved this in this blog! cloudblogs.microsoft.com/opensource/202…
English
0
5
9
1.6K
onnxruntime retweetledi
ONNX
ONNX@onnxai·
We are seeking your input to shape the ONNX roadmap! Proposals are being collected until January 24, 2023 and will be discussed in February. Submit your ideas at forms.microsoft.com/pages/response…
English
2
3
5
0
onnxruntime retweetledi
Jingya Huang
Jingya Huang@Jhuaplin·
Imagine the frustration of, after applying optimization tricks, finding that the data copying to GPU slows down your "MUST-BE-FAST" inference...🥵 🤗 Optimum v1.5.0 added @onnxruntime IOBinding support to reduce your memory footprint. 👀 github.com/huggingface/op… More ⬇️
English
1
15
54
0
onnxruntime retweetledi
efxmarty
efxmarty@efxmarty·
Want to use TensorRT as your inference engine for its speedups on GPU but don't want to go into the compilation hassle? We've got you covered with 🤗 Optimum! With one line, leverage TensorRT through @onnxruntime! Check out more at hf.co/docs/optimum/o…
efxmarty tweet media
English
1
19
76
0
onnxruntime retweetledi
Loreto Parisi
Loreto Parisi@loretoparisi·
Finally tokenization with Sentence Piece BPE now works as expected in #NodeJS #JavaScript with tokenizers library 🚀! Now getting "invalid expand shape" errors when passing text tokens' encoded ids to the MiniLM @onnxruntime converted @MSFTResearch model huggingface.co/microsoft/Mult…
Loreto Parisi@loretoparisi

Sentence piece vocabulary and merge files generated, some minor issues occurring, hopefully @huggingface can help 🙏github.com/huggingface/to…

English
0
5
3
0
onnxruntime retweetledi
Anton Lozhkov
Anton Lozhkov@anton_lozhkov·
🏭 The hardware optimization floodgates are open!🔥 Diffusers 0.3.0 supports an experimental ONNX exporter and pipeline for Stable Diffusion 🎨 To find out how to export your own checkpoint and run it with @onnxruntime, check the release notes: github.com/huggingface/di…
Anton Lozhkov tweet media
English
2
16
83
0
onnxruntime retweetledi
Open at Microsoft
Open at Microsoft@OpenAtMicrosoft·
The natural language processing library Apache OpenNLP is now integrated with ONNX Runtime! Get the details and a tutorial explaining its use on the blog: msft.it/6013jfemt #OpenSource
Open at Microsoft tweet media
English
0
6
8
0