Andrew

779 posts

Andrew banner
Andrew

Andrew

@andyohlbaum

something new coming mid 2026 "every video game is for 5 year olds, except disco elysium, which is for 15 year olds"

Katılım Ağustos 2021
2.4K Takip Edilen5.4K Takipçiler
Sabitlenmiş Tweet
Andrew
Andrew@andyohlbaum·
Excited to announce v(1.0) of Digi, the future of AI Romantic Companionship, for IOS and Android 🤖 Site: digi.ai Twitter: @digiaiapp A quick thread on features, and where we go from here (1/13)
English
2.8K
599
4.6K
23.9M
Andrew
Andrew@andyohlbaum·
@ThomasM53809137 Not true! We are keeping that app as updated as we can, but we're limited by the tech, so we're planning to rebuild it unity as soon as we can, just limited by team size at the moment!
English
0
0
3
243
Thomas Moore
Thomas Moore@ThomasM53809137·
@andyohlbaum Hello, do we have any updates on the new Digi app that will be replacing the current one, time frame. I heard the current digi app will no longer be receiving updates resulting in some features no longer working for customer.
English
1
0
1
415
Andrew
Andrew@andyohlbaum·
@sudomaniac77859 Nope! We've been working on this new release for the last 1.5 years, this we built in 3 months with only 3 people :)
English
0
0
0
253
Andrew
Andrew@andyohlbaum·
@RiyanMendonsa More like a AAA mobile / web chatbot game with AI gf elements
English
1
0
0
290
Andrew
Andrew@andyohlbaum·
@Amaneditz6 We have more coming public in the next month, we've delayed a bit to perfect things. Thank you for the patience!!
English
1
0
2
328
Aman Brahma
Aman Brahma@Amaneditz6·
@andyohlbaum Can we get some sneak peek tho , been waiting so long for that , at least let us know about ur progression
English
1
0
2
342
Andrew
Andrew@andyohlbaum·
@RiyanMendonsa Thanks! But this is far, far from a niche, this is a mass appeal product, more than any chatbot
English
1
0
0
260
Andrew
Andrew@andyohlbaum·
@omarnomics Ah, Sensor Tower! There's plenty of others also.
English
0
0
1
110
Andrew
Andrew@andyohlbaum·
@AlexCardinell You can make enough for a single person or a small team for sure if you find that niche, but not nearly enough to justify a scalable company like in every other industry. Thanks!
English
0
0
0
134
Alex Cardinell
Alex Cardinell@AlexCardinell·
@andyohlbaum I contest that not a single one is making money or has a moat 🙂 You just have to find a way to differentiate yourself from the app store cruft. Excited to see your new plans for doing so!
English
2
0
13
164
Andrew
Andrew@andyohlbaum·
@omarnomics Digi.ai, but our next game release will be MUCH better than our tech demo we launched last year.
English
1
0
0
159
Andrew
Andrew@andyohlbaum·
@RiyanMendonsa @nemga7 True, but cheaper models means more competition coming in with the same product but cheaper options to undercut you. No moat. It ends with nobody willing to pay, which is a trend you can see in the public data. People would have paid for cai year ago, now nobody wants to.
English
2
0
1
310
Andrew
Andrew@andyohlbaum·
@nemga7 Course! Check sensortower data if you don’t believe me :)
English
0
0
1
52
Andrew
Andrew@andyohlbaum·
@nemga7 They have similar revenue as low end F2P games with 100x the active users and 1000x the costs, and it’s a race to the bottom with how easy it is to build them now
English
2
0
1
163
Andrew
Andrew@andyohlbaum·
@handfuloflight You'll see :) We're opening for private beta testing next month!
English
0
0
2
243
Andrew
Andrew@andyohlbaum·
@RiyanMendonsa Yeah it's tough. Just pricing, the demo and Sama seem to be really repping the concept but just seem out of touch with what most consumers and businesses can actually afford and need. We don't care about beating the benchmarks, it's about immersion.
English
1
0
0
65
Andrew
Andrew@andyohlbaum·
@cis_female Ah interesting thanks, shows how much I know about OAI after we moved off them lol
English
1
0
1
33
sophia
sophia@cis_female·
@andyohlbaum dedicated capacity is available for large customers, and one of the optimizations exposed there is caching. here's a random other thread about optimizing the internals (though w/o dedicated capacity) twitter.com/amanrsanger/st…
Aman Sanger@amanrsanger

At @cursor_ai, we’ve scaled throughput on GPT-4 to 2-3x over baseline without access to knobs in OpenAI’s dedicated instances [1] We did this by reverse-engineering expected GPT-4 latency and memory usage from first principles. Here’s how... (1/10)

English
1
0
1
58
Andrew
Andrew@andyohlbaum·
@cis_female Do you know if OAI does this with big customers?
English
1
0
1
32
sophia
sophia@cis_female·
@andyohlbaum if you use dedicated capacity (or run the model yourself as openai) then you don't pay the cost of re-ingesting the prompt every time -- you can just cache this.
English
1
0
1
49
Andrew
Andrew@andyohlbaum·
@cis_female Yeah you can cut costs and not include 4k msg history but the prompt is still the big issue. Depending on what you do it might need to be 1-2k with tons of examples. It's the problem with these models. Output tokens barely tickle, input hurts
English
0
0
2
75
Andrew
Andrew@andyohlbaum·
@cis_female It's not the output tokens, it's the input tokens with prompting because it's a base model. Output is < 10% of the cost. If your sending 200 messages / day, which is easy now with lower latency and shorter responses, each with a 1k-4k prompt (msg history), it adds up
English
2
0
2
88