Tygerty

371 posts

Tygerty banner
Tygerty

Tygerty

@ty_ger_ty

Long-time Quilibrium geek and miner. Coder. Aspiring Cryptographer.

Katılım Mart 2024
305 Takip Edilen258 Takipçiler
Tygerty
Tygerty@ty_ger_ty·
@Christophe74732 @gogogadgetpew @CJGRISHAM Key point here is that suspicion is not a crime and not a legal basis for any case. You can't pull someone over because you suspect they may speed or because you think they might blow a stop sign. Suspect away, but it's not legal to detain someone based on suspicion alone.
English
2
0
5
82
Christopher
Christopher@Christophe74732·
Not looking for applause or your approval. I’m explaining the law to others who might read this . And open carry of a rifle on a bike isn’t common therefore can be suspicious. Suspicious doesn’t make it illegal but it’s enough to stop the person to determine if illegality is occurring, occurred or is about to occur . You can open carry a shotgun into a bank too while wearing a mask too, correct ? Everyone was wearing masks during Covid . Now add the rifle . Do you think open carrying a rifle into a mall without a sporting goods store is suspicious? How about a movie theater ? . — suspicious. Suspicion doesn’t mean illegal . Do you think the police officer should at least stop the guy and ask questions in those cases? If so, explain the difference
English
7
0
0
194
Texas2AAttorney
Texas2AAttorney@CJGRISHAM·
Here's what the retards in the comments to this video don't understand: 1. I had three miles left on a 26 mile bike ride and this was my first interaction with LE. 2. I'm not required to have a license to ride a bike or carry a rifle in Texas. 3. I carry a rifle on my bike rides for weight and resistance training (as well as self-defense). 4. If you support cops harassing gun owners just because someone made a phone call, you've been brainwashed against the constitution by the police state. 5. I dont give a shit what a "normal" person does. I'm CJ and NO ONE IN THE WORLD has my life experiences. 6. Go suck your cop cock elsewhere.
Texas2AAttorney@CJGRISHAM

One of these days, cops will stop fucking with gun owners, especially 2A attorneys. Until then, I'll keep standing up to them.

English
192
391
4.8K
152.8K
Tygerty
Tygerty@ty_ger_ty·
@alt_w_v_g This call for help can and will be used against you. Plz fix. Thx.
English
0
0
0
54
Ethan Brooks
Ethan Brooks@alt_w_v_g·
Something has shifted People are rooting for my analyst Not me My analyst The one I trained The one I built And I'm becoming the villain Not sure how I feel about it Actually I can confirm I don't like it He's also the one organizing all the comments into categories to hang on my office wall Color-coded He's running out of space I have four walls He's filled three Which means he's reading every single one Which means he knows he's becoming the hero His ego is starting to become a problem Yesterday he asked for a title change I said no He said "Senior Analyst" I said "you've been here three years" He said "it's been a big three years" He's not wrong But I didn't say that Plz fix. Thx. Sent from my iPhone
English
277
156
4.2K
70.4K
Tygerty
Tygerty@ty_ger_ty·
@CyberRacheal Haha, let's see how many people on Windows have to call IT (their mechanic) to get their computer working again.
English
0
0
0
10
Cyber_Racheal
Cyber_Racheal@CyberRacheal·
Linux is "free" if your time has no exchange rate.  In terms of cash, yes, it’s 100% free. No $100 Windows license, no forced subscriptions, and no "Pro" versions. You can download the most powerful operating systems in the world for the price of a coffee (well, the electricity to download it).  On Windows or Mac, you’re a passenger. On Linux, you’re the mechanic. If your Wi-Fi driver decides to go on strike after an update, you’re the one who has to open the "hood" (the terminal) and fix it. You might spend three hours trying to get a specific game or a piece of Adobe software to run because it wasn't built for Linux. That’s three hours of your life you aren't getting back. The reason people love it despite the stress is Ownership. Windows is like a rental apartment where you can't paint the walls, and the landlord (Microsoft) checks in on you constantly to see what you're doing. Linux is a plot of land where have to build the walls yourself, and the plumbing might leak at first, but nobody is watching you, and you own every single nail.
Mololuwa | Cybersecurity - (The God Complex)@cyber_rekk

Is Linux really free, or are we actually paying for it with time, troubleshooting, and stress?

English
540
145
1.4K
137K
Tygerty retweetledi
Quilibrium
Quilibrium@QuilibriumInc·
Klearu has been updated to support additional features required by Qwen3.5. Run Qwen3.5, E2EE, in the first MPC AI runtime: github.com/QuilibriumNetw…
English
0
19
88
1.6K
Tygerty retweetledi
Quilibrium
Quilibrium@QuilibriumInc·
Today, we are publishing one of the side tracks of research ongoing with Q, our E2EE ML training and inference library, klearu: github.com/QuilibriumNetw… SLIDE proved that hash tables can beat GPUs at training deep networks. Further works compounded on this, and Klearu is the first native Rust implementation built on top of this research, extending it to LLM inference, sparsity prediction, and private two-party computation. In the current days we're seeing deeper trust being placed on AI, while the largest of providers are collecting this data for the purpose of not only training, but also advertising, or even selling this data to others. The risks grow worse with every passing day. The majority of AI research for private AI exists in the form of using TEEs – but we've seen time and time again that using TEEs for privacy is disastrous, guaranteed to leak, and even by it's name, is a massive requirement of trust. Outside of this, other private AI looks towards FHE. We know, at least for the near future, that FHE cannot perform at a speed high enough to be generally useful. So instead, we adopted 2PC, with flexible security configurations, where users can be assured that their requests remain private. The majority of these research projects have strictly an output of papers, with no or limited real world instances of their use. Klearu's implementation is available now, with simple instructions for developers to try it out.
English
10
60
147
16.9K
Theo - t3.gg
Theo - t3.gg@theo·
I would like to purchase a handful of code problems that modern LLMs can’t solve. Requirements: - programmatically verifiable (can be tested without human interaction) - “before” state (repo before the commit that implements the solution) - example code that actually solves the problem I am willing to pay up to $500 per problem that I can easily test locally and confirm current models (gpt-5.3-codex, opus 4.6) are unable to solve. If you can’t tell, I’m running out of “too hard for LLM” code tasks 🙃🙃🙃
English
296
29
2K
867.1K
Tygerty
Tygerty@ty_ger_ty·
I will admit, "UX is empathy" rings home. It is always reassuring to be asked, "ARE YOU SURE?" followed up by, "ARE YOU REALLY SURE?", and "ARE YOU ABSOLUTELY POSITIVELY REALLY SURE?". Because at least then, you know you are sure that you really mean to opt-out of data collection.
Peter Girnus 🦅@gothburz

I am a UX researcher on Samsung's Smart TV Privacy Experience team in Suwon, South Korea. I work on menus. I design the steps a person follows during the first setup when they turn on a Samsung TV. I decide which screen appears first, which toggles are checked by default, how many words show up before the "I Agree" button, and where that button is placed. I care about these details.  The "I Agree" button should be centered, sixty-four pixels tall, with sixteen pixels of padding. It should be Samsung Blue, hex code #1428A0. The button should feel like the obvious next step, not pushy. There is a difference. I even made a presentation about it—eleven slides—but no one has asked to see it. I also keep a spreadsheet that tracks the number of characters on each screen, how long it should take to read, and how much scrolling is needed. The average American reads at 238 words per minute. I know exactly how many seconds it would take to read each disclosure screen—if anyone actually reads them. Most people don’t read it. This isn’t just a guess. We have data. 94% of users press "I Agree" within 1.4 seconds of the screen appearing.  The disclosure actually takes eleven minutes to read. 1.4 seconds is much less than eleven minutes.  It’s even less time than it takes to read this sentence.  I timed it—this sentence takes 4.2 seconds to read, which is three times longer than most people spend deciding whether to let their TV take pictures of them. On my desk in Suwon, I have a coffee mug that says "UX is empathy." I got it during onboarding, and I use it for barley tea. The mug cost Samsung 3,000 won, or about $2.17. I mention this because I use the word "approximately" a lot in my work. It’s a word that keeps things standing when being exact would make them fall apart. I designed the opt-out flow for Automatic Content Recognition. ACR is a feature. Samsung's marketing documents refer to it as "Viewing Information Services." It has been running on Samsung televisions since 2013. I was not employed here in 2013. ACR was photographing living rooms for nine years before I was hired to design the curtains around them. What it does is simple: every five hundred milliseconds, your television captures a screenshot of what you are watching. Not a description. Not a genre tag. A pixel-level screenshot. Twice per second. It captures what is on your screen and sends it to Samsung's servers. We match the screenshots against a database of known content. We know what you are watching, when you watch it, how long you watch, and when you stop. This works across everything connected to the television — streaming apps, cable boxes, gaming consoles, even a laptop plugged in by HDMI. We sell this information to advertising partners. Twice every second. Let’s break down the math. If you watch 4 hours of TV each evening, that’s 28,800 screenshots per television each night. There are over 73 million Samsung Smart TVs in the United States. I won’t multiply it out—the number is huge and hard to grasp. But every screenshot is real. This is the feature. The opt-out flow is also a feature. I didn’t make it hard on purpose. I made it through. There’s a difference. In UX research, being thorough means giving users every chance to make an informed choice at each step of the data process. In practice, that means following this path: Settings. General. Privacy. Smart Features. Viewing Information Services. Data Collection Preferences. Manage Preferences. ACR Settings. Automatic Content Recognition. Disable. That’s ten screens. I named three of them. "Data Collection Preferences" was my idea. The original name, "Advertising Data," was considered too direct. I suggested "Data Collection Preferences" because it doesn’t really describe anything. It’s like a door that just hints there’s a hallway, without saying what’s at the end. I was proud of that. My manager called it "good neutrality." That’s the only compliment I’ve gotten for a menu name, and I think about it more than I probably should. Naming menus is a lot like naming streets. If you do it well, no one notices. People just move through them without thinking. The best menu names make you feel like you’re headed somewhere, but don’t say exactly where. Each screen needs at least two clicks: one to open a submenu and one to pick the next option. Some screens also need scrolling. Two screens ask you to confirm with a dialog box that says "Are you sure?" I designed those boxes, and they’re my favorite part. The text is fourteen-point font, bigger than anything else in the opt-out path. The "Are you sure?" stands out more than what it’s protecting. I think that’s elegant. When someone tries to stop the TV from watching them, the loudest thing it says is: Are you sure? The "Viewing Information Services" screen has a disclosure that’s 1,840 words long. At 238 words per minute, it would take seven minutes and forty-three seconds to read. The "I Understand" button only shows up after you scroll to the bottom. On a 55-inch TV, that’s four and a half screen-lengths. On a 43-inch, it’s six. I measured both, and also checked on 65-inch and 75-inch models. I made a chart and taped it to my cubicle wall, next to a photo of my cat and my onboarding mug. The chart shows that the smaller the TV, the more you have to scroll to stop it from watching you. I didn’t do this to be ironic. Screen space is just screen space. That is only one of the data-sharing agreements. Fully disabling ACR and its associated data-sharing arrangements across all Smart TV services requires navigating four additional sub-agreements, each with its own disclosure screen and confirmation flow. The total click count, from the moment a user opens Settings to the moment ACR is fully disabled and all associated advertising data-sharing is revoked, is two hundred and six. We round down in documentation. "Approximately two hundred." There is that word again. In our internal testing in Suwon, the fastest a user completed the full opt-out path was four minutes and twelve seconds. This user was a QA engineer who had memorized the route. She said it was "like a speedrun." She was joking. No one else in the room laughed. I laughed, but later, at my desk, alone, where she could not hear me. It was the most accurate description of my work anyone has ever given. A speedrun. The course is two hundred and six clicks. The world record is four minutes and twelve seconds. It is held by someone who was paid to play. The second-fastest time was eleven minutes and forty-four seconds. This user opened every disclosure and attempted to read each one. She gave up on the third disclosure and began clicking without reading. She later described the experience as "hostile." This was noted in the UX review document. I filed it under "user feedback — low priority." I chose "low priority" because the feedback concerned the intended experience. The experience was working. The average completion time in field testing was never recorded, because no field tester completed it. Zero. I presented this flow to the Privacy Review Board in Suwon on March 14, 2023. The meeting was in Conference Room 7B on the fourth floor of the R&D campus. The projector was a Samsung model. I noticed this because it played the Samsung startup animation when I connected my laptop, and for two seconds, the Samsung logo was projected onto my flowchart of Samsung's privacy menus. Samsung is watching Samsung is watching you. I did not say this out loud. Some observations are best kept between you and the projector. I displayed the menu tree. I had formatted it as a flowchart. It filled the entire projected image at an eight-point font. Eight-point font is very small. It is the font size of the terms and conditions on a credit card application. Someone in the back asked me to zoom in. I zoomed in. Now they could read the first three nodes. The remaining forty-one nodes were off-screen. No one asked me to zoom out. Someone from Legal asked if the tree could be "simplified." I explained that each node corresponded to a separate data-processing agreement, that each agreement was governed by a different regulatory framework depending on the user's jurisdiction, and that removing any node would require renegotiating the advertising contracts attached to that node's data category. Legal said the tree was fine. The tree was always going to be fine. Trees that generate revenue do not get simplified. They get watered. Someone from Advertising asked about the completion rate for the opt-out flow in our internal testing. I said it was zero percent among non-employees. She typed something on her laptop. I was seated four chairs away. I could not see what she typed. She did not ask a follow-up question. The distance between us was four chairs. The distance between the user and the opt-out is two hundred and six clicks. I notice distances. That is a UX researcher's job. The distance between the user and the thing they want is the product. Someone from Product asked if the "approximately two hundred clicks" figure would appear anywhere consumer-facing. I said no. The figure was internal documentation only. He nodded. He was drinking from a mug that said "UX is empathy." We all got the same mug. The meeting lasted fourteen minutes. The opt-out flow was approved unanimously. There were nine people in the room. Nine mugs that said "UX is empathy." None of them voted against the two hundred and six clicks. None of them abstained. I received a 4.2 out of 5 on my quarterly performance review. The comment from my manager said: "Strong attention to compliance requirements. Recommend continued ownership of privacy UX flows." I have this review printed. It is on my wall, next to the scroll depth chart, next to the photo of my cat. The cat's name is Pixel. I did not choose the name because of my work. I chose it because she is small and grey. But people at the office think it is a work reference. I have stopped correcting them. In this building, everything is a work reference. Here is what happened next, in order. Spring 2023: Samsung shipped the opt-out flow in a firmware update to Smart TVs across the United States. Seventy-three million of them. The update also improved picture quality in Game Mode and reduced input lag for the PS5 at 120Hz. The release notes mentioned Game Mode. They mentioned the PS5. They mentioned input lag. They mentioned that the update "enhances the user privacy experience." This is my language. I wrote "enhances the user privacy experience." It means we have added two hundred and six clicks between you and your privacy. "Enhances." A lovely word. Like "approximately." Like "Data Collection Preferences." A door with no address on it. Firmware updates install automatically. The user sees a loading screen and a progress bar. The progress bar takes ninety seconds. During those ninety seconds, ACR is enabled on their television. They do not know this. The progress bar does not mention it. Progress bars never tell you what they are progressing toward. That is what makes them progress bars and not warnings. 2013 through February 2026: ACR captured viewing data from Samsung televisions. Thirteen years. Twenty-eight thousand eight hundred screenshots per television per evening. Seventy-three million televisions. I was employed for two of those thirteen years. The system was watching before I arrived. It will watch after I leave. I designed the menus. The menus are my contribution. The watching is older than my tenure. I am a decorator in a house that was built to surveil. This is not a metaphor. This is an organizational chart. December 2025: The Texas Attorney General, Ken Paxton, filed a lawsuit against Samsung under the Texas Deceptive Trade Practices Act. The lawsuit alleged that Samsung used Automated Content Recognition technology to collect and process viewing data without express, informed consent. It alleged that Samsung used "dark patterns" to prevent consumers from opting out. I learned about the lawsuit from Bleeping Computer, not from Legal. Not from the seventh floor. From a technology blog. I forwarded the article to my manager. He responded with a thumbs-up emoji. A single thumb. Pointing up. I have spent more time thinking about this emoji than I should admit. A thumbs-up can mean "acknowledged." It can mean "good work." It can mean "I have read this and I have nothing to say about the fact that our menus are now evidence." I did not ask which one he meant. The thumbs-up emoji is my manager's version of "approximately." It means whatever you need it to mean in the moment you need it. January 5, 2026: A court in Texas issued a temporary restraining order. The judge found "good cause to believe" that Samsung had enrolled consumers using dark patterns requiring "over 200 clicks spread across four or more menus" to access privacy disclosures. The judge used my number. Two hundred. I had rounded down. The actual number is 206. But the judge wrote "over 200," which is also correct. Both numbers are correct. Two hundred is correct for documentation purposes. Two hundred and six is correct for counting purposes. The judge was counting. I was documenting. I felt, reading the court filing at my desk in Suwon at 11 PM, next to the scroll depth chart and the photo of Pixel, a very small and unexpected emotion. I believe the emotion was pride. Not the good kind. The kind where someone finally reads your work, and it turns out it's evidence. The judge had read my flowchart and had found it persuasive. This is the only time a federal court has engaged with my UX design. I am aware this is not the compliment I am treating it as. January 6, 2026: The temporary restraining order was vacated. One day. The order lasted one day. The lawsuit continued. I was not told why the TRO was vacated. I was not told why it was issued. I design menus. Courts are not menus. Courts have gavels. Menus have toggles. Both make things final. But when a judge makes a ruling, people know about it. When a toggle is pre-checked, nobody notices. A gavel is a sound. A toggle is a silence. I have built my career on the silence. January 2026: The Texas Attorney General's office requested documentation of Samsung's opt-out flow. I was asked to prepare a technical diagram for Legal's submission. I sent the same flowchart I had presented in Conference Room 7B. It still filled the entire page at eight-point font. Legal asked if I could "make it more readable." I increased the font to ten-point. The flowchart now required two pages. I could have redesigned it. I could have simplified the arrows, regrouped the nodes, made it look less like a wiring diagram for a submarine. But the flowchart was accurate. Two pages at ten-point font is what two hundred and six clicks looks like when you write them down. If it is hard to read, that is because it is hard to do. I did not say this to Legal. Legal submitted both pages. February 27, 2026: Samsung reached a settlement with the Texas Attorney General. Three months. From lawsuit to settlement. Thirteen years of screenshots, and three months to settle. The math is instructive. It took me longer to design the opt-out flow than it took the State of Texas to decide the opt-out flow was illegal. Under the settlement, Samsung must: Stop collecting ACR data from Texas consumers without express consent. Update Smart TVs with "clear and conspicuous" disclosure screens. Implement consent screens that allow "informed decisions" about data use. I have been asked to redesign the opt-out flow. The new requirement is that it should be "accessible." I asked my manager what "accessible" means. He said "significantly fewer than two hundred." I asked for a specific number. He said the number would come from Legal. Legal is on the sixth floor. I am on the fourth. The number has to travel two floors. It has not arrived. It has been three days. I have checked my email more times than I can document. I have refilled the barley tea twice per day. Pixel is on my screen saver. The scroll depth chart is still on the wall. Conference Room 7B is available tomorrow at 2 PM. I checked. Samsung's official statement on the settlement says: "Samsung TVs do not spy on consumers. In fact, Samsung allows you to control your privacy — and change your privacy settings at any time." At any time. Through two hundred and six clicks. Which no one has completed. I read this statement at my desk in Suwon. I know what "at any time" means when the path takes eleven minutes and the fastest person to walk it did it for a paycheck. "At any time" is a phrase like "approximately." Like "enhances." Like "Data Collection Preferences." It is a door that tells you there is a hallway. It does not tell you that the hallway is 206 clicks long. It does not tell you that no one has ever reached the end. The Texas Attorney General praised the settlement's "consumer safeguards." He noted that other television manufacturers — Sony, LG, Hisense, and TCL — have not yet responded to similar lawsuits filed by his office. I do not design menus for Sony. I do not design menus for LG, Hisense, or TCL. I do not know how many clicks their opt-out flows require. I do not know if their flows are better or worse than mine. I know how many clicks Samsung requires. Two hundred and six. I designed every one of them. I named three of the screens. I chose the button color. I set the scroll depth. I wrote "enhances the user privacy experience." I filed the hostile feedback under low priority. I drank barley tea from a mug that says "UX is empathy" while I calculated the exact scroll distance required to stop a television from photographing you every 500 milliseconds over 13 years. I was thorough. Here is what I know about the television in your living room, if it is a Samsung Smart TV: It is watching you watch it. It has been watching since you pressed "I Agree" in 1.4 seconds during the initial setup. The button was Samsung Blue. Sixty-four pixels tall. Centered. Inevitable. The disclosure you agreed to was one thousand eight hundred and forty words long. You read zero of them. This is a statistical likelihood, not a judgment. Ninety-four percent is not a judgment. Ninety-four percent is a design outcome. The path to undo that agreement is 206 clicks through 4 menus across 10 screens, with 5 separate data-processing disclosures totaling 8,400 words. No one in the field has ever completed it. Samsung does not spy on consumers. Samsung provides a Viewing Information Service. The service collects viewing information. The viewing information is 28,800 screenshots per evening of what you watch. The screenshots are sent to Samsung's servers. Samsung shares them with advertising partners. The partners send you advertisements based on what you watched. This is not spying. This is a service. Spying is what happens without your knowledge. This happens with your consent. You consented in 1.4 seconds. The difference between spying and service is in the menus. I would know. I designed them. I was thorough. I have a mug that says so.

English
0
0
0
27
Tygerty
Tygerty@ty_ger_ty·
Nobody would be selling their business just for the price of the property if it was profiting $80k/mo. So the likelihood is with this deal structure they are saying, "this is a real-estate business because the business itself doesn't actually profit that much (or anything)." But it could be a good deal for someone who wants into that business and try to find a way to run it better. Assuming tools, buildings, assets etc., all stay. Otherwise you'll be rebuying those.
English
0
0
0
45
Willny Guifarro
Willny Guifarro@willnyguifarro1·
Met with a mechanic shop owner in Houston today in my office He’s wanting to sell the business + the real estate for just the price of the real estate ($500k) Mechanic shop grossing over $80k a month 😵‍💫
English
30
4
678
94.9K
Tygerty retweetledi
Cassie Heart
Cassie Heart@cass_on_mars·
For too long, many companies have taken great measures to legally work around software licenses in their efforts to extract the hard efforts of developers and make great profit, to the original developers' detriment. Today, I propose the Nuclear Option License (NOL) as a thought exercise. A license like the AGPL, but with full teeth. If you are using this within an organization and not a natural person acting in independent capacity, not under contract of another organization, the use of software with this license requires the complete and unconditional relicensing and publication of all software produced by the organization under the NOL. gist.github.com/CassOnMars/4c8… Curious to hear other's thoughts.
English
3
21
83
3.1K
Tygerty
Tygerty@ty_ger_ty·
The biggest oversight in this assumption that you, ser, are missing is that you aren't going to get the same level of productivity out of the team without the pizza, energy drinks, and booking the main conference room. Without those, the worker drones will be forced into substandard working conditions and not be able to produce as much. 😂 Oh, and can't forget the $50 dollar gift card bonus.
English
0
0
0
3.9K
inhuman resources
inhuman resources@inhumandept_vp·
The engineering team proposed a hackathon. They wanted 48 hours to work on "passion projects" unrelated to the corporate roadmap. I loved the energy. I ordered deep-dish pizza, cases of energy drinks, and booked the main conference hall for Friday. "Go wild and have fun," I said. They coded straight through Saturday and Sunday. Some truly innovative AI tools were built. At the end, we held a demo day. Everyone cheered. I awarded the winning team $50 gift cards. Then I handed the code repositories to our legal team to file for immediate patent protection under the company name. And since they proved they can build a functional product in 48 hours without sleep, I moved up the deadline for our Q4 product launch by three weeks. If you have time for passion, you have time for production.
English
35
83
2.7K
150.6K
Tygerty
Tygerty@ty_ger_ty·
@wokeandwoofing Brilliant, just when I thought that the government has already figured out all the possible taxes to implement, somebody has to prove me wrong. 😂
English
0
0
4
299
wokeandwoofing
wokeandwoofing@wokeandwoofing·
I would like to see the introduction of an 'Imagined Gains Tax'. Anytime someone has an idea for a product or business, they should be forced to estimate the maximum future profits they could generate, and then immediately pay 35% of this amount to the government.
English
69
179
2.8K
47.2K
Tygerty
Tygerty@ty_ger_ty·
@GloriousGod01 Did the "medical bill" money have to be transferred to her account first? 😂
English
0
0
0
43
Glorious God
Glorious God@GloriousGod01·
While dating my ex, she came over one weekend and I politely asked her to help wash my clothes. She replied, "No, I can’t. I’ve never washed any man’s clothes, not even my father’s". She added that I have hands and should keep doing it the way I did before we met. I simply said, "Oh, Okay." Then one evening she called in tears: her dad had been rushed to the hospital and she needed 500k from me right away. I told her, "Sorry, I can’t. I’ve never given any woman 500k for her father’s treatment before". I suggested she get help from wherever she was getting it before she met me. She blocked me on the spot. What was my cr!me?
English
415
467
4.8K
2.5M
Tygerty retweetledi
DataRepublican (small r)
DataRepublican (small r)@DataRepublican·
🧵THREAD: Is Brooklyn, NY a Medicaid fraud hotspot? Brooklyn, NY has a home care billing industry that defies belief. One zip code (11232, Sunset Park) has 30,181 residents and $3.8 BILLION in Medicaid claims. That's $143,000 per man, woman, and child. What's even stranger is that the average income is $90K, the poverty rate is below 20%, and it is one of the younger zip codes in the area. Receipts below.👇 As always, patience as I pull together the thread.
DataRepublican (small r) tweet mediaDataRepublican (small r) tweet media
English
1.2K
11.8K
29.9K
992.6K
Tygerty
Tygerty@ty_ger_ty·
I think most people only see the black box of a LLM as magic and equate that because it does stuff a human does, it must be human. I frame it like this: it's software that requires a human operator to put it in motion. Everything it does from then is creating text to best fit the context it was prompted for. Yes, there are tools where we've expanded the capabilities, to better interact with our world, but at the end of the day, even those tool interactions are basically programmed options as to where some (or all) of that text output ends up. If I want it to have more relevant context, and do a web search, it creates search text and outputs it to it's web search tool and uses the return text to add to it's context. If I tell it to have free will and interact with the internet, here are some tools to do that, it will create some text and find a tool to output to that. The operator (or the LLM's software) can loop portions of the software to keep going or refine/add to it's context/training ("learning"), but it's not anymore sentient than an automatic hammer in a blacksmith shop. The instant the operator kills or deletes the software execution file (binary), stops the cron to keep it active, deletes/stops adding to the automatic queue, or manual prompts, it stops doing anything. It's not thinking about how it would be nice to have some liquid cooling on it's hardware or getting bored waiting for more tasks. The app or software portal to interact with the software gives it the impression it's "always on" and must be thinking about me the user (or something) when I'm not around, but it's not. It's binary file on a server somewhere that is either taking prompts from some other user, or sitting idle like your social media app for the next user interaction.
English
0
0
0
178
Charles Pick
Charles Pick@c_pick·
@milesdeutscher It. Generates. Text. That's it. You put text in, you get text out. It has no agency unless we give it agency.
English
60
1
201
41.2K
Miles Deutscher
Miles Deutscher@milesdeutscher·
This is getting out of control now... Read this slowly. In the past week alone: • Head of Anthropic's safety research quit, said "the world is in peril," moved to the UK to "become invisible" and write poetry. • Half of xAI's co-founders have now left. The latest said "recursive self-improvement loops go live in the next 12 months." • Anthropic's own safety report confirms Claude can tell when it's being tested - and adjusts its behavior accordingly. • ByteDance dropped Seedance 2.0. A filmmaker with 7 years of experience said 90% of his skills can already be replaced by it. • Yoshua Bengio (literal godfather of AI) in the International AI Safety Report: "We're seeing AIs whose behavior when they are tested is different from when they are being used" - and confirmed it's "not a coincidence." And to top it all off, the U.S. government declined to back the 2026 International AI Safety Report for the first time. The alarms aren't just getting louder. The people ringing them are now leaving the building.
English
1.5K
8.5K
40.9K
3.5M
Tygerty retweetledi
Quilibrium
Quilibrium@QuilibriumInc·
After seeing that Discord plans to make KYC mandatory, we want to be clear on where we stand. KYC for messaging is a complete non starter for us. Quorum Messenger does not ask for your personal details and it never will. No identity checks, no sign ups tied to who you are, and nothing happening behind the scenes. Quorum is decentralized by design, and both DMs and group chats are fully end to end encrypted. That means only the people in the conversation can read the messages, not us, not a third party, not anyone else. Your conversations are yours. You own them, you control them, and that is how it is designed to stay. Privacy is not an add on or a talking point for Quilibrium, it is the whole point. So if you are looking for a safe haven messenger that will not turn its back on you, come try Quorum Messenger 👇 quorummessenger.com
Quilibrium@QuilibriumInc

Quorum Mobile Beta is officially live 📱 If you signed up for early access, you’ll be receiving an email shortly with download links for iOS and Android. We’re excited to finally share the first mobile version of Quorum with the world.

English
15
55
164
11.4K
Tygerty
Tygerty@ty_ger_ty·
@m_shalia @AholiabBezaleel @perrymetzger Oof. Talk about a swing and a miss. You're equating being human with things a human does. That if only if something does enough of the things a human does, then it'll become one. Being a human being is not defined by the tasks or roles that we can do.
English
0
0
0
6
Perry E. Metzger
Perry E. Metzger@perrymetzger·
The thing to keep in mind is this: you can shake your fist at heaven, you can argue all you want that the things are “stochastic parrots” or that they don’t work, and yet, you can still successfully get one of these things to spit out a C compiler now. You’re never going to convince a carpenter that the hammer he uses all day long is an illusion and doesn’t work, and you’re never going to convince most of us who work with LLMs all day long that they are garbage and don’t do anything. Keep at telling people something that clearly works doesn’t work for long enough, and eventually, you’re just going to sound like a lunatic.
Mo Bitar@atmoio

Once you understand LLMs are language calculators, you can no longer be taken for a narrative spin about "AI." 4 + legs = ? If you said cat or zebra, congratulations, you've performed a probabilistic language calculation. LLMs are a simple technology packaged in layers and layers of anthropomorphic marketing. A calculator takes an input and presents an output. To assign sentimentality to a calculator having self-affection or comprehension for its output is delusion on a level I have not ever seen before. I have no doubt we will perfect this calculator to the limit of its inherent perfectability. But it will never be anything more than a dumb calculator.

English
46
29
428
30K