Identity.org

1.5K posts

Identity.org banner
Identity.org

Identity.org

@identity

Privacy-First Digital Identity that’s secure, private, and simple.

San Francisco, CA Katılım Temmuz 2018
223 Takip Edilen1.8K Takipçiler
Identity.org
Identity.org@identity·
The global deepfake AI market is projected to grow from about $764 million in 2024 to nearly $20 billion by 2033. Growth like that means synthetic media will appear more often across advertising, entertainment, and online platforms. For viewers, the challenge is becoming simpler but more difficult at the same time. When you see a person in a video or image, it is not always clear whether that person actually participated in creating it. Signals that confirm identity, source, and permission may become an important part of how people evaluate media online.
English
0
0
1
13
Identity.org
Identity.org@identity·
Sometimes people discover an AI version of themselves by accident. A model recently said a fashion brand created an AI generated ad that closely mirrored her face and past photos. She only found out after a friend sent her the video. That raises a difficult reality about digital likeness. You may not know when an AI version of your image is circulating online, or when someone else may be benefiting from it. The conversation about identity online is not only about removing content after the fact. It is also about creating ways for people to know when their likeness is being used in the first place. dailydot.com/model-fashion-…
English
0
1
3
59
Identity.org
Identity.org@identity·
YouTube is expanding its AI deepfake detection system to include politicians, government officials, and journalists. The tool scans for AI generated videos that recreate someone’s face and allows the person being impersonated to flag or request removal of the content. Impersonation has long been a concern for celebrities and creators. But when it involves political figures or journalists, the impact can extend to public trust and the information people rely on. Moves like this suggest platforms are beginning to treat AI impersonation not just as a moderation issue, but as part of protecting the integrity of online media. techcrunch.com/2026/03/10/you…
English
0
0
0
35
Identity.org
Identity.org@identity·
A lot of the conversation around AI likeness focuses on takedowns. Someone’s image appears somewhere it shouldn’t. A request is sent. The content is removed. 🔁 That process has been the standard response across social media for years. But some creators and studios are starting to think about it differently. Instead of dealing with problems after content spreads, they are beginning to publish clearer usage guidelines and licensing terms that explain how a likeness can be used from the start. These frameworks do not prevent every issue, but they can reduce confusion around endorsement, attribution, and authorization. When expectations are defined early, fewer disputes need to be resolved later.
English
0
1
2
71
Identity.org
Identity.org@identity·
Some performances remain part of culture long after they were first recorded. Films are restored. Archived interviews are replayed. Familiar voices and faces continue appearing in new productions built from earlier material. Before retiring from the role of Darth Vader, James Earl Jones worked with Lucasfilm to create a digital model of his voice using recordings captured throughout his career. That system allows the character’s voice to appear in future Star Wars projects. Examples like this show how a likeness, voice, or performance can continue circulating long after the original work was created. When past performances can be reused in new productions, questions about who authorizes those appearances naturally follow. Our latest article explores how identity can continue appearing after death and who often manages those decisions. identity.org/who-controls-y…
English
0
0
2
50
Identity.org
Identity.org@identity·
Actor Jared Harris recently issued a cease and desist after an AI generated version of him appeared in promotional material for a podcast he never agreed to participate in. The producers used AI to recreate his likeness for a fictional film pitch segment. Harris said he was never contacted and that his image and identity are protected under existing contracts and law. Situations like this are starting to appear more often as tools capable of recreating faces and voices become widely accessible. The technology may allow a likeness to be generated, but that does not automatically answer the question of who has the authority to use it, especially when it appears in promotional or commercial content. thewrap.com/culture-lifest…
English
0
1
1
60
Identity.org
Identity.org@identity·
Fan culture has always embraced reinterpretation. Posters, fan fiction, mashups, and tribute edits have long been part of media ecosystems. New AI tools make those creations far more realistic. What once looked clearly fan-made can now resemble official studio content. The tension begins when the line between unofficial and official becomes unclear. Confusion around endorsement and approval follows. Creativity and protection do not have to conflict. Clear usage guidelines and transparent labeling can help audiences distinguish fan work from official releases. With clearer expectations, participation can continue while professional interests remain protected.
English
0
1
2
65
Identity.org
Identity.org@identity·
Last week we explained why you generally cannot copyright your face or likeness. That raises the next question: what about trademark? Trademark applies when identity functions as a commercial identifier. A name on a product line. A catchphrase used on merchandise. A stylized logo built around a persona. In those cases, protection attaches to the branding element used in commerce, not to the person as a whole. Trademark does not grant ownership over a face or voice in the abstract. It protects defined brand elements tied to specific goods or services. ™️ 🔗 Our latest blog explains what trademark law can and cannot cover. identity.org/can-you-tradem…
English
0
0
1
46
Identity.org
Identity.org@identity·
Meta has filed lawsuits in Brazil, China, and Vietnam against advertisers accused of using AI-generated celebrity images and voices in scam ads that asked people to send money or share personal information. This kind of misuse isn’t new. Public figures have appeared in fraudulent ads for years. What makes these cases clearer legally is the link to financial deception. When identity is used to mislead consumers for profit, fraud and consumer protection laws apply. The enforcement path is relatively direct. The harder question arises outside obvious scams. Digital identity laws vary across jurisdictions and remain unsettled in many places. Not every unauthorized use fits neatly into fraud statutes. Legal systems tend to move fastest where harm is measurable. Other forms of identity misuse often fall into gray areas, even when the impact is real. cp24.com/news/world/202…
English
0
1
2
89
Identity.org
Identity.org@identity·
The market extracts value from identity as if it were an asset. But extraction isn’t the same as structured ownership. Brands monetize recognizable faces. Platforms optimize familiarity. AI replicates voice, image, and style. The value is undeniable. What’s missing is formalization. Identity isn’t consistently treated like intellectual property with clear valuation, governance, or proactive rights architecture. Most people don’t deliberately structure or license it. It’s monetized reactively, often platform-dependent and contractually vague. That gap matters. An asset isn’t just something that generates revenue. It’s something intentionally structured, protected, and managed over time. Identity is moving in that direction—but we’re still early. The real shift comes when individuals control replication, not just promotion.
Identity.org tweet media
English
0
0
1
44
Identity.org
Identity.org@identity·
When serious legal pressure appears, platforms tend to respond quickly. Restrictions are introduced. Certain categories of content are limited. Policies tighten. That responsiveness makes one thing clear: the ability to act is there. What is less consistent is how that protection works for individuals. Most people do not have the resources to trigger immediate change when their likeness is used without permission. A global studio and a private individual may not carry the same legal weight, but the underlying issue is the same. Someone’s identity is being used without approval. Control over your face, voice, and name should not depend on scale or influence. If systems can protect major rights holders, they can be built to protect everyone.
English
0
0
1
59
Identity.org
Identity.org@identity·
That Brad Pitt vs. Tom Cruise “fight scene” was hard to miss this past week. It looked like a real studio production. The lighting, the pacing, the performances felt convincing. But it wasn’t real. Now the Motion Picture Association is speaking out about tools like Seedance and how recognizable actors and major film properties are being handled. When the trade group representing major studios steps in, it signals something bigger than one viral clip. The conversation is moving from curiosity to governance. Likeness has value. Intellectual property has value. The platforms that create clear safeguards around both will earn long-term trust as this space evolves. variety.com/2026/film/news…
English
0
1
1
163
Identity.org
Identity.org@identity·
Can you copyright your face? Is your voice protected by copyright? Can someone legally “own” a person’s likeness? As AI tools make it easier to generate realistic faces, voices, and performances, these questions are coming up more often. Copyright protects specific creative works like photographs and recordings. It does not automatically protect the person behind them. Understanding that distinction is key to knowing what the law actually covers when it comes to likeness. 🔗 Our latest blog breaks it down. identity.org/can-you-copyri…
English
0
1
1
60
Identity.org
Identity.org@identity·
He’s thinking in the right direction. AI isn’t the enemy. Lack of ownership is. If you own your likeness, your voice, your identity — then AI doesn’t replace you. It comes to you. It asks. You decide. You set the terms. That’s agency. We don’t need to fight AI. We need to structure it so it works for us.
Variety@Variety

#MatthewMcConaughey predicts to #TimothéeChalamet that AI actors will crash the #Oscars: “It’s damn sure going to infiltrate our category.” “Will we, in five years, have Best AI Film? Best AI Actor? Maybe. I think it could become another category. I’m not sure. It’s going to be in front of us in ways we don’t even see.” wp.me/pc8uak-1lGVAd

English
0
1
1
69
Identity.org
Identity.org@identity·
Major shift in India. Starting Feb 20, social platforms must: • Remove deepfakes in 2–3 hours (down from up to 36) • Clearly label AI-generated content • Require user disclosure of AI posts • Preserve metadata and traceability • Face legal risk for non-compliance This is big. Shorter timelines and clearer labeling are good news for users and creators who want faster responses and more transparency when harmful AI content spreads. techcrunch.com/2026/02/10/ind…
English
0
2
0
50
Identity.org
Identity.org@identity·
Deepfake goes up. You find it. You file the request. You wait. You get the email. “After review, we found no violation.” It shouldn’t be on the person being copied to track it down and push it through layers of forms and reviews. Protection should start before it spreads, not after someone fights through a maze.
Identity.org tweet media
English
0
0
1
51
Identity.org
Identity.org@identity·
AI has the potential to improve lives and expand how we create and innovate. But innovation needs guardrails to succeed long term. Last week, Senators Adam Schiff and John Curtis introduced the bipartisan CLEAR Act. The bill would require AI companies to disclose the copyrighted works used to train their models and establish transparency safeguards around that process. This kind of legislation signals something important. Growth in AI and protection for creators do not have to be at odds. When people know how their work is being used, they can make informed decisions, advocate for fair treatment, and participate in the future of technology with greater confidence. schiff.senate.gov/news/press-rel…
English
0
0
2
46
Identity.org
Identity.org@identity·
Seeing a familiar face used to mean someone showed up. That assumption is changing. AI avatars allow a person’s likeness to move across platforms, regions, and campaigns without repeated filming. For some, this creates new commercial and branding opportunities. For others, it raises questions about consent and control. When does digital representation extend someone’s presence, and when does it start to replace it? 🔗Our newest article explores what AI avatars signal about the future of identity online. identity.org/ai-avatars/
English
0
1
2
40
Identity.org
Identity.org@identity·
Have you read the part of a platform’s terms that explains how your content can be used? TikTok states that when you post, you grant other users a non exclusive, royalty free, perpetual and irrevocable, worldwide license to use, share, and adapt your content. That language is standard and supports features like duets and remix culture. But the meaning feels different today. Digital tools can now alter faces, voices, and video in more advanced ways. When your likeness is involved, reuse is not only about creativity. It is also about consent and control.
English
0
0
1
34
Identity.org
Identity.org@identity·
Fans have always wanted more ways to connect with their favorite athletes. Not just highlights or merch, but experiences that feel personal and ongoing. The MLB Players Association partnering to create AI companions taps directly into that demand. It opens up a new kind of engagement and a new revenue stream built around a player’s identity, not just what happens on the field. How these companions are being introduced matters. They’re presented as officially approved and licensed by the players themselves, not as unofficial replicas or background experiments. When fans interact with a digital version of a real person, trust is the foundation. The experience only works if it reflects the player’s intent and boundaries. The opportunity here is real, but so is the responsibility to keep authenticity and control front and center as these experiences grow. forbes.com/sites/charlief…
English
0
1
1
52