The chardet Python library was rewritten with Claude in a weekend to escape LGPL. No court has ruled on whether AI-assisted clean room rewrites are legal. Commercial open source business models built on copyleft protection are running out of time.
Photo by Michał Parzuchowski on Unsplash
A developer used Claude to rewrite chardet — the Python character detection library — from scratch. The original was LGPL-licensed. The rewrite shipped as MIT. The whole thing took a weekend.
This hit Hacker News with 499 points and 516 comments. The debate isn't about one library. It's about whether the licensing regime that commercial open source was built on still works when AI can replicate any codebase in hours instead of months.
If you're marketing an open-core product, or your company's moat depends on copyleft protection, this is the post where you stop and read carefully.
The chardet developer performed what's called a "clean room" rewrite — building a functionally equivalent library without copying the original source code. Clean room development has legal precedent going back to the 1980s (Compaq reverse-engineering the IBM PC BIOS). The difference: Compaq's clean room took a team of engineers months of work. This one took a person and an AI a weekend.
The LGPL license on the original chardet required anyone who modified and distributed it to share their changes under the same license. That obligation evaporates if you write equivalent code from scratch — the copyleft only applies to the original code and derivatives of it.
The old math: Rewriting a mature library from scratch is so expensive that copyleft effectively forces compliance. Nobody rewrites 50,000 lines of battle-tested code just to avoid a license.
The new math: AI makes that rewrite trivially cheap. The economic moat that copyleft created — "it's cheaper to comply than to rewrite" — just collapsed.
The legal landscape is a void. Three fundamental questions remain open, and until courts or legislatures address them, the entire commercial open source ecosystem is operating on assumptions:
1. Does AI trained on copyleft code produce derivative works?
If Claude or GPT ingested GPL-licensed code during training, is the output tainted by that license? The FSF and most legal scholars say the question is unresolved. If the answer is yes, every AI-generated code snippet is potentially a license compliance nightmare. If the answer is no, copyleft loses most of its teeth.
2. Is AI-assisted clean room development valid?
Traditional clean room development required strict separation — the team writing the new code couldn't have seen the original. But if the AI was trained on the original, is the "clean room" actually clean? The legal theory hasn't caught up.
3. Is AI-generated code copyrightable?
The US Copyright Office has taken the position that purely AI-generated content isn't copyrightable. If the rewritten code has no copyright, it can't carry a license at all — including the MIT license the developer slapped on it. This creates a paradox: the code might be free for anyone to use, but not because of the license. Because there's no copyright to license.
None of these have been tested in court. The chardet rewrite is the canary.
For companies running open-core business models, copyleft was a strategic asset. It ensured that if someone took your community edition and built on it, they had to contribute back or keep their changes open. That created a flywheel: community contributions improved the free tier, the free tier drove adoption, and the commercial tier captured enterprise value.
That flywheel assumed rewriting was prohibitively expensive. It no longer is.
Here's what this means practically:
Competitors can clone your core in days. A well-funded competitor can use AI to rewrite your GPL/LGPL codebase, relicense it under MIT or Apache 2.0, and ship it without any obligation to your community. The output may be functionally identical. The license is different.
Cloud providers get a new playbook. The AWS-vs-Elastic, AWS-vs-MongoDB fights of the 2010s were about cloud providers offering managed versions of open source projects. Companies responded with SSPL and BSL. But those licenses only cover the original code. A cloud provider that AI-rewrites the core under a permissive license sidesteps the entire problem.
Community contributions become riskier to rely on. If the copyleft mechanism that incentivized contribution-back no longer holds, the community flywheel slows down. Why contribute back if the license won't protect your work?
We've already seen the first wave of relicensing. HashiCorp moved Terraform to BSL in 2023. Redis switched to dual license. Elastic, MongoDB, Grafana — all moved away from pure copyleft.
The chardet rewrite is going to accelerate this trend. Companies that relied on copyleft as a moat will preemptively move to source-available licenses that explicitly restrict competitive use, not just code copying:
| License | What it restricts | Who's using it |
|---|---|---|
| BSL (Business Source License) | Commercial use until time-delay converts to open | HashiCorp (Terraform), MariaDB, CockroachDB |
| SSPL (Server Side Public License) | Offering the software as a service | MongoDB, Graylog |
| ELv2 (Elastic License v2) | Managed service offerings | Elastic |
| FSL (Functional Source License) | Competing products, converts to open after 2 years | Sentry |
Expect a second wave of relicensing announcements in 2026. The companies that haven't relicensed yet are now on a clock — every month they wait, the cost of an AI-assisted competitive rewrite drops.
If you're marketing a product with an open-core model, the messaging and positioning work starts today.
1. Audit your licensing narrative. How prominently does your current messaging lean on "open source" as a trust signal? If your license changes, does your positioning collapse? Start building messaging that's anchored in community, ecosystem, and product quality — not license type.
2. Prepare the relicensing communications plan. If your company hasn't relicensed yet, assume it's being discussed internally. The companies that handled this well (Sentry's FSL rollout, for example) invested in transparent communication months before the switch. The companies that handled it badly (HashiCorp's BSL announcement) lost developer trust overnight. Have the blog post, the FAQ, and the community talking points drafted before the decision is final.
3. Reframe the value prop around ecosystem, not code. A competitor can rewrite your code. They can't replicate your integrations, your plugin ecosystem, your documentation, your community expertise, or your enterprise support track record. Shift the narrative from "our code is open" to "our ecosystem is unmatched."
4. Monitor the legal landscape actively. The first court ruling on AI-assisted clean room development will reshape the entire conversation overnight. Have scenario plans for both outcomes: if courts validate AI clean room rewrites, your copyleft moat is officially gone. If they don't, you have breathing room — but the market has already moved.
5. Position for the developer trust conversation. Developers have strong opinions about licensing. Any change will generate backlash. The PMMs who get ahead of this are the ones who frame the change as protecting the community and the project's sustainability — not as a defensive move against competitors.
The chardet rewrite is a proof of concept. Not for AI coding — we already knew AI could write code. It's a proof of concept for licensing arbitrage: using AI to escape copyleft obligations that were once economically impractical to avoid.
Commercial open source companies built their moats on the assumption that rewriting was expensive. That assumption is gone. The companies that adapt — by relicensing, by building ecosystem moats, by shifting their value narrative — will survive. The ones that pretend the moat still holds will discover it doesn't when a competitor ships their rewritten core under MIT.
The license can be rewritten in a weekend. Can your positioning be?
See also: Open Source Coding Agents Are the New Browser Wars | SLG, PLG, OSLG: The Three Growth Paradigms | You Don't Need a Project. You Need a Problem.
Sources: HN Discussion — chardet relicensed LGPL to MIT via Claude rewrite · US Copyright Office — AI and Copyright · HashiCorp BSL announcement · Redis Dual License · Sentry Licensing · FSF on AI and Copyleft · trend-scan 2026-03-10