Nobody reads the terms of service
In the span of a single month, three tools I use daily rewrote the rules. GitHub Copilot announced it would start using interaction data from Free, Pro, and Pro+ users to train AI models, with an opt-out deadline of April 24. Google's Gemini API costs reportedly tripled overnight for some developers around March 16, with no advance warning. ChatGPT's usage limits now fluctuate based on "system conditions," a phrase so vague it could mean anything. Meanwhile, cPanel quietly pushed through another 8 to 15 percent price hike for 2026. None of these changes required my approval. None of them triggered a negotiation. I just woke up, checked the news, and discovered the tools I depend on had silently altered the terms of our relationship. This is not a coincidence. It is the pattern.
The biggest lie on the internet
A 2017 Deloitte survey found that 91 percent of consumers accept terms of service without reading them. Among 18 to 34 year olds, the rate climbs to 97 percent. A Pew Research Center study in 2019 confirmed the trend: only 9 percent of adults said they always read privacy policies before agreeing. In a controlled experiment, researchers at York University found that only a quarter of participants even opened the terms of a fictional social network, and those who did spent about a minute skimming thousands of words before clicking "I agree." We all know this. We joke about it. "I have read and agree to the terms" might be the most universally told lie in digital history. But the joke stops being funny when you realize what it actually means: billions of people are entering binding agreements they have never read, with companies that can change those agreements whenever they want. The terms of service is not a formality. It is the contract. And we are signing it blind.
The asymmetry of investment
Here is what makes this particularly painful for developers and creators. You do not just use these tools casually. You build on them. You invest hundreds of hours configuring workflows, writing integrations, training muscle memory, and structuring your entire creative or technical process around a specific platform. Consider what it actually means to depend on GitHub Copilot. Your coding patterns adapt to it. Your speed estimates account for it. Your team's onboarding process assumes it. You start writing code differently because you know the AI will fill in the gaps. That is not casual usage. That is deep integration. The same applies to any tool that becomes part of your daily workflow. ChatGPT for research and drafting. Gemini for API calls in production. A hosting panel for managing your servers. Each one represents accumulated investment, not just in money, but in time, knowledge, and process design. And then they change the deal. GitHub can decide your interaction data now trains their models. Google can triple your API costs over a weekend. OpenAI can throttle your access based on unspecified "system conditions." They announce it in a blog post, update the terms, and give you a few weeks to either accept or leave. The asymmetry is staggering. You invested months building around their tool. They invested a paragraph changing the terms.
What actually changed with Copilot
The GitHub Copilot situation deserves a closer look because it illustrates the pattern clearly. On March 25, 2026, GitHub announced updates to its Privacy Statement and Terms of Service. Starting April 24, interaction data, specifically inputs, outputs, code snippets, and associated context, from Copilot Free, Pro, and Pro+ users will be used to train AI models. The opt-out mechanism exists. You can go to your settings, find the Privacy section, and disable "Allow GitHub to use my data for AI model training." If you previously opted out, your preference is preserved. But here is the thing: the default is opt-in. If you do nothing, your data gets used. GitHub is banking on the same behavioral pattern that the Deloitte study documented. Most people will not read the announcement. Most people who read it will not change their settings. Most people who intend to change their settings will forget. This is not malicious. GitHub's Mario Rodriguez, their chief product officer, frames it as improving the product for everyone. And technically, it is. More training data generally means better AI. But the mechanism, quietly changing defaults and relying on user inertia, reveals the structural power imbalance at the heart of every terms of service agreement. Copilot Business and Enterprise users are exempt, of course. When you pay enterprise rates, you get the leverage to negotiate terms. When you are an individual user, you get a checkbox buried in settings.
The Gemini surprise
The Gemini API situation is a different flavor of the same problem. Around March 16 to 17, 2026, developers on the Google AI Developers Forum started reporting unexpected cost increases. One developer noted that their cost per unit of credit roughly tripled overnight, going from approximately 100,000 units of credit per dollar to about 35,000 units per dollar, with no change in their usage patterns. The number of requests had not changed. The models had not changed. The billing just silently shifted. Developers who had budgeted based on months of consistent pricing suddenly found their costs ballooning without warning. This is the risk of building on any platform with usage-based pricing and the unilateral ability to adjust rates. Your financial planning is only as stable as their pricing page, and that page can change at any time.
The pattern is the point
These are not isolated incidents. They are the natural behavior of platforms operating under the standard SaaS model. The terms of service gives companies the legal right to modify pricing, features, data policies, and usage limits at their discretion. Most agreements include a clause that says something like: "We may update these terms from time to time. Continued use of the service constitutes acceptance of the updated terms." Translated: we can change anything, and if you keep using the product, you have agreed to whatever we changed. This is not a bug in the system. It is the system. Every SaaS product you use operates under this same framework. The only variable is how aggressively each company exercises that right and how much notice they give you before they do. The pattern accelerates during certain conditions. When a company is under pressure to grow revenue, prices go up. When AI training data becomes strategically valuable, data policies expand. When infrastructure costs rise, usage limits tighten. When a company gets acquired, everything is on the table. You are not a party to any of these decisions. You are subject to them.
The individual version of vendor lock-in
At the enterprise level, vendor lock-in is a well-understood risk. Companies employ entire procurement teams to negotiate contracts, ensure data portability, and maintain leverage with their vendors. They demand SLAs, pricing caps, and termination clauses. Individual developers and creators get none of this. We click "I agree" and hope for the best. Our lock-in is not contractual, it is behavioral. We are locked in by habit, by accumulated configuration, by the switching cost of relearning a new tool and rebuilding our workflows from scratch. This behavioral lock-in is arguably more powerful than contractual lock-in because we do not even recognize it as lock-in. We think of it as preference. "I like this tool" is what we say. "I am deeply dependent on this tool and would lose weeks of productivity if it disappeared" is what we mean. The tools know this. Every onboarding flow, every integration, every feature that embeds the tool deeper into your workflow is, from the company's perspective, increasing switching costs. That is not cynical, it is just business. But it means that by the time a company changes its terms in a way you dislike, leaving is genuinely painful.
A framework for managing tool dependency
None of this means you should stop using SaaS tools. That is not realistic, and many of these tools are genuinely excellent. The point is to use them with open eyes and a plan for when things change. Because things will change. Audit your critical dependencies. Make a list of every tool that would cause real disruption if it changed pricing, features, or data policies tomorrow. Be honest about how embedded each one is in your workflow. The tools that scare you most are the ones that need the most attention. Read the terms, or at least the changes. You do not need to read every word of every privacy policy. But when a tool you depend on announces a policy change, actually read the announcement. Understand what changed, what it means for you, and what your options are. The few minutes this takes could save you from a nasty surprise. Maintain awareness of alternatives. You do not need to actively use alternatives to every tool. But you should know they exist and have a rough sense of what switching would involve. For AI coding assistants, that might mean occasionally trying a different tool. For hosting, it might mean understanding what self-hosted options look like. The goal is not to switch constantly, it is to know that you can. Export your data regularly. If a tool offers data export, use it. Periodically. Not because you are planning to leave, but because the option to leave only exists if your data is portable. A tool that holds your data hostage has infinite leverage over you. Diversify where it matters most. For non-critical tools, single-vendor dependency is fine. For tools that touch your core workflow, your data, your code, your communication, consider whether spreading the load across multiple tools or maintaining a self-hosted fallback is worth the overhead. Favor tools with open-source cores. When choosing between two comparable options, lean toward the one with an open-source foundation. Not because open-source tools are always better, but because they cannot change the deal on you in the same way. If a company behind an open-source project changes direction, you can fork the code and keep going. If a proprietary SaaS changes direction, you are along for the ride.
The social contract we pretend exists
There is an implicit social contract between tool makers and tool users. We believe that if we pay for a product (or contribute our data to a free one), the company will not radically alter the deal without good reason and fair warning. We believe they will grandfather existing users into reasonable terms. We believe they will prioritize trust over short-term revenue. This social contract is not written down anywhere. It is not in the terms of service. In fact, the terms of service explicitly override it. The legal document says: we can change anything. The social contract says: but you will not, right? Most of the time, companies honor the social contract. Most terms of service changes are minor and reasonable. But "most of the time" is not "always," and the exceptions tend to arrive at the worst possible moment, when you are most dependent and least prepared. The mature response is not paranoia. It is preparedness. Acknowledge that the tools you love are businesses first, and businesses respond to incentives that may not align with yours. Use them fully, but never forget that the "I agree" button is not a handshake between equals. It is a signature on a contract that only one side can rewrite.
References
- GitHub, "Updates to GitHub Copilot interaction data usage policy," March 25, 2026. https://github.blog/news-insights/company-news/updates-to-github-copilot-interaction-data-usage-policy/
- GitHub, "Updates to our Privacy Statement and Terms of Service: How we use your data," March 25, 2026. https://github.blog/changelog/2026-03-25-updates-to-our-privacy-statement-and-terms-of-service-how-we-use-your-data/
- The Register, "GitHub: We going to train on your data after all," March 26, 2026. https://www.theregister.com/2026/03/26/github_ai_training_policy_changes/
- MorganCDIP, "URGENT: The cost of API access has increased since March 16-17, 2026," Google AI Developers Forum, March 20, 2026. https://discuss.ai.google.dev/t/urgent-the-cost-of-api-access-has-increased-since-march-16-17-2026/134859
- BentoML, "ChatGPT Usage Limits: What They Are and How to Get Rid of Them," March 2026. https://www.bentoml.com/blog/chatgpt-usage-limits-explained-and-how-to-remove-them
- Underhost, "cPanel Price Increase 2026: What It Means for Your Hosting," October 2025. https://underhost.com/blog/cpanel-price-increase-2026-what-it-means-for-your-hosting/
- Deloitte, "91 percent of consumers accept terms of service without reading them," Business Insider, November 2017. https://www.businessinsider.com/deloitte-study-91-percent-agree-terms-of-service-without-reading-2017-11
- Pew Research Center, "Americans' attitudes and experiences with privacy policies and laws," November 2019. https://www.pewresearch.org/internet/2019/11/15/americans-attitudes-and-experiences-with-privacy-policies-and-laws/
- The Guardian, "Click to agree with what? No one reads terms of service, studies confirm," March 2017. https://www.theguardian.com/technology/2017/mar/03/terms-of-service-online-contracts-fine-print
- Forbes, "Companies Continue To Shift Away From SaaS," February 2026. https://www.forbes.com/sites/cio/2026/02/19/companies-continue-to-shift-away-from-saas/
You might also enjoy