In mid July 2025, MIT published a report on AI and their adoption among businesses. Media picked it up end of August and there has been no shortage of takes, opinions and posts regarding it. I have, too, no shortage of takes on that report. In this post, I ’ll focus on one of these.

There’s 8 industries identified in the report. I won’t re-iterate them here as I see no point, go read the report. Of those 8 industries, only 2 (Tech and Media) show clear signs of structural disruption. And Media substantially more than Tech.

The first, Tech has a number of probably easy to grok reasons. My non-exhaustive take is, in no particular order or gravity:

  • Forced adoption as dictated by C-levels
  • Adjacency to the industry that created AI
  • Seeing first hand that these tools can work.
  • The inquisitive nature of people in the Tech industry. They have been for decades, after all, the first to try out new technologies to begin with
  • Very faster iteration patterns that lead to:
    • Minimal time spent to regress from failures. Which other industry do you know that can ship a new version of the product the next day and fix whatever mess was released the day before?
    • Creation of a culture that not only accepts failure, but is open about it, quantifies it and actively tries to address it.
  • Acknowledgement that there is a chunk of work that is arguably boilerplate and could be automated away. There’s been efforts at that before, this is just the last iteration at automating the boilerplate away
  • Gravity of failures is usual low. It’s usual OK if you can not see your pictures or email or social media feed for a couple of hours. Chances are that nobody is going to be gravely injured over that (yes, I know we can concoct imaginative scenarios that this can happen - or has even already happened - but they are an exception, not the rule). Compare Gmail not working to a bridge collapsing (to stay within the Engineering profession), if you need something concrete.

There are of course some exceptions. There are Tech related high stakes environments, for example huge trade clearing houses. In these cases, a continental financial catastrophe might happen. Those move substantially slower and want a lot more assurance their code works. Similarly, some parts of the Tech industry, write code for other parts that are critical (e.g. Energy and Materials). You don’t want an entire city in darkness because AI hallucinated the wrong code block.

That last part above probably also explains why Tech isn’t leading the adoption.

And who is leading? Media. Which is this:

  • A profession, with low salaries. Multiple reasons for that, I won’t digress as to why. But, it is clear that the incentive for quality work fades with lower salaries.
  • Fed largely by the Tech industry, via ads. This creates incentives for click farming, not good content.
  • Living on the news, so knowing stuff early on and being able to adopt it sooner.
  • With people that their profession is to produce content offering a take or opinion on pretty much anything, without necessarily having any kind of deep knowledge. Lack of knowledge encourages mistakes.
  • With bad feedback channels. If they spend the time to do their work better, they are essentially left starving.
  • With minimal rewards for reporters, which are part of the Media industry. At best, a prize, minimal money and some congratulations by colleagues. At worst, mortal enemies for life.
  • Owned by companies that have an actual interest in not seeing truthful, good quality articles out. Yes, I know some are still held to high degrees of decency and respect, but the biggest part isn’t and we are talking majorities here, not exceptions.
  • A profession where a mistake, is often not even reported. When it is, political power needs to be often exerted to acknowledge and fix the mistake. Either by collective actions or by knowing someone that can push this forward. And when that happens, an apology tends to be enough. As long as you avoid defamation and calls to violence, any mistake is more or less OK.

The end result? Quality often isn’t an incentive, quite the contrary, it gets in the way

So the summary is, I guess, that those industries have adopted AI so quickly, because what they do is very often not critical. A failure isn’t gonna cost lives. So they can afford to use AI to produce what it is they produce.