This will be good long-term - it will make organisations look at testing, single points of failure, DR processes, and so on. And then everyone will forget again and so we go round the cycle.
I am a Linux sysadmin, so I have a professional duty to diss M$ at every opportunity, but even I can say it's Crowdstrike rather than Microsoft primarily at fault here.
I agree with comments upthread that coders are becoming less proficient - when I was learning to code at uni, which was after the era of punched cards, but only just, there was a focus on efficiency, using no more memory etc than necessary (which is partly what lead to Y2K problems) and no one is that bothered about that now.
A friend was telling me last night about a couple of years ago, going for a lecturing job, and learning the average CompSci student doesn't even know what an IP address is these days. I don't know if they still have to spend hours analysing every field in network packages these days. I found that very tedious, but I do see it's important knowledge for some people in the industry to hold, and it's probably been helpful for me to have the understanding in my background, though i focus on other strengths.* Apparently they don't teach the OSI 7 layer model any more, either, but the concept of there being different layers at which things happen, from 0s and 1s to a pretty, interactive GUI, is important to understand. But then I also had a director which had never heard of Ada Lovelace, which rather surprised me.
- These include using MN to procrastinate instead of debugging a script which isn't working.