They are naming professors like "Now That's What I Call Music" albums now?
(I genuinely can't find why there's a '90 there, suspect it's a copy/paste error?)
Memories use more mature fabrication processes, for which it is likely that electron microscopy already worked well enough.
The article is about a better method for processing the output of an electron microscope, which enables a better image resolution than in the past and the 3D reconstruction of the surface of the device. This is needed for the 2 nm/18A processes and their successors, for which the existing tools were insufficient.
Modern chip designs do include over-provisioned features, so designers can often selectively downgrade areas that are not viable.
Chenming Hu books about solar cell physics and semiconductors are quite accessible. =3
Actual operating life is often determined by the economic feedback loop which causes manufacturers to cut costs until basically all consumer products have roughly the same expected lifetime, regardless of the potential of the underlying technology.
* Or at least, the first millisecond after it starts using its normal operating clock, which might not be the very first millisecond
Which is why, despite being a huge BEV proponent, laugh when I hear people say things as "BEV are inherently more reliable due to having no transmission and less moving parts that could break". It might have been true in the early stage, that we're currently at the end of, but we already know that the reliability of a second-hand mid-range ICE car is what market has been bearing for decades, so we can be certain BEVs will be "value-optimized" until they are just as unreliable.
Anyone that has had to deal with a carbureted engine, or old school hydraulic ‘computer’ based automatic transmission is never going to extol their reliability or ease of repair.
Those also are doing 1/10th of the work (for things like automatic engine tuning, wear adjustment, on the fly power band adjustment, altitude adjustment, anti-pollution adjustment, etc).
The reason why people complain about modern cars is because computers have made it exceptionally easy to add massive amounts of new (and poorly tested, in many cases) functionality.
And even the equivalent of DRM.
If if you used current tech to implement the old feature set, and spent even a little effort making it open instead of DRM-ish, it could be even simpler and more reliable. But no one is doing that. Because it’s more profitable using it ‘for evil’, as it were.
Statistical process control is at the heart of profitability, and measurement of what we've actually built is how it gets its data. If the accuracy and frequency of measurement goes up, the control loop tightens accordingly.
Parameterizing features and defects is a really interesting multidisciplinary process. Figuring out how to correlate defects at EDS time with something that occurred 80 process steps ago is where all the money lives in the business. Once you draw the correlation, you can place it under SPC and people will automatically get paged in the middle of the night the moment something starts to drift into an unhappy range.