Another option is to design for easy migration, that is easy technically and business-wise.
For example, iOS programs are native code, but if Apple decided to do switchearoo and use a different instruction set CPU such as x64 they can do that with relative ease: ship new Xcode, ask developers to recompile and produce fat binaries (current binaries are already fat - arm v6 and arm v7, so the foundation is laid out already), and developers will have been around and motivated to recompile becuase of the financial incentive of new sales. Where old developer is not around, competition is sure to spring up because the market place makes it easier for demand to elicit supply. So combination of technical and business decisions created situation where Apple can switch CPU if they need to. This clearly stems from their previous experience of switching from Motorola to PPC and from PPC to x86.
Similarly, when designing a database I try to keep in mind that I will have to redo it, so I try to avoid irreversible operations and instead strive to preserve original data in my schema, so that I can always recompute all destructive operations (such as aggregates) later, even if the set of such operation changes over time.
Works, but some programs will be orphaned, and thus won't be recompiled. Plus, this is an effort that could have been avoided through a more future-proof approach.
It's ok for apps to be orphaned - when a healthy developer ecosystem is in place then a replacement will be coded to meet any substantial demand.
And the effort could not be avoided - you can only trade present effort for future effort, that is more work put upfront into CPU design vs more work later on to convince developers to recompile the apps. The problem with the former is that you make all the effort and still don't get the flexibility, whereas in the latter case the extra flexibility can can in handy in more ways that one, and some of which we cannot foresee today.
For example, iOS programs are native code, but if Apple decided to do switchearoo and use a different instruction set CPU such as x64 they can do that with relative ease: ship new Xcode, ask developers to recompile and produce fat binaries (current binaries are already fat - arm v6 and arm v7, so the foundation is laid out already), and developers will have been around and motivated to recompile becuase of the financial incentive of new sales. Where old developer is not around, competition is sure to spring up because the market place makes it easier for demand to elicit supply. So combination of technical and business decisions created situation where Apple can switch CPU if they need to. This clearly stems from their previous experience of switching from Motorola to PPC and from PPC to x86.
Similarly, when designing a database I try to keep in mind that I will have to redo it, so I try to avoid irreversible operations and instead strive to preserve original data in my schema, so that I can always recompute all destructive operations (such as aggregates) later, even if the set of such operation changes over time.