> Sure, in the long term things should be refactored, structured, and optimized.
How often does that really happen though? Once you've amassed enough technical/data debt, resistance to refactoring increases until it never happens at all. Having well defined, coherent data models and schemas from the start will pay off in the long run. Applications begin and end with data, so why half-ass this from the get go?
Assuming you're not omniscient you'll be refactoring regardless. The difference is whether you'll be paying as you go (clients want a JSON API, we need to add new columns to a table but it'll lock rows) or if you'll be taking on technical debt to be repaid in the future (turns out Mongo sucks and we would do much better with Cassandra for serious horizontal scaling).
I believe that if you aren't extremely certain about what the future holds it may be best to work with a more flexible technology first and transition to a more structured setup once you have solved for your problems and identified intended future features. And if you are extremely certain about what the future holds you're either insanely good at your job or just insane.
I think the key is to do regular refactoring as you go -- it has to be an ingrained part of the process. It's really a management issue. Not every company/project has the foresight to budget for this, of course. If a team can't or won't regularly improve their infrastructure, then yes, a more structured approach would probably be better for anything that needs to last.
Of course there are other considerations. A more "planned" structure always makes sense if you're talking about systems or components that are life-critical or that deal with large flows of money. The "fast and loose" approach makes the most sense when you can tolerate occasional failures, but you have to have fast iterations to be quick-to-market with new features.
In my experience the likelihood of your scenario (increasing resistance to refactoring)
is almost always indirectly proportional to the amount and quality of refactoring tools available.
500KLOC JVM/.NET application? No big deal.
50KLOC JS/HTML-based SPA? Pfooooh. That could take a while...do we really need to?
How often does that really happen though? Once you've amassed enough technical/data debt, resistance to refactoring increases until it never happens at all. Having well defined, coherent data models and schemas from the start will pay off in the long run. Applications begin and end with data, so why half-ass this from the get go?