I work on both apps and the web. My preference and long-term bet is definitely on the web - app stores are mostly a tool for locking customers in; they're superficially attractive to both customers as devs, but in the long run they're also too restrictive for both and too tied to the interests of the app store producers - the Apple app store for example has been used to effectively ban alternative browsers, force Google apps to be second class citizens on the store, stop Amazon from selling ebooks with a buy button in their app, etc, etc - all actions which are in the interests of Apple, but not of customers or developers. That's before you get into the problems of rewriting your stack n times because the different platforms want to force you into using their stack to develop with and constantly work to make creating a cross platform product infeasible, with n increasing each time a new device is announced.
For all the faults of web development, I also prefer it over having to work within the blessed stack chosen for me by the OS vendor, and being forced to migrate every few years as they decide their old stack isn't worth maintaining (secretly I wonder if this constant churn is not also useful to them on some level, it stops app devs every considering other platforms, they're running just to stay still). In contrast web development means you can choose your own tools, you are not limited to using javascript if you don't subscribe to this latest fad that all logic must be in the front-end written in javascript (which never appeared interesting and unstoppable to me at least), and adhere to the original interfaces of the web which was so dumb and simple it was incredibly powerful:
Simple text files (markup, styles) sent over the network to the client, which interprets them in a predictable way to show your UI and data (the predictable part has improved much over the last decade).
I do agree with you though that not having a bytecode for the web is a severe restriction on front-end development, and means that I prefer doing backend web work and using js in a limited way for ajax etc, not trying to use it to generate entire documents/UIs. Personally I find JS is fine for limited tasks, but not for writing apps in without quite a lot of pain. I expect that to be addressed at some point soon though, either by asm.js (not optimal), or something like Nacl. Till then, it's bliss working in whatever tools I want on the backend, even switching out tools if I want to, without the end user even knowing that I'm using a different stack - to them what is important is the app, not what it is written in - that is as it should be, and not something you see on other platforms. That's the attraction of web development for me, and one that native development will never equal.
I disagree that webapps will die - I think they'll just evolve and eventually will become universal applications based on HTML. Javascript will die, but not the web. However that is really a political question about how hard OS vendors will push users towards native apps (where they make money both from sales and more importantly from lock-in).
Will OS vendors continue to dominate computing? I think that's a more interesting question, and one we'll see play out over the next few decades. The web may yet reduce native platforms to a set of badly debugged device drivers.
For all the faults of web development, I also prefer it over having to work within the blessed stack chosen for me by the OS vendor, and being forced to migrate every few years as they decide their old stack isn't worth maintaining (secretly I wonder if this constant churn is not also useful to them on some level, it stops app devs every considering other platforms, they're running just to stay still). In contrast web development means you can choose your own tools, you are not limited to using javascript if you don't subscribe to this latest fad that all logic must be in the front-end written in javascript (which never appeared interesting and unstoppable to me at least), and adhere to the original interfaces of the web which was so dumb and simple it was incredibly powerful:
Simple text files (markup, styles) sent over the network to the client, which interprets them in a predictable way to show your UI and data (the predictable part has improved much over the last decade).
I do agree with you though that not having a bytecode for the web is a severe restriction on front-end development, and means that I prefer doing backend web work and using js in a limited way for ajax etc, not trying to use it to generate entire documents/UIs. Personally I find JS is fine for limited tasks, but not for writing apps in without quite a lot of pain. I expect that to be addressed at some point soon though, either by asm.js (not optimal), or something like Nacl. Till then, it's bliss working in whatever tools I want on the backend, even switching out tools if I want to, without the end user even knowing that I'm using a different stack - to them what is important is the app, not what it is written in - that is as it should be, and not something you see on other platforms. That's the attraction of web development for me, and one that native development will never equal.
I disagree that webapps will die - I think they'll just evolve and eventually will become universal applications based on HTML. Javascript will die, but not the web. However that is really a political question about how hard OS vendors will push users towards native apps (where they make money both from sales and more importantly from lock-in).
Will OS vendors continue to dominate computing? I think that's a more interesting question, and one we'll see play out over the next few decades. The web may yet reduce native platforms to a set of badly debugged device drivers.