Hacker Timesnew | past | comments | ask | show | jobs | submit | OD_'s commentslogin

But sublime isn't really a native app either. When did we start corrupting the meaning of native? When electron apps appeared, and we wanted to divide "compiled" versus "interpreted webapps" on the desktop?

Cross platform toolkits, be it proprietary, one-off like Sublime's, or things like Qt, GTK, WxWidgets, Java Swing, were NEVER considered native. Just because an app is written in C or C++ doesn't make it platform native.

Textmate is a native editor for MacOS. Sublime is not.


How would you create a native GUI app on Linux for either Gnome or KDE, considering that GTK and Qt are not allowed by this definition?

And yes, you may think: "Easy, if a GTK app runs on Linux its native, if it runs on another platform, it's not". Does that mean JavaScript apps for Gnome[1] can be called native, because they have officially blessed bindings to the underlying framework/platform and are fully integrated with GTK?

If not, does that mean as soon as you use any kind of binding/bridging technology "nativeness" is ruled out? Think of C++ apps for Gnome, they use bindings…

If yes, we are only left with defining what the "official" way for doing GUI's on a given platform is. Whatever the vendor gives us and preinstalls on its OS? Ok, no Electron, it uses the Chrome rendering engine, that is definitely third party and NOT native!

But… What if I open a WebKit WebView from Objective-C[2] to render my HTML from there and write my logic making use of JavaScriptCore[3] on macOS? Are we native yet? :)

[1]: https://developer.gnome.org/gnome-devel-demos/stable/beginne...

[2]: https://developer.apple.com/library/content/documentation/Co...

[3]: https://developer.apple.com/reference/javascriptcore


> How would you create a native GUI app on Linux for either Gnome or KDE, considering that GTK and Qt are not allowed by this definition? >And yes, you may think: "Easy, if a GTK app runs on Linux its native, if it runs on another platform, it's not". Does that mean JavaScript apps for Gnome[1] can be called native, because they have officially blessed bindings to the underlying framework/platform and are fully integrated with GTK?

Actually, yes, by definition a javascript app written with gobjects is native to a Gnome desktop. It fully integrates with the standards of the system be it text boxes, keyboard shortcuts, general behavior and widget look. The most important, even more than our personal little preferences, is that a native app can be accessed by things like screen readers which is the only way for a person with a disabled sight to use a computer. Anything not written with GTK on Linux is going to be a black box for Orca. In fact webapps would be less of a pain than a compiled app that uses a random crappy cross platform toolkit. While the web's accessibility could do with some improvements, it's still better than the absolute nothing that cross platform toolkit represents.

Sometimes devs put extra effort into making cross platforms apps accessible but they're the exception rather than the norm : https://www.parhamdoustdar.com/2016/04/03/tools-of-blind-pro...

The reason being is that native apps get "most of the work" done for them for free when they use the native tools to make apps, while cross platform apps require a severe amount of work to get them to talk to screen readers correctly.

Android Studio seems to be gaining on that side despite the original platform being pretty poor. And of course Sublime Text is absolutely unusable in that scenario.

Some apps do crossplatform the right way, although they're rare: they have have a platform-specific GUI rather than use a generic cross toolkit. Transmission is a solid example :

https://transmissionbt.com/

The app has a Cocoa, GTK, Qt, TUI and Web end user interface. It's native on all the officially supported OSes. There's a non-native, Qt-using windows port but it's a third party fork and not supported by the main devs.


From what the developer has stated on reddit, it's more like he wants to aggressively make changes on the filesystem right now, before any attempt at mainlining into the kernel, to not end up like btrfs, which in his view, was mainlined prematurely.


It does make sense to have it rock-solid stable before mainlining, so that people don't get burned by it early on.


The cheese graters cost between two to four times the price of an iMac depending on the era we're talking about (price fluctuations of the PowerMac/Mac Pro). Counting the fact that you do need to buy an external monitor to go along with it. That counts for something.

Apple never had a usable entry level desktop computer tower. Something with high performance on consumer class CPUs, rather than Xeon and ECC ram. The Mac Mini was always crippled to keep it from competing with the iMac and cheese graters among people who want something better performing.

The iMac is popular because it's the best performance to price ratio. Not because the form factor is any good. For the longest time the entry level MBP (and unibody Macbook before then) was also Apple's best selling laptop and they only recently cut it out from their line-up and replaced it with the Air as their entry level offering. The Air will also exist for as long as they keep selling the current Macbook at those prices because most people are not willing to spend 1449 euros on a machine that barely performs better.


The suse folks adopting btrfs says more about suse than it says about btrfs. I still remember in the past how I got tricked into believing ReiserFS (3, not the 4 that never made it into the kernel) was a good filesystem despite the broken fsck, the lack of defragmenting tools and many other issues listed there : https://en.wikipedia.org/wiki/ReiserFS#Criticism It was also the only time I've had a FS really eat my data after forced shutdowns (cutting off power). They kept pushing ReiserFS as the default until 2006, 5 years after the release of ext3 which added journaling to the ext family of filesystems. Ext3 was a much better filesystem all around. Suse only switched to ext3 because of the controversy with ReiserFS's author and the uncertain future of Reiser4, not because they admitted that ReiserFS was bad.

I will not make the mistake of putting any worth to suse's words again. Doing it again with btrfs shows they really have a knack for going edgy with filesystems while absolutely no other linux distribution is willing to recommend btrfs.

I trust the Red Hat guys to show more care and this is what they had to say only a year ago about btrfs:

https://www.reddit.com/r/linux/comments/37l2mf/i_am_matthew_...

"The btrfs developers keep telling us that it's not ready, so we're following that. (From one of our storage exports: "Btrfs will be ready in two years. The problem is, that's also going to be true next year, and in two years....") We try to be first where we can, but not at the cost of data loss for users.

-- Matthew Miller, Fedora Project Lead"

Doesn't look good to me.

As long as RH or Debian do not start recommending btrfs one cannot give it consideration in good conscience.


One way to look at this is in what technologies each company has spent resources. Suse has more Btrfs developers than I can count on one hand. Red Hat has zero these days. Red Hat might have more LVM and XFS developers than I can count on one hand. So it stands to reason each company's output will be biased, they're going to support (development, QA, and tech support) the things they're spending resources on.

Considering a big chunk, possibly the single largest chunk, of upstream is Suse, and they use it by default for several years on both the opensuse and enterprise offerings, it doesn't really make sense at all that 'btrfs developers say it's not ready'. This just doesn't square. What's going on in my opinion is, neither Red Hat nor Fedora have the resources, nor are they willing to add resources, to triage Btrfs related bugs and therefore Fedora isn't ready for Btrfs, not the other way around.

Even Suse goes very light on what is supported with Btrfs multiple device stuff, by the way. Single device volumes, I've reported bunch of minor bugs, no data loss ever. Multiple device stuff is difficult to qualify: if you're familiar with the warts you're at a net advantage over mdadm and LVM RAID. If you're not and run into trouble, there are traps and Btrfs's claims of focusing on fault tolerance and ease of use can betray the user.


> it doesn't really make sense at all that 'btrfs developers say it's not ready'. This just doesn't square.

Just looked at this page: https://btrfs.wiki.kernel.org/index.php/Status and I can see how they would interpret that as "not ready". There a good number of "mostly OK" ones, one is unstable. With comments like "write hole still exists, parity not checksummed", "auto-repair and compression may crash ", and others.

It might be good enough for some but I can see many customers of Redhat would not want to trust their crown jewels to anything that stays "mostly OK".


"mostly OK" and storage should not ever mix.

That's like the ¯\_(ツ)_/¯ of storing things. Scary.


Exactly! You phrased it better than me. Storage of customers' data and "mostly ok" should be very far from each other.


None of those are default configurations that you'd get with mkfs. And there is no UI in any installer to enable raid56.


Some of the niches that the 3DS owned up until now might be disrupted by the Switch though. I'm one of these old school (as in, thoughtful turn based combat, whether tactical ala FE or FFT, or more about party building and dungeon exploration like SMT and Etrian) RPG gamers who bought the 3ds solely games like the Fire Emblem, Etrian Odyssey, Shin Megami Tensei, Devil Survivor series, and of these, Atlus has already announced that the next major sequel to SMT will be on Switch instead of 3ds. The DS and 3ds up until now had all the major games of the series (Strange Journey, IV, IVA). It's nice to see the series go back to a more powerful console, we have missed seeing those demons animated in 3d the way they were in Nocturne on Playstation 2. Can't say I will miss the 3ds if the other game series follow suit.

Fire Emblem also traditionally had home console releases alongside portable gamesystems, but the Wii U flopped too hard for niche games to have much room. Concentrating developer efforts on a single console that does both portable and home, large screen play is a smart move for Nintendo.


It really depends on the game genres you like.

For example, I like fighting games, and that's why I bought the 3DS (even though you can count them all on both hands). Street Fighter IV was one of the early launch games for the 3DS and holds up surprisingly well.

Meanwhile in the Switch announcement, Capcom announced a rehash of the older 2D Street Fighter II, which is a less than impressive announcement. At least the Wii U got Tekken Tag Tournament 2.

If I didn't just buy a GPD Win recently (which plays the Windows Steam versions of SFIV and Street Fighter x Tekken among others at close to 60fps @ 720p in a 3DSXL-like form factor), I would still choose my 3DS for fighting games over the Switch for the foreseeable future. I still play my 3DS a lot because I somehow started playing Pokemon for the first time (Sun/Moon) and got hooked on it.

I actually pulled the trigger on the my GPD Win a couple of weeks after being disappointed with the Switch webcast. It basically killed any remaining desire I had to get a Switch, but as I mentioned earlier, it all comes down to the genres you play and which platforms serve them the best.


The SNES is not underpowered compared to the genesis if you compare them as a whole and not just spec for spec like comparing mhz on a CPU. Console hardware is made for running games and are not general computing devices. The snes hardware had many hardware supported graphical modes that allowed it to push the envelope without relying on a beefier CPU, the most famous of which is Mode 7 which allowed the pseudo 3d you see in games like F-Zero, the FF world maps, Mario Kart, Secret of Mana, Super Mario RPG and so on. And then there was this very common practice on SNES to embed coprocessors and DSP on the game cartridge, which is what allowed the graphical effects of Star Fox and Yoshi's Island notably. There was no such practice on the Genesis.

> The list of Super NES enhancement chips demonstrates the overall design plan for the Super Nintendo Entertainment System, whereby the console's hardware designers had made it easy to interface special coprocessor chips to the console. This standardized selection of chips was available to increase system performance and features for each game cartridge. As increasingly superior chips became available throughout the SNES's vintage market years, this strategy originally provided a cheaper and more versatile way of maintaining the system's market lifespan when compared to Nintendo's option of having included a much more expensive CPU or a more obsolete stock chipset.

https://en.wikipedia.org/wiki/List_of_Super_NES_enhancement_...

As far as graphical capabilities were concerned, the SNES ecosystem was definitely more powerful. By the time Sega considered the idea of extending the Genesis with the 32X it was already too late into the console lifecycle to matter and flopped (very close to the release of the Sega Saturn and 1 to 2 years before the Nintendo 64, depending on your region [NA, EU, JP]).

There was also the Sega CD but all it did is enable a library of not-games pseudo-interactive movies.


The SNES is not underpowered compared to the genesis if you compare them as a whole and not just spec for spec like comparing mhz on a CPU.

This is literally what the parent said phrased differently.


I didn't want to state the obvious and be too detailed but yes that's exactly what I meant. And is what people were trying to assess above too: what are the reasons behind Nintendo hardware. And to phrase it again differently, Nintendo tries to be vertical for "entertainment", doesn't really matter how, as long as they achieve quality fun.


Yes, but the Genesis came one year before and had quite a more beefy CPU (even two IIRC). This makes the SNES "processing" power a bit tame in a way. Otherwise I agree, visual and musical capabilities were vastly more important for gamers. Lots of games looked magical on a SNES but rusty on a Genesis.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: