Hacker Timesnew | past | comments | ask | show | jobs | submit | Jacked's commentslogin

Sounds like he's referring to "image persistence", which is similar in appearance to burn-in, but a different mechanism.

Burn-in is more likely to be found on CRTs and can take a long time to occur. Image persistence can happen in seconds or minutes and is what is seen on LCDs. It's usually temporary or reversible. (And plasma owners can experience both, yay! :)


Hmm, you're right. I did choose a wrong word, my problem is more of Image Persistence[1][2] than CRT-type burn-in, which in my case, did magically appears after few minutes of use and took a long time to disappear.

This doesn't deny my point of Apple support is awful outside US, though. I also have issue with horizontal line on 2009 iMac screen that they refuse to acknowledge the problem until very recently (October 2011). I'm happy that it's finally fixed, but it shouldn't took two years.

[1]: http://en.wikipedia.org/wiki/Image_persistence

[2]: http://compreviews.about.com/od/monitors/a/LCDBurnIn.htm


Hence the bluetooth keyboard as the first thing on his list ;)


Then, why don't you just get a laptop? Or slate?


Looks like Sublime Text 2 meets his requirements. Goto Line is simply Ctrl-G, and up pops a simple input line, similar to hitting ":" in vim.

As an added bonus, the input line disappears if you press Ctrl-G again, hit <esc>, or press enter without entering anything. So, fast and easy to get into, fast and easy to get out of.


Hah, nice. My first thought was: what's the big deal? Someone installed Pow, though I wouldn't expose my dev machine to the public like that. Oh, wait....


Hey, you and me both, brother! Some of the most fun I've had programming was writing an assembly TSR that would scrape the screen buffer of the PAL BBS game Flash Attack (I think that was name of it, anyway) to calculate the firing angles for me.

I also tied into the keyboard hooks so that I could build an entire decoy base with a single key press (just some simple buffer stuffing) along with a cool little "death blossom" attack.

Ahhh, those were the days ;)


It's also interesting to note that in the U.S., companies are thought of as a standalone entity, and referred to in the singular, rather as a group of people.

So, it'd be "BankSimple is based in...", instead of "BankSimple are based in...".

When I first started reading things from the UK, I initially thought the writers simply used poor grammar :)


So what are the frameworks or tools that you recommend? Serious question from someone that has so litte Java experience, you might as well call it "none."

All of the XML config stuff was, to some extent, what turned me away from Java so many years ago.

If I were to start over today, what kind of frameworks would be the wiser choices for desktop apps (gui frameworks) or web apps (web frameworks, maybe an ORM)?


The Play framework (http://www.playframework.org/) is a good example of how things can be done better with Java. None of the bloat or XML config, just a productive set of APIs to work with. It seems to mostly try to stay out of your way. I would love to try it on a real project one day.


That is a hard question to answer in general terms because the answer will depend on what sort of problems you solve. For a lot of my professional life i have worked on large scale information processing or search engines, and thus I know more about designing server components and relatively little about frontend stuff.

But if I were to come up with an answer it would be something along these lines: don't use of a framework or tool that cannot, with relative ease, be replaced by something else. For instance, in a well-designed networked application it is usually simple to replace one networking library with another. Or to replace one HTTP implementation with another. And I am not really talking about drop-in replacements.

Years ago I spent about about a week replacing the entire networking layer in a high traffic server, going from a pre-NIO blocking design to an asynchronous NIO-based design. This actually changed the entire execution model of the system as well as the networking parts, but it had a lot less impact than you'd think because there was proper separation of concerns, strong non-leaky abstractions and a lot of code that was written by people who were disciplined and consistent designers.

As for GUIs, I am not really the right person to ask. I've dabbled a bit in GWT, but I am not entirely certain I like it. Perhaps not so much because of GWT itself, but because it is tiresome to deal with building and deploying.

ORMs are generally a bad idea. Avoid them. You will feel some initial thrill when you can do some simple magic tricks and then everything ends in tears when you find that you actually have to understand exactly how it works and dig into the innards. Definitely not worth the trouble. (If you use ORMs by way of annotations you are doubly fucked because you will have one more thing that can go wrong which then necessitates dipping your toes into territory that you are not dealing with on a daily basis. I have no idea where some developers find the guts to depend on complex yet fragile subsystems that they have zero understanding of)

Instead you should design internal application specific APIs for dealing with stored state.

For instance, if you are writing a blogging server, you should design a interface that provides the operations you need against the blog store. Start by just implementing the storage operations in an implementation class. Then, if you need support for different types of blog stores, you extract an interface definition and then write implementations of that interface. (Of course, when you write the first implementation class you keep in mind that you might want to turn it into an interface later. This should keep you honest and ensure that you never, ever leak types that are specific to the underlying storage through your API).

In one of my current projects I did just that: I created an abstraction over what I needed to store in a database. The initial prototype didn't even use a database -- it was backed by in-memory data structures. Lists and Maps. This allowed me to prototype, experiment and discover what I actually needed without being side-tracked by details on how to realize this in a database.

Eventually we wrote implementations for both an SQL database (mostly as an experiment) and Cassandra. At that time, people depending on this server had already integrated with it -- before it was even capable of persisting a single byte to disk. As I wrote the in-memory implementation I wrote extensive unit tests. Both to test for the correctness of the code, but also to document what behavior was expected of an implementation. Not only did we later apply the same battery of unit tests to the other implementations, but the unit tests became the measure of whether new backends would be compliant.

As I said earlier, it is hard to give general advice, but I think it is very important to learn how to design software rather than picking a framework that will dictate the design for you. It is very hard to undo choice of architecture so at the very least one should make an effort to learn how to think about, and design, architecture. If nothing else so you can later choose the Least Evil Alternative.

I think 90% of people who got on the J2EE bandwagon were clueless about architecture and just did as they were told. the remaining 10% may have cared about architecture, but were not sufficiently averse to complexity and mindful about programming ergonomics to realize what a horribly bad idea it was. Of course, by the time people realized J2EE was a waste of time they had all this value locked into code that was really, really hard to re-use in a different context.

As for Spring and the over-use of dependency-injection and autowiring, that too will pass once the loudest monkeys in the tree get to change jobs a couple of times and realize that breeding complexity by scattering knowledge across a bunch of files is not a terribly bright thing to do. People usually get to hate Spring once they inherit someone else's non-trivial Spring-infested codebase.


What point am I supposed to be getting here? Yup, it's a benchmark. F# was consistently slower than Java, but, consistently used less memory and fewer lines of code.

Are you saying this is good, or bad?


Well... most of the stats aren't that different but them I'm reminded that this is mono and its using significantly less memory than java. Mono has a really sloppy garbage collector.. what does that say about java's in this benchmark.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: