Hacker Timesnew | past | comments | ask | show | jobs | submit | pagejim's commentslogin

Isn't it impossible as per laws of physics to achieve less than absolute ZERO temperature?


It depends on what we mean by "temperature".

If we think that temperature is a measure of how much the atoms move around then it is not possible to have less than zero movement.

But if we think temperature as related to the ratio of how much a system's entropy changes when a certain amount of energy is added (or removed) to the system [1],

1/T = dS/dq

then it's possible to construct systems with a negative T.

[1] Eqs. 8–10 in http://en.wikipedia.org/wiki/Temperature#Second_law_of_therm...


If I understand correctly, it's like saying a car has negative speed because it's going backwards. If you accelerate it forwards you actually reduce its speed towards 0, but that does not mean that the car is more immobile at -5km/h than at 0km/h.

It's just a matter of convention, a negative temperature just means that the temperature gets closer to 0 as you add energy. Did I get that correctly?


Negative temperature actually has some nuance to it. By the relation 1/T = dS/dq where T is temperature, S is entropy and q is heat added to the system, a negative temperature means that entropy decreases when you add heat to the system and increases when it emits heat. By the second law, entropy always goes up, so a negative temperature object will always emit heat. In that way, something with a negative temperature is extremely hot.


In fact, it's a negative absolute temperature because it's hotter than infinity.


> If I understand correctly, it's like saying a car has negative speed because it's going backwards.

Well, kind of, but I think that would be an oversimplification.

Negative temperatures have a concrete physical meaning, being "hotter than infinity", in the sense that if we bring in contact 2 systems: one with an arbitrarily high positive temperature, and one with a negative temperature, energy (heat) will flow from the negative system to the positive.

This looks tidier if we use the thermodynamic beta (beta = 1/T) instead of temperature. Then we can say that heat always flows from a system with a smaller beta to systems with a larger beta.

http://en.wikipedia.org/wiki/Thermodynamic_beta

Maybe debt is a somewhat useful metaphor here. If someone has a debt of 5 apples, in some ways if makes sense to say that this person owns -5 apples. But in some other ways it makes no sense: a person with 5 apples can eat them to get less hungry. A person with -5 apples cannot eat them to get more hungry.



Are they sure it's a planet. Could be a large spaceship


This really interests me as well - how do they know that it's a 12 million years old planet?

I suppose a big reason would be that it does give off a heat significant (if not AS significant) signature to begin with - a large spaceship would very likely try to conserve as close to 100% of its energy.

Then again - what about a dyson sphere that is still under construction?


Berserker


Space hulk!


Maybe it's Jenova's?


Got me thinking whether an application can detect if its running on a virtualized OS.

Found this : http://stackoverflow.com/questions/154163/detect-virtualized...

Whether something like this exists for our virtualized environment, we have to find out.


The point with any "blue pill / red pill" technology/experiment is that you can design it because you have "outside knowledge", you are outside the VM and know how the VM and the physical processor works. The knowledge required to detect of something is running as a simulation comes from outside the simulation. Even if by "miracle" you were to get this "outside knowledge", you wouldn't know whether it is real knowledge or something spitting a random result. You'd need to be "outside" to know that what you thought it was "spooky outside knowledge" actually was. Even in Matrix, the red pill is made via knowledge from outside The Matrix :) Think, how would Neo know if the outside the matrix experience weren't just an "acid trip"? He wouldn't, were it not for the "bending the laws of physics" phenomena that could be experienced by others too, but they were only possible because The Matrix was programmed to allow them in special circumstances, but you could still classify them as "science so advanced you can't distinguish it from magic" if you as a movie viewer didn't have the "outside knowledge".


Historically speaking "outside knowledge" is what we call as religion.


If you have time to write long blogs and exhaustive comments, then I am not sure you are really working hard enough on the thing you love the most.

And that is why most people here on HN (me included) might not ever get know what it is like to be in the "zone".

Hence this whole debate of working hard or not working hard or productivity is all really BS.


Yes, and for some people like me, reading and finishing the complete TAOCP series is a 'forever' project.


One of the side projects I have to embark on is writing a companion to TAOCP that shows how to implement the algorithms in a functional language, probably Scheme or some subset of Common Lisp. The main complaint I have about TAOCP is Knuth's insistence on using assembly language to illustrate the algorithms -- otherwise, they give what is probably the most in-depth description of the motivation and analysis of algorithms I have seen (much more detailed than CLRS, though CLRS is more complete).

Also, it is worth pointing out that there is no "complete TAOCP" yet. It remains a work-in-progress; I am looking forward to Volume 4B (though I have yet to actually finish any of the current volumes).


If you read the OP carefully, you will find that it is an implicit announcement. Google hasn't announced it explicitly. But the amount of hard work and time that they have put in, takes them years ahead. The number 400 is open to discussion and interpretations.

Read some of the other comments to find out how it can be interpreted.


Linkbait headline, got it.


Internet, being the most widely reaching medium along with the ease of publishing, becomes quite attractive to anyone who hasn't got the means ( money/influence ) to get content published on more traditional mediums like Newspaper, Magazines and TV.

Hence you will find all kinds of material on the Internet including all kinds of hate speeches against all the religions, ideologies, personalities etc. Google being an indexer of Web would naturally present such material if you go looking for it. It's not Google's fault nor should it be its worry. After all, you can't really blame or shoot the messenger.

In a way, Internet has had more of a leveling effect on the kind of content and information that is available to common people in the world ( at least for those who have some kind of access to it ).

If for any bizarre reason, Internet starts to become more censor prone, it would loose its magical value. It is the most important invention of our times. It's somehow better if we don't have control over it. Just sit back and try to recollect the amazing pace with which it has changed and how beautifully it has sustained itself. It has a evolution path of its own.

The logic inside my head says that it was correct when Google removed the video from Libya/Egypt. But there is something in my heart that says that we would loose the innocence of Internet we go down this path too often.

Remember how reckless, carefree and full of possibilities we were when we were children. Then we came of age and learnt to how behave and think and do things like Adults. On the way, we somehow became ... well, just boring.

Remember, Internet is still a child. Hope we don't force it into becoming just another boring adult.


Hi,

I am not an iPhone user, but if what you have described above are problems that a significant number of iPhone users face, then I am pretty surprised/disappointed.

All of these would be hampering user experience ( something Apple excels at ) irrespective of whether the user is a casual one or a heavy one.

Also, if these problems are easily reproducible and quite prevalent, isn't Apple solving these in upcoming upgrades? In other words, they must be getting some kind of feedback/bug-reports and these problems should surely have to be part of that.


Given its financial performance, I can only imagine that iOS's quality is priority #0, #1, and #2 through 10 at Apple. From an anecdotal perspective, it has all gotten a great deal better over the years.


>I am not an iPhone user, but if what you have described above are problems that a significant number of iPhone users face, then I am pretty surprised/disappointed.

As noted by other posts, most of the issues are with OSX applications rather than with iOS.

To be honest, I think that the storage/backup issue is a legacy design issue with iOS. Unlike Android, iOS was never originally designed to be a standalone OS. Apple designed iOS to sync heavily with iTunes, which led to a lot of this "if you just restore from a backup, it'll magically be fixed" nonsense. Apple's started to move away from that, and more towards iOS being it's own thing with the OTA updates and by giving users a lot more control over storage usage through the device itself instead of iTunes, but they still haven't really broken away from it's necessity.


Those are mostly problems with Mac OSX(except the storage issue on the iPhone).


Gt


Wonder how RealTime it is? Encoding/Decoding of data/signal as per LTE specs is a multi-step process with lots of maths intensive operations.

Normally, custom made Baseband DSP processors are used for this kind of stuff which contain special HW Accelerators for this kind of intensive computing.

Would be really interesting to see how he has implemented it and how practical is it.


Had it been somebody else, I would have pondered on the truth of the claim. But Mr. Bellard... that is another thing entirely.

My guess is that he does have certain limitations, for instance the amount of connections he can manage and so on. But I do think modern hardware can be made to perform well if you know what you are doing. There are many ways to generally implement the DSPs on a modern PC that would be fast enough.

Also, a guess is that one of the reasons DSPs are preferred is that they have a better power profile. You don't need that for testing purposes.

All in all, I think it is a great project.


It uses a USRP N210 (https://www.ettus.com/product/details/UN210-KIT) which has an FPGA that is most likely used for the signal processing. However, quite a lot can be done on a standard PC these days with SSE or GPGPU programming.


The list of reference books [1] on the site is quite formidable. Each book in that list deserves a lot of time to do justice to the content they have in them and the hard work/experience authors have put in.

Anyway, even if you start developing your own OS, you may not end up writing the whole of it all by yourself. Most people get a lot of help from outside and that is not at all bad, infact it has already proved itself to be the way to go by the open source Operating systems that are around today.

[1] http://wiki.osdev.org/Books


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: