Revision as of 19:02, 21 November 2005 editOmegatron (talk | contribs)Administrators35,798 edits →Images← Previous edit | Revision as of 00:42, 22 November 2005 edit undoJustinWick (talk | contribs)Extended confirmed users866 edits →Entropy/DisorderNext edit → | ||
Line 27: | Line 27: | ||
: I apologize I'm on travel for the holidays for the next week so I'm a bit sporadic. I must admit that I am very much in disagreement with you about how best to present Entropy etc... I have several textbooks on the subject I plan ton consult in the next few days (I do need a refresher course on such thing) but I believe that it is very correct to state that entropy is, above and beyond anything else, a measurement of information content of a physical system. Yes, this is related to various free energies etc, however that is more of a "side effect" than anything else. This definition is nice for many reasons, however my favorite would be that it is unitless (well, I guess you could say it's in "bits" but that seems to be the basic unit of information in information theory). I think that defining it in terms of free energy is not only confusing (entropy is only one variable in determination of how much energy can be extracted from a system... carnot limits etc also apply). I don't have time at the moment to disect your edits (I'll get to that later) but, certainly you understand that even a maximally disorded system can have energy extracted from it with the use of a low temperature sink. I think this makes some of your edits rather erroneous (or at best, highly misleading). I also really do not see the point of tying everything back to "free energy" - sure it may be more "intuitive" for some at first, however it really hampers any actual use of the concept, as it's certainly not mathematically intuitive, and it has tenuous connections at best to micro statistical mechanics. I see you've been doing some edits on the disorder page, and I think it'd be best to define entropy in terms of information content of a system with links to that page, and free energy (where appropriate, rather than relentlessly). - ] 07:25, 21 November 2005 (UTC) | : I apologize I'm on travel for the holidays for the next week so I'm a bit sporadic. I must admit that I am very much in disagreement with you about how best to present Entropy etc... I have several textbooks on the subject I plan ton consult in the next few days (I do need a refresher course on such thing) but I believe that it is very correct to state that entropy is, above and beyond anything else, a measurement of information content of a physical system. Yes, this is related to various free energies etc, however that is more of a "side effect" than anything else. This definition is nice for many reasons, however my favorite would be that it is unitless (well, I guess you could say it's in "bits" but that seems to be the basic unit of information in information theory). I think that defining it in terms of free energy is not only confusing (entropy is only one variable in determination of how much energy can be extracted from a system... carnot limits etc also apply). I don't have time at the moment to disect your edits (I'll get to that later) but, certainly you understand that even a maximally disorded system can have energy extracted from it with the use of a low temperature sink. I think this makes some of your edits rather erroneous (or at best, highly misleading). I also really do not see the point of tying everything back to "free energy" - sure it may be more "intuitive" for some at first, however it really hampers any actual use of the concept, as it's certainly not mathematically intuitive, and it has tenuous connections at best to micro statistical mechanics. I see you've been doing some edits on the disorder page, and I think it'd be best to define entropy in terms of information content of a system with links to that page, and free energy (where appropriate, rather than relentlessly). - ] 07:25, 21 November 2005 (UTC) | ||
:: Well, entropy "exists" just as much as any other theoretical entity (say, an electron) "exists" - we can use the idea of this entity to predict the physical universe. No known "macroscopic" system has ever repeatably/reliably been observed to violate any of the laws of thermodynamics, so I do not see anything wrong with the concept. Of course, as systems become smaller, the second law of thermodynamics begins to break down - interestingly enough, this also happens as one's time scale goes up - if a nonexpanding universe lasted forever, it would spontaneously reorganize itself and infinite number of times, in every possible configuration, in violation of the second law of thermodynamics. This is, however, so incredibly unlikely during any human lifetime that I would not mind saying it is "impossible" because it's more likely that you would be struck by lightning, than for you to observe this kind of reversal on a macroscopic scale. | |||
:: RE your comments about possible macroscopic violations of thermodynamics by humans, there are no scientifically accepted ways of doing this, even in theory. Certain forms of FTL or time travel (basically the same thing) are considered more plausible in theory than any "perpetual motion"/"free energy" (I mean energy for free, not the technical term in this case) machine anyone has ever devised. It is true that biological processes (such as evolution, on a large scale, or simple biological growth) can ''locally'' reduce entropy, but this is at the cost of increased entropy in other parts of the universe (such that the total is nondecreasing). | |||
:: As for pedagogical introduction, it's true that S is intimately related with free energy, however that quantity already has its own designation. I firmly believe that entropy is best described in the information content of a system (minimum number of bits required to reconstruct a system precisely). I'm not sure that I agree with the order in which these two concepts are introduced in the article is the best - however this could simply be my own bias towards statistical mechanics (which is a much more accurate model of reality than thermodynamics). I think I'll mention this on the talk page, as I don't feel comfortable making sweeping changes to such an important article on this kind of thing myself. | |||
:: Incidentally I think there is a bit too much confidence in physical "laws" - newton's laws of motion were not only completely "wrong" in their ideas, but all of modern physics can be done without any notion of explicit forces (energies can be used instead, see ] if you are unfamiliar with this). Mass is not a constant, even for the constituent particles of an object, momentum is quantized and uncertain, forces are noninstantaneous. Interestingly enough, ], a professor at Cornell University (my alma mater) once told me that he thought that "conservation of momentum is a silly law anyways," in response to my accusations that one of his pet theories violated it. Interestingly enough, though, he firmly believed in the second law of thermodynamics, as being far superior and a more fundamental truth. | |||
:: I guess in closing I should thank you for all the attention you've paid to various articles (some of your edits seemed to be quite good), however I would caution that making sweeping changes to fundamental articles without any discussion is probably a good way to get reverted. I think that any time you see something that's a sincere issue in an article, you should probably put something up at the talk pages... some people tend to be rather protective of these pages, and the subject matter is ''very'' difficult and contains many subtle issues. I have a bachelors degree in physics from a decent school, but I don't dare change any of these pages without consulting a textbook... Mathematics is such a precise language, and physics such a precise study that even small errors can have profound implications (the ] confused me for several years as a schoolkid) so the stakes are very high. I do think that a lot of these pages could use some overhauling (especially the more obscure pages) and copyediting never hurts :) Cheers - ] 00:42, 22 November 2005 (UTC) | |||
== 2nd law vandalism == | == 2nd law vandalism == |
Revision as of 00:42, 22 November 2005
Archive
View the archive of User_talk:Fresheneesz(archive)
Talk below
Images
- How did you upload an image without clicking on the upload link?? And how does the licensing of the image affect whether you can upload or not? — Omegatron 22:32, 9 November 2005 (UTC)
- - Well.. I didn't.. I created new pages with new files by using the "Upload file" link in the toolbox. But many image pages have an "Upload new version of this file" link that allows you to directly update the file without any doubts. Fresheneesz 02:31, 10 November 2005 (UTC)
- Aha! I never saw that link before. Since my images are on Wikimedia Commons, you'd have to go to the Commons description page first, and then you will see that link. — Omegatron 03:58, 19 November 2005 (UTC)
- - alright, I actually still don't see that link. For example at theres an "edit this file in an external application" link - but it just allows you to DL something (.. i dunno if its actually the file). The link I was talking about goes directly to an upload page with the file name already put in for you so that you know its replacing the correct file. Fresheneesz 01:16, 20 November 2005 (UTC)
- Hmmm... I see "Upload a new version of this file" immediately above that link. I don't know.
- The "Upload file" link always works, regardless. — Omegatron 14:39, 21 November 2005 (UTC)
- I think you have the link no matter what if you are the original uploader. But try going onto the picture's page without logging on. Fresheneesz 18:03, 21 November 2005 (UTC)
- Yeah, you're right. I don't know. I always just use the regular upload link. — Omegatron 19:02, 21 November 2005 (UTC)
Entropy/Disorder
Hello Fresheneesz. First let me say that I am absolutely in support of the WikiData idea (it is always good to question other's data processing methods, I've seen... some interesting data processing in my time on the MER mission. Anyways, I wanted to let you know that your edits to articles involving entropy seem rather misguided. It's really a very complex part of modern physics (and not always terribly easy for individuals to understand, even with a physics degree). I think for pages like that, the majority of editing affecting content (i.e. not typographical or structural) really should be performed by individuals with credentials in the subject (and familiarity). If you feel that there's something significant missing or incorrect on these articles, I would suggest using the talk pages, it is a much better approach IMHO because then if (as in this case) your ideas don't line up with accepted modern physics, someone there can explain why.
Science is never a finished process and we have much to learn in the years ahead, but Misplaced Pages is committed to reflecting the current consensus of scientific thought on any matter, "right" or "wrong." - JustinWick 19:05, 20 November 2005 (UTC)
- I apologize I'm on travel for the holidays for the next week so I'm a bit sporadic. I must admit that I am very much in disagreement with you about how best to present Entropy etc... I have several textbooks on the subject I plan ton consult in the next few days (I do need a refresher course on such thing) but I believe that it is very correct to state that entropy is, above and beyond anything else, a measurement of information content of a physical system. Yes, this is related to various free energies etc, however that is more of a "side effect" than anything else. This definition is nice for many reasons, however my favorite would be that it is unitless (well, I guess you could say it's in "bits" but that seems to be the basic unit of information in information theory). I think that defining it in terms of free energy is not only confusing (entropy is only one variable in determination of how much energy can be extracted from a system... carnot limits etc also apply). I don't have time at the moment to disect your edits (I'll get to that later) but, certainly you understand that even a maximally disorded system can have energy extracted from it with the use of a low temperature sink. I think this makes some of your edits rather erroneous (or at best, highly misleading). I also really do not see the point of tying everything back to "free energy" - sure it may be more "intuitive" for some at first, however it really hampers any actual use of the concept, as it's certainly not mathematically intuitive, and it has tenuous connections at best to micro statistical mechanics. I see you've been doing some edits on the disorder page, and I think it'd be best to define entropy in terms of information content of a system with links to that page, and free energy (where appropriate, rather than relentlessly). - JustinWick 07:25, 21 November 2005 (UTC)
- Well, entropy "exists" just as much as any other theoretical entity (say, an electron) "exists" - we can use the idea of this entity to predict the physical universe. No known "macroscopic" system has ever repeatably/reliably been observed to violate any of the laws of thermodynamics, so I do not see anything wrong with the concept. Of course, as systems become smaller, the second law of thermodynamics begins to break down - interestingly enough, this also happens as one's time scale goes up - if a nonexpanding universe lasted forever, it would spontaneously reorganize itself and infinite number of times, in every possible configuration, in violation of the second law of thermodynamics. This is, however, so incredibly unlikely during any human lifetime that I would not mind saying it is "impossible" because it's more likely that you would be struck by lightning, than for you to observe this kind of reversal on a macroscopic scale.
- RE your comments about possible macroscopic violations of thermodynamics by humans, there are no scientifically accepted ways of doing this, even in theory. Certain forms of FTL or time travel (basically the same thing) are considered more plausible in theory than any "perpetual motion"/"free energy" (I mean energy for free, not the technical term in this case) machine anyone has ever devised. It is true that biological processes (such as evolution, on a large scale, or simple biological growth) can locally reduce entropy, but this is at the cost of increased entropy in other parts of the universe (such that the total is nondecreasing).
- As for pedagogical introduction, it's true that S is intimately related with free energy, however that quantity already has its own designation. I firmly believe that entropy is best described in the information content of a system (minimum number of bits required to reconstruct a system precisely). I'm not sure that I agree with the order in which these two concepts are introduced in the article is the best - however this could simply be my own bias towards statistical mechanics (which is a much more accurate model of reality than thermodynamics). I think I'll mention this on the talk page, as I don't feel comfortable making sweeping changes to such an important article on this kind of thing myself.
- Incidentally I think there is a bit too much confidence in physical "laws" - newton's laws of motion were not only completely "wrong" in their ideas, but all of modern physics can be done without any notion of explicit forces (energies can be used instead, see Lagrangian mechanics if you are unfamiliar with this). Mass is not a constant, even for the constituent particles of an object, momentum is quantized and uncertain, forces are noninstantaneous. Interestingly enough, Thomas Gold, a professor at Cornell University (my alma mater) once told me that he thought that "conservation of momentum is a silly law anyways," in response to my accusations that one of his pet theories violated it. Interestingly enough, though, he firmly believed in the second law of thermodynamics, as being far superior and a more fundamental truth.
- I guess in closing I should thank you for all the attention you've paid to various articles (some of your edits seemed to be quite good), however I would caution that making sweeping changes to fundamental articles without any discussion is probably a good way to get reverted. I think that any time you see something that's a sincere issue in an article, you should probably put something up at the talk pages... some people tend to be rather protective of these pages, and the subject matter is very difficult and contains many subtle issues. I have a bachelors degree in physics from a decent school, but I don't dare change any of these pages without consulting a textbook... Mathematics is such a precise language, and physics such a precise study that even small errors can have profound implications (the Bohr model confused me for several years as a schoolkid) so the stakes are very high. I do think that a lot of these pages could use some overhauling (especially the more obscure pages) and copyediting never hurts :) Cheers - JustinWick 00:42, 22 November 2005 (UTC)
2nd law vandalism
I have to go now, so please watch over the page. Don't worry about the "3 reverts rule" - it only applies for non-vandalism. If that guy keeps adding that flawed paragraph, just hit revert. :) Infinity0 00:17, 21 November 2005 (UTC)
If worse comes to the worst, you can always find an admin to lock the page and ban that IP. I don't know any personally, but there's bound to be a list somewhere. Try Misplaced Pages:Administrators? Infinity0 00:20, 21 November 2005 (UTC)
- Misplaced Pages:Vandalism in progress is even better as a place to announce vandals. Yes, if vandalism is persistent, it definitely must be announced there. Oleg Alexandrov (talk) 00:47, 21 November 2005 (UTC)