User talk:Jheald/Archive 1
Welcome and Software patent
[edit]Hello and welcome! Thanks for your contribution in the software patent article. I have added two comments within the text which read: "Please cite your sources" (visible only when editing). It would be nice to integrate authoritative sources to support the matter you added (particularly after the FOLDOC definition and regarding the definitions according to taste of "pure software patent"). See also No original research. --Edcolins 12:04, Nov 30, 2004 (UTC)
Thanks for adding the sources in the article Software patent. But what do you exactly mean by "EP article 2(ba)" and "EP article 2(a)"? --Edcolins 07:31, Dec 12, 2004 (UTC)
- The references are to the amendments passed to the software patent directive by the European Parliament in September 2003, see eg http://www.ffii.org.uk/swpat/eudir/texts/articles.html . Note that in most cases for readability I am summarising; quoting phrases directly only where indicated.
Moving
[edit]Hello, if you want to move a page could you please use the "move" function (in the topbar), instead of copying and pasting from one page to the other, because this loses the edit history, thanks. G-Man 22:26, 6 Jan 2005 (UTC)
- I did (??). But double-redirects I think have to be edited by hand.
Patentable subject-matter
[edit]If you don't mind, I moved the whole discussion to Talk:Patentability so that everyone can join in. --Edcolins 20:43, 17 October 2005 (UTC)
Math notation
[edit]Hello. Please don't use "<" and ">" for angle brackets in TeX. Observe:
Orthonormal functions are normalized such that
with respect to some inner product <f, g>.
(See my follow-up edits after your edits to normalizing constant.) Michael Hardy 20:15, 28 October 2005 (UTC)
- Thanks. (I'm still on a learning curve here).
Thank you and remark
[edit]Thank you for your additions to catastrophe theory. And one remark. Using the edit summary and minor edit button a bit more often does not hurt. :)
By the way, you may be interested in Wikipedia:WikiProject Mathematics and its talk page. That's where a lot of discussion takes place. Oleg Alexandrov (talk) 04:23, 18 November 2005 (UTC)
Mediator response
[edit]The edits on Creationism should be put in other places. Mainly this is on physics topics and there are a lot of other possibilities to create new articles on Creationism and relationship with/or/and second law of thermodynamics. I am waiting your response.
- I suggest you create new articles for Creationism and relationship with/or/and second law of thermodynamics and let the article of second law of thermodynamics free of ambiguous links. -- Bonaparte talk 13:11, 28 December 2005 (UTC)
Second Law of thermodynamics
[edit]Hello - could you take a look at the edits by Flying Jazz on the second law of thermodynamics page? They relate to our discussion on availiable energy on the talk:entropy page, concerning the energy availaible for work. Among other things, I don't like the "fluxion" notation FJ is using, but thats only because its not standard. It is in fact more informative. Thanks - PAR 19:35, 4 January 2006 (UTC)
- Hi! In general, I have a sense that the thermo in wikipedia is great but open systems seem under-represented even though in some cases they're simpler to deal with mathematically than closed ones and more applicable to real-world power generation too. I'm not sure yet about your changes to the second law content I added, but I know I like the R subscripts better than I had there. Could you also look at the changes I made on the enthalpy page? Thanks! Flying Jazz 08:12, 10 January 2006 (UTC)
- Hi - more of the same. Could you check out recent edits to Free energy - I still don't have a intuitive sense of this free energy stuff except mathematically. Thanks - PAR 02:20, 30 January 2006 (UTC)
After there are no (non-talk page) links pointing there, would you support a changing of the redirect to Phoenix, Arizona? Leave a note on my talk page, please. Matt Yeager 00:00, 27 January 2006 (UTC)
Great work!! Very nice and informative article. deeptrivia (talk) 21:01, 2 February 2006 (UTC)
Matt Yeager
[edit]Hi, you may be interested in reading the comments at Wikipedia:Requests for adminship/Matt Yeager. Austrian 20:54, 3 February 2006 (UTC)
Software patents under the European Patent Convention
[edit]Thanks for your message on the talk page of this article. Your suggestion does not seem to have been heard. Instead, I received nice compliments from User:80.237.152.53. I will not revert it back but I am confident that a reasonable solution can be found... --Edcolins 08:20, 15 February 2006 (UTC)
{category redirect} description?
[edit]Hi. I saw your comment on the feb 12 2006 CFD page regarding {{category redirect}}'s instructions. Currently the talk page for that template does say:
- Q: Why doesn't the bot operate on my category?
- A: Any category using this template now requires the last edit of the category to be made by an administrator.
I think that the instructions for cat redirects in general could be made clearer though, so I may try to take a look into improving it. As an admin, I didn't even know thats how one flagged the category for moving for several days after I started at cfd. :/ --Syrthiss 14:13, 20 February 2006 (UTC)
- Yep, thats what I was thinking too. That is indeed the preferred behavior, IMO. --Syrthiss 14:26, 20 February 2006 (UTC)
Clan Fraser
[edit]Hi. Thank you for taking an interest in the Clan Fraser article. There are many theories as to how the Frasers came to Scotland, but your Norman theory is only one. I find it POV to label all the greater Clan as Scoto-Norman. If you wish to create an article for either Clan Fraser of Philorth (now Saltoun), feel free to label them Scoto-Norman. However, the Lovats, and the Family, are not Scoto-Norman. We are a Scottish Highland Gaelic Clan, and have been so since the 13th century or so. I'll be removing the label from Clan Fraser, and especially from Clan Fraser of Lovat. I will however, add mention of the Norman theory.
Haplogroup R1a (Y-DNA)
[edit]Hi there Jheald. Why did you change R1a to R1a1 in Haplogroup R1a (Y-DNA)?
I'm not a geneticist, but AFAIK, R1a1 is only one subgroup of R1a, associated with the change M17. There is also the branch R1a*, without the M17 change. See this chart. --Saforrest 08:22, 27 February 2006 (UTC)
- Hi there. I had a look at the Anatolia paper; I couldn't find any mention of M17 on the first couple of pages, but both Figure 2 and Table 2 associate M17 with R1a1 specifically. They don't even mention R1a or any other R1a group other than R1a1; they're probably not significant for this study.
- On the other hand, the National Geographic Genographic Project openly equates M17 with R1a, but that's a bit fishy since they also equate M173 with R (when M173 is actually R1).
- Now, it certainly appears that R1a1 is overwhelming more abundant everywhere (not just in Europe) than any other R1a haplogroup. I agree with your point: if almost every car on the road is a Ford Taurus, we should say so and not just call them all Fords.
- With that in mind, I propose moving the page Haplogroup R1a (Y-DNA) to Haplogroup R1a1 (Y-DNA). R1a* can be mentioned there on in the page Haplogroup R (Y-DNA), which can serve as survey of all the R subgroups, including tiny ones. Sound reasonable? --Saforrest 17:36, 27 February 2006 (UTC)
Entropy history section
[edit]Jheald, in the entropy history section you changed John von Neumann’s quotes around; in a sense, putting words in his mouth that he did not say. I would appreciate it if you would go back and replace the original quotes. Editing is one thing; changing history is another. Thanks: --Sadi Carnot 04:59, 10 April 2006 (UTC)
- Version according to (Jheald):
- Claude Shannon introduced the very general concept of information entropy, used in information theory, in 1948. Initially it seems that Shannon was not particularly aware of the close similarity between his new quantity and the earlier work in thermodynamics; but the mathematician John von Neumann certainly was. "You should call it entropy, for two reasons," von Neumann told him. "In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage."
- Version according to (John Avery):
- An analog to thermodynamic entropy is information entropy. In 1948, while working at Bell Telephone Laboratories electrical engineer Claude Shannon set out to mathematically quantify the statistical nature of “lost information” in phone-line signals. To do this, Shannon developed the very general concept of information entropy, a fundamental cornerstone of information theory. Initially it seems that Shannon was not particularly aware of the close similarity between his new quantity and earlier work in thermodynamics. In 1949, however, when Shannon had been working on his equations for some time, he happened to visit the mathematician John von Neumann, who asked him how he was getting on with his theory of missing information. Shannon replied that the theory was in excellent shape, except that he needed a good name for “missing information”. “Why don’t you call it entropy”, von Neumann suggested. “In the first place, a mathematical development very much like yours already exists in Boltzmann’s statistical mechanics, and in the second place, no one understands entropy very well, so in any discussion you will be in a position of advantage.”[1]
- Reference
- ^ Avery, John (2003). Information Theory and Evolution. World Scientific. ISBN 9812384006.
Does this revised version sound better? I've cleaned it up a bit; it is sourced by Nobel Prize winning author John Avery, in what is essentially a small textbook on information theory. The chapter in which the above paragraph is copied, word-for-word, contains seven sources by Shannon, from the years '48 to '93. I hardly think that a famous 1949 story by the "father of information theory", only recently appeared in 1971, 22-years after its inception? Either we can work together to make compromise, or we can put both our versions on the entropy talk page to see what other editors think.--Sadi Carnot 16:04, 10 April 2006 (UTC)
- Moved discussion to talk:entropy; I think we should let this debate sit there for a while to see if anyone else has an opinion?--Sadi Carnot 17:21, 10 April 2006 (UTC)
- I started the page History of entropy and move our discussion there. I tried to incorporate both of our views into the section. I hope it will be agreeable to you?--Sadi Carnot 01:03, 11 April 2006 (UTC)
- Moved discussion to talk:entropy; I think we should let this debate sit there for a while to see if anyone else has an opinion?--Sadi Carnot 17:21, 10 April 2006 (UTC)
Frequency probability
[edit]I note that you are one of many users who's had a problem with User:INic at Frequency probability. S/he routinely censors any discussion of Bayesian alternatives to frequentism. I've had a series of reverts on an attempt to include a single sentence. I thought I might tag you in, if you're interested.
Jhead, what are you doing? You just reverted all of my work to the intro? see: Talk:entropy. --Sadi Carnot 04:39, 20 June 2006 (UTC)
haplogroup R
[edit]hold your horses, please. There can always be an independent R1a1 article per WP:SS, but I'm working on building a coherent central article here, don't demolish that. The article was nowhere near unwieldyness, it was just approaching 'substantiality'. dab (ᛏ) 21:27, 28 June 2006 (UTC)
Risk
[edit]Heh, good catch -- I forgot the big one! --Dhartung | Talk 20:31, 11 July 2006 (UTC)
Thank-you for the Kelvin link
[edit]Wisconsin University have a scan of the paper online here. Thomson, William, "Kinetic theory of the dissipation of energy", Nature, IX. pp 441-444 (April 9, 1874). It can also be found reprinted in Harvey S. Leff, Andrew F. Rex, editors. Maxwell's Demon 2: entropy, classical and quantum information, computing: 2nd edition. Institute of Physics. 2003 -- Jheald 07:47, 12 July 2006 (UTC)
- Thank you very much Jheald. These old papers and books are very hard to get a hold of. If interested, here's a link to a partial selection of some of his other papers. The guy who types these papers up and puts them on line is Lyle Zapato, he's a big Kelvin fan, who lives in France. Thanks again: --Sadi Carnot 11:43, 12 July 2006 (UTC)
- I think you may have been misinformed by agents of the Belgian Conspiracy. I do not live in France; I live in the Republic of Cascadia. Lyle zapato 09:26, 9 October 2006 (UTC)
Barnstar
[edit]The E=MC² Barnstar | ||
For your work in theormodynamics and statistical mechanics. Blnguyen | rant-line 04:56, 2 August 2006 (UTC) |
Micro black hole / Black hole electron
[edit]Jheald; Your edit of "Micro black hole" was very well done. The article has definitely been improved. I would like to encourage you to edit "Black hole electron" also. The article could show that the Bh concept may explain an electron property that is not explained in any other way.
The point particle problem, where we think of photons being created and absorbed at mathematical points and the infinite density of charge implied by point electron pariicles and point interactions is explainable when the electron is defined as a black hole. The density of a black hole is very high but not infinite. With gravitational collapse, the resulting size will approach the gravitational radius. To a first approximation this is zero radius with infinite density. Black hole density is explainable, infinite density is not. DonJStevens 16:00, 5 August 2006 (UTC)
Marginalization
[edit]Hi, I have been trying to improve this article, so please take a fresh look at it and see how we might improve this important article further. many thanks Peter morrell 16:04, 9 September 2006 (UTC)
Black hole electron question
[edit]Hi Jheald; I added the equation below under "Talk black hole electron". The L2 length is defined as 1/2 of the electron Compton wavelength or 1.2131551x10 exponent-12 meters.
(L2) squared = (L3)(2pi)(Planck length)(3/2)exponent 1/2
When this equation is solved for L3, the value obtained is 1.1834933x10 exp 10 meters (using 1.616x10 exp-35 meters as the Planck length value). If the L3 true value is slightly larger or 1.1835332x10 exp 10 meters then the L3 length is (2pi) squared times one light second. Note that 1.1834933/1.1835332 is 0.9999663.
I would like to describe the (L2) squared equation as a "self-evident truth" indicating that "L3 is (2pi) squared times one light second" because this can be easily verified to 4 significant digits by anyone who is interested. Do you think this is too speculative for Wikipedia? DonJStevens 15:57, 21 September 2006 (UTC)
When (if) we accept the value 6.6717456x10 exp-11 as the actual G value that applies at the subatomic particle level, then equations defining electron quantized mass and time dilation (gravitational potential) at the electron black hole radius (3G m/c squared) become simplistic. Evidence relating to this is described at my "User" page. DonJStevens 15:23, 24 September 2006 (UTC)
You may find the simplistic equation for the electron Schwarzschild radius is interesting.
radius = (2/3)(Le/4pi)(Le/2L3)squared
The Le value is the electron Compton wavelength. This radius value is 1.3524368x10 exp-57 meters. This will be numerically equal to (2Gm/c squared) only if G has the specific value implied earlier (6.6717456x10 exp-11). DonJStevens 17:23, 30 September 2006 (UTC)
The other important electron black hole radius is 3Gm/c squared.
3Gm/c squared = (Le/4pi)(Le/2L3)squared
m = (c squared/3G)(Le/4pi)(Le/2L3)squared
The value (h/c)(1/Le) is then substituted for m.
(h/c)(1/Le) = (c squared/3G)(Le/4pi)(Le/2L3)squared
This equation is then solved for Le.
Le = 4pi(3pi hG/c) exponent 1/4
Do you see any way to clarify these equations so that readers will know that Le is a length value? Your comment on this would be appreciated. DonJStevens 17:35, 9 October 2006 (UTC)
Emergency!
[edit]First, I would like to thank you for your courteous response to my (unfortunately normal) overly emotional reaction to information theory (NOT to you as an individual!). Recently, I had been writing only about mixtures because that had been the focus of the last month. Energy dispersal, of course, applies directly and obviously to thermal transfer of all sorts. (And space is involved but an ignorable part of the calculation of entropy change compared to the q/T)
However, my writing you at the moment is truly an emergency -- Sadi Carnot, aka Libb Thims (sic and sick!) at ... has just deleted all of our discussions of the past month from the Talk:Entropy page. I do not know what to do, not being skillful in Wikipedia high=level procedures. I have sent "Thims" an email requesting clarification of his two degrees in engineering (that my ex-student, a prof in engineering at the U. of Mich could not verify) and his enrollment in a non-existent "University of Berkeley". (The University of Califonia at Berkeley has never been called that.) His major non-Wk writing appears to deal with human interactions and thjermodynamics; I hardly feel that "the thermodynamics of sex" is a legitimate area for thermo, but "Carnot" does!
Can you appeal to authorities? This kind of disaster, in my view, is what I think is vandalism to Wikipedia system -- but what can one do? Frank Lambert: email at flambert@att.net. I would appreciate aid in any way .FrankLambert 16:22, 6 October 2006 (UTC)
Information theory references
[edit]Hello - Do you have any references on the relationship of information entropy and thermodynamic entropy? Email or talk page, either is fine, thanks for any help. PAR 18:53, 19 November 2006 (UTC)
New Thermo Template
[edit]Hello - could you look at the discussion page for the thermodynamics template? (HERE) - thanks PAR 06:49, 29 November 2006 (UTC)
Ideal Gas
[edit]Hi - could you check out Talk:Ideal gas#nR is amount of gas and give your opinion? Thanks PAR 14:59, 11 December 2006 (UTC)