Wikipedia:Reference desk/Archives/Computing/2012 August 17
Computing desk | ||
---|---|---|
< August 16 | << Jul | August | Sep >> | August 18 > |
Welcome to the Wikipedia Computing Reference Desk Archives |
---|
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages. |
August 17
[edit]is the human brain a Turing Machine?
[edit]I was having a debate with someone and he asked me to prove that a human being was a Turing Machine. In some ways it seems so obviously true, that humans are at a minmum Turing Machines that I had never really thought through a proof or convincing argument. I can think of several but when I searched I was a bit surprised that I didn't find any definitive statements from Turing, Church, or others with a lot more credibility than I have on the subject. I just want to make sure I haven't missed something, has anyone like Turing, Church, Von Neuman, Chomsky, Dennet, etc. written on the topic? Mdebellis (talk) 02:26, 17 August 2012 (UTC) — Preceding unsigned comment added by Mdebellis (talk • contribs) 02:24, 17 August 2012 (UTC)
- Is it literally a Turing machine? Clearly not (no such device can exist - infinite tapes are in rather short supply, for a start). Is it even metaphorically one? I very much doubt it. AndyTheGrump (talk) 02:29, 17 August 2012 (UTC)
- I should have been more clear in my question, didn't want to make it too long, I recognize that it obviously isn't literally a Turing machine since there is no infinite tape but by that standard no computer is a Turing machine either, no computer has infinite memory. And the same goes for errors, no human can be without errors if they try to "be" a Turing machine but computers eventually can also make errors (not talking about software but about the HW actually failing) True, a computer will go a lot longer than a human before an error but the principle is the same, it seems tome that any implementation of a Turing machine is an approximation of the formalism. Mdebellis (talk) 16:50, 17 August 2012 (UTC)
- Anything a Turing Machine can do, a person can do in principle. Bubba73 You talkin' to me? 03:26, 17 August 2012 (UTC)
- Dennett has written a little on computers in Consciousness Explained, but I don't remember in what context :( Уга-уга12 (talk) 03:31, 17 August 2012 (UTC)
- A Turing machine, if the term means anything at all, has a discrete set of states and a well-defined transition matrix that determines how it behaves. The human brain has neither a discrete set of states nor a simple transition matrix. The Church-Turing thesis implies that anything algorithmic that the brain can do, can be replicated by a Turing machine, but there is lots of microstructure in the brain that isn't doing anything functional. To put this in different words, the human brain can perhaps be simulated by a Turing machine to any specified degree of accuracy, but not to perfect accuracy. Looie496 (talk) 05:57, 17 August 2012 (UTC)
- I think the better question is whether the human brain is Turing complete. It doesn't matter if it is actually a Turing machine, and talking about metaphorical Turing machines is far less interesting than talking about whether it is Turing complete or not. --Mr.98 (talk) 13:09, 17 August 2012 (UTC)
- I tend to agree with Looie that it's unclear how to apply these computing concepts to the brain. In any case, Turing machines can accept arbitrarily large inputs, so unless human memory capacity is infinite, the answer is presumably no. -- BenRG (talk) 18:18, 17 August 2012 (UTC)
- There are two distinct questions: "is the brain a Turing machine," and "can the brain be reasonably-well modeled as a Turing machine?" The Chinese Room thought-experiment contemplates whether these questions are identical. Nimur (talk) 18:43, 17 August 2012 (UTC)
- I tend to agree with Looie that it's unclear how to apply these computing concepts to the brain. In any case, Turing machines can accept arbitrarily large inputs, so unless human memory capacity is infinite, the answer is presumably no. -- BenRG (talk) 18:18, 17 August 2012 (UTC)
I think the question is intended as the opposite of whether the brain is Turing-complete (it obviously isn't, per the lack of infinite memory). Rather, it's whether a Turing machine could in principle simulate a brain, and yes, philosophers and some mathematicians have spilled a great deal of ink on this question. It's a sub-question of the Church-Turing thesis. These days I'd say most scientists agree that a TM could simulate a brain. See The Emperor's New Mind for one of the few contemporary exceptions. However, like theology, it's not something that can be scientifically or mathematically proven. 67.122.211.84 (talk) 18:48, 17 August 2012 (UTC)
- Thanks very much to everyone who answered, they were all very helpful. I will check out The Emperor's New Mind. Although I'm not sure I agree the question is like theology. We can't scientifically or mathematically prove it one way or another now but I don't see any theoretical reason why we never will be able to. Actually, I think Ray Kurzweil assumes it to be a given that we will one day be able to simulate brains on computers in at least one of his books. He might be another author worth checking into. Mdebellis (talk) 20:51, 17 August 2012 (UTC)
- I agree that there's no reason, a priori, to assume that this is unprovable. We can prove that many types of computational devices are Turing complete. When we know more about how mental computation works — we are still impressively ignorant on that front — we should be able to determine whether it is Turing complete or not. That we cannot do so now is just a reflection of our ignorance; there is nothing metaphysical or supernatural about it. --Mr.98 (talk) 01:32, 19 August 2012 (UTC)
- I don't see how it can be provable. Consider the following religious doctrine: 1) the one true deity has computational capabilities that exceed those of a Turing machine; 2) humans possess souls, and souls are an extension of the deity; 3) in particular, through suitable rituals/prayer/cheese sacrifices, the deity's followers (but nobody else) can receive divine inspiration telling them the Kolmogorov complexity of arbitrary strings, even though this is an uncomputable function. So you might flip a coin 10000 times and claim you have an arithmetically random string with 10000 bits of real entropy, but an adherent of the religion then asserts that the Kolmogorov complexity of your string is actually no larger than 2346 bits and he can know this because he has a soul and contemplated your string deeply before receiving that answer from above. Of course another adherent might tell you the upper bound is 6023 bits, but that just means one of the souls is (unknown to them) not completely in a state of grace so he doesn't get as sharp a bound from the deity. There is no way to disprove any of these assertions. Turing machines of the sizes under discussion are large enough to have instances whose halting problem is independent of any mathematical theory likely to gain any widespread acceptance. A Turing machine simply can't prove that an arbitrary, purportedly random string is actually random. 69.228.170.132 (talk) 19:49, 22 August 2012 (UTC)
- I agree that there's no reason, a priori, to assume that this is unprovable. We can prove that many types of computational devices are Turing complete. When we know more about how mental computation works — we are still impressively ignorant on that front — we should be able to determine whether it is Turing complete or not. That we cannot do so now is just a reflection of our ignorance; there is nothing metaphysical or supernatural about it. --Mr.98 (talk) 01:32, 19 August 2012 (UTC)
Concurrency
[edit]Hello, I was wondering, can one actually implement reliable multithreading in software (w/o hardware support, like atomic fetch-increment-write back instructions or similar)? How does x86 do this stuff (I think there was some operation that was atomic, perhaps XCHG?)Уга-уга12 (talk) 03:23, 17 August 2012 (UTC)
- Amateur programmer answering here - but I think I know the answer. Given that a Java virtual machine for instance can multithread, I think one can take this as a given - but as you say, you need at least one atomic instruction to ensure reliability. Java has for example the wonderfully-named atomic Boolean [1], and presumably this ultimately has to be supported at the hardware level. With regard to the x86 itself, see x86 assembly language#Instruction types: "Contains special support for atomic instructions (xchg, cmpxchg/cmpxchg8b, xadd, and integer instructions which combine with the lock prefix)". AndyTheGrump (talk) 03:51, 17 August 2012 (UTC)
- One builds safe concurrent programs from operations like mutexes (I'm pretty sure that if one has one of mutexes, counting semaphores, and safe queues one can build the others just from that). So your question amounts to "how does one build mutexes". As the mutex article notes, this is typically done with an atomic instruction like test-and-set and compare-and-swap. These do require more of the hardware than do ordinary compare or arithmetic operations - they effectively interlock concurrent execution units while they run. So that, to my mind, counts as "hardware support". As Mutual exclusion#Software solutions says, one can build mutual exclusion without such an instruction. These work in single-processor multithreading environments, but they make assumptions about instruction completion ordering that do not necessarily hold in multi-processor environments. Multiprocessor designs can implement means of supporting these, like memory barriers - but those too require inter-processor hardware support. -- Finlay McWalterჷTalk 13:13, 17 August 2012 (UTC)
- If you're working on a simple, single microprocessor/microcontroller (not a multiprocessor system, etc), you can simulate atomic test/exchange instructions by disabling all interrupts, doing the read and write, and then enabling interrupts again. Since a processor can only change thread if there's an interrupt to break the instruction sequence, this is reliable as long as you can disable all interrupts. This mainly applies if you're not using an operating system; if there is an OS you would use its mutexes, semaphores, etc. --Colapeninsula (talk) 15:32, 17 August 2012 (UTC)
Thank you everyoneУга-уга12 (talk) 17:26, 17 August 2012 (UTC)
.gif files do not animate on my browser
[edit]I suddenly found that my computer, while searching the internet for .GIF files, does not recognise pictures as animated GIF pictures, and ends up not animating them as they should. How to fix that problem? 72.235.221.120 (talk) 04:37, 17 August 2012 (UTC)
- Are you sure those are animated GIFs ? There are also static GIFs. StuRat (talk) 04:39, 17 August 2012 (UTC)
- Which browser and browser version do you have? Some have settings on whether to animate and whether to loop the animation (e.g. for Firefox, search the web for "image.animation_mode"). 94.101.10.162 (talk) 05:14, 17 August 2012 (UTC)
- In Internet Explorer, if you press ALT + T and go to Internet Options → Advanced and scroll down to Multimedia, ensure that Play animations in web pages is not unchecked.—Best Dog Ever (talk) 05:56, 17 August 2012 (UTC)
- Yes, they are supposed to be animated GIFs. I have Internet Explorer on Windows 7, and "Play animations in web pages" is checked. Oh and I forgot to mention, this gadget:
Digital World Clock The "name scrollng" option does not work anymore, presumely in conjunction with the failure to display animations. 72.235.221.120 (talk) 07:24, 17 August 2012 (UTC)
- That link links to a jpg and does not animate. - Purplewowies (talk) 07:38, 17 August 2012 (UTC)
- That is supposed to be an example picture. It is the gadget that fails to animate on my computer, along with all GIF files. The picture I posted here is not supposed to animate. 72.235.221.120 (talk) 11:12, 19 August 2012 (UTC)
- That link links to a jpg and does not animate. - Purplewowies (talk) 07:38, 17 August 2012 (UTC)
Ubuntu 12.04 1) Delete Icons from taskbar and 2) Minimize all open windows
[edit]I have upgraded to Ubuntu 12.04, a late upgrade and that has given me a bunch of headaches.
- In 11.10 I used to delete taskbar icons using Alt right click Del, but it is not working in 12.04. See the unanswered question in Ubuntu forum.
- In 12.04 Window+D is not working to minimize all open windows!
With my good wishes
Tito Dutta ✉
07:52, 17 August 2012 (UTC)
- I assume you're using unity (and not Gnome 3). To see a summary of keyboard shortcuts, hold down the super (windows) key for a second, and it pops up a little reminder pane. For your latter question, the key combination is Window+ctrl+D, and the same again to restore them. -- Finlay McWalterჷTalk 12:10, 17 August 2012 (UTC)
- For your first question, it sounds like you want to remove an icon (for a non-running application) from the Unity Launcher (the bar at the left). First alt+F1 to put keyboard focus into the launcher; then ↑ and ↓ to move to the icon you want to remove, then → to bring up its quicklist. Then arrow to "unlock from launcher" and ↵ Enter. -- Finlay McWalterჷTalk 12:27, 17 August 2012 (UTC)
- Incidentally I've assumed you prefer keystrokes for this latter task; with the mouse, one right-clicks on the launcher icon, and again picks on "unlock from launcher". -- Finlay McWalterჷTalk 12:31, 17 August 2012 (UTC)
- Window+ctrl+D is working Great!
- Post number 6 here has a screenshot. I want to remove those launchers. Their suggestion is not working in 12.04. I asked there few days ago but have not got any reply still. --Tito Dutta ✉ 14:25, 17 August 2012 (UTC)
is the linux is an flatform independent or not?
[edit]is the linux is an flatform independent or not? — Preceding unsigned comment added by Ravichandran123 (talk • contribs) 12:02, 17 August 2012 (UTC)
- Independent from what? In what sense? ¦ Reisio (talk) 13:52, 17 August 2012 (UTC)
- It's just me who cannot make sense of this question? Comploose (talk) 17:50, 17 August 2012 (UTC)
- Strictly speaking, Linux was originally just the Linux kernel, a kernel that Linus Torvalds originally wrote, intended for the GNU operating system. GNU is intended as a Unix-like operating system that is comprised wholly of free software. Theoretically, for a kernel to be platform independent (I assume this is what you meant?) means it need only achieve hardware independence, since you can build an arbitrary software framework atop any given kernel. Bearing that in mind, I believe you will find your answer in List of Linux supported architectures, if I have correctly inferred what you meant to ask. BigNate37(T) 18:48, 17 August 2012 (UTC)
who is the best seo company chairman in india?
[edit]usully best company chairmain having some principles it will heps us to doveloping our company ,is it useful for us or not? — Preceding unsigned comment added by Ravichandran123 (talk • contribs) 12:10, 17 August 2012 (UTC)
- Companies associated with SEO (beyond simply properly assembling a website) are unprincipled. ¦ Reisio (talk) 13:53, 17 August 2012 (UTC)
New folders in Windows
[edit]On this Windows 7 computer, when I make a new folder now, the data it shows in the columns defaults to the ones for music files. There must be a setting somewhere to have it default to the standard ones for a new file. Is there such a setting and where is it? Bubba73 You talkin' to me? 13:09, 17 August 2012 (UTC)
- There is, but in my experience no amount of configuring will make it stay that way. It's been like this since Vista. I heard it was because they changed most of the Explorer view settings backend but never updated the registry part that should control it. I imagine if consistency is what you want, your best bet is to replace Explorer entirely with something else. ¦ Reisio (talk) 13:58, 17 August 2012 (UTC)
I've never had this problem with new folders in Windows until about a week ago. Bubba73 You talkin' to me? 15:12, 17 August 2012 (UTC)
- Well I s'pose technically you may have updated your Windows system about a week ago and something changed, but more likely about a week ago was the first time you started playing with new folders in whatever area of the filesystem you are playing in, and that area is one preconfigured to try and be smarter than you. Such areas include the desktop, and almost any top level directory in your profile directory, including anything starting with 'My '. ¦ Reisio (talk) 23:13, 17 August 2012 (UTC)
No, I've made hundreds and hundreds of new folders, and probably none of them were in "My...", in the profile, etc. Bubba73 You talkin' to me? 23:44, 17 August 2012 (UTC)
- That's what I'm saying. Elsewhere Windows doesn't try to be so clever. ¦ Reisio (talk) 00:17, 18 August 2012 (UTC)
- But what I'm saying is the new folders I was having the problem with are not in those areas either. Bubba73 You talkin' to me? 00:54, 21 August 2012 (UTC)
Market research, solar power, battery to USB
[edit]I'm trying to find a product in the market, or maybe it doesn't exist. I'm looking for an arrangement where a charging station with some sort of battery is recharged by solar panels, and then my gadgets can charge off that battery via USB. I have a solar panel that charges directly via one USB, but then my (only one at a time) gadget has to be in the same place where the sun is shining, when the sun is shining. I would want to charge the battery via direct sun during the day, and then recharge my gadgets off the battery via multiple USB charging ports at night (or on un-sunny days). Thanks if anyone can point me to such as device... — Preceding unsigned comment added by 94.208.75.76 (talk) 13:55, 17 August 2012 (UTC)
- It might exist, but I should point out that it won't work very well. First, solar panels are highly inefficient, then you get additional inefficiencies when charging its batteries, losses while its batteries sit, and more inefficiency when its battery is discharged/the battery in your device is charged. So, by the time you get to the charge in your device, you might only have something like 1% of the energy that hit the solar panel. Then consider that rechargeable batteries get weaker as they age, and solar panels become less efficient, making the equation even worse. The result is that you'd need a huge solar panel and/or a very low power device to make this work.
- Also, if you're going to schlep a battery along with you, you might as well charge it at home, so why add the additional complexity of the solar panel ? (The only scenario I can see is while camping or otherwise away from electrical outlets for an extended period, in which case hauling a large solar panel around won't be pleasant.) StuRat (talk) 15:30, 17 August 2012 (UTC)
- You can get various solar powered devices (just Google "solar powered" or put "solar powered" in the Wikipedia search box), which would be a more efficient way of charging them, but apart from very low power devices like a wrist-watch (I've had a very good solar-powered Casio for several years) this is mostly just a marketing gimmick with disappointing results.--Shantavira|feed me 09:24, 18 August 2012 (UTC)
Adobe_Flash#Alternatives is rather incomplete. What alternatives are there?
[edit]I don't mean alternatives whether to the Adobe Flash tool for creating Flash, nor to the Adobe Flash player for visualizing them. What others alternatives are there to Adobe Flash besides the ones linked there (html5, JavaFX and Silverline)? Wouldn't jQuery be a kinda alternative? jQuery could be even better, since it doesn't require installing anything, just turning js on.Comploose (talk) 17:32, 17 August 2012 (UTC)
- jQuery isn't a platform, but sort of an 'API' for building web applications. So jQuery would fall under HTML(5). Unilynx (talk) 18:02, 17 August 2012 (UTC)
There's Gnash, Lightspark, and combining various HTML/CSS/JS things. ¦ Reisio (talk) 22:54, 17 August 2012 (UTC)
- Reisio is wrong. Gnash and Lightspark are clearly alternatives to the Flash player, Not to Adobe Flash platform. The OP didn't ask about that. BTW, the Adobe Flash article is not incomplete. There are just these alternatives, which do not completely cover the functionality of Flash. OsmanRF34 (talk) 21:22, 18 August 2012 (UTC)
- The thing that makes this very hard to answer is that Flash does many things. Depending on what you are trying to replicate, different software does the trick. If the functionality you care about is scripting, there are lots of alternatives. If it's playing movies, lots of others. If it's doing complicated vector graphics, there are others for that. The three linked to there do more or less all of those things; the other ones you've mentioned do some of those things but not all of them. jQuery can't play a movie, for example, in the way that Flash can, nor can it do as many complicated vector operations. You could use it in conjunction with other technologies, though, to replicate many of the functions provided by Flash. --Mr.98 (talk) 01:29, 19 August 2012 (UTC)
Scared straight for kids on the Internet
[edit]I'm worried about my young nephews, Huey, Dewy and Louie. Is there a Scared straight for kids that warns them off what not to do on the Internet?
Personally I don't have time to hang out on all the social media sites and find out the pitfalls of each. I work with databases. You've heard of weaving straw into gold? Well I'm brought in after the horses have already processed the straw and I'm expected to take their output and turn that into gold. Hcobb (talk) 17:41, 17 August 2012 (UTC)
- Are you Donald Duck, by any chance ? There was a PSA on TV in the US which featured a girl who "sexted" her b/f, only to find the pic posted on the school (physical) bulletin board, and magically reappear every time she tried to remove it, with the perverted janitor taking a copy. Other than earning the ire of school janitors, that ad seems to get the point across about how anything you send electronically is accessible by everyone, forever. StuRat (talk) 17:50, 17 August 2012 (UTC)
- Add to the top of that that face recognition is getting better and better. So, your pictures of you drunk or doing drugs as you were younger will be found some day. Comploose (talk) 21:45, 18 August 2012 (UTC)
- This is one of the reasons I wear that leather face mask in all the pornos I make. :-) StuRat (talk) 02:12, 20 August 2012 (UTC)
Norton Security Suite
[edit]My internet provider offers Norton Security Suite as part of the subscription. Is it a good choice to provide antivirus, firewall, antispyware, etc? The PC previously had Windows Defender, which I hear is more of an antispyware than an antivirus. Should Defender not be used if Norton is installed? Thanks. Edison (talk) 19:10, 17 August 2012 (UTC)
- It seems to work for me. I have seen reviews of Norton indicating that Norton has improved greatly in recent years. I would not run two different anti-malware products that are designed to do the same thing; they might conflict with each other. Jc3s5h (talk) 19:34, 17 August 2012 (UTC)
It's okay if you aren't paying (extra, in this case) for it, but keep in mind that if you ever change providers you might not be able to any longer use this application for "free", and might be motivated to replace it, which means retraining yourself (if ever so slightly). Good free antivirus apps are Avira and Avast! ¦ Reisio (talk) 22:58, 17 August 2012 (UTC)
- I use Norton because I get it free from my ISP. It works for me but some people complain about it a lot. It is much better than Windows Defender, and don't use both at the same time. From time to time I also run Windows Safety Scanner and MalwareBytes. Bubba73 You talkin' to me? 23:41, 17 August 2012 (UTC)
- It's not my favorite even among commercial antivirus applications, but to be fair what really makes Norton a pain is if you decide you want to replace it with something else (that is, if you decide to not pay to use it for eternity), as attempting to uninstall it (that's right, attempting) can often not only not uninstall it, but break your networking! If you like Norton and are happy paying for it (or for whatever bundles it in for free) for the foreseeable future, that seems sane enough. ¦ Reisio (talk) 00:35, 18 August 2012 (UTC)
- The old PC seems to be damaged and not worth repairing. Is it still common to use a desktop, or do most users go for a laptop? I note that a big monitor and a conventional keyboard can be connected to a laptop to emulate a desktop. Are desktops more powerful, for a given purchase price? Edison (talk) 02:25, 18 August 2012 (UTC)
- Until fairly recently, in the UK at least, you got more computing power for your money with a desktop, but laptop prices have continued to fall so that there is now little difference in price for the same power. You will still pay more in total if you need to buy a separate large monitor. Dbfirs 06:58, 18 August 2012 (UTC)
- My experience (in the US) is that the only thing laptops have going for them is portability. Decktops have better CPUs, more memory, bigger and faster hard drives, and more USB ports. They are more expandable and adding more memory or HD is possible and cheaper on a desktop. In addition, you have a real keyboard, a real mouse, and a real monitor without having to buy additional ones. Bubba73 You talkin' to me? 16:25, 18 August 2012 (UTC)
- Have to agree with Bubba73 here except to say laptops may also have an energy efficiency advantage (you can get energy efficient desktop CPUs and systems but they're rare whereas most products intended for laptops are energy efficient) and perhaps to note that I expect you're more likely to find a laptop with an SSD. Of course most systems nowadays are fairly energy efficient compared to 5 or so years ago. There are also a few other minor possible advantages, e.g. possibly non USB SD card readers (although that can be a disadvantage in compatibility other areas). Nil Einne (talk) 07:12, 19 August 2012 (UTC)
- Yes, I guess you are both correct that the differential has not yet shrunk to zero, but I can remember the time when a laptop cost many times the price of an equivalent desktop. I haven't noticed much difference in USB ports, or in memory or hard drive size recently, but I agree that desktops often have faster processors for the same money, and the expandability is an advantage. Dbfirs 16:36, 19 August 2012 (UTC)
Inadvertent reinstallation of Windows XP
[edit]My usual PC suddenly quit opening Windows XP a month or so ago. so I've been using a borrowed laptop. The PC said to insert the installation disc. With that in place, it would open ok and I could access files and programs. But then somehow it went ahead and re-installed Windows XP. Now there is no sign of my old files or programs, except what came with the new Windows XP installation. Should it be possible to regain access to the old files and programs? Edison (talk) 19:47, 17 August 2012 (UTC).
- Usually the first step in the install process is to format the disk partition. If it did so, and your files were there, then they are gone. It might technically be possible to restore those portions which weren't overwritten by the new O/S, but it's not likely to be feasible, unless those were some rather valuable files. StuRat (talk) 19:56, 17 August 2012 (UTC)
- If the files are valuable and you are going to try to recover them, then don't use your new windows installation, but connect your hard drive to another computer for recovery. A high-level format doesn't actually delete the data, but it might be difficult to piece it together from fragments, and any files overwritten by the new installation will not be recoverable by any method. After something similar happened to me in the early days of Windows, I always save my data on a separate partition (sorry, I know that's not helpful to you just now), though Microsoft often makes this difficult because all defaults are set to save in subfolders of Windows. I suppose it's too late to mention regular backups, and unless you saved a disk image (or at least the old registry), the old software will have to be reinstalled. Sorry we can't give you much hope of recovery. There is software available that will recover pictures (e.g. Convar's PC Inspector Smart Recovery), and List of data recovery software might have something similar for other files. Dbfirs 21:03, 17 August 2012 (UTC)
- If you had partitions set up and the installation process wiped them out, you may wish to look at TestDisk, which is an excellent open source tool for partition and file table recovery. BigNate37(T) 21:10, 17 August 2012 (UTC)
- Just to clarify some of what was already said: the format during Windows' installation does not overwrite data, but the actual writing of the OS files does. So you've probably lost data equal to the size of the Windows OS files, more likely at the beginning of the disk (which is more likely the previous OS files, and other older files of the previous system). ¦ Reisio (talk) 23:06, 17 August 2012 (UTC)
- The format may not delete them, but it makes them inaccessible, even if they weren't yet overwritten. This is unless you're willing to resort to forensic methods to try to save portions of them. StuRat (talk) 23:35, 17 August 2012 (UTC)
- It'd be a pain, yeah. ¦ Reisio (talk) 00:02, 18 August 2012 (UTC)
OSX 10.4/5/6 - Override "interactive" login and switch to textbox, username/password domain-enabled login
[edit]Our school has a domain login system set up so that users can login to their school account from most of the computers in the building. However, some of the class machines (ones that aren't considered "public") are set to an interactive, user-tile login screen that only allows login to local accounts. The only way to switch them to a domain login screen is to log in as an administrator and change the login options from inside the System Preferences, which is semi-permanent until someone goes back and changes it. Is there a shortcut or other method to perform a one-time override and force the system to show the domain login screen? (Kind of like double-tapping Ctrl+Alt+Delete in most versions of Windows XP) Hmmwhatsthisdo (talk) 22:11, 17 August 2012 (UTC)
- I'm sure this is possible; I remember doing it myself in the past. According to this page, the shortcut is ⌥ Opt+Esc, followed by clicking on a user. — Earwig talk 19:33, 19 August 2012 (UTC)
stripping a video of red and green, keeping blue channel only
[edit]
I start with a blue and black only video and converted it to mpeg to upload to Youtube so I can put my iPhone next to a 3"x4" arena of Drosophila and image them with red light. However light from the red channel seem to be leaking through because they are visible on my camera which has been optically filtered to accept red light only. It is absolutely required that there be zilch/nada/zero output coming from the red channel otherwise it will disrupt the imaging of the flies in red light. The blue light is to drive fly behavior only. 137.54.1.117 (talk) 22:51, 17 August 2012 (UTC)
Maybe I should ask, how would I completely remove the red and green channels from my video? 137.54.1.117 (talk) 23:12, 17 August 2012 (UTC)
- I'd say making it lossless would be a good start, and MPEG formats just aren't usually meant for that. Maybe a lossless JPEG2000 sourced MJPEG would be a better idea. ¦ Reisio (talk) 23:27, 17 August 2012 (UTC)
- Most video data is not stored as RGB pixels (as you have become accustomed with still image formats). So, it's nontrivial to extract a "red only" "channel." It can be done, but you should be aware of the caveats.
- Even JPEG uses a subsampled yuv colorspace, (not RGB values for each pixel); so JPEG isn't a great choice if you're planning to filter specific colors. Motion JPEG has an advantage over MPEG because it uses no interframe coding, but each individual frame is still compressed with a pixel-position-altering, color-altering lossy JPEG algorithm. For scientific image processing, you should use uncompressed, unconverted video if possible; or tread very carefully with the image-data.
- You might try putting a blue-only filter on the display screen when you are filming/capturing the test scene, and a red-only filter on the camera - to help isolate the signals you care about before you even capture the video. Nimur (talk) 00:26, 18 August 2012 (UTC)
- RE: lossy JPEG: hence JPEG 2000 (or other lossless JPEG), but I couldn't comment on whether you can actually easily create an MJPEG…2000. :p ¦ Reisio (talk) 00:41, 18 August 2012 (UTC)
- JPEG2000 can use a lossless scheme only after color-space conversion, which might contain roundoff errors! Another caveat to the uninitiated video-processing engineer!Nimur (talk) 01:08, 18 August 2012 (UTC)
- Heheh, woe! If it were me I'd use a light that only emitted the wavelength I wanted and obscure as required. :p ¦ Reisio (talk) 02:18, 18 August 2012 (UTC)
- But, this is all moot: even if the video data is perfect, or placed on screen in RGB colorspace, the individual pixels may still leak colors... because it sounds like the real problem is the display screen, not the source video. LCD panels have white backlights and liquid-crystal pixel color arrays. They aren't designed to be monochromatic or even narrow-band color filters: they're designed to look good to the human eye. Even a phosphor screen CRT would have better isolation because it has no backlight. I'd also worry about the polarization that may come out of an LCD screen: isn't polarization a physical property known to affect insect vision? Nimur (talk) 15:23, 19 August 2012 (UTC)
- Perhaps an AMOLED phone would be a better bet.... However looking back at the older post, since they just want stripes, I can;t help wondering if something like a few stripes of RED LEDs would be better even if slightly more difficult to design. Nil Einne (talk) 16:37, 19 August 2012 (UTC)
- But, this is all moot: even if the video data is perfect, or placed on screen in RGB colorspace, the individual pixels may still leak colors... because it sounds like the real problem is the display screen, not the source video. LCD panels have white backlights and liquid-crystal pixel color arrays. They aren't designed to be monochromatic or even narrow-band color filters: they're designed to look good to the human eye. Even a phosphor screen CRT would have better isolation because it has no backlight. I'd also worry about the polarization that may come out of an LCD screen: isn't polarization a physical property known to affect insect vision? Nimur (talk) 15:23, 19 August 2012 (UTC)
- Heheh, woe! If it were me I'd use a light that only emitted the wavelength I wanted and obscure as required. :p ¦ Reisio (talk) 02:18, 18 August 2012 (UTC)
- An optical filter isn't going to be 100%. Sounds like you need to digitally filter it instead (which is pretty much what you are asking about, I'm just pointing out the futility of the optical filter). StuRat (talk) 23:31, 17 August 2012 (UTC)
- I think optical filters—a blue one on the screen and a red one on the camera—are the only hope here. Unless you're using super-expensive calibrated equipment, you can't expect to show blue on a screen and record it on a camera without leakage into the R and G channels. LCD subpixels emit a pretty broad spectrum, and there's no range of the visible spectrum that's detected by only one of the camera's three sensor types (otherwise it couldn't distinguish hues in that range). Cameras do a lot of internal processing of the raw sensor data even before JPEG compression. -- BenRG (talk) 02:09, 18 August 2012 (UTC)
- I could certainly write a program to read a pic, strip it of all colors but the desired one, save it again, and create an animated GIF from those frames. That would just need a pre-processor to split the video into single frame images, and a post-processor to convert the animated GIF to your desired format. How many frames, at what res, are we talking about, though ? (Too many would make it take too long to process.) StuRat (talk) 02:31, 18 August 2012 (UTC)
- A little red light will still escape from an LCD monitor displaying only blue - view it a steep angle above or below to see. A CRT monitor shouldn't have that problem. For either monitor, if they have a VGA connector, you can trim out the blue pins on a cable in order to make sure that no blue signal ever makes it to the display. Not a software solution, but simple, effective and easy to apply to any source. 209.131.76.183 (talk) 15:39, 20 August 2012 (UTC)
- I should have said cut the red and green pins, to make sure only blue gets through. 209.131.76.183 (talk) 15:40, 20 August 2012 (UTC)
- As I said above, this won't work. CRT phosphors are not even close to monochromatic (see File:CRT_phosphors.png for an example) and neither are the camera's sensitivity curves (see here for an example). -- BenRG (talk) 17:15, 20 August 2012 (UTC)
- That is a nice chart - I was thinking the phosphors were monochromatic. It looks like a filter on the display is the way to go. 209.131.76.183 (talk) 13:37, 21 August 2012 (UTC)
- As I said above, this won't work. CRT phosphors are not even close to monochromatic (see File:CRT_phosphors.png for an example) and neither are the camera's sensitivity curves (see here for an example). -- BenRG (talk) 17:15, 20 August 2012 (UTC)
- I should have said cut the red and green pins, to make sure only blue gets through. 209.131.76.183 (talk) 15:40, 20 August 2012 (UTC)