Jump to content

Wikipedia:Reference desk/Archives/Computing/2016 March 27

From Wikipedia, the free encyclopedia
Computing desk
< March 26 << Feb | March | Apr >> March 28 >
Welcome to the Wikipedia Computing Reference Desk Archives
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages.


March 27

[edit]

Why aren't computers and phones advertised by GFLOPS and fillrate or benchmarks?

[edit]

Sure GFLOPS can be gamed somewhat but so can megapixels and many TV specs and those are still used. And phone marketing at least tells you how many milliamp-hours the battery has which seem even more jargony than pixels per second so why not use that too? At any rate people don't have to know what mAh actually means, they just have to know that this number is bigger so this battery is better. (why they don't use joules or watt-hours is beyond me) What is wrong with manufacturers? Why aren't they bragging about how many GFLOPS their phones can do? It could only help peoples' equipment feel outdated so that they buy stuff whether they need to or not. How is calling a improved processor i3 every year supposed to encourage them to buy things? At least car ads tout the tiniest differences in hp or torque. Sagittarian Milky Way (talk) 04:06, 27 March 2016 (UTC)[reply]

There is no standard benchmark for "GFLOPS" or "fill rates" - it depends entirely on the application structure. Also, many many applications have an almost pure integer workload, so GFLOPS is not even a good proxy. Megapixels (assuming it refers to screen size) are countable. mAh are measurable (although also quite irrelevant if you don't know what your phone needs). I can speculate that they use mAh because maybe that is the relevant value - the voltage of any battery has a theoretical maximum determined by its chemistry and layout, but in practice is somewhat variable and typically drops over time. But if every clock cycle transports a (nearly) constant amount of charge, mAh is a better measure than Joule. --Stephan Schulz (talk) 11:53, 27 March 2016 (UTC)[reply]
Maybe you're right about mAh. I had thought they were just gaming the specs to obsfuscate the amount of energy in the phone to make it look better (since my cheapest phone had a lower voltage battery than its upgrade). Similar to phone cameras adding megapixels even after reaching the limit of optics or very reflective TVs measuring contrast ratio in a pitch dark room. Sagittarian Milky Way (talk) 17:45, 27 March 2016 (UTC)[reply]
How do you know the voltage was different? The nominal voltage published on the batter is normally pretty meaningless. There can be minor differences in voltage levels (in particularly the maximum voltage and possibly the cutoff voltage) relating to safety and battery longetivity decisions made by the manufacturer, but often these aren't that significant. Probably the biggest factor is that some manufacturers just plain lie about the mAh ratings although this isn't actually that bad with most phones even cheapish Chinese ones AFAIK. (Third party Chinese batteries are another thing.) As mentioned, the other factor is that it doesn't tell you how long your phone will last in a given circumstance unless you know a lot more and a phone with a higher capacity battery could easily not last as long due to higher power usage even when basically doing the same thing. (Various runtime figures are normally also published although these also suffer problems.) Nil Einne (talk) 16:04, 29 March 2016 (UTC)[reply]
Related articles: Megahertz myth & BogoMips. I am surprised I couldn't find an article about the bitwars. I probably used a wrong search term. The Quixotic Potato (talk) 19:28, 28 March 2016 (UTC)[reply]

Dual monitors now different resolutions

[edit]
File:Screenshot showing my problem.jpg
Screenshot showing the problem

I have a dual-monitor Windows 10 system. I had a DVI monitor as my primary monitor and a VGA on the right as a secondary monitor. Both were at 1920x1080. I replaced the VGA monitor with another one (with a physically larger screen). When I did that, the primary and secondary monitor designations switched. I got them arranged back like I want, but now the DVI monitor is at 1536x864 instead of 1920x1080! See the screenshot of Speccy. The Advamced Display Settings shows: "multiple displays - extended" and "resolution: 1920x1080 (recomended)".

I've tried rebooting the system and powering the monitor completely off and back on - nothing fixes it.

How can I get both monitors at 1920x1080? Bubba73 You talkin' to me? 05:59, 27 March 2016 (UTC)[reply]

I found that the DVI monitor (only) was set to 125% font size. Setting it to 100% fixed the problem. Bubba73 You talkin' to me? 06:30, 27 March 2016 (UTC)[reply]
Resolved
It's quite a broken design that changes font sizes by changing screen resolution - one of those quick hacks with a "fix later" comment that get's shipped anyways and haunts you for decades... --Stephan Schulz (talk) 17:53, 27 March 2016 (UTC)[reply]
It doesn't change screen resolution and AFAIK never has.

Since Vista the resolution presented to programs which don't advertise themselves (in the manifest) as being aware of the DPI setting are presented with a lower resolution, but the actual resolution remains the same. This was at least partially because probably 99.9% (including Microsoft programs) didn't actually do anything in response to the DPI setting. So to ensure it actually worked the solution was to render the window for a lower resolution and upscale it. The results are uglier but at least they are actually the right size (in terms of UI elements etc).

After the system was changed in Vista along with improved APIs for DPI scaling, things improved slightly but actually not that much. Some programs claimed to be DPI aware, but did a poor job or no job or actually scaling the UI. In some cases (e.g. games, video players and perhaps stuff like Photoshop where accurate pixels definitely matter) perhaps this made sense (in that it was a choice between a tiny UI or a screwed up program). In others, it did not. Many programs correctly don't present themselves as DPI aware so are scaled and will see a lower monitor resolution. (For games, Windows will generally automatically disable DPI scaling in the compatibility settings although you may need to start the program once.) A tiny number of programs actually did decent scaling at presented themselves as such. (I guess initiallythere were a tiny number of programs which actually did DPI scaling but didn't have the manifest setting so were screwed up. Although if they were at all common, I suspect they were detected during MS compabitility testing and added to the list and the Windows scaling was disabled.)

With Windows 8, there was more attention particularly from Microsoft. Also with this and with the rise of high PPI displays, other software developers started to actually pay attention so there was finally some real improvement particularly with web browsers.

Things changed again in Windows 8.1, given the additional support of per monitor DPI setting. Now programs need to be able to handle per monitor DPI changes when moving between Windows. Programs which don't present themselves as being per monitor aware are scaled if needed. I believe the highest DPI at logon is presented as the default DPI. So programs which aren't aware of the per monitor setting but are aware of the general DPI will be presented with the higher DPI and resolution for the monitor which will be needed for that DPI and then downscaled appropriately. So a bit more work and slightly uglier but without the blurryness due to upscaling if programs were presented with the lower DPI and upscaled as needed. (Admitedly this doesn't seem to correlate with the OPs experience so I could be wrong or maybe the DPI setting wasn't at login or something else went wrong.) Programs which aren't aware of the DPI at all will be treated as they used to be. (Presented with the resolution needed to give the DPI when scaled.) Programs which present themselves are per monitor aware will be left be.

You can turn off the automatic scaling of unaware programs by changing the compatibility setting for that program. (You can also change the manifest but that's much more involved.) I think there may be a way to turn off the forced scaling for unaware applications completely but can't remember offhand. (I tried to search but just found people talking about setting to default DPI completely which isn't the point.) It's definitely possible to turn off the per monitor setting and only have a single DPI.

Of course you wouldn't have to worry about forced scaling (or apparent incorrect resolutions) if all programs actually implemented proper scaling but even now many don't and ultimately there's nothing that Microsoft can do about that. (You'd still need to make sure you have an appropriate DPI setting.) Admitedly I think there are still some Microsoft programs which don't seem to quite handle DPI changes properly. In more general terms, one factor is that 125% is small enough that most people don't care if the UI is a bit small. So often prefer no scaling to the unsharp forced scaling for unaware programs. Once you start to get to 150% let alone 200%, it becomes much less acceptable. The allowance of non integral settings obviously complicates scaling, but my understanding is the APIs are actually fairly decent just poorly used. (On factor is legacy APIs, open source APIs and other reasons means there are many different ways of doing things in Windows which are still used for modern programs. It also seems a lot more accepted to use a non standard OS API on Windows than it is on an OS like OS X.)

P.S. I probably didn't explain poor programs well enough. For some of them, they can become unusable because you can't properly read the text at it's cut off by other UI elements. Unfortunately there's no good solution for these programs other than don't use them AFAIK. I don't think Microsoft ever implemented a way to tell this program the DPI was standard and then use the automatically scaling (or no scaling, but this is most likely with large scalings since small ones might be ugly but mostly usable); rather than rely on the programs borked internal scaling.

Nil Einne (talk) 23:05, 27 March 2016 (UTC)[reply]

I think that what happened is that I had 125% size set on the DVI monitor before the change. Somehow when I changed out the VGA monitor, it took over and the system adjusted the resolution of the DVI monitor. Note that it went from 1920x1080 to 1536x864 which is dividing by 1.25 in both dimensions. Bubba73 You talkin' to me? 23:05, 27 March 2016 (UTC)[reply]

OFC

[edit]

What type of OFC is required for 4 mbps unshared uncompressed symmetrical dedicated leased internet access last mile connectivity what are the different options of OFC and how the costs compare.What accessory equipment will be needed at customer premises.I understand a router is redundant though service provider tries to push sell .An acclerating UTM and a layer 3 switch are needed if i want high network security and content filtering.Please advice about different speciall most cost effective and efficient OFC and related equipments.The vendors are advising multimode Tx Rx OFC and says that 6 joints will be required.Pleaseanswer. 150.107.176.227 (talk) 10:43, 27 March 2016 (UTC)[reply]

This is a question for your service provider and your network architect, I doubt anyone here will be bothered doing your research for you.Vespine (talk) 21:52, 29 March 2016 (UTC)[reply]
Are you the same questioner from Bengal that was asking design questions earlier? A layer 3 switch may not have all the same functions and capabilities as anrouter, but unless you know about what to do with them, perhaps you will not be using those functions. Are you the only IT support person in your organisation? Graeme Bartlett (talk) 21:07, 31 March 2016 (UTC)[reply]