4th February 2017, 9:49 PM
(This post was last modified: 5th February 2017, 10:46 AM by A Black Falcon.)
Dark Jaguar Wrote:Windows 7 really is Vista but better, because the interface fixes numerous issues Vista had making it more user friendly.Didn't Vista fix a lot of the issues people had with it later on anyway? I never wanted to move up to 7 because, first, I've never changed OSes without getting a new computer, but beyond that (I'm sure I could figure that out, or get a new drive and install it there...) and more importantly, I didn't see any benefit. Now I can see a bit of benefit, since for instance I could run Windows 7 on this computer if I had a copy but can't run Vista because as I said there aren't motherboard drivers for Vista anymore... and I'm sure 7 is so much better than 10 in all of the ways I'm already disliking 10, since Vista sure is! From the "flat" OS graphical design style to the abysmal Start menu, 10 is kind of bad in some important ways.
Quote:I switched to Windows 7 from XP years ago. The Core Duo was a 64 bit processor after all. Funny thing is, x64 includes the full 32 bit instruction set, and the full 32 bit instruction set includes the full 16 bit instruction set, so all 64 bit processors can run 16 bit code. However, when the processor is running in "64 bit mode" it only provides the 32 bit set. Run it in 32 bit mode and it provides the 16 bit set. It's an awkward setup for those of us that like backwards compatibility. There is one workaround. If your 64 bit processor supports hardware based virtualization instructions, you can get the processor running in 64 bit mode and 32 bit mode "simultaneously", using one set for the 64 bit OS and another for the virtualized 32 bit OS. That means that you can run your 16 bit programs on the virtualized 32 bit OS just fine, or even run a virtual instance of Windows 3.11 or some version of DOS. If you want a free virtualization client, try out VirtualBox. Keep in mind that virtualization isn't emulation (with a few exceptions) and runs it natively off your existing hardware so long as the hardware supports virtualization. Think of it as a backup plan for those 16 bit programs you wish you could run, but also keep in mind driver support for modern devices like graphics cards and sound cards on those older OSes. That's where a little bit of emulation trickery comes in handy.VirtualBox... sure, something like that would be great if it works. It's really unfortunate that MS chose to lock out 16 bit software, though, is there a good reason to not support it? That reminds me of low resolutions -- like, how on my Vista computer it just won't run anything fullscreen under 640x480, I believe, so old Windows games that try to run full-screen 320x240 won't work fullscreen. It's a real pain, and is one reason why I still need that WinME machine, for games like those...
Quote:M.2 (or U.2, depending on which format wins out) is pretty much a necessity moving forward. SATA has reached it's limits as far as speed goes. I'm aiming for M.2 (which my motherboard supports) down the line once flash drives reach a certain point. What I can say is there's really no point hooking up a HDD to M.2 yet, because they just aren't fast enough to need that.I probably mentioned something about this at the time, but unfortunately, the CRT monitor I used through the '00s, a Dell one that came with the WinME computer I got in '01, died several years ago. When that happened, I used this not very good mid '00s Dell LCD (it uses a regular VGA plug, not DVI, and is certainly a TN panel) for a bit, but it's awful so I researched monitors, and not too long afterwards I got a new one. That newer monitor is my main monitor, while I still use the LCD Dell as a second monitor, since it's often handy to have two screens.
I want to know one thing though. Do you have a new monitor? You mention you use two, but are either of those a decent modern resolution monitor? I only just upgraded to a 1080 monitor of decent size myself. I made sure it had the lowest latency I could manage. It works pretty well, but I do miss those perks of a CRT monitor. You know, better blacks, contrast, colors, viewing angle, response time, and biggest of all, scaling of various resolutions. NVidia still doesn't support a basic integer scaling mode, preferring to fill up the whole screen and apply a blurry filter over the result. I've found some workarounds to manually "double" resolution on a game by game basis while turning off scaling entirely in the graphics card, but it isn't a universal solution. I mean, I do still have my old CRT, but both NVidia and AMD are in a mad rush to kill off analog signaling as quick as they can (and my old CRT doesn't support DVI or any form of digital signal really), so I can't hook that up as a backup for older games. A thorny situation, so thorny...
The newer monitor is an Asus ProArt monitor, 27" if I remember right. It's an IPS panel with a 16:10 aspect ratio, so the native res is 1920x1200. It's 'only' a 60hz monitor, but that's fine on an LCD; on the CRT 60hz was eye-hurting and I always ran it higher, at least 75hz, but somehow on an LCD that's fine. I got it because the Dell LCD's colors look terrible compared to the CRT I had before, so I wanted a monitor with colors as good as LCDs get, which means IPS. I know it doesn't match the colors of a CRT, but it does look pretty good, certainly far better than the other monitor I have.
As for resolution scaling, though, yeah, there's nothing you can do about that; all LCD screens look kind of awful when they scale resolutions. It's unfortunate, and it definitely makes me wish I still had a CRT monitor that works, but sadly I don't. I have a few CRT TVs, but not a computer monitor... too bad.
Oh, on the issue of response times, I don't know, it seems fine? Given how bad I am at noticing framerate issues unless they get really bad though -- think of how I'm often unable to tell the difference between 30fps and 60 -- I'm not sure if I'd notice response times much... unless I just haven't had a TV/monitor which is really bad at them? Not sure.