I wrote this over on my neowin blog:
http://www.neowin.net/forum/index.php?automodule=blog&blogid=8&showentry=2280
Bottom line: Given the choice between an SSD or running 64-bit with 8 gigs of RAM, get the extra RAM.
This part is true. An application takes a single core unless it's designed with several threads.
I should point out, however, that having dual cores does help responsiveness a bit since it can spread the tasks between cores. This has diminishing returns as the number of cores increases, though. The best way is to use applications designed for multiple cores.
Well, it can theoretically work like this, yes, but realistically most OSes do things a bit differently. You CAN do this by setting the affinity or by using software that sets affinity, but most OSes simply try to level out the CPU usage and pay more attention to how much CPU an application is taking rather than whether it's in the foreground or not.
You'd be surprised how many applications are multithreaded . . .
It might be more accurate to say that 99% of applications are not designed to use cores evenly or take full advantage of them. They mostly throw threads at the CPU hoping the OS will take care of distributing them.
Even without cores, threads have long been used to do things like keeping the GUI on an application responsive while other parts of the application wait for resources. This type of threading, though, isn't designed with multiple cores in mind, although it will work.
Servers have also long had multiple CPU/Core systems because they have to manage many clients simultaneously. Perhaps we are opening the floodgates for people running their own personal servers? With the big push for cloud computing, I wouldn't be surprised if more people started wanting to run the server side of things as well and not just the client side of things.
It's an event that seems to repeat itself throughout computer history: Something starts in big businesses, on large servers and mainframes. But it gets minaturized and put into PCs, and before we know it everybody is doing it in a small, inexpensive, and affordable box. It's happened before, and it'll happen again. If you want to see what people will be doing in their homes a few years from now, look at what the businesses are doing with large server racks right now. Because a few years from now, a PC will be as powerful as all of those racks. A few years more, and your phone will be that powerful as well.
This whole Software as a Service thing? It'll last until people start realizing they can host their own software. Next thing you know, they'll want to own their own software so they can host it for themselves. Geeze, where have I seen that before?
History will, as always, repeat itself.
I think that increasingly gaming will also take advantage of multiple cores as well. A lot of tasks in games are well suited for parallelism. Physics and AI come to mind. Graphics would also work, although that is usually given to the GPU.
I also think that we'll be finding more uses for multiple cores - in computer science there is a lot of theory about problems that may be better suited for parallel machines, and some programming languages such as Smalltalk and Erlang use paradigms that are well suited for parallelism. It's possible that there's some killer application lurking around somewhere.
BUG: Next page doesn't appear until it has two posts . . .
I did not mean I disabled superfetch, although I do have that disabled as well, but that is not what I meant. Superfetch in over simplified terms is pre-loading of commonly used programs (determined by stats collected by O/S) into memory (RAM) when the O/S starts up or paging file depending on hardware and software config. Maybe you thought I meant write caching on disk?I meant that I disabled disk caching in the sense of caching files to hard diskI wasn't refering to disk caches so much as I was referring to caching of files to disk (if that makes sense? lol probably not, I stayed up too late playing Sins!). Example : I have disabled the paging file on the hard disk and readyboost, there was one I other maybe but I don't remember the name hehehe.
And they don't have to wait for drive spinnup... sorry I switched my wording around on that, thanks for catching it (lol what was I thinking)
Well none of the standard raptors even bother with SATA 3 (3 Gb/S) as the manufactuer was smart and knew they wouldn't use up the interfaces capacity of SATA II. (Except the Veloci-Raptor which apparently does have SATA 3) but again that does even need the use of SATA 3.
I couldn't agree more, but I got the raptors so my game load times for levels when it happens are much faster. There is a surprising amount of speed difference between loading on a 7200 and loading on the raptor which surprised me.
Overall O/S speed no change between harddrives, but disabling the paging file and forcing it all onto RAM pretty noticeable difference on everything.
I'm gonna move to 64-bit, but it'll probably be Ubuntu.
If the SSD trend catches on (er, when...not if...), I could see us throwing a lot less L2/L3 cache on the CPU's and putting on more cores instead. The cache miss penalty just isn't that bad anymore. Arbitrating all the cores' concurrent memory accesses almost costs more than the memory access itself. This means, going forward, CPU cost shouldn't particularly go up even though they've got 8 cores.
What we will probably NOT see, though, is integrated SSD directly on the CPU. They use different manufacturing processes. So the CPU will probably continue talking to the SSD at the PCB board-level. It would be totally awesome if we could integrate persistent memory directly onto the CPU chip, because that would mean sub-1 nanosecond "hard-drive" seek times. You do the math what that would mean. But I don't think it's going to happen anytime soon.
The affect is the same, however, since an active foreground application will be taking cycles on one core, which will cause the OS to load balance secondary operations on additional cores. XP/Vista/7 do an excellent job of spreading the wealth around evenly.
Of course. I was presenting a simple layman's description without going all in.
Agreed. Games are the next big push in taking advantage of multiple cores. In fact, today's games already are doing this. The latest Unreal engine is highly threaded to take advantage of multiple CPUs and, of course, SLI technology addresses this for GPUs at the hardware level as well.
All Stardock games are very multithreaded. It's what got me into programming in the first place really.
Our new game, Elemental, has a new graphics engine that is explicitly designed with multiple GPU/core setups.
If money were no object, I would pay for a new type of motherboard to be developed and then buy 1TB of the best RAM and a nuclear generator to keep it all running.
Oh wait...
There are many great features available to you once you register, including:
Sign in or Create Account