November 9, 2007
I've been wary of 64-bit on the desktop, as the benefits are usually outweighed by the compatibility problems. I agree that 64-bit operating systems are inevitable in the big scheme of things, but I've struggled to see the relevance of 64-bit for typical desktop and laptop users. It's a novelty, albeit a necessary one for particular niche applications. However, I'm now beginning to think we could see a fairly broad switch to 64-bit desktop operating systems over the next few years-- much sooner than I anticipated.
- 64-bit versions of popular consumer desktop operating systems are commonly available. Both Vista and OS X 10.5 fully support 64-bit apps out of the box, although evidently the OS X kernel is still 32-bit.
- Memory is cheap. Dirt cheap. As of this writing, you can buy 4 gigabytes of quality DDR2 memory for around $120. The memory industry has a nasty habit of switching to newer, faster, more expensive memory types over time, but it looks like this plateau might be here to stay. 4 GB of memory is no longer a rare extravagance for rich users; it's becoming commonplace, even mundane.
- The 32-bit x86 architecture doesn't scale very well beyond 2 gigabytes. If you install 4 gigabytes of memory, you may find yourself wondering -- Dude, Where's My 4 Gigabytes of RAM? Good luck explaining to the average user why their computer says they only have 3 GB of memory, even though they paid for 4. It's a tough sell. And honestly, who has time to listen to a bunch of arcane technical explanations for this bizarre limitation? People just want full use of the memory they paid for.
- Modern video cards do not play well with 32-bit memory limits. Newer operating systems emphasize the importance of good, discrete video hardware. To get the full suite of cool desktop effects, through Aero, Beryl, or Core Image, you need a decent midrange video card. I'd say the average amount of memory on a midrange video card today is 256 megabytes, and in the enthusiast class it's closer to 512 megabytes. I can easily see that doubling over the next two years. That's a massive chunk of the 32-bit address space carved out for required hardware. And if you're a hardcore gamer or multiple monitor enthusiast with more than one video card, it's worse. Much worse.
The switch to 64-bit is interesting because there's a certain air of finality to it. It may be the last bit transition in our lifetimes.
Sure, nobody will ever need more than 640 kilobytes of memory, but this is a whole new ballgame. To put the size of the 64-bit memory address space in context, here's a chart showing the respective sizes of each. Note that the scale is logarithmic.
The transition from 16 to 32 bit increased our address space by a factor of 65 thousand. That's big. We've been in the 32-bit era since about 1992; that address space has been good for about thirty years, give or take a few. The transition from 32 to 64 bit, whenever we finally make it, will increase our address space by a factor of four billion. Will there be a transition to 128-bit machines and operating systems? Absolutely. But I'm not sure it'll happen while we're still alive.
You certainly won't be upgrading to 64-bit applications for better performance. Or at least you shouldn't be, unless you enjoy disappointment. 64-bit offers compelling performance benefits on servers, but on desktops, it's a bit of a wash. On one hand, the x86 architecture simply works better in 64-bit mode:
The x86 instruction set was created in the 16-bit era and has accumulated quite a bit of cruft going from 16-bit to 32-bit. Some of that cruft was wisely abandoned during the transition from 32-bit to 64-bit. Applications compiled for x86_64 don't just get larger registers, they get more registers, plus a more modern calling convention and more addressing modes. Every 32-bit x86 application can benefit from these changes, it's just a question of how significant that benefit will be.
On the other hand, stuff is just plain larger in 64-bit land-- your pointers and data structures now take up twice as much room. That 2 megabytes of cache on your CPU won't be able to fit as many things in as it used to.
Once you factor in the pros and cons, you end up with a 64-bit machine that runs desktop applications a few percentage points faster than the 32-bit machine it replaced. There are some exceptions, of course-- most notably games and audio/video editing-- but on average, performance remains roughly the same for typical desktop applications. It's hard to find a definitive set of benchmarks that tell the entire 64-bit versus 32-bit performance story, but all the ones I've seen show rough parity.
I recently upgraded both my work and home machines to 4 GB of memory. Based on the positive Vista x64 experiences related by coworkers and Scott Hanselman, I took the plunge and upgraded to Vista x64. It was the only way to use anything close to the full 4 GB of memory. I resisted mightily, because I expected 64-bit driver and software problems, but much to my surprise, I've had none. Zero. Zilch. It's been unbelievably smooth. Perhaps it's because I waited a good six months after the initial release of Vista to move to x64, but everything "just works". All my hardware has 64-bit drivers. Many of my applications even come in x64 flavors, and the ones that don't still work flawlessly. I didn't change any of the hardware other than adding memory, but I'd swear my system is more responsive under x64 in daily use. And I no longer run into certain aggravating 32-bit operating system limits.
Of course, my original advice regarding 64-bit operating systems hasn't changed. Unless you have more than 2 GB of memory, there's no reason to bother with 64-bit. But have you priced memory recently? Now that 4 GB configurations are approaching mainstream, it's encouraging to know that 64-bit operating systems are out there, and that they work with a minimum of fuss. It's certainly taken long enough to tackle this problem. Hopefully we can stay with 64-bit for the forseeable future, and leave that pesky 128-bit problem for our kids to deal with.
Posted by Jeff Atwood
I dual boot between XP 32-Bit and Vista 64-bit on my system that has 4GB of ram in it, of course in the 32-bit OS it only shows 3GB of ram. I actually was running the Vista 64-bit RC and even then my driver issues were related to things like scanner functions in my all-in-one printer and drivers for my Razer mouse. All of the mainstream hardware devices had drivers. This is a far cry from what I encountered when I installed the XP 64-Bit RC. It reminded me of the old days when I would try to install Linux on a laptop.
Finally viable 64-bit OS with proper driver support... Bring on the applications to make me feel good about my 4GB of RAM!!!
True, 64bit platform does not magically increase performance like many people thinks. http://mpan3.homeip.net/?blender64
However, I am not quite convinced that servers benefit from 64bit so much. After all, Crysis is just as good a benchmark as IIS. The req/sec havn't changed much, but it seems the response time is significantly shorter. Any idea why?
Now that the transition to 64-bit seems to be going well, I just hope that software makers step up and start making sure that their software runs well in a many-core world. I could be wrong, but this seems like the next big hurdle.
It's good to see more and more people are starting to make the leap to 64 bit without being scarred. The more people that migrate, the more pressure to modernize the software ecosystem.
The real threshold will come when OEM's are forced to slap x64 OS's on PC's with 8GB of RAM.
In the meantime, I'm quite content with my 3.581 GB. :)
64 bit on the windows desktop will never happen until Dell starts shipping 64 bit systems. And Dell won't ship 64 bit systems until people start asking for them (chicken and egg problem). And people won't ask for them until there is some compelling reason to make the switch. That compelling reason doesn't exist today. But the memory limitations of 32 bit is about the closest thing we have right now. Without that, 32 bit OS's would continue along their merry way for a good 10 years or more.
going from 32bit to 64bit increases by a factor of 4 billion not 4 million.
Reimar: The performance increase can easily be explained by the extra SSE registers that became available with the 64-bit architecture; and thus less cache misses. Most multimedia applications such as codecs and computer games have (some) hand optimized SSE code in there for performance reasons. When done well SSE optimizations can provide a 3 to 4x performance increase. This is a constant factor, since one SSE register can store 4 floating point values; which is what it's usually used for which is only influenced by the parts of the application that [i]don't[/i] use SSE.
It seems like MS made sure everything that has a signed driver for Vista has both 32 and 64 bit versions. Unfortunately I still have some hardware that is not supported under 64 bit XP.
When Vista first came out, at work we found there were no drivers for a few bits of hardware (mostly SATA cards). We managed to install the XP drivers though and they worked flawlessly. Has anyone managed to go the other way, using Vista drivers under XP 64?
64bit is going to be the second and the last (after DX10) reason for the majority of WinXP users to switch to Vista.
"Will there be a transition to 128-bit machines and operating systems? Absolutely."
I think you are being short-sighted, clearly we will need 256-bit pointers in the near future when we want to map every ZFS file system on every IPV6 addressable computer into virtual memory.
Who the hell needs more than 640kb of memory anyway?
you know what happened to 8bit-computing, do you?
I was using 64-bit Vista. Everything worked fine, even games. Then I went to college where the network dorks require anti-virus software to use the internet. So it's back to 32-bit.
Mac OS X Leopard is x64, BTW. It doesn't even give you an option to install 32 bit version if your machine is 64 bit capable. It also can use 32 bit drivers even though the OS itself is 64 bit. It doesn't mention its 64 bitness anywhere either, because to be honest, why should a user care. That's how it should be.
So, there are two corollaries:
1. The x86 mass exodus has begun
2. Apple is leading the way once again
but does that mean another full Vitsa Ultimate licence?
The Vista keys work on 32 or 64 bit versions. The keys are specific to the edition (Home Premium, Ultimate, etc) but not the bitness.
even a 32 bit Linux kernel can be configured so that every application can use (almost) 4 GB, the 2/3 GB limits are Windows-specific
The limits aren't Windows specific, they're tied to the x86 architecture. Sounds like you're referring to PAE mode, which is a hack you can use on the server editions of Windows but not Vista or XP. 64-bit is preferable to PAE hacks, obviously, and *especially* on Linux-- you can just recompile everything from the source code!
Where was Windows in 2004? Where was Fedora 2 in 1996? Windows NT had 64-bit support for the DEC Alpha back then. There wasn't enough use for it on the desktop and it was shelved.
The statement that Leopards kernel is 32-bit is a little strange. Tiger had no problems addressing of 32GB of RAM... which you might expect to be impossible. Obviously there's something clever going on here that I'm yet to grapple with :).
Still worth keeping in mind: even though the kernel is 32-bit, it offers the same advantages as 64-bit kernel.
I apologies for asking this silly question but its been bothering me. I understand that 1 gb of your 4 is used for hardware slots, compatibility etc( I'm not sure exactly but I get the idea) but why is less used when you only have 2gb ram? Or is it imaginary until you actually have 4gb installed? In which case how does the computer use addresses for non-existing ram?
Well as an enthusiastic gamer I have to say:
Stay away from Vista x64.
Yes I tried it. Its really not worth the trouble. Sure there are drivers, but most of them seem like thrown together emergency code (just look at nvidia and creative soundblaster). Every other game has random crash-issues or just bad performance.
I agree, pretty soon 64bit becomes the new standard, as soon as we are approaching 8GB RAM as common. But as today I don't see any reason to upgrade. I run with 4GB RAM, Windows XP-32 shows I've 3.5GB ram, so essentially I'm missing half a gig. But I'm pretty sure installing Vista 64 will decrease my overall performance even though I've more RAM avaible, given the resource-hog vista is.
It's really a vicous circle right now. The switch from 32bit to 64bit is more or less tied to the switch from XP to Vista, including new driver a model and adaptation to DirectX 10 for games. This means a lots of software has to be modified or rewritten (including developer frameworks) to perform as well or at least to be stable enough. Since this doesn't happen overnight, most user, like me, refuse to upgrade and while the marketshare of Vista64 stays low, software companys won't make much effort to push for max compatibility.
So yeah, I recommed riding on 32bit as long as you possible can and hope by the time you have to switch, things will be better ;)
Your link text "nobody will ever need more than 640 kilobytes of memory" points to a page that describes how Bill Gates never said such a thing and how the quote is actually a myth.
even a 32 bit Linux kernel can be configured so that every application can use (almost) 4 GB, the 2/3 GB limits are Windows-specific
The limits aren't Windows specific, they're tied to the x86 architecture. Sounds like you're referring to PAE mode, which is a hack you can use on the server editions of Windows but not Vista or XP. 64-bit is preferable to PAE hacks, obviously, and *especially* on Linux-- you can just recompile everything from the source code!
I certainly don't mean PAE, PAE only allows to address _more_ than 4 GB RAM (so it is useless if you have "only" 4 GB RAM) and adds the NX bit.
In addition PAE does not (directly) affect applications, only the OS (so whether you can recompile applications or not does not matter for PAE), and I wouldn't really call PAE a "hack", it can provide better performance than 64 bit mode if you have many, many, many applications that all use less than 1 GB of address space.
The fact that the upper 1 - 2 GB of _virtual_ memory is not freely useable for applications _is_ OS specific, Windows always reserves at least 1 GB (default is 2 GB) in each application's address space for special uses, 32 bit Linux can be configured to reserve between 3 GB and 0 GB (reserving less can mean a performance penalty for some system functions).
64 bit Linux will always leave (almost) 4GB of virtual address space for 32 bit applications to use however they want, since it does not have an (additional) performance penalty here.
64bit means 64bit CPU register length and memory address size.
4GB memory is not enough soon, but big integer are not useful in desktop programming in my programming experience.
Hey Now Jeff,
I'm so shocked to learn the 64-bti transition will be the last of our lifetime. Your reasoning makes sence, I just never heard that before or thought about it. I agree w/ you on the 4gigs of ram, I wish older notebooks had the space physically place it.
Coding Horror Fan,
@ Mark Smith:
Tiger had no problems addressing of 32GB of RAM
It kinda did. It had a 64 bit kernel, but apps could not be 64 bit, because app libraries were still 32 bits. In other words you could only access all your RAM by either running more than one memory intensive 32 bit app, or by writing a console 64 bit app. That's no longer the case, you can now be 64 bit throughout. And you can compile your app for 32 and 64 bit simultaneously - there's a setting in XCode. And _most_ (but not all) of the apps that come with Leopard are 64 bit, which you can see in the Activity Monitor as terabytes of VM space.
The real bit of kick ass work is that the kernel can use the old drivers. This is something Microsoft failed to figure out how to do.
G'Day from Dublin, Ireland.
You will probably never know how tiring many of those questions seem to me. I recently got a new monitor and was so frightened of it that it remained, unopened, in its box for three days before I dared look in. It took two days research to work out which graphics card my camera needed.
Must study your excellent blog more often.
but why is less used when you only have 2gb ram
If I'm not mistaken, the problem is that if you have 3 Gig of RAM, normally you'll be using the /3G switch in windows so that an application can use it. Usually windows splits the virtual address space in 2G for the applications, 2 gig for addresses to shared system resources. With /3G, it's 3gig for the application, and 1 Gig for the system.
So if your video card is now taking most - or all - of that 1 Gig system address space, you're in big trouble. All the sudden the OS is being squeezed out. Even though you have more RAM, your PC is actually slower and starving for RAM.
The real bit of kick ass work is that the kernel can use the
old drivers. This is something Microsoft failed to figure out how to do.
They didn't want to. They wanted a pure 64-bit kernel and an opportunity to drop support for older drivers and APIs.
I don't know how many times I've run out of resources in 32-bit
world -- literally, context menus would refuse to show,
Internet Explorer wouldn't render, I'd have to reboot just to function.
This is not a 32-bit vs 64-bit issue. You were just running out of space on a table called The Desktop Heap, which you can tweak in the registry. It's a fixed size table of handles in Windows
I'm sorry but I have to ask for what reason do you think a desktop computer should have more than 2 GB memory?
I have 2 GB in mine but I have never seen a software that could make _good_use_ of it. IMHO if a program "needs" that much, it is badly written.
I know in the case of servers it is reasonable, but we are talking about desktop PC's. AFAIK there's only one PC game that is labelled with 2 GB recommended memory size (Crysis, and yes, I think it's badly written).
Peter, what part(s) of Crysis should be optimized to lower the memory requirements?
Honestly, your tone is so arrogant and spoiled as to be horrid.
4GB is commonplace? No, I don't think so.
I have it, but I don't know one other person personally, who does. And I work at the tech of the tech field. So please, nix the overconfidence in your own words.
Secondly, Vista is a pile of dog poo. I tried it and went right back to XP. Of course, that's just on my VM, because who uses Windows for real anymore anyway? Not since Linux came around.
Oh wait, don't like my tone? Sound familiar?
What's all this fuss about 64 bit OS on the desktop? Solaris has been 64 bit on the desktop since 1998.
I have 2 GB in mine but I have never seen a software that could make _good_use_ of it
Virtual machines, of course :-)
It's an interesting notion that 64-bit might be the last transition in our lifetimes. The 64-bit address space sure is vast.
But consider the 32-bit address space. In human terms, a billion addresses is literally unimaginably huge. Yet we're fast approaching the day when 3 or 4 billion addresses won't be enough. That's not because 3 billion is in any sense a small number, though. It's not (in principle) difficult to do most computing tasks in a lot less space than that.
So why is a few GB no longer enough? Largely because memory is cheap, developer time is expensive, and so we have (quite rationally) built a tool-set, an ecosystem, and an engineering culture that regularly trades boatloads of dirt-cheap storage for human convenience.
That's a trend that doesn't seem to be slowing at all. Even if it's never possible to build a machine that actually contains 2^64 bytes of storage, there may be new approaches to OS/software design that gobble up that address space in order to make something else easier.
We've got a similar situation with the IPv4 address space, where an address space that once seemed unimaginably vast was carved up very inefficiently, simply because allocating addresses efficiently is a lot more work.
Linux migrated at least two years ago Jeff.. This post is so biased to a windows users perspective it is hilarious..
Secondly, how about worrying about the 16-bit software... when did that really leave windows ?
I cannot believe the "technically literate" rubbish here. It matters nought to me how good Unix/Linux might be, or how bad Windows might be. Every single one of my clients runs Windows. I never get asked to write software for Unix, or Mac either. I could swap to Unix and feel very smug, but I would starve to death.
Back to the issue. I recently updated my desktop to 64 bit and 8 gig of RAM. It was overkill but memory is cheap. I run a 4core processor, following a similar spec to the machine you built for Scott Hanselman. It is great for running a few virtual machines which you need when you have clients still running Win 98, and others on XP or Vista.
I think 64 bit and a lot of RAM is future proofing yourself. Especially out here in Australia where prices are high compared to US and Europe. I don't want this PC to become outdated too soon.
"By the way, Linux had full support for x86-64 for a couple of years already."
A COUPLE years?! I was using 64 bit Linux back in 1996 on a DEC Alpha! I think it went 64 bit in 1993. That means Linux has been 64 bit for 14 years. And there are no driver problems. Everything that I have ever used in 32 bit works in 64 bit. This is because the Linux world distributes everything as source. For more things going 64 bit just means a recompile. Occasionally a minor code tweak is necessary but those were all done 14 years ago. No driver drama for Linux!
Your assertion about true 64-bit chips is false. You are correct if writing about Intel. You are not, however, when writing about AMD.
I have been taken off guard by the strange responses to this one. For example - per santana - because someone writes Windows specific posts does not mean he is "just another writer who think he knowns whats up but doesn't have a clue." It is that he has "a clue" about dealing with Windows. Whether or not you believe Windows should not exist is irrelevant - it does. Like Steve S, I have many moments of disdain for the OS, but I choose to learn as much as I can dealing with it because there is a reason to. Perhaps you do not have this reason - and perhaps that would also mean you have no clue but you think you do?
We are now at the breakpoint of needing 64 bit for the desktop. The standard memory size needs to double 32 times before we reach the limit of 64 bit adressing. Is 18 months a reasonable time for memory to double? Well then, in 48 years most of us will be either dead or retired: 128 bits you say? bah! I remember when...
It seems a little... wrong to call Vista a popular OS.
The biggest drag to 64 bit adoption is going to be the lack of 64 bit versions of MS Office and, probably more significantly MS Access/Jet. MS is using the 64 bit transition to try to finally kill off the mdb file in favor of SQL Server. I think corporate America is going to change their mind for them, but in the mean time it's going to be a rough few years for those of us who've written apps that rely on Jet.
By the way, I remember when we were in the 12 bit era with the 8088 and reading Intel technical roadmap stuff about 32 bit archetectures for the 80386. At that time having 4 gig of virtual memory was absolutely unimaginable. I think we'll see a move to 128 bit computing sooner than you think.
It may be the last bit transition in our lifetimes.
I am suprised Jeff....
The one thing I have learned in this strange field was to never use the words, never, last, final, etc...
When we are using protein based bio cube memory modules with laser holographic resonance imaging subsystems we will be cursing your shortsightedness.
I remember being told the 640K was going to be impossible to fill too.
I can see lots of use for 64bit applications as well as 128bit and beyond requiring massive amounts of RAM. The "goal" of computing if you will is just that. Computing. Efficient massively parallel "machines" will be required to perform AI / human interaction models effectively, and thats just to enable a robot to interact with us and do our bidding. Think of the military applications that are undoubtedly being dreamed up in skunkworks type labs. Although I would much rather have that computing be put to use for games and desktop applications :grin:
16-bit can be worked around with dosbox I hear. Otherwise you could probably load up a virtual PC with windows 95 or windows 1.0 or whatever floats. If/when I ever go this route, I'll be crushed if I can't run my dos tie fighter (the awesomest of all star wars games to this date)
In addition to everything else I do on my computer, I use it as a DVR. Unfortunately, I have yet to find a consumer-grade HD tuner card with a 64-bit driver for Windows.
I think the UNIX/Linux/Solaris move to 64-bit went well, especially as they are all LP64. Solaris and Linux on the desktop in on x86-64 is a good thing and very stable and I find works great.
Porting Windows 32-bit apps to 64-bit takes some effort, the LLP64 isn't they same as LP64 and that hurts portability. Many libraries and apps that people rely on either don't compile or haven't been built with VS2005 yet.
But the transition definitely will happen over the next few years.
Are you referring to how Jeff "The Gatekeeper" Atwood was putting forward unqualified assertions and just generally making a jackass of himself? If that was the case, I could definitely see the parallels between his post and yours.
Fortunately, he's not writing about cryptography this time, so his arguments come off as a bit more cogent ;)
But all seriousness aside, the argument that we'll never switch to 128 bit computing in our lifetimes is an interesting one. It definitely appears true as long as we stick to conventional computing methods, but I wouldn't put too much cash money on it.
I know Bob was joking when he mentioned using virtual memory to address drives on IP-accessible computers, but who knows how plausible that will be 20 years from now? Also, who knows how commonplace distributed computing will become? That could change the whole face of the game.
To be safe, let's just say we'll never see a general switch to 256-bit computing in our lifetimes.
I went with a Vista ultimate 64 w/2GB RAM laptop in June. The only problem I've encountered is printing on my 32 bit network server, the drivers are there but the inf file needs to be hacked on the server to recognize the 64 bit drivers; just haven't gotten around to it.
As to the 4GB memory and where it would be used, VM. On my new laptop a VM instance of XP runs faster than the previous 32 bit laptop does natively. This makes it great for testing apps I develop and previewing beta software, Orcas (VS 2008) or SQL 2008. With the two gig on the laptop I can only run one instance of VM at a time, I'd love to have 4 for running 2-3 instances.
See this for making use of your full 4GB of memory on an x86 running a Linux operating system(haven't tried it myself, but it seems legit to me):
I can only assume it does thing through further obfuscation of virtual memory addresses which make fetching/writing to memory slightly slower than it would be if you were using an architecture like x86-64 which was built to provide access to more RAM.
Mac OS X Leopard is x64
Sorry, Jeff is right. The Mac OS X Kernel is still 32 bits. This was mainly done for driver compatibility. However, it isn't all that much of a limit since Mac OS X has a micro-kernel structure. Only the lowest layers that work directly with drivers is 32 bits. The rest of the OS is 64 bits. This includes memory addressing and register handling, so most of the 64 bit kernel benefits are there. 64 bit apps on the Mac are much faster than the 32 bit implementation. It is also the reason why a single Mac OS can handle 32 bit and 64 bit implementation while Vista has two distinct implementations.
I bet the next Mac OS X release will be 64 bits down to the micro kernel. I suspect it wasn't done because certain Carbon heavy companies whose names I won't mention (Adobe and Microsoft!) already will have a heck of a time having to move everything to Cocoa.
I cannot believe the "technically literate" rubbish here. It matters nought to me how
good Unix/Linux might be, or how bad Windows might be. Every single one of my
clients runs Windows. I never get asked to write software for Unix, or Mac either.
Then you need a high class of clients ;-
(Look smilie! I was just joking!)
Seriously, I've been mainly doing Unix development and really haven't had a problem with finding work or getting a paycheck. The issue is one of "experience". You have mainly Windows experience, and most of your contacts are with Windows developers. Thus, you hear mainly about the Windows jobs. People I know who mainly have Unix development experience rarely hear of any sort of Windows development positions.
I think this divide between Windows vs. *nix development is a sad development because I both sides could learn a lot from each other. For example, if all those Linux people see how Microsoft makes it easy to setup a Windows network, they'd probably build better utilities. (Then again, maybe not. Anyone who's used "git" knows that Linux people like doing things the hard way.)
And if Windows developers see how quickly Unix developers are able to work because they're not stuck with using VisualStudio, maybe they'll demand Microsoft actually create a fast and efficient development platform. (Then again, maybe not. I can't understand how a developer thinks that an environment that is so complex, that it takes five minutes to load up a project into the IDE, or a development environment so complex, that a tool like intellisense has to be created to make development even feasible is the greatest platform in the world. And, they say Steve Jobs serves the Koolaid!)
I am continually frustrated by this author's win-centric-ness. I don't see how any technically literate individual can support the crap that microsoft has produced over the past decade. One certainly can't make an objective evaluation of 64 bit hardware performance based only on windows! I would suggest that the authors disappointment with the 64 bit architectures lie not in the hardware itself but rather the software he is running. Windows simply has a poor design, and will not be capable of leveraging current hardware technology(maybe by the time 128 bit hardware comes to market MS will have a decent 64 bit implementation, but I wouldn't be holding my breath). By the way I have been running x86_64 since 2004 when I got my first athlon 64 and put fedora 2 on it. Where was MS in 2004? not in the 64 bit arena thats for sure. Its disappointing to hear the author give MS a pat on the back for brining 64 bit windows to market 3 years late, given that lag they better damn well have their driver interface re-compiled. By the way I am currently running 16 G of ram in my dual cpu quad core AMD athlon 64 work station. I am running a quality os:64 bit kubuntu. Designed and built by technically literate individuals, for technically literate individuals. not the retarded backward thinking drones that gates hires and builds software for. since the upgrade from 4 to 16G I have seen a massive performance increase. No swapping needed for most of what I do. And that is where memory hungry apps stand to benefit the most as I see it. You didn't even mention swapping in your piece! Your classes and structs will tend to be larger but the bus in and out of the cpu is also wider so I don't any negative impact there, as for cahce if you hadn't noticed the newer chips come with quite a bit more than 2 M, last time I looked high performance intel chips were shipping with 8M. Probably by the time you get your latest project to market the low end chips will be shipping with 8M.
I seriously hope you realize that windows is a piece of sh** and start taking that into account in your blog, if not then in my opinion you are just another writer who think he knowns whats up but doesn't have a clue.
Holy crap Santana, aren't you just an elitist!?!? Why do people continually think that there is and should be only 1 'of anything', just because it's built by people for people? We would all live in the same style of home, drive the same car, etc. I'm not saying that Windows isn't bloat rot, but it runs the applications that I want to use. That's the world I live in.
Slamming someone who writes a blog because it slants towards one particular OS, when if you read his biography you would realize that this blog will lean towards Microsoft since Jeff is a developer of the Microsoft platform.
It's great that you are a proverbial grandfather in the 64bit arena and probably have helped paved the way for the rest of us to join you. But just because we are hear and we are running Windows don't crap on our parade.
If for no other reason, Microsoft brings the market with it; peripherals, applications, etc. The more people doing 64-bit the more they will port it to different OSs, which benefits in the elitist linux people that have to write their own drivers for the latest and greatest devices that come out with only Windows/Max support.
By the way, Linux had full support for x86-64 for a couple of years already. I've been using linux-x86_64 for more than two years, and _all_ software is 64-bit here, and absolutely no problems with drivers (there are devices for which there will never be new drivers for windows).
I have been running XP 64bit with 4GB RAM now for over a year and I have to say that I have not really had many problems with drivers and like. To be honest most problems were configuration issues with IIS and .NET.
I really believe that 64 bit is still seen as a bit of a novelty and most of my colleaques are still ranting that "you cant get drivers..."
I am a developer and I have to say that under XP x86 I was having to reboot my machine once a day as I reached one limit or yet another crash! with 64 bit I cant remember the last time I had to reboot!
Don't forget PAE. Though a PAE enabled 32-bit OS will NOT run 64-bit apps, it can use more than 4 GB of physical memory.
Holy crap is right. It is that kind of attitude since the split with Abraham that has had the Muslim's and the Christian's at each others throats ever since; throw in the Jews and we can have a real free for all. How about racial superiority, you want to espouse that too.
Ever considered tolerance, live and let live, different strokes for different folks, etc?
I started out writing apps on the Mac in '90, by '92 I went to Windows, not because it was a better platform, but because that's where the market share was, and therefor the customers. There's not a week goes by I don't curse windows and give thanks for the paycheck it brings me.
"Will there be a transition to 128-bit machines and operating systems? Absolutely."
Actually, not really. Big numbers are easy to make up on a piece of paper, but in order for an address space to be actually useful, a piece bigger than the 2 EB provided by 64 bits has to be backed by physical storage.
Think about how you will build this physical storage and you'll see that you can't. We're already down to a few atoms in DRAM -- there's not much more data we can pack into a given physical area of circuits, given our current knowledge of physics. I don't think even quantum computers would increase storage density enough to make 128 bits worthwhile.
Unless theoretical physics change dramatically (like the discovery of hyperspace pockets...), we'll never go beyond 64 bits. Not a matter of "enough for anyone" but a matter of "can't build a computer to make use of that space, even with hypothetical manufacturing technology".
I have a few small points: firstly, even a 32 bit Linux kernel can be configured so that every application can use (almost) 4 GB, the 2/3 GB limits are Windows-specific.
Also, we currently do not have any real 64 bit CPUs yet in terms of address space, they are actually 40/48 bit for now, though switching to more bits (up to 64) will need only changes to the operating system, not the applications (unless the programmers did some really stupid things).
And lastly, 64 bit mode offers not only 8 additional registers, the 64 bit registers mean a big advantage for any applications that really need 64 bit numbers (e.g. it can half the number of instructions needed for some AES operations, and a whole DES block fits in one register).
And the "needs twice as much memory" really only applies to pointers, which for many application I guess make up only a few % of all data.
From my personal experience: 64 bit mode make very little difference in most cases, rarely it is a bit slower, and in some equally rare cases it is _a lot_ (about 10 - 20 %) faster, mostly as you said multimedia, like H.264 decoding with FFmpeg.
When I got my OEM Vista Ultimate 32-bit I was so tempted to go 64-bit, but I was worried about driver/app compatibility. Now I find myself wishing for 64-bit with all the good stuff I am hearing, but does that mean another full Vitsa Ultimate licence? Or is there a cheaper upgrade to 64-bit?
About the performance issues...
I work at AMD (where the 64-bit extensions to x86 were invented) in the group that among other things does liason with the compiler venders, working with them on code generation, etc., so I have some real basis for what I'm saying...
Your average large C/C++ program will probably run 10%-15% faster just recompiling it for 64-bit, and the reason is amost entirely the fact that there are twice as *many* registers in the ISA, so the compiler can keep more stuff in register, eliminate spills to memory.
A second advantage is that where in 32-bit, there are Lord knows how many different calling conventions (__stdcall, __cdecl, FORTRAN, PASCAL, WINAPI, ...), in 64-bit there is only one calling convention and it passes most parameter values in register, not on the stack.
Sure, your mileage will vary, and there are some pathological cases where performance actually decreases, but on average it's a modest win, about 1 processor speed bin.
Of course, you don't notice much difference on most desktop apps, since most desktop apps are not aggresively optimized for speed. A 3GHz processor used to surf the web is idle most of the time waiting for your page to download, so who would notice the difference?
As others have pointed out, only pointers and size_t are expanded to 64-bit; int, long, float, double, short, char are still the same size that they were for 32-bit builds, to the program .exe size and data space requirements do increase, but only modestly. Your data structures *don't* double in size, unless they're all pointers and size_t.
You could argue that the switch from ASCII to UNICODE, by doubling the size of all the text strings probably did a lot more to increase program size and memory footprint, but nobody seems to be too upset by that.
One of the compelling benefits of 64-bit Windows I didn't see you mention is security. Jeff Jones calls out the following three security benefits unique to 64-bit Vista :
* Hardware NX protection on globally by default.
* Kernel Patch Protection aka Patchguard.
* Mandatory Kernel Module and Driver Signing.
Basically, the introduction of a new CPU architecture with no backward compatibility restrictions gave the OS folks some liberties to do things more securely than they could with 32-bit and its legacy drivers/programs/etc..
There are 64 squares on a chess board and bit-twiddling chess game writers have already figured out how to exploit 64-bit hardware. Hopefully the bit transitions will continue until there are enough bits to represent a 4x4 matrix of adequate precision for my preferred set of games.
(Not related to this article) Do you have a list of favorite Blogs?
And the reason why this is possible (and I didn't know this) is because Mac OS X does not map kernel into user address space. App uses the entire 4GB of available address space (with quite a bit of it reserved for Apple and third party libraries, though).
Your assertion about true 64-bit chips is false. You are correct
if writing about Intel. You are not, however, when writing about
I am usually referring the AMD Programmer's Manual because I have that one in print, so I am even more certain that what I say applies to them than to Intel chips.
(I assume you mean my statement that x86_64 has only 48 bit virtual / 40 bit physical address bits, AMD indicates that the former should be considered an architectural limit (though it is easy to do away with) while the latter is only an implementation-specific limitation).
In case you wonder where the 40 bit limit probably comes from, that is the number of bits that fit in a standard 64 bit hypertransport control packet, 64 bit physical addresses would need the extended 96 bit format.
Hey Jeff, I just wanted to let you know that Beryl is no more, its merged with http://compiz-fusion.org/ . Also, you don't really need a midrange card to run it. I am currently driving a 1920x1080 monitor with the onboard intel chipset in a mac mini with no problems. Also a coworker is using a 6 year old dell laptop with the onboard intel chipset and running compiz just fine.
I'm not certain that the underlying assumptions about 128-bit memory hold; addressing space need not be equal to available physical space (we already see this in the 32-bit world with virtual memory).
Many areas of CS, for instance, must deal with convoluted algorithms to fold back sparse arrays into finite contiguous memory space. This is basically memory management done in software (pretty much the least processor-efficient way of doing it); such arcana (and accompanying bloat, testing, and optimizations) could be done away with by letting us address a very large memory space directly, having fast dedicated hardware map it back onto physical memory locations. I hope such desirable goodies will not have to wait for the next generations to reach adulthood.
And my favorite application (I'm biased though obviously), Paint.NET, is full native 64-bit. We take advantage of 64-bits in a few places in order to avoid spending weeks optimizing 32-bit code: instead of futzing around with 32-bit integers, we just use 64-bit longs. This is necessary in many places where we have to sample many pixels from an image and blend them together to create a final pixel (the distortion effects do this a lot).
The memory management in Paint.NET, sadly, uses a short-sighted design in hindsight. All bitmaps are created using 1 big allocation instead of a tiling or paging scheme that GIMP, Photoshop, et. al. use. Because of this Paint.NET often fumbles on large images as evidenced by the numerous "out of memory" crash logs I get in my inbox every day from 32-bit users. I was planning on 64-bit adoption gaining traction sooner so that this wouldn't be necessary, but I've been completely wrong :( Oh well, I'm hoping to do a lot of stuff in v4.0 that will make use of tiling for more than just memory management.
beryl is done. compiz-fusion all the way.
you say: 8-bit 2^8 256 bits
you mean: 8-bit 2^8 256 bytes
I would not suggest to upgrade to Vista at all. It is too raw, unfortunatelly. SP1 and time (when we say about drivers) should fix that, bit now that OS is not good. 64 bit version twice as poor as 32 bit Vista. So even if you have 4GB of RAM (that is very unlikely for usual user, just like having 3 displays:-)), think twice when you want to upgrade your OS to Vista and, more over, 64 bit Vista.
Interestingly, I just wrote about troubles I had with Vista on my blog.
The one place where Vista 64 really shines (IMO) is with SuperFetch. The OS can cache so much stuff in RAM, I find my app-launch speed is, frankly, amazing. With 3GB of addressable space, this just isn't possible to do if you do any serious multitasking -- you run out of memory too soon. I now find myself more frustrated because
1) I have an iPhone, and Apple hasn't gotten around to supporting my damed OS!
2) Most PC configurations (still!) max out at 4G of RAM -- you typically have to go FB-DIMM and Xeon to get any large amount of RAM in your machine
3) Edit and continue isn't supported on VS 2005 on x64 binaries. Annoying. Anyone know if this has been fixed in VS 2008?
It's really a vicous circle right now. The switch from 32bit to 64bit is more or less tied to the switch from XP to Vista, including new driver a model and adaptation to DirectX 10 for games.
Actually, gamers are one of the groups who benefit MOST from 64-bit. Modern games can easily allocate 2 GB of memory in large multiplayer situations.
"Once applications begin to push the 2GB addressing space limitation of Win32 (something we expect to hit very soon with games) or total systems need more than 4GB of RAM, then Vista x64 in its current incarnation would be a good choice."
The following AnandTech article documents the issue in Supreme Commander, Company of Heroes, and Lost Planet.
@ David W.:
Sorry, Jeff is right. The Mac OS X Kernel is still 32 bits. This was mainly done for driver compatibility
How exactly can a "32 bit" kernel allocate TERABYTES of virtual address space and receive 64 bit calls? Could you (and Jeff) enlighten us a bit here?
Actually, you're right and I'm wrong. I've looked into "Mac OS X Internals" by Amit Singh, and here's what he has to say on the topic (caveat lector, this is accurate as of Tiger, Leopard may have changed things):
... the kernel is still 32 bit. Although the kernel manages as much memory as system can support, it does not directly address more than 4GB of physical memory concurrently. To achieve this, the kernel uses appropriately sized data structures to keep track of all memory, while itself using a 32 bit virtual address space with 32 bit pointers. Similarly, device drivers and other kernel extensions remain 32 bit.
End quote. So in fact Mac OS X is a 32 bit OS that to 64 bit apps appears as 64 bit. Two other interesting tidbits I've found are:
* 32bit PPC processes can use all 64 bit assembly instructions.
* Memory futzing is completely hidden from 64 bit apps, so they can allocate the entire available VM address space, if they so desire.
I've found programming in 64-bit assembler (AMD64) a joy compared to 32-bit, plus the more you do it, the more tricks you find, so you find yourself programming in a completely different way. I think more performance will come the longer compiler vendors have to play with 64-bit code. After all, it took quite a while for 32-bit compilers to get up to speed.
Just as you program C# in a different style to programming C++ for performance, it might take a while to get up to speed with 64-bit, so to speak.
David W, in what way is git "doing things the hard way"? I'm no genius but I find it pretty easy to use, and that's coming from a Subversion background. The only thing that was difficult was adjusting my thinking to the concepts of decentralized source control; the tool itself is pretty easy.
Santana: "I seriously hope you realize that windows is a piece of sh** and start taking that into account in your blog, if not then in my opinion you are just another writer who think he knowns whats up but doesn't have a clue."
That rise to 5% market share's really going to your head, isn't it? Just wait till it's 90% and everyone's talking about how only losers use Linux, with all those viruses people write for it now it's being used by non-geeks who can't protect their kernels.
Terry Pratchett [adapted]: "I need no code on my screen to be a geek. Nor do I need to hate Windows. What kind of creature defines itself by hate?"
the Anandtech link you displayed is only the first in a series of articles about the memory problems for games in Vista 32 bits.
It's because Vista reserves twice the amount of memory in your graphics card! Strange decision if you ask me...
So if you have a card with 256 MB ram, you have no problem with the address space, but when the size of the video ram increases, you will run out of address space: http://www.anandtech.com/showdoc.aspx?i=3044p=1 and
On Windows XP there is absolutely no problem!
So please when you talk about 64 bits, don't make general statements when they are based on your experiences with Vista 64 bit!
Clearly seperate what are root problems with 32 bits from problems with 32 bits implementations of XP, Vista, Linux or OSX.
I really don't know what everyone is making a fuss of? I've been on XP x64 and Vista x64 and Server 2003 x64 for ... ummm ... around 3+ years. Rock solid. I would never use a 32-bit OS again. Vista 32-bit is dung and is only there because Intel wanted it supported on legacy procs. Driver support was flakey in the early days. I've ditched XP x64 in preference to Vista because driver support is so much better. I had problems with webcams and scanners but just went and bought new ones. I need 64-bit. We write software and require the large memory model for parallel compilations (has anyone seen the RAM footprint of Dev Studio 2008) and running a mountain of VMs simultaneously. Some of the apps that run under our apps are huge memory hogs (namely dedicated gaming servers, rendering simulators). Our test environment is fully automated and we use VMware Server on x64. I've got 2GB in all our laptops, 4-8GB in our desktops and 16GB in our servers. Try booting a dozen virtual machines on a 32-bit box (even with a custom HAL). Plus x64 is rock solid and running 32-bit apps under it mean we can isolate them under separate WoW64 instances (like the good old days of OS/2 Warp allowing you to run Win3.11 apps and if they die they don't take the OS with it). Memory is dirt cheap, driver support can be problematic but choose your hardware carefully and be prepared to optimise (eg. a lot of the rendering we do is now in the GPU based rather than CPU based). 64-bit rocks :)
That problem is already fixed through a Vista Hotfix. It's a requirement for installing the Crysis Demo on Vista.
I've been using Vista x64 for about 4 months now. Everthing is running great Development + Games.
Wouldn't the "Look how huge 64 bits is!" graphic have more impact if it had a linear scale? The logarithmic scale gives the impression that the size is merely doubling every time, which is absolutely _not_ the case.
"That rise to 5% market share's really going to your head, isn't it? Just wait till it's 90% and everyone's talking about how only losers use Linux, with all those viruses people write for it now it's being used by non-geeks who can't protect their kernels."
...and why would they want to protect their kernels when they don't get viruses anyway, Vista *finally* appears to be relatively immune from virus attack because the user runs as a user, this is what Unix systems have been doing for 30 years!
Note all of the attacks on Vista machines have either been aimed at drivers or apps that run as (or partly as) administrator, these are the same attacks that Unix systems have been quietly coping with without anti-virus software
64 bit software besides being able to address more memory (and so allowing it to be faster because of caching) are their any benefits for the lowly ordinary users who have just upgraded to 2G. Besides gamers (who always need cutting edge machines) render farms (who need as much memory/disk/CPU cycles as possible) and developers (who seem to need impressive machines to write relatively simple software ;-)) who actually needs this?
Wouldn't the "Look how huge 64 bits is!" graphic have more impact if it had a linear scale?
Sure, if you can find a way to fit it on *any* screen.. I couldn't! The difference is truly astronomical.
Right now I have these apps open: IE, Firefox (a memory hog), MS Word, few instances of file explorer, few instances of Wordpad, SQL Server and enterprise manager, Excel, VS 2005 (another memory hog), my RSS reader, Outlook, a couple CHM files, Yahoo desktop search, Dreamweaver (another memory hog) and 3 more specific apps. My peak memory never exceeded 3gigs.
I am thinking it's going to be very rare that one needs to switch to an X64 to make good use of 4gigs of RAM because going above 3 gigs, if 3 gigs is XP's limit, is a far away reach.
My advice to everyone is to install an x64 OS as second boot and install all your apps and hardware drivers again under that OS and make sure everything works. Do not upgrade before doing such a test. I am wary myself that everything will work fine. Personally I have a ton of software and hardware installed and I am not going to risk it. My XP runs fine and reliably. If it works, don't fix it. I haven't reached a point where my OS is starving for memory regularly. Occasional disk swapping for virtual memory is not yet a big concern.
@ Peter Juhasz: Multimedia. Software synthesizers, multitracking and mixing. There are a lot of cheap audio software packages out there that do not require an insanely spec'd workstation, but they'd still benefit from the memory because it allows you to load big sample libraries entirely in memory. Ditto with making movies; there are already consumer-level HD cameras and compositing or adding effects will benefit from more memory.
Content production is ramping up; while a DSLR was the domain of professionals a few years ago, amateur enthusiasts are getting access to them now, and working with material like this, fast, requires extra horsepower.
I recall a POV-Ray render of 640 x 480 I got on a CD - the origin was someone's BBS. It took 3 or 4 days for a 386 to churn it out. I think that anyone who wanted to do this more than once would look into getting some extra juice so you wouldn't have to wait for that long; even though the times are now vastly reduced (or the scenes vastly more complex).
I'm not so sure there won't be a shift to 128-bit in our lifetimes. I believe there are platforms that already use 128-bit addressing. If I remember correctly, on these platforms each 128-bit memory address is unique across all machines running that platform, so a running process can be shifted across to another machine without missing a beat. Obviously, that's quite a feat, but I wonder if the same idea won't be used elsewhere in some form --- it would certainly avoid some problems if handles or memory addresses were never recycled for different objects or memory blocks. Of course, we're talking about virtual addresses rather than physical ones, but that's what the page table is for.
I didn't read all posts but looks to me no one has mentioned yet about shell extensions. I used to have lots of 32 bit shell extensions e.g always on top on right click of a taskbar tab, open command propt at current directory, vtf viewer, MD5 checksum change file date and time. These although possible to do require quite a bit of work from what I have read to convert to 64 bit. I am on 64 bit and miss them hopefully they will be modified for 64 bit as more people move to it.
nearly forgot my absolute favourite although this doesn't work on 32 or 64 bit vista due to api changes it is folder size. This gives the folder size as an explorer column so you don't have to right click properties on a folder or tooltip over to find its size.
When I got my OEM Vista Ultimate 32-bit I was so tempted to go 64-bit,
but I was worried about driver/app compatibility.
I started with Vista 64-bit when it first came out. I do mostly software development which means that I use somewhat specialised tools that most people wouldn't ever need. This is a good thing for them, because 64-bit Vista drove me nuts. Nothing that I needed worked on it, and various vendors that I contacted about it indicated that they currently had no plans for 64-bit support. That's not just debuggers, monitors, system utilities, and so on, but also printer drivers, the data cable for my cellphone, my GPS, ... . I stuck it out for nearly nine months, but eventually threw in the towel and went back to XP. It was like going back to an old friend (which I realise isn't said often about Windows :-), everything just worked again, things I'd almost forgotten about were back, it was great. I really, really wanted to give Vista a chance, but found the 64-bit version unusable, at least for what I wanted to do with it.
Moral: If you're planning to move to 64-bit Vista, set up your machine as dual-boot to XP just in case.
Why else 64-bit applications are faster than 32? That's why:
BOOL WINAPI SetFilePointerEx(
__in HANDLE hFile,
__in LARGE_INTEGER liDistanceToMove,
__out_opt PLARGE_INTEGER lpNewFilePointer,
__in DWORD dwMoveMethod
This is how a file pointer is relocated to a needed place in a file in Windows. LARGE is 64-bit.
The point here is that all the file access is already 64 bit. Even a DVD image file is larger that 32 bit addressing allows. Since application does use files they should benefit from 64 bit addressing.
I've been running Vista 64 on my home machine, and I was thinking of using my MSDN copies of Vista to replace the 32-bit Vista copy that came with my Dell XPS m1330.
Have you run into any issues running Vista 64 on your m1330? I'm curious if 64-bit drivers are available for all the devices in that laptop. If so, I may do that upgrade this weekend.
One of the major problems with 64bit is you can't do mixed mode (.NET/native) debugging. The ability to step from .NET to native code is a nice feature that I don't wanna give up by going to 64bit.
So I have a question for you Vista x64 bit adopters,
How many of you run with multiple monitors running under Vista and what are your experiences? I switched from XP to Vistax x64 in a development environment and I was pleasantly suprised. Untill I actually booted up my machine. Evidently, Nvidia (my video card) will no longer support nView in Vista, a godsend for those of us who like to use multiple monitors as a single virtual monitor. Anyone else have this problem? Any good workarounds?
There is one major problem I've been running into for the last month I've been running Vista x64 - VPN Clients.
You'll notice that even Vista support is limited - and x64 support is almost non-existent.
I will point out that everything else about my experience developing software on the Vista x64 has been great. I just can't work remotely.
Wait, is this really a problem :)?
@Gary: I suggest Ultramon. I use two discrete monitors (laptop + desktop LCD with different sizes) but it works great for that.
To the quantum physics guy:
You're right we probably won't ever see 2^128 bytes of RAM attached to a machine. (don't quote me tho, I don't want to be infamously quoted far in the future).
But I dont think a 128 bit address space is that far fetched. One of the useful things about a large address space is you can carve out blocks and give them meaning, even though every byte of the block may not be needed or used. One example is 32bit windows dlls. They load faster when they can use their hard coded addresses as no fixups are required. Another example would be giving hardware manufacturers blocks of IO space.
I definitely agree with you and I think software development side of the computer field is not really going as fast as the hardware development sie. A lot of users are choosing 64-bit systems but thereis not that much software for them.
As about the software, I've found a website where I can find at least something.. - www.64xsoft.com but again not too much.. I'd prefer to upgrade all my software to 64-bit, but there is no chanse to do that...
Thank you for the article, it was really usefull for me.
I wish they(development companies) finally start thinking about 64-bit users as well...
been using Vista x64 since the beginning. no issues at all. i even run vista x64 exclusively on my work laptop which is a 17" Macbook Pro. driver support is good.
the perfect development combination is Vista x64 + VMWare Workstation for Server 2003 images. there are no longer deployment surprises such as opening ports, configuring DTC, etc. i allocate 384MB of RAM to each server: IIS, SQL Server, App Server. i highly recommend using virtualization as i have run into little quirks with Visual Studio 2005/2008 on Vista x64.
for Cisco VPN, I use a 256 MB Windows XP VMWare image. i use that to remote desktop in and transfer files. we use Subversion instead of SourceSafe so I don't need a vpn tunnel to access source.
ultramon works great with Vista x64 for multiple displays. for a virtual desktop try the fast and open-source virtuawin.
Note that the scale is logarithmic. - did you means exponential ?