The Borg Collective

posted Aug 29, 2013, 11:59 PM by Samuel Gomes   [ updated Feb 12, 2014, 5:57 PM ]

Dear reader. Please understand that I am only writing this blog to make people more aware of the technical nature of the topic and spur some meaningful discussion. My statements here deal with operating system intrinsic and design. If you do not possess the core knowledge and fundamentals about the working and design of operating systems, you may still continue to read and ask questions. Meaningless, stupid, unintelligent comments/sarcasms/statements will be ignored. Moreover, you will be insulting yourself with such actions. I do not accept responsibility for any effects, adverse or otherwise, that this blog may have on you, your devices, your sanity, your dog, and anything else that you can think of. Read it at your own risk. Now with that off my chest, let's begin…

There are times when I fling out my Windows Phone 8 device out of my pocket, few of my friends and colleagues go… why? Why Windows Phone 8? Why not Android? To all of them, my standard response is that I do not want to become a part of the "Borg Collective". Well, that is the short answer. Here goes the long one…

Please understand that as a company, Google creates excellent services like Google Search, Mail, YouTube, Maps etc. However, in my opinion, Android and Chrome OS are not one of them. They are just a bloody mess with sugar coating on top. I am not against the open source community either. I love and use Ubuntu. I even test Haiku OS and ReactOS on my machines. My primary machine runs Windows 8. Let's face it. 91.56% of our computer using population runs Windows on their machines and I am no different. I am not a Microsoft fan boy either. I do understand where they came from and what they are doing. However, simply dismissing their product on their previous reputation and not being open source is stupidity. Open Source operating system has been with us for a long time now and they have really done wonders in the enterprise world. However, in the consumer world, Windows became a hit for a reason. It makes your system more usable and allows a variety of standard ways for a developer to leverage the hardware for his or her applications. Yes, there are plenty of ways to screw up Windows. You could install anything from the internet that flashes and is free. And for people who do that, we have something called as "Tech Support". Microsoft is trying to change all that with Windows Store apps like Apple with its App Store and Google with Google Play.

First of all, the "Windows" the general public knows is actually just one part of the modern Windows NT operating system. They usually mean the Win32 subsystem, a layer that sits upon the NT kernel, providing the user and application interface. NT is still around, known as XP, Vista, 7 and 8. Most people think of "NT" as "WinNT 4", while in reality the term NT refers to the NT series, which ranges from version 3 over NT5 (2000, XP, 2003) to NT6 (Vista, 2008, 7, 8). The NT architecture was designed by a team lead by David Cutler, a former lead developer of VMS. It took them more than 4 years to combine the best of UNIX, VMS and OS/2 and create the NT architecture.

The ReactOS team has been around for ages trying to clone the Windows NT architecture. However, reverse engineering anything for which you do not have the source code is hard and time consuming. Still, ReactOS is in beta; 0.3.15 as of this date and is fairly useable most of the time. I for one what to see this project succeed so that the world gets an open source replacement for Windows. The ReactOS project re-implements a state-of-the-art and open NT-like operating system based on the NT architecture. It comes with a WIN32 subsystem, NT driver compatibility and a handful of useful applications and tools. ReactOS combines the power and strengths of the NT kernel - which is well known for its extensibility, portability, reliability, robustness, performance and compatibility - with Win32 compatibility.

Mac OS X, Linux, BSD and other UNIX derivate share a common heritage based on a more than three decades old design of a simple basic operating system, which has evolved over time into a complex structure. Modern incarnations like Mac OS X put a fancy graphical user interface on top of UNIX/Linux, to hide system details, but focus mainly on beginners, and many advanced users are left out in the rain, as most advanced features cannot be accessed from the graphical user interface. Almost all UNIX flavors retain some of the original design flaws and binary compatibility between various versions is usually non-existent. In theory there are a few UNIX standards like POSIX but in practice the standards are old and cover only the basic operating system and the terminal environment. Other standards such as the Linux Standard Base are often not implemented faithfully. As there is no user interface standard or a standard API, most people still have to use command line applications or fight through the GUI mess. Many UNIX derivate uses the de-facto standard X-Window system for graphical output, which might well possess one of the worst designs in software history. Still, modern UNIX derivate are trying to catch up with recent innovations and some of them already possess important features like access control list support.

What about Android?

Android is a Linux-based operating system designed primarily for touchscreen mobile devices such as smartphones and tablet computers. Google uses a modified Linux kernel for Android. In the recent past Google has not played fair with the open source community. Read
this and this. However, Google has since learnt that you must give back from where you once took.

Apart from the Linux kernel and few libraries and tools, there is nothing that is same with Android and other Linux distributions. Almost all Linux distributions uses X as the software system and network protocol that provides a basis for graphical user interfaces (GUIs). Android does not do that like iOS and MacOS X. This, by the way, completely thrills me. Both Apple and Google realize that X can be a real bitch to deal with. Ubuntu has already announced that they will move away from X gradually and most probably embrace the Wayland display server.

Google (Android Inc.) did all the right things when it came to taking the best of all open source technologies and putting it together. But, the choices that it made while deciding the application runtime layer is what ticks me off. Dalvik VM and Java! Java bytecode running in the Dalvik VM. Are you FKM? Now, I understand why Google did it. Even if Android where to be re-compiled for a different architecture, your apps would still run because the Dalvik VM will take care of the translation. And if you wrote a pure Java app, Android will take care of the Java bindings automatically. This is a big plus. But what about performance? Simply put, it sucks! Yes, I already hear people screaming that Dalvik has JIT and you can do JNI. More on that later. Dalvik is a register-based VM. The relative merits of stack machines versus register-based approaches are a subject of ongoing debate. Read
this. As of Android 2.2, Dalvik has a just-in-time compiler. However, tests performed on ARM devices by Oracle (owner of the Java technology) in 2010 with standard non-graphical Java benchmarks on both Android 2.2 (the initial release to include a just-in-time compiler) and Java SE embedded (both based on Java SE 6) seemed to show that Android 2.2 was 2 to 3 times slower than Java SE embedded. Java SE is a stack-based VM. Read this. In my opinion, Google should have never used Dalvik and the Java language from the start and thus could have avoided the litigation they have with Oracle.

What about JNI? WTF? People use native code for better performance or reusing existing code. And that is exactly my point. Why should I have to use C/C++ and Java for my project? That to me is just creating Frankenstein. Moreover, running native code is complicated by Android's use of a non-standard C library (libc, known as Bionic). And unlike the Java application development based on the Eclipse IDE, the NDK is based on command-line tools and requires invoking them manually to build, deploy and debug the apps. Yes. I know there are 3rd party solutions. But why 3rd party in the first place? If Google had made the Android runtime in native code, we could have all used a GCC based SDK (which does a brilliant job at generating optimized native code). Microsoft solved this problem for WinRT apps by allowing Visual Studio to generate different binaries for different platforms at the same time from the same source code. Google should have done the same. I think Java programmers are sloppy. No offence (if you are a Java programmer), but the language and the runtime will convert you to one. Imagine, allocating a big chunk of memory and not releasing it. A C/C++ programmer will shudder at that thought. The Java guy will just say - "The GC will take care of it!" What? Imagine playing a first-person multiplayer shooter game written in Java. Just when you are about to frag that guy for a win, the GC kicks in! Damn! This is exactly the reason why all big budget games are written in C/C++ by real programmers. Then again, Java programmer can be found at the drop of a hat. There are too many of them. The fact is Java programmers have no programming discipline. They are like drivers who have to be constantly monitored by a police (VM). A C/C++ programmer on the other hand knows what to do when he sees the traffic lights… he follows the rules! There are many things that you can do in Java and get away with it. I guess that is why you can ship your software faster. No wonder it is called RAD. But RAD is not a solution for everything. Windows Phone is not perfect either. I hated Microsoft when they said that all apps on Windows Phone 7 had to be written with Silverlight and .Net. That meant I had to use a managed runtime. However, at least on the Windows Phone 7 the CLR (read .Net VM) is a mature and optimized one. Then Windows Phone 8 came along and Microsoft did the right thing. They allowed apps to made using native code only and easily harness the power of DirectX. Something that most Windows programmers are aware of. Microsoft still retained the ability to run Windows Phone 7 apps on Windows Phone 8. Getting the hint Google? Heck even Apple knows this! All iOS apps are native code. No wonder all Apple devices feel so fast and fluid. The same can be said about Windows Phone. On two similar spec devices power by the Qualcomm Snapdragon S4, Windows Phone feels faster and works better.

There are people who say all devices in the world will start using Android. Seriously man? What are you smoking? If you say only tablets and smartphones, I might just believe you. But do you use a tablet or smartphone to finish that report for your boss? Can you run an office with Android laptops and desktop? I dare you. I recently had a chance to use an Android 4.2 based TV stick from Rikomagic. The setup experience was absolutely horrible to say the least. I had to go through multiple huddles to get my external hard drive working. Android does not make use of the external keyboard properly. For instance, there are almost zero keyboard shortcuts and tab key does not move the focus on the screen. There are almost no configurations options for an external mouse. My Microsoft LifeCam does not work at all. All these things worked beautifully right out of the box on an Ubuntu system. I am now considering putting Ubuntu on the device and using XBMC as a Media Center. As of now Android is only good on a smartphone or tablet and I am pretty sure that is where it is going to remain for the next 20 years unless Google starts doing things right. There are people who call Android a mature operating system. Mature? The only thing mature about it is the Linux kernel. Google, you still have lots of homework to do.

The next mistake Google did when it released the Android OS was to invite everyone to the party! I am taking about Google Play. Any Tom, Dick or Harry can be an Android developer and submit apps to Google Play. There are tons on apps on Google Play but more than half of it is junk. Even worse, some are malware. If you are using an Apple or Microsoft device (running Windows Phone/RT), you can be rest assured that you will not get any malware. Almost no anti-malware program exists for Apple and Windows Phone/RT devices. Now, can we say the same for an Android device? Only recently Google has started to be stricter about what a Google Play app can or cannot do. They should have done this from the beginning. And giving people access to install apps from "Unknown sources" is a bad idea Google! People say Android has many free apps. Oh yes. It does. But when you install and run those, half of the screen is filled with ads. Many times these are in-your-face types! This is exactly what you are good at Google. We all love your search engine, YouTube and Maps!

No two Android devices will work and feel the same. Take HTC and Samsung Android devices for instance. You are using an HTC phone and decide to buy a Samsung. If you have not used a Samsung phone before then get ready to spend your quality time with your phone for some time. On the other hand a Windows Phone user feels right at home even when the hardware manufacture changes. The worst part is that Google allows the hardware vendors to change/modify many parts of the Android UI which effectively kills the real Android experience. Most hardware vendor seldom updates these customized parts of the OS. You are at the mercy of the hardware vendor to get the next version of Android on your device. On Windows Phones, all OS updates are pushed directly by Microsoft.


I really like now the Ubuntu Phone/Tablet (Convergence) platform is shaping up. I was taken aback when Canonical announced that they were switching from Wayland to Mir. However, it is not X. Thank god! I really hoped that Wayland would be the way to go for all Linux distributions. What is really interesting about the platform that it allows you to write native apps including HTML5 + JavaScript. This is what Google should have done with Android to start with. The ability to use full OpenGL on this platform also sounds yummy!

Enough rants for today. I might update the blog with additional stuff as and when they cross my mind. In the meantime, you may put your comments below.

Windows 8 x64 RTM on my desktop computer

posted Sep 2, 2012, 1:25 PM by Samuel Gomes   [ updated Nov 9, 2012, 12:42 PM ]

Just posting a few screenshots of Windows 8 RTM running on my desktop computer... Enjoy! :)

Nocturnal adventures with my Inspiron 1520 after a glass of Cuba Libre

posted Apr 15, 2012, 8:58 AM by Samuel Gomes   [ updated Sep 2, 2012, 1:34 PM ]

After my 2 year old son dropped my Inspiron 1520 from the bed, the system had to go though an extensive makeover. The bottom plastics, palmrest, speakers, hinge cover and hinges were all replaced. Thanks to the Dell Complete Care warranty. The onsite technician was at my house the very next day I contacted Dell about the issue. Excellent service by Dell! The service also happened to be the last one for the 3 year warranty that I had on the system. One week from then the warranty expired. The system worked well after the service. About 3 months from the incident, my laptop developed a strange problem. The system would work well if it was sitting in one place. If I touched or moved the system, it would either restart or freeze. As the notebook was already out of warranty, there was nothing much that Dell could do. This also happened to be a hardware issue and troubleshooting software made no sense. The problem became worse when the system started to shutdown and freeze more frequently. This continued till the point where the system was completely unusable. The system would give random blue screens and PSA diagnoctics would either pass or give ramdom error messages.

PSA Error

After trying all possible troubleshooting, I came to a conclusion that something was wrong with the motherboard. Then the inevitable happened. The system completely died. Pressing the power button would just make all the lights flash with nothing on the screen. Then, one stormy weekend night after making myself a glass of Cuba Libre, I decided to check the system for one final time before "putting it down".

Inspiron 1520 ripped

I disassembled the system. It was a cakewalk. In the picture above you can see the various components.


After removing the video card I found this (see the picture above). Observe how the soldering came loose. I am sure this has to be a result of the drop and the close proximity of the component to the video card. The heat from the video card might have made the soldering to become brittle and eventually snap.

So I pulled out my soldering iron and got ready for an old-school "smoking lead experience".

I finally managed to solder the damn thing into place. Please note that this is not the best of equipment for the job and I am not the marksman. However, I did the best I could. And bingo...

I have a fully working system now. In fact, I posted this blog from the same system. :)

Permanently overclocked my NVIDIA GeForce 8600M GT (Dell Inspiron 1520)

posted Apr 3, 2011, 4:28 AM by Samuel Gomes   [ updated Aug 4, 2011, 6:18 PM ]

I seem to be on a Video BIOS flashing spree. First my Radeon HD4670, and now my GeForce 8600M GT. Well, I thought of breathing new life to my ageing Dell Inspiron 1520 and here is what I accomplished...
I downloaded an overclocked version of the NVIDIA GeForce 8600M GT BIOS from techPowerUp. After a few hours of research I came to a conclusion that 590 MHz core clock and 490 MHz memory clock should be stable on this system. For some strange reason Dell used DDR2 memory for the video cards on the Inspiron 1520 which severely bottlenecks the performance. Overclocking the memory and the GPU can do wonders. I downloaded the Windows version of the NVFlash tool from techPowerUp and flashed the OCed video BIOS. Below are some screenshots of GPU-Z and Windows Experience Index. Notice how the subscores for graphics has changed. :)
GPU-Z with old BIOS
GPU-Z with new BIOS
Windows Experience Index with old BIOS
Windows Experience Index with new BIOS

Permanently overclocked my Sapphire ATI Radeon HD4670

posted Apr 3, 2011, 4:15 AM by Samuel Gomes   [ updated Apr 3, 2011, 4:21 AM ]

Permanently overclocked my Sapphire ATI Radeon HD4670! How? Ripped the video BIOS using GPU-Z. Downloaded a factory OCed BIOS for a Sapphire HD4670 (Core Clock 775 MHz, RAM Clock 1000 MHz) from techPowerUp. Opened both BIOSes using RBE (Radeon BIOS Editor) and copied over the original BIOS identifier to the downloaded one. Adjusted fan speed setting in the new BIOS to linearly scale with the temperature. 0 C : 0% speed - 100 C : 100% speed. Save the new BIOS and flashed it using ATIFlash. Restarted system. Result: Old GPU Core Clock = 750 MHz, New GPU Core Clock = 775 MHz. No overclocking software required. Damn thing overclocked in hardware! Did not stop there though. Enabled AMD OverDrive in CCC and set the damn thing to auto-tune. Turns out, the card is stable at CC of 790 MHz and RC of 1130 MHz. Finally called it a day!

My Dell Inspiron 1520

posted Feb 21, 2011, 2:53 AM by Samuel Gomes   [ updated Sep 26, 2011, 6:29 PM ]

Okay. This one was long overdue. I know I am lazy! Well, here it is... I bought this system somewhere during 2008. I have'nt had any major issues with the system apart from 2 hard drive and 1 optical drive failures. I'll give the benifit of the doubt to Dell here as both of these are mechanical parts and are subject to wear and tear. Anyways, lets start with the processor, memory and motherboard first.
Most of my R&D happens on this system now as I have given up my old P4 system. Let's take a look at the system graphics department. Shall we?
Okay. That is the NVIDIA GeForce 8600M GT. Couple of things that you want to take note here is that the video memory is standard DDR2 running at 400 MHz. This is something that Dell should not have done as it really hurts the GPUs performance and become a bottleneck. Nevertheless, the system can easily pull off games like TrackMania Sunrise Extreme at max settings (without AA). My system was ordered with an LCD panel running at 1440x900. So things look kinda nice. I ran Windows Vista Ultimate x64 on this for a while and switched to Windows 7 Ultimate x64 (beta, RC, RTM... as and when it became available).
As always, I use 64-bit operating systems only. This is the only way you can use the full potential of your x86_64 processors "long mode" performance improvements.
As you can see, this system get a Windows Experience Index of 4.7. Which is not bad at all. The bottleneck here is the hard drive. Which is pretty obvious as it runs at 5400 RPM only. Below are some more interesting stuff.
Well, that's all there is to it. Next up is my custom made all AMD desktop, where I do most of my grunt work. Hopefully, this will be sooner than you think. :)

My old P4 Desktop Computer

posted Jun 18, 2010, 7:56 PM by Samuel Gomes   [ updated Sep 26, 2011, 6:31 PM ]

Okay. This is a custom assembled system that I assembled years ago. It looks pretty decent... black being the consitent color all over. Lets see what we have here. Lets look at the processor, motherboard and memory details first. 

Kinda outdated by today's standards? It sure is! I use and abuse this system for my R&D. So that is OK. Lets look at the
graphics card details now.


Ah. The trusty old NVIDIA GeForce 7300 GT. Outdated you say? It sure is! Who cares. But it still works. Now lets look at the "other stuff".

Yes. This is Windows Vista Ultimate x64 Edition. Sooner or later the entire world will move to 64-bit computing. Why disgrace the 64-bit processor by running a 32-bit operating system on it? Biased, you say? I consider myself a 64-bit computing evangelist! Make the change today. The choice is yours!

Windows Experience Score of 3.7. Not bad at all considering the age of the system. As you can see... the major hiccup here is the processor. Moral of the story: Pentium 4/D processors are dogs! Don't buy one. Buy a room heater instead!

Here are some more amusing stuff. 

Did you notice those multiple CD/DVD drives? Yes. Those are virtual drives, except two. Notice the two sound cards? Yes. I use them both at the same time. Also notice how I installed all 64-bit drivers for Windows Vista x64. Which brings us to the conclusion... "Seek and you shall find. Ask and you shall get."

NB: I don't have this system anymore. I sold this junk and bought myself a 22" Samsung LCD. :)

Legacy Free PCs

posted Jun 4, 2010, 5:46 PM by Samuel Gomes

Many people don't know it, but today's PCs--including the system you're using right now--contain elements that have hardly changed at all in the last 20 years. Yes, CPUs are faster, hard drives are bigger, and RAM banks are larger. But in many fundamental ways, your PC isn't very different from the PCs of two decades ago.

Although some of the system elements have been modified over time, almost everything in your PC is a direct lineal descendent of the IBM PC AT--a seminal design that still shapes PC architecture two decades later.

In many ways, the PC's hardware consistency over time has been a good thing, a stabilizing force in the otherwise rapidly changing world of computing. It's been a huge positive for businesses and users because this consistency has made many peripherals completely interchangeable. For decades, we've been able to mix and match printers, keyboards, mice, monitors, scanners, modems, and more, largely without regard to the brand of PC.

Hardware standardization also has helped the bottom line by driving down prices: System and peripheral vendors have had a vast and uniform market from which to draw supplies, and to which to sell products, resulting in the commodity-level pricing that's behind today's amazingly low hardware costs. Overall, the PC AT's legacy has been an enormously positive one.

But it also has had a downside, principally in retarding innovation and slowing hardware advancements. The installed base--that is, the mass of existing, older, in-use hardware--acts like a giant speed brake on the computer industry because businesses and users are loath to give up older equipment that's still functional, even if newer designs would perform better or faster. As a result, new technologies tend to emerge piecemeal and more slowly than they would if hardware vendors could make a clean break with the past.

There's even a joke that made the rounds of the computing industry awhile ago: "Why was God able to create the universe in only seven days? Because he didn't have an installed base to deal with."

Despite this backward drag from the installed base, the Grail of many hardware engineers has long been a totally "legacy free" PC that can employ only fully modern, state-of-the-art, high-speed components and architectures. Such a PC would be faster, more compact, more reliable, and less expensive, as well as easier to manufacture and maintain.

- Fred Langa

The IBM PC shipped in 1981 -- over twenty years ago. The PC offered various expansion capabilities, including a parallel port, a pair of serial ports (on a separate card), and a keyboard port. It also supported a 5.25" 160KB single-sided floppy disk. The 8-bit PC/XT bus slots were expanded to 16-bit ISA slots in the PC/AT in late 1984. Later, IBM shipped the PS/2, whose enduring legacy in the PC universe today is a pair of compact connectors for the keyboard and mouse, and the 3.5" 1.44MB floppy drive.

Recently, I dug out an old Northgate Omnikey keyboard that's been gathering dust in my storage area. It's at least ten years old. I plugged in a PC-to-PS/2 keyboard adapter. It still works. This type of backwards compatibility has been the great strength of the PC over the years, but it's rapidly becoming an Achilles heel. Various factors have kept these anachronisms in place, such as corporate IT shops that need to support parallel and serial ports, or users with a pile of floppies that contain valuable data. We've even seen an ISA slots in a few new systems. And no doubt the Super I/O chip is still using ISA signaling to support legacy I/O.

So it's no surprise that a company like Apple Computers can push interesting new technologies into their hardware and software more quickly than PC manufacturers. But the buzz over the "legacy free" PCs is starting to heat up. It began several years ago, with both Intel and Microsoft encouraging PC makers to move away from legacy connections. Back then, the pleas fell mostly on deaf ears, but it's beginning to look like the industry is ready. Dell is starting to ship USB keyboards, Gateway will pay you to delete the floppy drive, and at least one component company -- ABIT -- is shipping a line of "legacy-free" motherboards.

What do we mean by "legacy" here? Specifically, we're talking about a set of I/O options that have been part of the PC architecture for a long, long time.

If you look at "legacy-free" meaning a system that eliminates the entire kit and caboodle of this table, then we're still several years out. PCI and AGP will be around for at least two more years before PCI Express surfaces in force. Even then, don't expect systems to get rid of PCI slots anytime soon. Parallel IDE hard drives will probably be around for a couple more years, but will gradually give way to Serial ATA. Similarly, parallel SCSI will yield to serial-attached SCSI.

Actually, we shouldn't forget that the VGA port is also a legacy standard. In fact, VGA hardware is the only remaining piece of hardware that interacts directly with Windows. There is a move afoot to eliminate VGA, called the Universal Graphics Adapter or "UGA". The firmware-based UGA functionality will be accessible via a UGA driver built into the next version of Windows, codenamed Longhorn. If the graphics chip makers consider removing VGA at that time, we could be completely legacy-free.

- Loyd Case

Did you know that the latest Intel Macs are actually "Legacy Free PCs"? In case you do not believe me, do a Google search for any Intel Mac's hardware specifications.

Why a Legacy Free PC?

Three words... simplicity, stability and evolution.

How can we define Legacy Free PCs now? Here is an overview:

Must Have

x86-64 Processor





Must Not Have





Serial Ports

Parallel Port

PS/2 Ports

VGA Port

Floppy Disk Controller

Game Port

Indeed, as system designers are freed of the constraints of the past, we'll likely see radical PC designs that will not only be faster, smaller, and better than today's designs, but that will make the traditional beige-box PC seem positively antiquated. And I, for one, can't wait!

The AMD Athlon™ 64 Processor Operational Modes (also applies to Intel chips with EM64T)

posted Jun 4, 2010, 5:45 PM by Samuel Gomes

Legacy Mode: Legacy mode is what the processor defaults to in basic 32-bit software. If you run a 32-bit Linux or Windows installation you’ve been running in legacy mode; if you’ve looked at any performance benchmarks it should be obvious that not only is the AMD Athlon™ 64 processor more than happy to run this way, it delivers excellent performance while doing so.

Compatibility Mode: Compatibility mode is designed for a 64-bit processor that still needs the capability to run 32-bit applications. The AMD Athlon 64 processor is capable of this natively, which eliminates the need for 32-bit emulation on the hardware level.

Long Mode: True 64-bit Long Mode is intended for a native 64-bit OS environment where the application is also running 64-bit. Windows XP Professional x64 Edition is capable of using both modes, and the AMD Athlon 64 processor is capable of switching between 64-bit Long Mode and Compatibility mode from within the 64-bit OS.

Why Move to 64-bit?

If 32-bit systems offer excellent performance now, why move to a 64-bit OS at all? It’s a question with a multi-faceted answer, depending on your needs.

Increased Memory Addressing: Standard 32-bit computers have a 4 GB limit on addressable memory. 64-bit computing offers a memory address limitation measured in exabytes that will open this potential bottleneck for years to come.

Increased Computational Power: The AMD Athlon™ 64 processor adds an additional eight General Purpose Registers and eight 128-bit streaming SIMD extension (SSE) registers. Using a 64-bit OS opens these for application-usage for the first time. Increasing the raw computation capability of the processor in turn opens the door to increased visual detail in games, unprecedented scientific modeling, and stronger performance across a wide range of applications.

No Performance Penalty for Running in 32-bit: Unlike computer platforms that emulate 32-bit or x86 compatibility, AMD64 architecture can execute such commands natively. The advantage to this is that even legacy 32-bit code is delivered with no performance penalty. When running in a 64-bit OS, Microsoft’s WoW (Windows on Windows) 64 mode allows the chip to natively access programs; in a 32-bit OS such as Windows XP (standard), no such operating mode is needed. This is in direct contrast to other 64-bit designs, which require emulation (and take a massive performance penalty for executing 32-bit x86 operations).

Moving to 64-bit will not universally improve performance in all applications and test suites, but it won’t degrade performance either. This logically translates into a scenario where the “worst case” possibility is performance equality, and the best-case is a significant performance boost. For end-user, corporation, or major business server, this is a universal win.

This information was taken from a document on

1-9 of 9