Picture this: you’re sitting down at your desk at home putting the finishing touches on your last minute essay (which of course you had valid reasons for leaving this late). You’ve just finished writing a brilliant conclusion to your somewhat hasty rhetoric and are doing a final read-over for stupid grammatical errors when your face turns blue. Well, not exactly. The blue originates from the sudden blue tinge to your monitor’s perfect baby skin. Just as you were hovering your finger over the save button your computer gives you the infamous “Blue Screen of Death” or BSOD for short. Then it reboots and you lose all your unsaved work.
Even if you’re not a uni student the Blue Screen of Death can hit you, whether it be at home while playing your favourite game or at work when you’re trying to meet that pressure deadline. If you use a Windows computer (like most of the world, excluding those pesky Apple fanboys) you are at risk of being hit with the Blue Screen of Death, and understanding the possible causes and troubleshooting steps is important to keeping your computer in one piece. Many a hapless computer have ended up in pieces on the pavement having been thrown out of high storey balconies by frustrated computer users.
The first thing to consider is any changes that you may have made just before your screen turned bright blue. Changes which might not seem significant at the time could be behind your computer deciding to blue screen. Some of these can include trivial things such as updating drivers, installing new drivers, Windows updates, or plugging in a new USB device. If you did perform one of these actions just prior to experiencing the blue screen, chances are that it may have been the culprit. The solution then is to undo whatever change you made and see if you get another blue screen; if you don’t, you’re in the clear! The way you do this can vary depending on what you did, and ranges from a simple driver rollback to performing a System Restore or booting into Safe Mode.
If you didn’t make any such changes recently, you may unfortunately have a bigger more complicated issue to deal with. Generally application or software crashes do not cause a blue screen so if it wasn’t an obvious change listed above, then you may have a hardware fault. Meaning one of the components that join together to form a computer – commonly hard drive but can also be RAM or motherboard component – has stopped working properly and is causing your computer to blue screen. Depending on the type of computer you have this can be a very complicated and/or expensive issue to resolve, and unfortunately it is often more time and cost effective to just buy a new one!
Ever wonder how big hard drives can get? Storage capacity limitations have increased in leaps and bounds since the advent of computing in the 80’s. Back then, you were stuck with boxes and boxes of 1.44MB “floppy” disks, each roughly the size of an average passport. Your internal hard drives weren’t much better: 100MB hard drives were the latest greatest invention and would drive your tech nerd friends wild with envy. Of course back then you didn’t need that much storage… there was no internet and no peer to peer sharing to put the strain on your storage capacity. Even games were nice and compact and could easily fit on one measly floppy. With the 90’s came fast internet, IRC file sharing, and newsgroups. Tech nerds with the savvy and know-how suddenly had access to vast stores of games, music, movies, TV shows, you name it. So the storage capacity field met their demands with recordable CDs and then DVDs, with 700MB and 4.5 GB worth of storage respectively. Discs were also more compact and easily manageable, and huge spindles of discs soon cluttered the rooms of nerds far and wide. Discs were accompanied by larger hard drives with 200GB capacity and more becoming the norm as demand drove prices down.
Enter the 2000’s, and the wonderful wonderful world of torrents. Although tech nerds did have the monopoly on torrents early on, everyone else eventually caught on and started downloaded gigabytes and gigabytes of free stuff. Again the storage capacity nerds delivered, with even larger internal hard drives breaking the 1TB (~1000GB) barrier, and even portable external USB hard drives you could take to your friend’s place with similar amounts of capacity. Nowadays 2TB hard drives are relatively affordable, even if (sadly) governments are cracking down on torrent usage.
Now, HGST (formerly part of Hitachi) are getting ready to release the Ultrastar Archive He10, which features a whopping 10TB of storage in a standard 3.5″ internal HDD. The cool thing about this hard drive is that it uses Helium gas within its internal workings in order to improve many facets of its operation. Generally hard drives have normal air rushing around their internal mechanisms, and this leads to many limitations on hard drive components. These are caused by the fact that air is dense and corrosive, meaning that the components inside hard drives have to be tough enough to withstand turbulence, and are limited in how fast they can turn (RPM). Helium is a fraction of the density of air, and this significantly reduces these issues and limitations. Combined with the other cool technological breakthrough in the Ultrastar Archive He10, shingled magnetic recording (SMR), HGST is able to offer you the largest standard size hard drive available today.
When you purchase brand name personal computers these days, you can pretty much expect to have the operating system come pre-installed with various utilities from the brand name company. There’ll probably be plenty of branding scattered throughout your operating system reminding you that you have purchased from this brand name. Perhaps you’ll find this on your desktop or even in the Windows splash screen (if you have a Windows 8 operating system). Key point: the brand name company you purchased your computer from can (and often do) make changes to your Windows operating system, but they are minor and do not significantly affect your usage of the computer.
Korean multinational giant Samsung have now been found to have bundled software with some of their laptops which interferes with Windows Update. Windows Update is a built-in feature found in all Windows installations which downloads and installs updates as needed from Microsoft. These updates address a variety of issues in Windows which can relate to performance, stability, security, and compatibility. For the average user who may not be overly computer savvy, it is generally convenient (and recommended) to leave Windows Update configured with the default automatic setting. This allows Windows Update to automatically check for and download updates in the background, and then install them either while you are asleep or when you shut down or restart your computer. This is all done without any user input, and is the easiest and most convenient way to keep your Windows installation up to date.
Brand PCs often come with their own “update” software which is supposed to keep your bundled software and certain devices up to date. Samsung’s version is called “Samsung Update” and is bundled with all its PCs. On at least some of its laptops, Samsung have included a component called “Disable_WindowsUpdate.exe” which changes the way Windows Update works on your computer. Despite the scary name, it doesn’t actually disable Windows Update entirely. Instead it changes the default automatic setting (described above) to “Check for updates but let me choose whether to download or install them”. This means that Windows Update will not download or install updates for you automatically.
You’re probably wondering why people don’t just change the setting back to automatic. The software doesn’t stop you from doing that, however every time you reboot your computer it runs again and changes the setting right back! The only way to stop this from happening is to manually remove the Samsung software. Samsung have since stated that it will be issuing a fix to stop this from happening, but the question remains as to why they bundled such intrusive software in the first place.
Computers nowadays are fast. Back in the old days of 286/486 you were the envy of all the tech nerds in your street if you could claim ownership of a 50MHz computer. “I’ve only got a 33MHz 286!” you could hear them wail. You only had one processor of course, so it was purely a matter of who had the faster processor speed. A Megahertz duel at dawn, mano a mano, number vs number, the winner easily distinguishable. These days multi-core processors are the thing, and pure clock speed won’t cut it as a measuring stick anymore. There are so many other factors like how many cores, cache speed, actual computation speed, the list goes on.
So the tech nerds today compare their Intel Core i7 to their friends’ AMD, but on a global scale the battle is fought to decide who has the fastest supercomputer. A supercomputer is fairly self-explanatory, and is basically a super-fast computer. The scale of numbers you are dealing with changes dramatically, and you start dealing with petaflops as a measure of computing speed. FLOPS refers to floating-point operations per second, and your average computer might be capable of gigaflops (billions of FLOPS). When you start dealing with supercomputers, petaflops become a necessary jump in terminology and measure quadrillions of FLOPS (or 1000 x 1000 x billion).
The 45th bi-annual TOP500 list of the fastest supercomputers in the world has produced a clear (and expected) winner: the Tianhe-2 in China. Residing in the National Supercomputer Center in Guangzhou, since its completion in 2013 the Tianhe-2 has topped the list every time. With a HPL Linpack benchmark score of 33.86 petaflops, the Tianhe-2 is comfortably superior to its nearest competitor the Titan at the Department of Energy in the US, which could only manage 17.59 petaflops. A paltry figure in this high-stakes game of who has the fastest supercomputer, in which the main players are, you guessed it, the US and China. Similar to the space race in the 80’s there’s a mix of national pride, prestige and rivalry behind this latest technologically driven race. Although China is clearly the top dog at the moment, the US still boasts the most supercomputers in the list at 233. Europe is also a big player on the scene with 141 supercomputers featuring the top 500.
Never one to be outdone, the US plans to strike back with a Department of Energy system planned to be released in 2017 or 2018. The $425 million investment is planned to yield a whopping 150 petaflops, which will place the US squarely in the lead. There is also the possibility of the development of a quantum computer in the near future, one which IBM has gone on record as impossible for current TOP500 supercomputers to even get near to outperforming.
Operating systems and PCs go hand in hand. Whenever you buy a new PC, it comes pre-installed with (most of the time) the latest Windows or Mac OS. If you’re a PC builder who likes to hunt for high quality cheap components with which to construct your own far superior PC, you need to install an OS to be able to use your new extreme gaming or computing rig. Operating systems are a necessary part of any personal computer, and in the past the release of a new Windows OS has meant PC manufacturers could rejoice. Assuming the marketing department did their job, everyone would be clamouring to get their hands on the latest offering from Microsoft, and the easiest way for the average consumer to achieve this end has been to buy a new PC with the latest Windows pre-installed. Thus PC manufacturers could sit back and watch their sales boom upon a new Windows release.
In the case of Windows 10 however, the story has been slightly different to what PC manufacturers would have been hoping. PC sales have actually slumped following the release of Windows 10, and it hasn’t been caused by the Microsoft marketing department not doing their job. This is evident in strong figures for upgrading to Windows 10, with over 75 million installations in the first month after release. The reason behind this change in fortunes for PC manufacturers is Microsoft’s new policy with Windows 10 of offering a completely free upgrade from users’ existing Windows operating systems. This is in stark contrast to Microsoft’s previous policy of selling new versions of Windows to consumers who wished to install the software as an in-place upgrade on their existing system. Microsoft have also invested a lot into making the upgrade process as painless and seamless as possible.
So now instead of having to pay for a painful far-from-seamless upgrade of your Windows operating system, you can upgrade for absolutely free to Windows 10 through a relatively painless process. No wonder PC sales have slumped; where is the advantage in buying a brand new PC? With ever more reliable hardware components too, it is a lot more likely nowadays that your PC will last you a good 5 to 6 years. Improvements in hardware performance (especially relating to CPU processing power) have slowed in recent years too. The last significant rise in Intel CPU performance was the Intel Sandy Bridge technology, which was released in 2011. Gone are the huge jumps in CPU power made obvious to consumers by number of processor cores. Back in the 2000’s, improvements in CPU power were clear and easy to understand: going from single core to dual core to quad core is an obvious progression for consumers to envision increasingly powerful CPUs.