Tech-Talkers http://www.tech-talkers.com Let's Talk Tech... 2010-01-15T03:35:51Z hourly 1 2000-01-01T12:00+00:00 Bandwidth Caps: Saving the Cable Companies http://www.tech-talkers.com/2009/06/bandwidth-caps/ 2009-06-25T08:26:23Z Tim Severeijns This is the second article in a series aimed at analyzing the motive behind the various bandwidth caps that have recently been enacted by numerous Internet Service Providers. In the preceding article, The Relative Cost of Internet Access, we looked at the differences in costs between various service tiers available from ... This is the second article in a series aimed at analyzing the motive behind the various bandwidth caps that have recently been enacted by numerous Internet Service Providers.

In the preceding article, The Relative Cost of Internet Access, we looked at the differences in costs between various service tiers available from two Internet Service Providers. For those interested in a simple recap, here it is: the cost of a broadband Internet connection in Germany is (slightly) lower than the cost of comparable connection in the United States. But not only is the pricing more attractive in Germany, the speed of the connection is superior as well. Excluding special offers and discounts, customers in Germany have no trouble signing up for an uncapped connection of 32 Mbit/s down, while U.S. customers, looking to spend no more than their German counterparts, are limited to only 6 Mbit/s.

So now the question remains: why the hell is an Internet connection so darn expensive in the US, and why are so many ISPs now considering, or, worse yet, actually implementing, bandwidth caps?

In order to answer this question, let’s take a brief look at which ISPs are capping their customers and to what extent they are doing so:

  • Comcast: 250GB cap on total bandwidth consumption per month.
  • Time Warner Cable: Toying with the idea of total bandwidth caps at about 40 - 75GB per month.
  • Cablevision: No explicitly stated cap; although some report that heavy usage is frowned upon.
  • Verizon FIOS: No cap, whatsoever.

These four — Comcast, Time Warner Cable, Cablevision, and Verizon — represent practically the entire high-speed ISP industry as it exists today in the United States; so much for competition, right?

Notice anything interesting about the various caps that these four companies are imposing on their customers? Here’s a hint: look at what else they’re invested in.

Comcast and Time Warner both own a variety of actual TV networks. More specifically, Comcast owns E! Entertainment, The Style Network and G4. Similarly Time Warner also owns a plethora of channels, networks, and even a studio or twoNew Line Cinema, HBO, TBS, Warner Bros., Cartoon Network, and the list goes on.

But why should it matter what else these companies do, as long as they can provide us with a digital connection to the outside world? Well, a major part of the problem is that because these companies have been allowed to expand into so many different direction (without any proper oversight from either the government, or the executives heading the boards), they are now so big that they simple too cumbersome to be able to adapt swiftly to the latest industry tends.

If you’ve set up a business model around television stations, networks, and programming for said media, what is the one thing that your business relies on? Television viewership!

And what’s the biggest threat to television at the moment?

Why, it’s the Internet, of course.

So, at the end of the day, it all comes down to traditional television programming versus diverse, à la carte Internet content.

Now, one might make the observation at this point that there really shouldn’t be a problem here, since Comcast and Time Warner are two business that are well invested in both of these markets; that is, they both offer Internet access, and then both offer television services. Though correct, this observation misses a critical point. Companies of this caliber are, relatively speaking, old gaints, who have over the years gotten very used to a single, very lucrative business model, which, at the end of the day, boils down to nothing more than the number of people tuning in. Internet service for these companies, in comparison, is only a small branch in a much larger business model. Additionally, offering Internet access is not, by any stretch of the imagination, the same thing as owning a TV network, let alone several of them. With ownership comes the ability to dictate and create content, which is not the case if you’re only acting as a doorkeeper to a vast realm of content and knowledge.

Being used to seeing the majority of their income stem from television based content, companies like Comcast and, to a larger extent, Time Warner are scared out of their wits; the Internet is stealing away viewers and they have no clue what to do about it. There now exist services such as Hulu and Netflix, which have led thousands upon thousands of people to drastically reduce the time that they spend in front of an actual television — mind you, they’re still spending a lot of time in front of a screen; they’re just not putting their feet up and leaning back.

Due to services like Hulu and Netflix, less and less revenue is streaming into the coffers of Comcast and Time Warner. They might be providing the Internet access, but that really is all that they are doing. They see absolutely no additional income from what the consumer actually does with that access.

The simple fact is that people are watching less and less actual TV, and from the perspective of those invested in both the Internet service and television service industries, the only real short-term solution to this problem is to limit the amount of time that customers can spend using online services such as Hulu and Netflix. This is simply due to the fact that under the current model there is far less money to be made providing content over the ‘Net than through the tube.  For a lot of providers, the easiest way of reducing the time spent with a browser instead of a remote is simply to impose bandwidth caps and/or increase the price of pulling in the bits.

Of course, Comcast and Time Warner would be the last to admit that they are behind the times and that their coveted revenue models are antiquated and may be approaching extinction. The excuse typically peddled by Comcast and its ilk is that there are users — and by their own admittance, way less than 1% — who use exorbitant amounts of data on a monthly basis, and that these customers, in doing so, are spoiling the party for the rest of us. Without getting too far off topic, let me just say that I have a very hard time believing that.

If customers can only consume so many bits per month, then, logically speaking, there should come a point at which they will be forced to stop using the Internet to watch their favorite shows. And if they can’t watch their favorite shows on the ‘Net, customers will have to revert back to watching television instead, which is exactly what the cable companies want — and need.

]]>
The Relative Cost of Internet Access http://www.tech-talkers.com/2009/06/the-relative-cost-of-internet-access/ 2009-06-24T08:27:00Z Tim Severeijns This article is the first in a series aimed at discussing the cost of Internet access around the world, as well as the implications of imposing ever stricter bandwidth caps on customers. Intuitively, one might assume that as technology advances and becomes more readily available that products and service should ... This article is the first in a series aimed at discussing the cost of Internet access around the world, as well as the implications of imposing ever stricter bandwidth caps on customers.

Intuitively, one might assume that as technology advances and becomes more readily available that products and service should become ever more abundant and ever cheaper. One would presume that in the most populous state in America, where the boundaries of technology are continually being probed and pushed back, that something as basic as an Internet connection would be relatively cheap, certainly no more expensive than a connection somewhere in Europe, right?

Looking at the numbers, however, this is not necessarily the case, and reasons for this pricing discrepancy are not immediately apparent. In order to understand why the prices are what they are, one needs to understand the who the players are, as well as the state of the industry as a whole.

But, before we get too far into the analysis, let’s start out by looking at the numbers.

Internet penetration in the United States is at roughly 45 percent (according to a census taken last year), and with the number of services and devices what use the Internet increasing daily, this percentage is bound to skyrocket in the coming years. That being said, however, the United States is still way ahead in terms of Internet penetration when compared to the rest of the world, and as such, the rate of adoption in the U.S. is not quite as high as it might be in some other, more rapidly developing, parts of the world. In fact, while the percentage of Internet users in the rest of the world is at around 20 percent of a given population, Internet adoption (in the rest of the world) is growing at a rate of about 395% — compared to a growth rate in the United States of about 228%.

Despite the high penetration, however, the cost of service in the United States is rather steep, even when compared to other developed nations. To illustrate my point, let’s take a look at the cost of a high-speed Internet connection in Germany versus the available offers from Comcast, the most popular Internet Service Provider (ISP) in the United States.

With a quick hop across the pond with Google.de, it doesn’t take all that long to find an appealing high-speed Internet offer in Germany. I found a rather appealing (as you’ll soon see) deal from a company called Kabel Deutschland (Cable Germany, for those who care).

For the purposes of this comparison, I’m only interested in signing up for an Internet connection; in other words, I’m not interested in television or phone service, nor do I want any sort of bundle. So, let’s take a look, shall we…

FlatEasy1kThe most basic service available from Kabel Deutschland at the time of writing is their so-called “Flat Easy 1000” plan. Although this basic plan isn’t likely to impress any Internet addicts, it nonetheless offer customers a download speed of up to 1Mbit/s, an upload speed of up to 128Kbit/s, in addition to a few more features (like 6 e-mail accounts, a free modem, et cetera). Again, this isn’t the most impressive package, but it’s probably more than enough if the computer isn’t the center of your world. Neglecting one-time installation costs and the like, the monthly cost for this service comes out to be €9.90 (about $13.15).

At this point, none of this is all too impressive; it is certainly possible to find a comparable deal in the United States. In fact, AT&T is currently offering a $10 per month plan for customers who have not had a high-speed Internet connection in the past 12 months. Ignoring special offers, however, AT&T cheapest offer is $19.95  for a connection featuring 768 Kbp/s up, and 384 Kbp/s down (with a 1-year contract); and Verizon offers 1 Mbit/s down, with 384 Kbp/s up for $17.99 (if you sign up for a 2-year contract).Verizon’s offer is arguably the closest to that of Kabel Deutschland, but it is still 36% more expensive than the German offer.

Where things start to get really interesting, however, is when one starts looking at the available offers for what has been coined “hi-speed Internet.” What becomes quickly apparent is that the cost of an Internet connection really skyrockets in the U.S. as speed increases, while in Europe the prices remain far more reasonable. Signing up for Comcast’s hi-speed connection quickly results in a monthly bill of (just) over $40 dollars. Now, that might be okay, depending of the speeds involved. Unfortunately, those speeds are not that impressive at all, especially when you start to consider the offers available in other first-world countries.

Although it is possible to attain speed of more than 6 Mbit/s, the vast majority of Comcast’s offerings revolve around a 6 Mbit/s down-speed. More to the point, Comcast is particularly fond of offering customers what they call “PowerBoost”; depending on the overall demand placed on its network, Comcast is able to offer customers access to a full 12 – or 16 Mbit/s (depending of the type of service), for short periods of time.

Going back to Kabel Deutschland, the situation is quite different. FlatComfortTheir top offer is for a package that includes a 32 Mbit/s connection, with a 2 Mbit/s upload, 60 e-mail accounts, and a free wireless router to top it all off. But, wait, there’s more! Smack-dab in the middle of the page, in red text, it reads: “Kabel Internet mit Flatrate ohne Limit.” Now, you don’t need to know a lot of German to know what that translates to. Unlike Comcast, Charter, and almost every ISP in between, Kabel Deutschland does not place caps on the amount that their customers can use. If such a service were to be offered here in the U.S., it would be “competitvely” prices at well over $80, to be sure. However, the Germans are offering all this for a mere €22.90 per month for the first year — that’s $29.60! Thereafter, the price increases to €29.90, which translates to about $42.10.

Although it would seem that after a year of service both Comcast and Kabel Deutschland charge about the same, one ought to consider that Kabel Deutschland’s offer is far superior in speed and value. Not only does the German ISP offer a better connection, but they also guarentee that customers can download (and upload) as much as they please, without having to worry about any hitting any bandwidth caps — the same cannot be said for Comcast, with their 250 GB cap.

Now that we’ve seen the numbers, there’s still the lingering question of why this price descrepency exists, especially when one considers that Comcast, and in fact most ISPs in America, is also imposing a rather unpopular bandwidth cap on their users. The reasons for and implications of these simple facts will be discussed in the next article in this series.

]]>
DTV Transition Delayed ‘Til June 12 http://www.tech-talkers.com/2009/02/dtv-transition-delayed-until-june-12/ 2009-02-15T01:41:48Z Tim Severeijns Barely a week into the new year, several stories broke suggesting that the National Telecommunications and Information Administration had run out of funding to provide any more coupons for the upcoming digital television transition. Responding to this news, the Obama administration began an immediate campaign to delay the transition, fearing ...

dtv-transition

Barely a week into the new year, several stories broke suggesting that the National Telecommunications and Information Administration had run out of funding to provide any more coupons for the upcoming digital television transition. Responding to this news, the Obama administration began an immediate campaign to delay the transition, fearing that as many as six million households would be unable to meet the transition deadline.

After a new bill was drawn up, it was quickly rammed through the Senate. Yet, in a surprising turn of events the bill failed to make it through the House of Representatives. Undaunted, however, by this initial defeat, the bill was resubmitted, and on the second pass through, it was voted into law — 264 for, versus 158 against.

So, now that the DTV transition date has been officially moved back, to June 12th, 2009, what does that mean for the average consumer?

Well, for one, there’s going to be an awful lot of seriously confused consumers. For months now, the air waves have been saturated with the public service announcements urging consumers to get ready for the transition to DTV. When the first announcements were made that such a transition was in the works, there was a tremendous amount of confusion regarding the exact purpose of this transition, how it was to be implemented, how it would occur, and whether or not services would be lost.

The first batch of public service announcements weren’t very helpful; only those who had read about the issue in the paper, or on the Internet, knew what was going to happen, and how they might be affected, if at all. However, as the advertising campaign continued, a little more effort was put into explaining, in lay man’s terms of course, what was supposed to occur and how it would affect those with “rabbit ear” antennas.

At least one thing all those commercials was consistent: February 17th, 2009.

That was supposed to be the date that all analog over-the-air broadcasts had to cease or be converted over to digital. That date, February 17th, 2009, had been on the books for well over a year, and as we entered the new year, it seemed as if everyone at least knew that something important was to happen on that day.

Not wanting to upset any of its constituents, the government quickly put a financial aid program into place:

Between January 1, 2008, and July 31, 2009, all U.S. households will be eligible to request up to two coupons, worth $40 each, to be used toward the purchase of up to two, digital-to-analog converter boxes. The coupons may only be used for eligible converter boxes sold at participating consumer electronics retailers, and the coupons must be used at the time of purchase. (Please note that these coupons will expire 90 days after mailing). Manufacturers estimate that digital-to-analog converter boxes will sell from $40 to $70 each. This is a one-time cost.

The point of these converters would be to take the newly mandated digital broadcast and convert it back to an analog signal so that it can be processed by older generation televisions that lack a digital tuner.

Those in favor of the coupon program argued that every household in America should be able to receive at lease some sort of a television signal, so that, in times of emergencies, everyone can be kept informed and safe.

On the other side of the fence, the oft heard argument against the proposed delay was the the government ought to get this transition over with as soon as possible for the benefit of everyone involved: Congress has an economic crisis to deal with; broadcasters are tired of wasting precious revenue-generating airtime to keep the public informed; retailers can’t wait to clear the converter boxes of the shelves in order to start selling other electronics with higher profit-margins; broadcasting stations already have plans and equipment in place, and just like like everyone else in the industry they, too, are ready to just get this whole transition over and done with, once and for all.

Well, maybe it’s just me, but in my opinion, the government should have had far less of an involvement in this entire process than it has had; the industry could have taken care of this all on its own.

Not only is the delay going to cost the industry a lot of money, it’s also going to cost the tax payers, who will inevitable have to foot the bill for this ridiculous coupon program. The first coupons were made available at the start of 2008 – that’s more than a year of advanced notice for consumers to save up $40-$70 dollars for a one-time investment in a converter box, and that’s if they even need one in the first place.

If it’s really so much of a problem that several million households might be without TV for those few days it’ll take to figure out what’s going on, then the government should have told those people to just turn on their damn radios! Remember those? That little box, with the shiny antenna that makes noise. Putting that message out would have been far easier, and cost way less, and trying to explain where people can go to sign up for a government issued coupon.

Oh, and speaking of coupons, if those six million households can’t afford a radio, give them a $10 coupon to go get one, not a $40 coupon to get a converter box. So, assuming that there are about 100,000,000 households in the United States, and that about 6% of them need a coupon, issuing a radio coupon instead of a converter-box coupon would save the taxpayers about: (100,000,000)(0.06)(40-10) , or a 180,000,000 dollars – and that’s a conservative estimate! Saving money is so hard, isn’t it?

]]>
Converting FLAC to 320kbps MP3 with Foobar http://www.tech-talkers.com/2009/01/converting-flac-to-320kbps-mp3-with-foobar/ 2009-02-01T06:44:04Z Tim Severeijns Lossless codecs, like FLAC, are great when it comes to preserving all the quality present on the original media. Unfortunately though, they do take up quite a bit of space and most lossless codecs are far from popular in the mainstream. Therefore, it often becomes necessary to sacrifice some of ... foobar2000

Lossless codecs, like FLAC, are great when it comes to preserving all the quality present on the original media. Unfortunately though, they do take up quite a bit of space and most lossless codecs are far from popular in the mainstream. Therefore, it often becomes necessary to sacrifice some of the quality in order to recoup some disk space and increase compatibility — because the day that the iPod supports open source codecs, is also likely to be the day that hell freezes over…

So, you’re stuck with hordes of FLAC files and you want to convert them all into high-quality MP3’s, and, of course, you’d like to do all this without paying a penny, correct? Is that even possible? After all, the Fraunhofer Society currently owns the rights to the MP3 codec, and they’re not shy about cashing in on it.

Fortunately, there is a solution, and both of the tools needed for this operation are available for free!

Ever since I gave up on iTunes and Winamp, I’ve been using a very light-weight and very elegant music player known as Foobar2000, or just Foobar for short. Besides the wonderful job it does playing back music with an absolute minimum of fuss, unlike iTunes, it also features an extensive array of useful plug-ins, the most fundamental of which come pre-installed. One of these built-in tools is the codec converter, which is what we’ll be using to solve our little dilemma.

In order to encode and decode MP3 files, Foobar uses the LAME codec. Due to reasons I don’t fully understand, the guys behind this codec have somehow circumvented the need to license the relevant technology from the Fraunhofer Society, and can therefore offer MP3 support for free — sounds like that might be an interesting article all on its own, but I digress.

Anyway without further ado, here’s the interesting bit:

Once you have Foobar installed and fired up, click on the “Library” menu, and select “Configure” — alternatively, you can just hit Ctrl-P, and that will get you to the same place. On the left side of the “Preferences” window that just opened up expand the entry titled “Tools,” and click on “Converter.”

foobar-pref

From here, look to the right side of this window, and you should see a button labeled “Add New.” Click on that.

You should now see the window pictured below. From the “Encoder” list, select “Custom.” The first text box under the drop-down that you just used should now read “lame.exe” –  again, just like in the picture.

foobar-pref-ii

The settings are almost perfect right from the start. There is, however, one problem. We would like to convert our pristine FLAC files to the highest quality offered for MP3’s, but unfortunately, that’s not what the settings reflect at the moment. In order to rectify this, all that we need to do is replace the text in the “Parameters” box with the following:

-S --noreplaygain -b 320 - %d

From a technical standpoint, that’s all you need to change; your files will now be properly converted to 320kbps MP3’s. The description of this conversion preset, however, won’t reflect the fact that we’ve forced the LAME encoder to use a constant bit rate of 320kbps. But, this is easily fixed as well. At the very bottom of the window, simply update the “Bitrate” field to 320, and set the “Settings” field to “CBR,” short for “Constant Bit Rate.”

That’s it, you’re now ready to convert your FLAC files.

Now, in case you’re new to Foobar, once you’re ready to actually convert a file, all you need to do is add the FLAC files in question to any playlist, select the song, right-click, and hit “Convert.” The rest is really straight forward…

Oh, almost forgot: The first time that you convert something to MP3, Foobar will ask you where it can find the LAME executable (i.e.: the file called LAME.exe). If you don’t already have it, you can download it from www.free-codecs.com. Once you’ve downloaded it, simply extract it to any location of your choosing — perhaps you might want to extract it your music folder, so you know where it is for future reference. After the extraction, head back over to Foobar, and when prompted, simply select the executable from the folder that you extracted the .zip file into.

Enjoy!

]]>
How to Safely Test-Drive the Windows 7 Beta http://www.tech-talkers.com/2009/01/how-to-safely-test-drive-the-windows-7-beta/ 2009-01-11T22:38:35Z Tim Severeijns Despite the troubled roll-out, the beta to Microsoft's next operating system, Windows 7, is finally available. But, unless you are either a die-hard Microsoft fan (possible, though unlikely), or a true geek, you're probably thinking to yourself, "So what?" Well, if that's your first reaction, then I really can't blame you. ...

Despite the troubled roll-out, the beta to Microsoft’s next operating system, Windows 7, is finally available. But, unless you are either a die-hard Microsoft fan (possible, though unlikely), or a true geek, you’re probably thinking to yourself, “So what?”

Well, if that’s your first reaction, then I really can’t blame you. Traditionally, trying out beta releases of operating systems has been a real hassle. Either you’re so eager and willing to try out the very latest that you can’t wait to repartition and reformat a drive in order to install the new OS, or you just can’t be bothered.

Wouldn’t it be great if you could just boot your current operating system – be it XP, Vista, or some Unix variant – pop open a window and run Windows 7 in that?

Not only would this be a great alternative to a traditional installation, but it’s also pretty easy to setup!

First off, a few terms. What we’re trying to achieve here is known as “full virtualization,” meaning that we’ll be virtualizing not just a single application or service, but an entire operating system. From now on the operating system that boots up when you turn on the machine will be called the “host,” and the operating system that we’re going to run virtually on that host will be known as the “client.” So, in other words, the client will be running within the host.

A critical aspect to virtualization is the fact that the client will think that it’s the one and only operating system running on the machine; it will, by design, be completely oblivious to the host. From a practical standpoint, this means that the beta to Windows 7 should behave exactly as it would if you had installed it on a fresh machine.

Unfortunately, the complete isolation achieved by virtualization also has a few downsides. First off, Windows 7 won’t have direct access to any of the machine’s hardware; for any action that it wishes to perform it will have to go through a middleman, the host. In short, this means that you’re computing experience in Windows 7 won’t be quite as spiffy as it might otherwise be on the host. Additionally, Windows 7 won’t have any access to OpenGL or Direct3D implementations on the hardware level. In layman’s terms, this means that the graphics will, for lack of a better term, suck. You’ll have to forgo the Aero eye-candy, and forget about playing solitaire at a decent framerate — I’m serious.

With all the warnings out of the way, let’s get started.

There are several application that will provide us with the capabilities that we need, but we’ll go with Sun Microsystem’s VirtualBox, since it’s free. You will find the appropriate download link here.

Besides VirtualBox, you will also need the ISO image of Windows 7 so that you can actually install it. Microsoft already has an entire page dedicated to Windows 7, located here. At the top of the page, you should be able to see a link that reads “Download the Windows 7 Beta.” Clicking on this link should start you on your way towards getting the beta, as well as the key that you’ll need if you don’t want Windows 7 to expire after 30 days. Since we’ll be using VirtualBox, make sure to get the 32-bit version of Windows 7; 64-bit clients aren’t completely supported yet in VirtualBox, but it’s coming soon, I hear.

Once you’ve downloaded and installed VirtualBox, go ahead and run it.

After the application starts, click on the blue star icon labeled “New,” this will start the “Create New Virtual Machine Wizard.” Sun has done a great job with VirtualBox, as such the Wizard is pretty straight forward and easy to use. Nevertheless, I’ll share some of the more important details to getting this all to work.

The second page of this Wizard will ask for the VM name and the OS type. The name can be whatever you like, so “Windows 7 x86,” for instance. The OS type, however, is more important. The operating system is obviously “Microsoft Windows,” so leave that as it is. As for the version, select “Windows 2008″ – not “Windows 2008 (64 bit).”

The third page is where you choose the amount of RAM that client will be able to access. The general rule of thumb is that you want to allocate about 40% of the total physical memory in your system. Too little, and the client won’t run properly, just like in real life; too much, and the host won’t be all too pleased. I would strongly recommend at least one gigabyte, the minimum recommended by Microsoft. On to the next page!

On the fourth page, you’ll be asked to select, and if necessary, create a virtual hard disk to install the client onto. Go head, and hit the “New…” button. This should bring up another Wizard, the “Create New Virtual Disk” wizard. I left all the setting in this Wizard at their default values and that seems to be working just fine for me. These would be: Dynamically expanding storage, the default virtual drive file location, 20.00 GB in size, and that’s it. Once you’re done with new Wizard, you should end up with the first Wizard. Just hit “Next.”

You should be seeing a summary page, so if everything is satisfactory, click “Finish.”

Going back to the main “Sun xVM VirtualBox” window, you should be seeing you’re newly created virtual machine in the left-most side of the window. Virtually speaking, we have just assembled you’re brand new machine from all the various parts, and we are now ready to boot it up for the first time.

Double click on the virtual machine to start it.

You should now see a new window, and a new Wizard, the “First Run Wizard.” Click “Next.”

The second page will ask you to select the installation media; in our case, that would be the ISO image of Windows 7 that we just downloaded. So select the ISO, select the “Image File” bullet, and click on the browse icon. This will open the “Virtual Media Manager.” From this new window, click on the “Add” icon and browse to the 2.44 GB (if it’s the 32-bit version) ISO image that you just downloaded. Once you’ve selected that image, you should end up back in the “First Run Wizard.” Click “Next” once more.

You should new be seeing another summary page. If all the listed information is correct, click “Finish.” If we go back to the analogy of before, you’ve just booted your machine for the first time, the Windows installation disc is in the drive, and you’ll now be asked to install Windows.

After Windows 7 installs itself, you can safely shutdown the client, just as you would with any “real” computer, and the next time you start her up, she’ll behave exactly like you would expect. Any changes you make will be saved for the next session. So, if you want to install Firefox, or OpenOffice in Windows 7, you can, and the next time you boot into Windows 7 everything will be exactly as you left it.

Enjoy, and remember, it’s a beta

]]>
Far Cry 2 and the annoyances of SecuRom http://www.tech-talkers.com/2008/10/far-cry-2-and-the-annoyances-of-securom/ 2008-10-30T07:42:54Z Tim Severeijns Not thirty minutes ago, the UPS truck came to a screeching halt outside my apartment to deliver the package that I've been eagerly awaiting the entire week. Trying hard to contain my enthusiasm, I carefully opened the DVD-sized box that Amazon shipped it in, and there it was: Far Cry ...

Not thirty minutes ago, the UPS truck came to a screeching halt outside my apartment to deliver the package that I’ve been eagerly awaiting the entire week. Trying hard to contain my enthusiasm, I carefully opened the DVD-sized box that Amazon shipped it in, and there it was: Far Cry 2, Ubisoft’s latest and greatest — or so they claim…

The reviews have been pretty positive so far, with Metacritic giving it a very respectable aggregated score of 88%,  but consumers seen to have a different opinion so far. One need only glance at the user reviews on Amazon.com to know why: SecuRom, a DRM protection scheme that can only be described as malware.

A sizable number of gamers seem to have gotten it into their heads that if they do nothing but give DRM-laden games one or two star reviews on Amazon that the developers might get a clue and stop hassling honest users with DRM. I suppose that this approach might yield some results, but I’m not too optimistic. As long as these large game studios are run by naive, elderly executives, and over-cautious shareholders, I foresee little hope of getting any an A-list title on the shelves without the inclusion of some sort of obnoxious scheme to protect the game from piracy.

I firmly believe that the issue is one of utter ignorance and naivety. The inclusion of a DRM scheme, like SecuRom,  in a game is meant to deter and prevent piracy, which is all fine and well; I, like the vast majority of consumers, have absolutely no problem supporting artists and developers, nor do I believe that studios should sit idly by while their hard work is being sold on the black market. I do, however, take issue with the manner in which SecuRom approaches this problem.

The reason I’m writing this post is that I’ve just about had with SecuRom, and it’s absurd approach to anti-piracy. Having just unwrapped the game, I popped the disc into my DVD drive and fired up the installer. All okay so far; the game, weighing in at a very moderate 3.2 gigs, installed without a hitch. However, as soon as I double-clicked the executable, SecuRom intervened:

Okay, fair enough, I guess this is a reasonable objection, since I do have one application that I suppose might facilitate piracy, namely PowerISO. After all, the application does allow for the mounting of ISO images on virtual drives, which would be a crucial step in installing pirated content. So, fine, this is the first time in a while that I’ve had DRM bitch at me, so I was willing to play along. After booting PowerISO off my computer, and rebooting my rig, I tried again: but no, I got the exact same error!

A few minutes with Google revealed that I’m not the only one experiencing trouble, and that the issue is actually fairly widespread. In fact, the guys responsible for this poorly implemented mess have already acknowledged their mistake and posted a fix, which is nothing more than a modified executable.

At this point, though, I don’t know what I find more aggravating: the fact that studios have absolutely no trust in their own customers, or that they apparently have no issue with shipping defective merchandise! The game has been out for barely a week, and already a patch is needed to even get the damn thing running — this is beyond absurd! Where was Quality Assurance on this one?

Update: So far I’ve only logged a few hours with the game, and I’ve had no further issues with SecuRom, but, then again, I haven’t actually reinstalled PowerISO yet…

]]>
Play-Testing the Battlefield: Bad Company Beta http://www.tech-talkers.com/2008/04/play-testing-the-battlefield-bad-company-beta/ 2008-04-09T07:04:02Z Tim Severeijns Anyone who's even remotely into first person shooters on the PC has probably heard of, or come across, the Battlefield series from Electronic Arts. The first one was release way back in September of 2002, and its captivating online multiplayer scheme enthralled thousands. So, it wasn't much of a surprise ... Battlefield Bad Company

Anyone who’s even remotely into first person shooters on the PC has probably heard of, or come across, the Battlefield series from Electronic Arts. The first one was release way back in September of 2002, and its captivating online multiplayer scheme enthralled thousands. So, it wasn’t much of a surprise when EA followed up on the success of 1942 with its next installment, Battlefield Vietnam. Although it didn’t fair quite as well with the critics as its predecessor did, it wasn’t a failure by any stretch of the imagination.

The Battlefield series had already established itself as a franchise capable of delivering, but the true show of EA’s potential came in June of 2005, with Battlefield 2; the first true sequel. Instead of transporting the player back into yet another war of the past, the game was modernized into a fictional, present-day conflict between the Chinese, an undisclosed middle eastern nation, and of course the good ol’ US of A. Once more, the game was a stunning success.

But then, perhaps due to their success induced high, Electronic Arts let down a sizable portion of its fan base with their next effort, Battlefield 2142. Jumping from the varied theaters of the Second World War, into the claustrophobic jungles of Vietnam, and then to the present was a progression that seemed logic and apt to many. The series started out with the most significant military conflict of the last century, and then it series took players in a logical progression through all the major engagements that the United States’ military has seen.

The main reason that 2142 never really caught on was probably due to the fact that players suddenly found themselves more than 100 years in the future in a conflict with weapons and vehicles that seemed incredibly out of place in a series that seemed to be focused on realistic military engagements.Luckily, though, the upcoming entry in the series, Battlefield: Bad Company, is a true return to form — and what a form it is! Reinvigorating a series is one hell of a challenge, and as such, Electronic Arts has decided that what better way to field test a game, than to actually put it out in the field. The beta, available to those who have access to the beta codes, has been available for several days now, and I must say that the multiplayer component looks really promising.

The last few weeks, I haven’t played much else besides Call of Duty 4, which is another series that started out by pitting players against the Nazis. As such, I’ll probably make quite a few comparisons to that game.

The beta only includes two maps so it’s pretty obvious that the purpose of the beta is to vet the fundamentals, like gameplay, instead of play-testing a series of maps. When it comes to gameplay, though, Bad Company does quite a few things spot on, leaving only a short list of gripes.

The fundamentals of the series as still present, and perhaps stronger than ever. Players are divided up into two opposing teams, each tasked with accomplishing their goal of destroying enemy depots, while preventing the enemy from performing their task. Once killed, players will continue to respawn into the game until the reinforcement reserves deplete.

One of the changes that veterans of the series will notice immediately upon joining a game is that players are no longer presented with a map from which to choose a spawn point; you either spawn from a set insertion point, or next to your our teammates, your choice. This change might have been made to simplify the game for the console, or it might represent a premeditated departure from the status quo. Either way, the change didn’t have an averse effect on my experience.

The major new addition to the gameplay this time around is the fact that environments can be completely destroyed; everything from trees and terrain, to entire buildings can be completely demolished. This seemingly tiny change has devastating affects on gameplay. No longer can a single, well positioned sniper pick off hordes of enemy soldiers, while simply hiding behind a door post. Traditionally, a sniper positioned in such a manner would have free reign until some fool-hearty soul makes a mad dash for the door and engages the sniper in close-quarter combat. No more, though. Now, you’re free to be as creative as possible with your shots and kills. Let’s see here: you can still go for the direct, zig-zag as you go, approach towards the front door; or you can, for example, sneak around to the back of the building, blow out the rear wall with a well-paced grenade, and make a dramatic entry that way.

Unfortunately, the one thing you can’t do the scenario I just described is take out the poorly supported ceiling above the sniper once the walls are gone so that the rebar studded chunks of concrete will crush or impale the sniper. The most damage you can do to any one of the buildings is reduce it to a smoldering cocoon of support walls and ceiling. In other words, if you were hoping to unite every tank, chopper, boat and Humvee on your team to launch a joint attack aimed at decimating anything and everything in your path, think again.

A destructible environment not only makes the game a lot more fun when you’re the one going the shooting, but it will also produce some truly awesome scenarios. At one point, having already taken out several enemy combatants with a M24 sniper rifle, I was about to fire off another round when I heard an approaching tank. Within seconds this metal beast had rounded the corner of the building ahead of me. As the turret cranked its way slowly my way I knew, then and there, that I was fucked. And I was right…but, the effect was awesome. That tank may have ended my short spree, but seeing the building that I was in blow to smithereens was definitely worth it.

Even though the beta provided me with some truly awesome moments, the experience wasn’t completely without fault. What follows is a short list of some of the gripes and observations, both positive and negative, that I encountered during my play testing. Granted, some of the things listed may not necessarily be issues of real concern, but bare with me:

  • One of the first thing I noticed was that whenever I died, the game would kick me back to a menu that conveyed to me that I had to wait a certain number of seconds before I could respawn. It wasn’t so much that I had to wait that bothered me, but rather that this menu took me out of the experience. The menu also allowed me to change my weapon class along with a few other things, but I would have preferred it if, when I died, my view switched to that of a teammate, with a small countdown somewhere off to the side of the screen, and maybe a message telling me that I’d have to press Y, for example, to change my class. Not only did this menu take me out of the experience visually, it also killed the audio; I heard nothing of the ongoing battle until I was able to rejoin.
  • I loved the way the sniper scopes his rifle. Instead of immediately switching to the scoped view, as is done in so many games nowadays, there’s an actual transition animation where the rifle is brought up to the character’s face. This simple touch really lent itself to the authenticity of the role. Unfortunately, though, my high hopes for the sniper were immediately brought back down again when I noticed that the sniper seemed to have arms (and nerves) of steel, allowing him to eliminate all vibrations from his aim. I realize that this has been the norm with the entire series, but I feel that perhaps packing a little more punch with each round would go nicely with an unsteady hand. On the positive side though, reload animations for the bolt action look pretty neat, if perhaps a little mechanical.
  • Another neat touch is that fact that whenever a player experiences trauma severe enough to lower his or her health down to 30% or less, that player’s vision becomes impaired. The concept is really neat, but the execution is slightly lacking. To me it seems a little hackneyed, but the players vision is simply flooded with a red haze, nothing all too special. It would be cooler if the players vision got worse and worse as his or her health deteriorated. One might start out with perfectly clear vision, go to a red hue at around 30, and then add a slight blur and maybe some camera wobbly as the Grim Reaper approaches.
  • Of course, since it is the Battlefield series, there are plenty of vehicles as expected; everything from jeeps and tanks to boats and choppers. I don’t have all too much to mention about the vehicles, since the series has always done this pretty well, but I will mention that I am starting to grow slightly weary of the whole arcade style approach to vehicles. I understand that a console game needs to be accessible to a large audience, but it couldn’t hurt to make the vehicles handle a little more realistically. Maybe I’ve been playing a little too much Crysis as of late, but I want to be able to shoot out tires to bring a car to a potentially spectacular halt — ya know, stuff like that.
  • Maybe it’s just me, but I’m not too fond of the grenade control scheme. I want to be able to pull out a grenade and throw it immediately; I don’t want to have to worry about how long I have to hold down the trigger to be assured that the grenade will go far enough. I want to be able to pull the trigger once, and just have it go. I’m perfectly content to have to aim up higher if I want the grenade to go farther. Oh, and another thing: please, please increase the effective radius of a fragmentation grenade. There have been plenty of occasions were I have thrown a grenade to within a yard or two of my target, only to find that after it went off, my target was making me his target.
  • Graphically speaking, the beta isn’t all that spectacular. If you’ve been following the development of this game then you have, like me, probably also seen a good portion of the screenshots that can be found floating around the Web. I’m sad to say, though, that the screenshots don’t quite live up to the hype — but, hey, when isn’t that the case?
]]>
A Lil’ Guide to Installing Ubuntu http://www.tech-talkers.com/2008/03/a-lil-guide-to-installing-ubuntu/ 2008-03-28T03:44:50Z Tim Severeijns If you've taken the time to read the last two posts, then surely you must be aware that the time has now come to finally install Ubuntu; and if not, then oh well, just look at the pictures or something. In my last two posts, I've attempted to explain the pains ...

If you’ve taken the time to read the last two posts, then surely you must be aware that the time has now come to finally install Ubuntu; and if not, then oh well, just look at the pictures or something.

In my last two posts, I’ve attempted to explain the pains that I went through trying to get the best out of my new laptop. The problem originally started with my realization that I am in no means whatsoever a fan of Vista. Postpartum depression urged me to seek out XP’s familiar setting. Much to my dismay, though, XP would not install on my new laptop, so that was the first hurdle I had to surmount.

Not only was I interested in a return to XP, I was also curious about Linux; particularly Ubuntu. So, the obvious choice, or at least a logic one in my mind, was to dual-boot. In the previous two articles, I explained how to install XP on a new machine intended for Vista, as well as how to prepare a system for a dual boot operation. Okay, so now that we’re all up to speed once again, let’s take the plunge once more, shall we…

If this first step doesn’t seem blatantly obvious, then maybe Linux isn’t for you, but in order to do anything further you’ll need the Ubuntu installation disc. So, head on over to Ubuntu.com. and download the latest version — 7.10 at the time of writing. Once armed with the image file, you’ll need to burn this to a disc. If you already own it, I would recommend using Nero, but if not, InfraRecorder is an excellent, and not to mention free, alternative.

Once the disc has been burned and finalized, pop it into the machine you wish to persuade into conversion — and cross your fingers, for the moment of truth is at hand. Linux is often labeled as being far more stable than Windows, which is true for the most past, but the main problem is getting it to run in the first place. Not all hardware likes Linux flavor. If you’re unlucky, you may have in front of you a machine cursed with unsupported hardware. If this is the case, you may experience issues ranging the benign, such as a nonfunctional web cam, to real show-stoppers, like incompatible graphics cards or display devices. Curing any of these issues may require long and inquisitive searches of some of the less frequented fringes of the net. Fortunately though, there always seem to be a plethora of willing and able Linux hobbyists to help you out.

Oh, and if the CD doesn’t boot right away, make sure that your BIOS is set to boot from CD-ROM drive…

Okay, so assuming that the CD boots properly, you should soon see the Ubuntu desktop–but, hey, wait a minute! What happened; nothing was installed, or was it?

The CD you just popped into your machine is what is known as a Live-CD, meaning that it contains a bootable version of the operating system. Allowing the user to experience the full operating system without having to install a single file is a great way of demonstrating the potential of product that many might be too hesitant to install.

“Okay,” you say, “but I want to install it, not just demo it.” Well, see that little icon on the desktop, the one that says “Install”? That’s where you wanna be if you’re interested in giving Ubuntu a serious test-drive.

Starting up the Installer, you should see a dialog box that will guide you through the installation process in seven rather easy steps. The first three steps are real no-brainers, simply select your prefered language, the appropriate time zone and your keyboard layout.

Step four is were things start to get a little more exciting. Depending on how many drives and partitions your machine has, this step may be more or less complicated. Since I’m writing this as part of a series in which I explain how I managed to dual boot XP and Ubuntu on a Vista-shipped laptop, the path I chose to follow in step four of the Ubuntu installer is quite specific to my particular needs. In the previous article, I explained how GParted was used to partition my drive into a Windows partition and a Linux partition companioned by a SWAP partition. If you followed the same procedure as I did, and used GParted to prepare the drive, then all you’ll have to do is select the correct partition, and move on. Since we already used GParted for most of the heavy labor, it makes little sense to reformat/resize the partition again — unless, of course, you made a mistake the first time around. Not to sound to repetitive, but step four is where you really want to pay attention. It’s very important that you select the correct drive to install Ubuntu onto, otherwise you’ll be in serious danger of losing existing, potentially important, data.

Looking at the screenshot below, we see that the system I used to grab the screenshots has two physical hard drives: HDA and HDB. We can also tell that the first drive, HDA, has three partitions on it: HDA1, HDA2, and HDA3. The one we’re after is the one that is formatted as EXT3, so in this case that would be HDA2. To double check that this is the right drive, take a look at the size of the partition. If this number corresponds to what you specified in GParted then you’re good to go. Note that there is no need to check any of the boxes unless you want to reformat the drive again, which really can’t hurt. So, once again, if you want to reformat, check the box, otherwise just highlight the drive. And in case you are wondering, SDA1 is a removable thumb drive.

If you have decided to reformat any of the drives again, make sure to also hit the “Edit partition” button. This should bring up another dialog box, asking you to specify the new size of the partition as expressed in megabytes, what format to use, and where to set the mount point. If the drive that you are partitioning is meant to house an installation of Ubuntu, it’s critical that you use the EXT3 format and set the mounting point as a single forward slash. If you’re confused, refer to the picture below.

Once you’re done with the “Edit Partition” dialog box, hit “OK,” and then click “Forward” once you’re back at larger dialog box.

Step five is particularly useful for those of us who have decided to dual boot, since Ubuntu will scan existing partitions for any other operating system from which it might be able to port over any important documents and folders. I always prefer to start fresh, and since Gutsy Gibbon is capable of reading Windows partitions, I will leave this step up to you. If you’re not interested, just proceed to the next page.

Even though there appear to be 7 steps, number 6 is really the last one that really requires any work. The purpose of this step is to help you set up a user account, and it’s pretty straight forward. As such the only note to make here is that your log in name cannot be capitalized and that…uhm, oh yeah, don’t forget your password.

Series: Dual-Booting: XP and Ubuntu

Part I | Part II | Part III

]]>
Partitioning with GParted http://www.tech-talkers.com/2008/02/partitioning-with-gparted/ 2008-02-10T09:04:04Z Tim Severeijns After I got my new laptop, it didn't take me all to long to determine that I wasn't all too fond of Vista. Instead of hastily ditching Vista and running right back into XP's familiar settings, I thought I'd try out Linux. Having heard myriad glowing reports of its ...

After I got my new laptop, it didn’t take me all to long to determine that I wasn’t all too fond of Vista. Instead of hastily ditching Vista and running right back into XP’s familiar settings, I thought I’d try out Linux. Having heard myriad glowing reports of its user friendliness, Ubuntu seemed like the obvious choice. However, if I did dedicate my entire system to Ubuntu, there’d always be the problem of gaming; Linux isn’t exactly a hotbed for interactive entertainment. So, why not dual-boot?

There are several approaches to dual-booting a system with XP and Linux: you can either nuke the drive, partition it, and then reinstall both operating system; install Linux first, then partition and install XP; or you can install Windows first and then get Linux working afterwards. In this article, I’m going to choose the latter method, but no matter which way you choose to go, an essential tool to have in your arsenal is GParted.

GParted is a nondestructive UNIX-based partition editing application used to create, destroy, resize and move entire partitions and their file systems. The application is widely available on many Linux systems, but more importantly, it also comes as a live-cd. What that means is that you can download the application image, burn it to a disc and boot from it, without having to touch the rest of your system.

For the purposes of dual-booting, I’ll describe how to use GParted to shrink an existing Windows partition, so that the freed up space can then be reformatted as EXT3 and SWAP partitions to host a fresh installation of Ubuntu’s Gutsy Gibbon:

  • First thing’s first: since you’re going to be making critical changes to your hard drive, make sure that the data on that drive is properly backed up and secure. Even though GParted is designed to be nondestructive, you never know what might happen; the wrong button might get pressed, or the power might fail — so backup first!
  • Okay, now that we’ve gotten that out of the way, let’s start. In order to do anything you’ll need the program, so go get the GParted Live-CD image first and burn it to a disc. After the burning process finishes, pop the CD back in and reboot. Note: if GParted doesn’t load, then you’ll have to amend your machine’s boot order in the BIOS first.
  • When booting from the Live-CD, you’ll be asked to input some system information, such as hardware configuration, language and keymap – don’t worry, it’s easy. A few seconds into the boot process, you’ll see a list of possible boot configurations. From my experience the first option, the one that reads “Gparted-liveCD 0.3.4-11 (auto-configuration),” seems to work just fine, so go ahead and select that. After a few more loading cycles, you’ll be asked to select the appropriate keymap and after that your language; the numbers you should be entering in are 41 and 33, respectively. This should take care of the setup, now just wait for the distro to finish booting.
  • Now that everything is loaded and set you’ll want to get started by shrinking the existing Windows partition. To do this, select the partition, either from the list or from the graphical representation, and then hit the button labeled “Resize/Move.” This will bring up a smaller window from which you can choose by how much to shrink the existing partition. Notice that there are three input fields in this window; enter the desired new size of the partition in the second field, or just move the arrows at either end of the graphic. Since GParted aims to keep your existing data intact, the minimum size that an existing partition can be shrunk to is equal to the amount of data already on that partition. The other two fields in the window describe how much of the new free space will appear before the newly resized partition and how much of it will appear after it. How you organize the layout of the partitions isn’t crucial to getting both operating systems running, but it keeps everything organized. Once you feel comfortable with the changes, hit “Resize/Move” — don’t worry, the changes aren’t permanent yet.
  • Once you’re back to the main interface, you’ll notice that the existing partition is smaller and that it is now surrounded on either one or both sides by unallocated dark gray space. To create any further partitions, which you’ll need to if you want to install Linux, simply select the dark matter and click “New.” Creating a new partition is just as simple as shrinking an existing one, the only difference being that you’ll also have to select the filesystem type; since the new partition will host a Linux install, select EXT3. When specifying the size of the new partition, make sure that you leave about 512MB to 2GB for a Linux swap partition. Think of the swap partition as an extension of your RAM, but on the hard drive.
  • To create the swap partition repeat the previous step, but instead of choosing EXT3 as the filesystem type, select LINUX-SWAP. With the size set, hit “Resize/Move,” and return to the main interface.
  • Up to this point, GParted hasn’t done anything to your hard drive; the only thing that you have done up to now has been planning out what to do. So, once you’re confident that you’ve set everything up correctly, press the “Apply” button to actually make the changes to the system. Hitting “Apply” should bring up a cautionary message, telling you that you’d better be sure about what you’re doing. Once you dismiss that message, GParted will get to work and the operations that you have specified will commence. Depending on the size of your hard drive, the amount of data on, and how many new partitions you are going to create, GParted may be at work for a couple of minutes or a couple of hours.

Once you’re done, GParted will kick you back to its main interface window, from where you can go on to the main menu and quit. If everything went to plan you should now have at least three partitions: one based on NTFS, another on EXT3, and then a third to serve as a SWAP partition. At this point you’re free to install whatever Linux distro you fancy.

However, if you’re new to Linux, then I would highly recommend you try out Ubuntu, which is highly user friendly, very stable, and extremely customizable. For help on installing Ubuntu, be sure to check back in a couple of days for the last installment of my dual-booting series to get detailed instructions on what to do and not to do…

Series: Dual-Booting: XP and Ubuntu

Part I | Part II | Part III

]]>
How to Install XP on Vista Laptops http://www.tech-talkers.com/2007/12/how-to-install-xp-on-vista-laptops/ 2007-12-29T04:53:16Z Tim Severeijns If you've recently purchased a new laptop, chances are that it came with Windows Vista installed on it. Now, depending on how easily you can adapt, that might be a little bit of a problem. Even though Vista has been out for about a year now, many users are still ... windows logo

If you’ve recently purchased a new laptop, chances are that it came with Windows Vista installed on it. Now, depending on how easily you can adapt, that might be a little bit of a problem. Even though Vista has been out for about a year now, many users are still reporting on having a hard time with the new operating system. Some consumers just can’t get all of their peripherals to work properly, others are experiencing horrendous performance issues, while still others just hate the changes that Microsoft made.

One solution to this dilemma would be to take Apple’s advice and just “upgrade” back to XP. If you’ve already gone ahead and tried this approach, though, you may have noticed that this causes another problem — quite a serious one actually. XP refuses to install!

So what’s the problem, you ask? Well, let’s think about this chronologically, shall we. Windows XP is old. No wait, let me rephrase that: XP is ancient! This means that the software and the drivers included with the installation package are just as old; that’s six years to be precise. Back in 2001, when XP debuted, Microsoft expected motherboards to interface with hard drives via an IDE cable. But, six years down the road, PCs and laptops manufacturers have all dumped IDE in favor of SATA, which is faster, allows for hot swapping of drives, and insures better data integrity and reliability. This all sounds wonderful, until you consider that Microsoft never included any SATA drivers with their XP installers. So now what?

Fortunately, there is a fairly easy way to rectify this problem. The necessary SATA drivers aren’t included with the standard installation disc, so we’ll have to add, or slipstream, them using a neat little tool called nLite:

  • First off, we’ll need to download and install nLite, a freeware application developed and hosted by Dino Nuhagic. The latest version can be found at www.nliteos.com. I got everything up and running with version 1.4.
  • Next, we’ll need to locate and download the appropriate SATA driver. Since it took me quite a bit of time and effort to locate the necessary drivers, I’ll do my part to simplify the task by hosting the driver I used here.
  • The only other thing that we need now is an XP installation disc – hopefully a legal copy – into which we’ll slipstream the SATA driver.

Okay, now that we have the right software and the correct driver, let’s get started. Since we’re going to be adding a component to the standard Windows installation files, we’ll have to extract the files from the CD. While doing so, it is important to make sure that the file structure stays intact. By far the easiest way of getting all the right files onto your system is to simply insert the CD, head over to My Computer, right click the CD directory, hit copy, and then paste it to the desktop. Once that is done you’ll be ready to get started with nLite:

  • Once nLite is launched, go ahead and skip the first screen which only contains information about the program.
  • The second screen will ask you to locate the Windows installation, which will be on the desktop if you followed the instructions above verbatim. Once you’ve located the correct directory, hit OK in the selection window and wait for nLite to analyze the directory. If you’ve identified the correct directory, you should be able to see the Windows version, what service pack you have, the version number and installer size. If you don’t see this information, chances are that you’ve identified the wrong directory.

nLite-1-Large

  • The third page isn’t really that important, so skip it and make sure you get to the Task Selection page, since that’s the part where paying attention becomes important; it’ll be where you tell nLite what to do with the Windows installation files. Depending on what version of the Windows XP installation disc you have, you might want to alter a few of the settings. However, for the purposes of this article, I’m going to assume that we’re dealing with a standard Microsoft installation disc that already has Service Pack 2 installed on it, and that we’re not interested in slipsteaming in any other components, hotfixes, addons or tweaks. So, with all these considerations in mind, go ahead and check the third box down, labeled “Drivers,” as well as the very last one, labeled “Bootable ISO.” Click next when done.

nLite-2-Large

  • We should now be at the Drivers page; this is where you should locate the correct driver for nLite to slip into the installation. In the bottom right hand side of the window, right above “Next,” you should see a button labeled “Insert.” Click it, and then select “Multiple Driver Folder.” Doing so should bring up yet another window within which you’ll have to locate the folder that the SATA driver, which you should have downloaded earlier, resides in. Clicking OK in this, as well as in the next window, should bring up a list of possible SATA drivers. Now, if you know exactly what hardware you have, go ahead and select the appropriate driver. However, if you don’t know exactly what you need, then you might want to try the fourth one from the bottom — that’s the one that worked for my HP DV6500T. Clicking OK again should kick you back to the Drivers page; go ahead and click next.

nLite-5-Large

  • This next page is really straightforward. If you feel that you’ve set everything up correctly, hit OK, and watch nLite got to work — okay, well maybe there isn’t that much to see…
  • After nLite finishes working its magic proceed to the next page, where you’ll have to complete one last task. Now that nLite has analyzed and modified your XP installation files, it’s going to want to know what it’s supposed to do with the newly created files. Your two best options are either to burn a new XP installation disc directly from within nLite, or to have the program create an ISO image which you can then do with as you please. In my case, I went ahead and created an image (just in case I happen to loose the CD at some point), which I then burned to a disc with Alcohol 120% (there are plenty of other applications, such as Nero, that will do the exact same thing).

nLite-7-Large

If you successfully followed the steps outlined above, then you should now have a brand new installation disc for Windows XP. The only remaining step is to insert the disc in your laptop and reboot. If everything went to plan, your machine will recognize the drive and start the installation process, which should be the exact same as it would otherwise be. If you have any remaining questions, please feel free to post a comment down below, and I’ll try my best to assist.

Note: This slipsteaming process will only work if you want to install Windows 2000, Windows XP, or Windows Server 2003.

Series: Dual-Booting: XP and Ubuntu

Part I | Part II | Part III

]]>