Early adopters have long known the perils of being the first to jump on new systems, before all the bugs and kinks have a chance to be worked out. Luckily most of these bugs and problems are relatively minor and can be fixed with software updates, but every now and then a whopper comes along that shakes up the whole industry.
This is what happened recently with Intel and the launch of their new CPU, codenamed Sandy Bridge. The chip made waves when it was first announced by promising consumers a way to escape the budget squeeze of buying a dedicated graphics card. It did this by taking a page from the past and combining a graphics processor unit, or GPU, and the central processing unit, or CPU, together in one chip. While most CPUs have some graphics functionality built in, Sandy Bridge took it one step further by pushing the power levels to those of most mid-range graphics cards, potentially saving consumers hundreds of dollars.
A Major Flaw Requires Recall
The problems began when computer manufacturers realized that a key component that controls devices like hard drives and DVD players was malfunctioning and causing these devices to burn out very quickly. Though few of the chips have made it into consumers’ hands, many large computer manufacturers had already ordered large quantities of the chips. Estimates for the cost of the recall place them at around $700 million, with even more in lost revenue.
Even though the flaw was caught long before it had a chance to spread among consumers, it’s important to remember the lesson here: be careful with early adoption of technology.