Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Intel 'preparing' to put an end to user-replaceable CPUs (zdnet.com)
72 points by KeepTalking on Dec 3, 2012 | hide | past | favorite | 71 comments


Well, what's AMD's stance on this? Hopefully they - at least - will continue to make CPUs that the "modder" community can take advantage of. If not, that would be a real shame. I've been an AMD fan since my 386DX/40, and would love another reason to continue supporting them.

Otherwise, one has to feel that this is just another battle in the "War on General Purpose Computing". Sure, PCs will - for now - remain "general purpose" computing devices, but you have to consider this one more step down the path of locking them down, and making them less accessible and hackable.


Came here to say something along these lines - if Intel steps out of the ring then it's basically handing AMD a monopoly on CPUs for the modding/custom-built community is it not?


If Intel's offering holds the performance crown (and/or the performance-per-$ crown) I'd have thought they'd hold onto a lot of the enthusiast market. A lot of modding, after all, is oriented towards getting performance.


Who's to say that modding is dead if you just have to buy your motherboard with the CPU welded on to it?


Looking at AMD's history, I would guess they will copy Intel a year or two later. (Remember Slot A?)


> A switch to BGA would mean that the processor could no longer be fitted into socket where it could be removed or replaced, and instead would be soldered to the motherboard much like processors for notebooks and tablets are nowadays.

This is completely incorrect. They could solder all the CPUs, but you can absolutely socket BGA chips. Are there any indications that they're going non-upgradable outside of the switch to BGA?


you can absolutely socket BGA chips

Eh, not really. Well, you can, but it's ugly, expensive, and usually only done for test fixtures and such.


I really don't see why they have to be ugly or expensive. They're absolutely expensive now, but that has a lot to do with the way in which they're used (test equipment is crazy expensive) and very little to do with what they actually are. As for ugly, well, ZIF sockets aren't exactly lovely, they're just really effective.


BGA sockets are only used for prototyping; they're expensive compared to other kinds of sockets.


Are they expensive because of some inherent property, or because they're only used for prototyping and are thus done in small runs? Having spent several hundred dollars on a little plastic BGA socket that could be made for pennies, I'm inclined to say that it's the insanely small production runs right now.


BGA balls are made of solder, and they'll cold-flow under pressure. So any socket meant for durable use in production equipment (as opposed to prototype or test purposes) is going to require a lot of small, delicate, and expensive spring contacts that provide a secure connection with very little force.

Also, sockets in general are perennial problem areas for signal integrity and EMC.

It is probably best to think of the motherboard as a monolithic hunk of material, going forward. Too many reasons for the trend to go in that direction.


I don't know too much about LGAs vs BGAs, but the Wikipedia article mentioned that sockts are "less reliable". How does the reliability compare between BGAs in a socket vs LGAs in a socket? The wikipedia article mentioned that BGAs can move heat around better; is this lost when a socket is used?

I hope Intel doesn't go the socket-less route. This seems like it would alienate the Linux server market for high-reliability servers that depend on CPU hot-plugging. Perhaps this only applies to the consumer market?


BGAs in a socket fail after about 5 years of not being moved, or maybe a dozen times you move things.


Generally, a soldered IC is more physically durable and gets some more heatsinking as a result.


This just seems like a crazy decision from Intel's point of view. Won't it make them incredibly beholden to motherboard makers? The explanation that they're jealous of, and trying to take over, a low-markup commodity market strikes me as weird.

Right now, most computer users grab the cheapest motherboard that will work for them, and one of the places they spend extra cash is splurging on more CPU than they need. Now, motherboard makers will be able to pair the faster-stepping CPUs with 'markup' motherboards and capture a lot of that windfall to themselves, no? Doesn't this price-insensitive enthusiast market send a lot of cash Intel's way?

Maybe Intel is worried about the health of x64 white box component makers and is intentionally sending them a windfall?


Intel has methodically integrated more of the system on the CPU package. With the memory controllers, integrated gpu and pci-e controllers already on the CPU, and VRM and rest of the north bridge following in Haswell, there will be nothing of value left on the motherboard. Broadwell-era motherboards will be dumb connector boards worth $50 at most. That's why they can be sold integrated.


Oh, that's not nearly as big a deal then.

Still, it's kind of an annoyance just because the form factor of the computer is still so closely tied to the motherboard, even if the board itself is trivial. Are they going to get a ton smaller and just be CPU+RAM with cable ports for PCIe/USB busses?


You cannot really do full-speed PCIe buses on cable ports.

I expect that Intel will do what AMD and Nvidia are doing presently with GPUs, and what they used to do with their chipsets: Sell the CPUs into the various motherboard manufacturers, and let them sell the complete package to the consumer. They can implement any form factors they like.

Of course with a sufficiently punitive contract that Intel retains control of everything.


Isn't that what Thunderbolt is? PCIe lanes with displayport?


not really. you aren't plugging a video card into a thunderbolt connector.

pci-e 2.0 x32 is 16gb/s. thunderbolt is 2.5gb/s


"As far as the PC OEMs are concerned, killing off the PC upgrade market would be a good thing because it would push people to buy new PCs rather than upgrade their existing hardware. The PC industry is currently stagnant, partly because consumers and enterprise are making existing hardware last longer."

Maybe the market is stagnant because there really is no need to upgrade. I have a 2009 macbook pro and it's fine. In the late 90's early 2000's I wouldn't go more than 1-2 yeara without an upgrade. I still use an AMD Phenom in my desktop, though I may build a new machine just because of gaming. Nothing is really pushing the limits of modern hardware. Most software is moving to the web and doesn't require the latest processor to run.


If the stated claim is true (killing off upgrades) then their assumption that people will buy new PCs needs to be substantiated. For example when having to upgrade many will consider going for a tablet or similar form factor instead which won't come from the PC OEM, and when it does will be a far lower margin item.

As one example my parents upgraded from their Windows PC to an iPad.


Isn't this going to introduce lag into Intel's sales pipeline?

In this new model, Intel will be unable to simply new ship chips to distributors to go to system builders. They'll have to ship all their CPUs to "the land where things are soldered" (Asia, presumably) and ship everything back again.

I think AMD may have just been given a gift of 6 weeks.


As a build-your-own-box type guy myself, this doesn't particularly bother me, for two reasons.

First, it's extremely rare (in my experience, anyway) to want to upgrade the CPU of an existing machine without also upgrading the motherboard. Usually by the time CPUs have advanced far enough to justify upgrading, there's also new connectors, faster memory interfaces, etc. out as well that would require a new mobo to take advantage of. If you skip the mobo upgrade and just buy the CPU, you risk just moving the system's performance bottleneck around.

Second, installing the CPU onto the motherboard is easily the trickiest, most delicate part of building your own PC. In an age when everything else plugs together with idiot-proof plugs and sockets, CPUs still have a forest of fragile pins on the bottom that can easily be bent or broken. You also have to manually add thermal protections like fans, heat sinks, thermal paste, etc., which all require selection and installation; if the CPU came pre-installed on the mobo, you could skip all that hassle completely.


> CPUs still have a forest of fragile pins on the bottom that can easily be bent or broken

Not really - the pins have been on the socket for some time now (for Intel at least - I haven't built anything AMD for some time I'm afraid) and are much, much harder to damage. The levers on the sockets have gotten much better, as have the HSF mountpoints, and inserting a CPU and attaching the HSF has not made me feel like I'm going to break the motherboard for many years (let alone actually broken it - I can remember boards where you had to jam a screwdriver into the heatsink retention bracket and apply pressure towards the motherboard... I'm getting cold sweats just thinking about it)

> You also have to manually add thermal protections like fans, heat sinks, thermal paste, etc., which all require selection and installation

Thermal paste is optional, if the CPU and heatsink make good contact it doesn't make so much difference. You don't have to select if you buy retail boxes instead of OEM, which come with HSFs already selected for you in the box, and with thermal gunk pre-applied.

I do agree that it's rare to upgrade the CPU and not motherboard, but you could say the same thing about most of the components in my PC (I tend to upgrade all at once).

As for trickiest, most delicate part of building your own pc... honestly if you're installing the CPU into the motherboard before putting into the case, I would have to disagree with you there.


>Thermal paste is optional, if the CPU and heatsink make good contact it doesn't make so much difference.

Err... it makes a massive difference. Try running your brand new Xeon workstation without TIM. Your PC will shut down or blow up within seconds of doing anything. Try living in Australia, or I guess Texas and doing CAD/rendering work or playing games in summer. If you don't have aircon you'll need aftermarket cooling or your CPU will throttle.

Intel used cheap thermal paste to stick the CPU to its heatspreader on their latest chip offerings instead of soldering it on. This is between CPU and heatspreader, not between heatspreader and heatsink. That alone caused load temperatures shot up by more than 10 degrees Celcius (18F). In addition to that I've reduced some of my friends' load temps by 30 degrees just by reapplying thermal paste properly and reseating the HSF.


I have run computers without TIM, for extended periods of time, and it is perfectly possible. In fact, most people put far, far too much TIM on which gives you far worse thermal transfer than no TIM at all!

All the TIM is meant to do is fill in the microscopic gaps between the heatsink base and the top of the CPU, so it fills in the gaps where air would otherwise be. However, with a decently smooth HS and CPU, there should be a lot of direct contact between the two surfaces, and metal <-> metal transfers heat better than metal <-> thermal goo <-> metal.

If you were able to get a 30 degree drop then they probably either had far too much TIM on, or the heatsink was not making good contact with the CPU.

However, this all said, I think I have now disproven my point that the CPU is not the most complex part of building a system (not that it being integrated into the motherboard will make heatsink selection & attachment any easier).


> Not really - the pins ... for Intel at least ... are much, much harder to damage.

Having built all my machines w/AMD cpu's I'll say that bent pins can be common, at least enough to justify taking care when installing the cpu on the mobo. To supplement, I've bent a couple myself, and have helped a couple friends straighten some of theirs out as well. They're not that hard to bend.


The trouble is with repairing old hardware. Recently I had a motherboard die on an old system, and needed to go to Ebay for a replacement. With forced bundling, replacement parts will be more expensive, and harder to find.

For the motherboard manufacturer, they'll need to buy CPUs and estimate the relative demand of each, instead of leaving that to the consumer or OEM.


The biggest problem is stock - you're going to have to stock 5 times as many types of motherboards now.

Which is impractical, so the net result is fewer motherboard to choose from.


It will, however, make hardware debugging harder. Being able to swap the CPU into another motherboard comes in handy.


As a person who has worked for a few hardware OEMs in the past in various roles from phone jockey to sustaining engineer, this bothers me greatly.

First, if one part fails (which is not extremely rare) you'll have to replace both parts.

Second, you'll still have to deal with thermal protections like hsfs, thermal paste, etc. You will not have to skip all that hassle and can still damage the CPU.

You're not saving much effort, will have less options when putting the PC together, will likely pay more, and the toughest parts of putting together a PC will still be present.


A-yup, couldn't agree more. It's already common practice to buy bundled CPUs and boards. We might as well be buying them as one solid piece.


I run a site dedicated to people who build their own computers. Like you, I am not terribly concerned by this move. From my experience, the number of people who upgrade CPUs keeping a given motherboard, or upgrade motherboards keeping the same CPU is very limited. CPUs typically don't depreciate in value that much, and by the time someone does decide to upgrade availability is limited to the point that prices often go up rather than down. If you wait a year or two, it's often more cost effective to just upgrade both.


The pins are sort of fragile, but it's usually a fairly smooth operation to plug in a CPU. I wonder if there are reliable statistics from people doing this in large numbers.

Installing the cooler can be much more tricky depending on the retention system. Pre-installing them could have a number of benefits. But it'd be a shame to lose the ability to chose your own cooling system, more so than having the CPU soldered on, in a way.


And if you want to add more CPUs to your 2, 4 or 8 socket motherboard?


Multi-socket mobos and CPUs command a significant price premium. I, personally, when building a new system will probably just build a 6-core single socket system, rather than buy a two socket mobo that can take two quad-core processors. Maybe spend the money saved on more memory, faster hard drive, etc.


I suspect Xeons will remain socketed. As a side benefit, Intel probably won't mind if people spend extra money on principle to get a socketed processor.


Two points that I haven't seen covered yet:

1. Waste. If some component on your motherboard goes, you're on the hook for a new CPU (and vice versa). This seems tremendously wasteful. Maybe bigger repair shops will support mail-in refurbishing? Will people take advantage of that? Or just buy new for convenience?

2. Competition. Smaller motherboard vendors won't be able to sell direct anymore. I'm wondering if anyone can comment on how bad of a thing this is.


I don't think I understand #2, wouldn't the small vendor just build and sell their boards pre-populated with an intel CPU part? They'd have to carry more inventory but not a crazy amount more.


Yeah, that's a good point. This introduces a new inventory consideration, namely CPU model demands (they'll have to determine the best distribution of proc models across each product). Probably not too big a deal


This was pretty thoroughly debunked as an issue on Slashdot last week.


I skimmed it at +5 and didn't see anything like that, care to be more specific?


One point that should worry Intel is, if enthusiasts move to ARM, and begin to take their friends and family with them...

I've never upgraded a chip, but I've built plenty of computers, and mixing and matching was important for me.

I worry if we'll have less options down the road.


Most ARM chips are BGA.


Though as far as I'm aware there's no inherent reason for this - apart from cost.

One of the things which has made PC so awesome since the early 90's has been the vast array of hardware available - you can just plug something out, plug something else in and boom, it runs faster, better and it's easy. You don't have to wait 12 months for a "new version of The PC". It's an open platform, we all love open platforms, don't we?


> Though as far as I'm aware there's no inherent reason for this - apart from cost.

Cost and size. The bread-and-butter market for ARM is consumer electronic devices, where size really matters. Other chip packaging that's easier for hobbyists to work with would be a huge waste of space by comparison.

There are some ARM chips available in DIP packages, but I suspect that selection will always be relatively limited, since demand will always be relatively limited.


The ARM chips available in DIP packages are microcontrollers that you are more likely to see in a toy or gadget than a general purpose or multimedia computer. In embedded systems where they are used, you probably can't swap them out for another part anyway because of code changes needed, different pinouts, etc.


When was the last time you upgraded a CPU on a laptop? Laptops already outsell desktops, and by 2015, tablets will outsell desktops.

http://www.inquisitr.com/76157/tablets-to-overtake-desktop-s...

I can't see why Intel would continue to worry about an increasingly shrinking part of the industry.


The difference now is that consumers aren't driving PC sales -- Enterprises are. Consumers are buying tablets.


I didn't know that, I was just parroting the linked article, which stated that:

"Unfortunately Intel doesn’t care about the enthusiast, and unsurprisingly they have moved on. ARM chips are now the focus for that crowd, and they are taking the mainstream geeks with them. "

http://semiaccurate.com/2012/11/26/intel-kills-off-the-deskt...

and it linked to:

http://pocketnow.com/android/droid-overclocking-reaches-30gh...

Taking that article at face value, I speculated that it would indeed drive people off.


This whole thing has a faint whiff of DOJ anti-trust investigation on it. Maybe it's wishful thinking.


How? There's nothing anti-competitive here. Just because we've been able to swap CPUs for years doesn't make it a legal obligation. After all, I can't upgrade the processor in my phone, tablet, or graphics card.


Moving to non-replaceable CPUs seems unnecessarily hostile to Intel's biggest fans.


The whole desktop/atx ecosystem is about to drown isn't it ?


Who upgrades a CPU without upgrading their motherboard too?

The benefits are neglible, as next-gen RAM, next-gen peripheral bus, etc, all require a next-gen motherboard.

More of a concern is custom-built shopping, where it is harder for a small vendor to stock a batch of mobos and a variety of chips, and some chips go obsolete before sale (and so their mobos would be lost too...)

But hardware is really, really good these days (as the article notes). Runs cooler, less stress, less failure, far overpowered for most use-cases, less need to upgrade.

Enjoy our modern bounty and pay a bit extra for a new mobo every few years.


The problem is that when you pick a motherboard, you also pick the CPU you want. Now, motherboard makers are unlikely to carry such a wide array of upgrade choices. For example, when they introduce a new cheap motherboard, they will offer a low performance CPU for it. A high end motherboard will offer a high end CPU, and so on. You might have one or two CPU choices for the motherboard you want to buy, as opposed to every possible choice like now.


I think I can live with that. I like building my own PCs, but choosing a motherboard is perhaps my least favorite part of the process. There are just too many choices out there, and the number of variables to consider is downright unholy. I might welcome anything that helps me cut down the list of options more quickly, including being able to toss out every model that doesn't have a CPU I'm interested in.

Alternatively, if there's serious demand for it I'm sure the market will come up with a solution. Motherboards with BGA-compatible sockets, perhaps. Or PGA sockets designed to accept adapters you can solder a BGA CPU into.


Hmm.. We seem to differ.

MB choice today seems to be rather simple for me: What _socket_ do I care for, plus what kind of slots and ports do I need (memory, pci-e, random ideas about the minimum required numbers of USB ports, whatever).

But for me the socket came always first. I upgraded my CPU quite often and would like to keep that option in the future. In the past there was no need to throw away your board for a faster (clock speed) CPU and it made sense to me that way.


Well, it is already pretty easy to throw out any motherboards that don't correspond to the processor you've chosen. On most PC part picker websites such as ebuyer you can select only motherboards with the correct socket. I don't like what Intel are planning to do as I absolutely love the flexibility of desktop PCs. This move could end them a lot sooner.


> Well, it is already pretty easy to throw out any motherboards that don't correspond to the processor you've chosen.

Not really. You can use it to throw out motherboards that aren't compatible with the category to which your processor belongs. Which is probably a very loose category - "Motherboards which support socket AM3 CPUs", for example. That's just not a very fine filter - probably only vaguely more specific than "Motherboards which were released in the past 48 months."


No its pretty good as a first filter. Lets say I want a i5 or an i7 but I must have a bunch PCI-e slots and 4 e-SATA. You start with a motherboard with the right socket and go from there. Perhaps I can get away with getting the most inexpensive processor right now but I must have these other features. This will probably work out to an inexpensive processor on an expensive motherboard. Right now, you can do that.

In the cheap processor is married to the low-end motherboard and the expensive processor is married to the high end motherboard world you couldn't. To get the higher end features you would have to buy the expensive processor, even if you didn't need it.


Actually, in building my own system, choosing the MB was my favorite part of the building process; always choosing high-quality and board component reliability/longevity over anything else.

I still have a few questions though. What will this mean for aftermarket heatsinks/coolers? I never run with the stock Intel/AMD heatsinks, will these coolers be permanently bolted on just like the processor?

And for motherboards featuring high-end processors, what will this mean for scenarios where the MB fails but the processor is still good. Will we just have to scrap the two altogether?


> The problem is that when you pick a motherboard, you also pick the CPU you want.

I think the issue here is that builders/gamers will be pushed into spending more. Right now you can get a < $100 ASRock board, pair it with a i5 3570K and a decent video card and you're off. Now you're more likely to have to grab a > $180 board if you want a top-end CPU.


I've upgraded CPUs a number of times. However, it gets a bit problematic when they changed the socket every other year, but it's still nice to be able to spend $50 to upgrade an old computer with a better CPU rather than buying a whole machine.

Another circumstance I had was that I had a motherboard fail. So I replaced the motherboard but kept all the other components including the CPU.


Your experience may be different, but I've never seen a case where a $50 CPU yielded a significant upgrade.

The new CPUs tend to change sockets before prices drop that low. With memory bandwidth being the major bottleneck, a new CPU in the same system never seemed to give more than 10% improvement anyway.


I've done it only when I had a few machines which needed bleeding-edge CPUs (and price didn't matter), and other machines. I'd buy high end and cheap CPUs when putting both systems online, then in a year (assuming the motherboard/socket was still current), I'd put the old high-end CPU into the cheap machine, and buy a new high-end CPU for the important machine.


Being forced to buy a new motherboard every three years could wind up being very costly.

First, after a few years, a new motherboard will probably require a new type of RAM, which can be quite expensive to fully populate.

It's also possible that the new motherboard will not have the same type of peripheral slots as the old motherboard did, requiring you to buy all new peripherals when you upgrade.

Finally, buying a new motherboard will force you to "upgrade" to a new BIOS, like UEFI -- which can bring in a whole host of problems that the old BIOS did not have (such as not easily being able to boot your existing operating system).


> First, after a few years, a new motherboard will probably require a new type of RAM, which can be quite expensive to fully populate.

Since memory controllers are now embedded in the CPU, not on the motherboard, ram support depends on the CPU, not the motherboard.

> It's also possible that the new motherboard will not have the same type of peripheral slots as the old motherboard did, requiring you to buy all new peripherals when you upgrade.

Peripherals are going PCI-e-only, which is designed for infinite future compatibility. It won't be replaced in a decade, if ever.


Yes, the move to UEFI worries me a whole lot more than the CPU being soldered to the board. I haven't upgraded a CPU in a very long time.


Don't worry: software still manages to bloat the hardware into uselessness every few years.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: