Jump to content
Sign in to follow this  
loki

moving multi-sockets to PCIE cards?

Recommended Posts

I remember ~10 years ago talk of cpus being moved to add-on cards...

so i was just thinking...

IF, drivers get alot better-

AMD has good GPU tech in their APU's and the next gen apus should have DDR4 support, ~65W tdp's and are SOC's

what do you think of the idea being able to put several cards of an APU with ddr4 as crossfired gpus?

the draw backs i'd see are

1.price

2.i doubt the main apu will have enough pcie lanes to support more than 1 or 2 of these unless

---- they could supplement each one added to move the data around.

Share this post


Link to post
Share on other sites

the 'end' of 2+ sockets on 1 mobo.

more heterogeneous computing power in a smaller box.

the 'cards' would be powered with the 75w spec that pcie can power.

Share this post


Link to post
Share on other sites

AsRock kinda dabbled in this concept with the 939Dual motherboard. This was around 05'ish.

k8upgradenf3_riser_little.JPG

The conventional cpu plug was 939. However, a separate board with an AM2 plug could be attached to the mainboard by way of a special slot. The extra board also had its own DDR2 slots, while the mainboard had DDR1. It's pretty damn cool if you think about it, if in a weird, sorta useless way.

Edited by FAPTurbo

Share this post


Link to post
Share on other sites
PCIE 16X CPU CARD

post-12-0-96484700-1396357869_thumb.jpg

This is a industry standard current card much like Blade Servers but for use in the telco industry and other industry specific vertical markets. Not cheap and while allows a very dense system as the box usually has 10 to 20 PCIE 16x card slots they also have a specialty slot to support and allow coworking of multiple cards.

Over all not price reasonable in todays current disposable market with price leaders going so cheap.

Share this post


Link to post
Share on other sites

dfelt, now imagine that w/o power connector, side chip(south bridge?) and laptop ram.

Share this post


Link to post
Share on other sites

It'd only really appeal to enthusiasts and even then, just a subsect. Perhaps 10 years ago when software still outpaced hardware, the concept would make more sense. But nowadays, even a Core i3/Pentium or AMD A-Series will have power to spare for years for applications that don't involve rendering graphics or CAD's. If that power is needed, that's what easily-upgradeable GPU's are typically for.

Plus with the PCIx16 and SATA-3 bus' nowhere close to being saturated by even SSD's or GPU's, you can buy a board and decent CPU and hang onto them for 5 years. Having an upgradable CPU port doesn't make much sense when it'd likely be cheaper to purchase a new mobo/cpu several years down the line anyways.

Edited by FAPTurbo

Share this post


Link to post
Share on other sites

If you just want to be able to add a second cpu to your rig, I guess...maybe. But you'll might run into license issues if you run Windows, and how many games and regular software can actually use a second cpu anyway?

Share this post


Link to post
Share on other sites

Personally best to just buy a workstation system that has the dual or quad core CPU's on the motherboard. You can get a heck of a deal from DELL on a Quad Core, Dual CPU workstation.

Share this post


Link to post
Share on other sites

Your content will need to be approved by a moderator

Guest
You are commenting as a guest. If you have an account, please sign in.
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Sign in to follow this  



About us

CheersandGears.com - Founded 2001

We ♥ Cars

Get in touch

Follow us

Recent tweets

facebook

×

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.