Hi, Leo. I have an Acer Aspire X1301 Desktop running 64-bit Windows 7. I
have an NVIDIA GeForce 9200 card installed. I bought a BFG Nvidia 9800 GTX (512
MB) graphics card to upgrade. When I took the cover off, I saw my graphics card
is not plugged into a PCI slot; it’s just a small, metal silver box. It’s the
same for my sound card, etc. Can I never upgrade?
In this excerpt from
Answercast #61, I look at the issues (and opportunities) around adding a new
video card to a system.
Become a Patron of Ask Leo! and go ad-free!
No, you can upgrade. There’s no problem with that.
What you’re seeing is that many manufacturers now actually place a video
card (although it’s not a separate card), they place the video circuitry
directly on the motherboard. And that’s actually been true even longer for the
sound hardware that you’re seeing.
So, what you have is a motherboard that has an integrated video card and an
integrated sound card.
Adding instead of replacing
That doesn’t mean you can’t upgrade them. In fact, it gives you a little bit
more flexibility than you might expect.
If you have an open PCI slot (in other words, if you’ve got a place to plug
in that new card that you purchased), plug it in.
What you’ll find is that Windows will (in all likelihood) detect that it’s
there – and then let you use it as potentially a second monitor. Whether it
becomes your primary (or your only) really depends then on how you configure
When you right-click on the Desktop and click on (I
believe) Screen Resolution, it will let you start to choose
which display will be your primary, and how big it will be, and whether or not
it’s even used – so there’s no problem here at all.
You can also (if you want to) disable the drivers for the motherboard-based
You can actually disable it in most computers via the BIOS. But in all
honesty, in a situation like this, I’d be really tempted to leave it running.
Even if you don’t plug anything into it, it’s not going to be a drain on your
system as long as you’ve got enough RAM.
The neat thing about it, like I said, is that you’re one step away from a
dual-monitor system without really even trying. You can have a monitor plugged
into both: that one on the motherboard circuitry and the video card that you’re
You can use the video card that you’re adding for whatever software you’re
running that has higher graphics requirements. I would assume this is probably
a situation where you’re about to run a game or so forth. You can use it by
itself or if you like, get yourself another monitor and you’ve got a dual
Next from Answercast 61 – I found what appears to be a crash report for MsMpEngine. What should I do?