More display adaptor troubles. LOL.

aimeeandbeatles

watermelon
Joined
Apr 5, 2007
Messages
20,112
Okay, I use ATI Radeon Xpress 200 series, with latest drivers. Only thing I installed recently was Zoo Tycoon.
So anyways, I was trying to open Civ4, didnt open properly so I killed the process, which made the bottom half of the screen disappear. (It happens every so often; no big deal.) So I rebooted, got the usual display adaptor troubles, tried uninstalling adaptor. Rebooted, it kicked in, and then the screen went blank with a flashing underscore in the corner. Think of a DOS Prompt. Nothing typed. (Although I heard the sounds of programs opening behind it, I leave my speakers on permanently now.)

What should I do now? Try to reinstall the driver? I'm wondering if Zoo Tycoon has anything to do with it -- I just learned it used SafeDisc, I think.

Thanks.
 
Sounds like your on board graphics dosen't support your saved resolution. Try booting in safe mode to re-enable your graphics adaptor and re-install your drivers.
 
I'll try it tomorrow after school. Thanks. :)
Although I haven't changed my resolution recently. Strange.
 
I'll try it tomorrow after school. Thanks. :)
Although I haven't changed my resolution recently. Strange.
The thing is your computer might have been running at a resolution that your card could handle but on-board couldn't. It's doubtful that Civ fried your card or on-board graphics. But it is possible..... Wow. Board is a weird word. Boaarrrd.
 
Reinstalled the driver. Got the thing working.
Thank you. :)

P.S. I figured out why it happened -- earleir that day, a fuse blew, knocking the computer power's off. For some reason, my display adaptor doesn't like power outages. Thought of it this morning.
 
Okay, another fuse blew. I didn't have any problems (the first boot after an outage is all right) so I reinstalled the drivers.
Should it be all right?
 
More problems: I hit the reset button (which made it power out) but when I tried to reinstall the adaptor, it said I didn't have DirectX 8 so it couldn't install.

I googled it and it gave me DirectX 8.1 and installed but it didn't like that. What can I do now?

EDIT: Hold on, gonna try something.
 
Ha, turns out that Safe Mode also disabled DirectX.

Why didn't I think of that before? God I'm stupid!
 
Ah, crap -- it appeared to install properly, but it didn't.

Any suggestions now?

EDIT: Attempting to uninstall and then reinstall -- I wasn't bothering with the uninstalling part. Wish me luck.

ANOTHER EDIT: Uninstalled both the drivers and the ATI Control Panel. Now when I click on the adapter installer, it starts to install (InstallShield thingy loading) but then nothing happens...
 
Still not working.

A few things --

1. A few days before this, I accidentally hit the reset button on my computer, which switched the power off instead of going thru a proper reboot. (Display adaptor don't like that.)
2. I rebooted the computer to try to troubleshoot a program. On the Shutdown menu, it said it needed to install some updates, so I did that. Then the adapters went weird. (I remember this happening on the last fresh install, but not as badly as this.)
3. I don't have any other hardware drivers installed except my printer, which never gave me problems before. So I don't think that's it. (I remember once a webcam giving me problems.)

EDIT: I'm such a noob! It was either ATI Tray Tools or an outdated driver. (I uninstalled Tray Tools then downloaded the driver again.)
 
Having more problems. Nothing happened recently -- I'm wondering if it's something other than the drivers. Any ideas?

EDIT: I tried to uninstall and delete everything so I could do a fresh install of the driver. But when I rebooted, it automatically detected.
 
I haven't had any fuse/reset button issues since the last time -- so I don't think it's that. And I don't smell anything funny.
 
If it autodetects and you'd rather put in your own driver in your own time, simply cancel the autodetect. You can double check the status in Device Manager.
 
It kicks in before I get to cancel. Is there a way to disable the autodetect?

EDIT: Googled it. It said to disable Plug and Play. Would this work for the adapter?

ANOTHER EDIT: My monitor's a plug and play. Will this mess with it?

STILL ANOTHER EDIT: Found something cool -- more googling brought me to something called "Shell Hardware Detection." I wonder if this'll work.
 
You should never have a driver problem with a monitor. they all act the same without drivers. The drivers only (occasionally) enable extra features.
 
Oh, it's one of those :D

I've never needed to disable plug and play. If the driver is wrong and it's installed, you could go in to it in device manager and use update driver. (Other cases involve add/remove programs)

I agree with cutlass. Never mind the monitor and what it's doing until you've finished, then set it to get the refresh rates fixed up.

Shell hardware detection is something else (removable drive detection).
 
I tried to update it (display adapter) in the device manager -- says it didn't have any updates. So I checked the ATI website and there were updates.
 
Aha, I think I figured out exactly what went wrong.

I was installing the 200m series drivers, instead of 200.
 
I'm kinda flying blind here but hope I can help. When using device manager to update the drivers you can ask windows not to search automatically then browse to the drivers.

On the other hand, more often with display drivers you can uninstall the driver package using add/remove, then install the new one, rebooting between each step for good measure.


EDIT: just saw your last post. No worries.
 
Back
Top Bottom