Windows XP Service Pack 3 = Malware

1) At this point I'm not sure what I'm trying to say here
What I am trying to say is that it shouldnt matter what OS you do your tests in. If it does, then the hardware design is faulty.

3) What is the normal useful lifespan vs estimates of CPUs?
When run within spec, a CPU can last years, maybe even decades. I still have old P3's and even some P1's that should still run just fine.
On the other hand, I have seen CPU's that are run at the upper thresholds of what I consider 'safe' temperatures (upper 60's and low 70's) start to artifact and error out after mere months. From what I've heard, CPU's at 80+C can begin to error out after several hours.

The upper thershold for CPU's, like 105C for the i7's tested, is so the CPU doesnt outright fry.



is a cinebench 10 score of 8,612 decent

Its decent.
 
Regarding the time-consuming defrag: I take the view that prevention is better than cure! :)

Way back in the age of Win95, I advocated this little tip: In system properties, set the minimum and maximum swap file size to about 3 times the system RAM. This has two effects. While the system is fresh (such as the first run) it will be a few milliseconds slower because its accessing more disk space. After the system is aged (read: be used once) the computer will be many times faster than it would have been if you didn't do this.

By forcing a fixed-size swap file, every time the computer uses virtual memory (which is every session), the exact same space is used on the HDD. This contrasts with the default setting which will use a different area each time, interspersed with updated system files and new documents - that represents a source of significant fragmentation :crazyeye:

I am not an Apple or *NIX fanatic. I am a provider of solutions and nothing more :)

So it seems Windows can be fixed, but its still not ideal. *nix takes a similar approach of using a separate partition for the swap file. The *nix approach has the advantage of being able to reserve the sweetest parts of a HDD before installing a system, as well as allowing the swap file to change size without risking fragmentation.

Its not about changing the size once or twice, or even ten times. Its that Microsoft would have it changing continuously and then using clever algorithms to try and clean up the mess. I must iterate that prevention is better than cure.

Now lets look at NT vulnerabilities, such as users failing to disable everything MS installed to fragment that HDD.

If you don't disable things such as ActiveX and MS Office macros, then you are exposed to serious threats. So being wise users, we switch these off. In other words we pay for and install upgrades and immediately switch them off - yeah, that's value for money!

If you don't switch them off you're apparently an ignorant who deserves system failure. Not everyone in my family is keen to learn Windows administration and I think its unfair to blame them for visiting a website.

One of the weakest things about NT-derivatives is that they prey on ignorance. For example, malicious spoofing requires the fake pop-up (or whatever) to appear familiar. Perhaps the fake uses icons that look familiar, phrases that sound familiar, and colours that match your NT theme. The Mac novice will see these and think "Why is there a dubious-looking Microsoft pop-up asking me to install updates for Windows on my Mac?" - Obviously the Mac novice is in a much stronger position than the Windows novice!

So the NT community strikes back with extremely powerful anti-spyware, anti-malware, anti-virus, recovery applications and hardened firewalls. They started with a marginally slower system, they unintentionally slowed it down thanks to that fragmentation issue, and then they intentionally go and install bucket-loads of defensive applications that bring their top-end computer to a c-r-a-w-l.

No amount of Spyware has ever slowed down a computer as much as installing one of the comprehensive NT-defending suits (McAfee, Norton or whatever..) and to make matters worse each such suite is better at defending against certain types of Malware so people install a second (Adaware, ZoneAlarm, etc.) :eek:

You are all sooooo wrong... I'm not an Apple fanatic! I'm just being an independently-minded realist. If you would like to taste my bias, please install the free super-OS from Haiku.org, which is a product that I am contributing to :D



You forgot Darwin! :mad: :lol:

Good effort :goodjob:

Sadly, however, all of these distributions lack a good display system. X11 was intended for remote desktop administration and it does that very well. The cost is that X11 has a longer message path for many display operations and this makes it less efficient for desktop applications.

Furthermore, it lacks cycle-saving widgets. They put these things into competing desktop environments (KDE, Gnome, etc.) and applications written for one set of widgets won't work in another so you need multiple desktop environments and you soon begin to lose the beautiful efficiency that made the BSDs so sweet.

MacOS does provide an answer to this. Apple have done an excellent job of standardising their display system. While it is closed-source and not free, that might be why its also standardised *shrug*

The nearest open-source competitor is Haiku - check it out! :)


That's like saying diarrhea is better than constipation. Both are :):):):).


Most people do not have a choice.

There are client-restricting applications for network security and these are often paired with an anti-virus suite. If your computer does not pass your fickle employer's-administrative desires, you might be denied access to your employer's WLAN :eek:

People are typically Windows users so they download and install the required anti-virus package (which might be McAfee or some other big name). Their license is paid for by their employer.

The same extends to most universities. Soon we will have no freedom at all! :cry:

yea thats alot of old windows 98 tricks lol, you fail to understand a modern OS. As for the swapfile if you want truley responsive buy 8-12gb of ram and turn it off. Been running without one for 6 months no problems, thats thanks to 8gb of ram, even apple and linux use a swap file and both of those systems are easier to crack than windows. The market share doesnt exist so there less of a target for spyware and malware. Also working in the PC field repairing systems that are infected I can honestly say your idea about activeX is totally nuts, no activeX control can install or download without user premission, plain and simple. Its morons that always click allow to everything. But then again if Windows wanted to be idiot proof we'd hire steve jobs to pamper and baby us, and make sure we get sued for tweaking or changing anything.



as for CPU tempature it comes down to how its made basicly. There are sevral diffrent methods used to make CPU's and GPU's. A modern Intel CPU will run stable and fine at 75C while a modern AMD chip gets flaky over 70C. But a GPU can hit 105C before loosing it. Lifespan is 5 years for CPU's that is what the companies aim for, anymore than 5 years is above spec, and most will do this. Under 5 years means your either running out of spec or overheating or its a bad chip
 
yea but its relevant and there was alot of bad advice in this thread
 
Well Id like to point out then than you are wrong about Linux and Mac being easier to crack. So far it has been shown that while possible, exploits on Linux and Macs are far fewer, The ease with which Windows can be infected and exploited is pretty mind boggling.

You would also be wrong about activeX. There was one point when Microsoft made them automatically download and install without any user input. That has since been mitigated, but on older systems it can still happen.

There is no such thing as idiot proof. Someone will always be stupid enough to break something.
 
Wrong even with Windows 98 when ActiveX started with IE5.0 you had the option to allow or deny. It was never automatic. Also its been shown repeatidly how easy it is to break into a mac and linux, you dont need alot of exploits just one, to bring it down, and because Apple and the Linux community continue to ignore how easy they can be brought down it does make them less secure overall. Let MacOS get market share to rival windows or even come close and then it will be as if a million mac fanboi's screaming out in terror and where suddenly silenced.
 
Ease is not only about how fast or how surely you can crack a system, but also how much you can exploit it.

Linux exploits are also patched out fairly quickly. Once there is disclosure on the security holes, a workaround or a patch is found pretty fast.
 
only in the distro's targeted at bussiness and large corparations, for other users not so often or likly, unless you can compile the code yourself so it can be used in your distro. Sorry but Linux and Mac for home users are grossly insecure and it wouldnt take much of a progamer to bring either to there knees.
 
only in the distro's targeted at bussiness and large corparations, for other users not so often or likly, unless you can compile the code yourself so it can be used in your distro. Sorry but Linux and Mac for home users are grossly insecure and it wouldnt take much of a progamer to bring either to there knees.

Businesses are where the vast majority of Linux installs are.

Home users are also increasingly having the option of updates, which can include patches for exploits.

yes actully you can, set size to 0-0 mb

Even if you are correct, turning off the pagefile is not a good idea. There are programs that will not work at all without a pagefile.
 
Businesses are where the vast majority of Linux installs are.

Home users are also increasingly having the option of updates, which can include patches for exploits.



Even if you are correct, turning off the pagefile is not a good idea. There are programs that will not work at all without a pagefile.

so then please tell me why my computer is doing just fine without it. The Pagefile is called upon only when the computer runs out of available memory. With 8gb of ram you will not run into such a problem with day to day use. I personally use this method and also do 3d modeling with no page file, and i have photoshop opened at the same time for doing my UV mapping, along with a model viewer for the game i export the models to, so i can see how they look rendered in the model format the game users to see what needs tweaking. Still havnt gotten an out of memory error
 
so then please tell me why my computer is doing just fine without it. The Pagefile is called upon only when the computer runs out of available memory. With 8gb of ram you will not run into such a problem with day to day use. I personally use this method and also do 3d modeling with no page file, and i have photoshop opened at the same time for doing my UV mapping, along with a model viewer for the game i export the models to, so i can see how they look rendered in the model format the game users to see what needs tweaking. Still havnt gotten an out of memory error

Dunno if windows task manager shows the number of page faults, but in case it does, each page fault is access of the page file (in general)

Windows is built in such a way that it practically requires a page file. As such, 0.0MB does nothing like Zelig said.
 
want to see if you have a page file, its very easy lol. Check the size of your disk space to the actual date on it, do this by revealing hidden files and folders and protected system files except pagefile.sys, select everything and hit properties, now compare sizes if both line up you have it.

Also first move the page file from Drive C to Drive D, this requires 2 partitions or 2 drives. Now disable the page file on C drive and reboot, the page file is now on D Drive. Now go into disk management and unmount Volume D then restart.

Go back to system properties, it hasnt turned paging back on in C Drive because its looking for D drive that no long exists, now at this point go into the registry and delete the entry for the page file in D Drive, reboot and remount volume D

Page file is deactivated and you just fooled windows, and no it has no effect, the page file is used because most computers dont support enough system ram to run without it. The Page file is used to store data that cant fit in the memory and is of low priority to your current tasks.

Whats the effect, you will consistantly be using your memory and the only seek time on the hard drive is accessing data to be loaded into the memory. Without enough system memory though, this will cause a system crash.

And like I said the page file is used only when system memory is not required. Virtual memory is another story and is not the page file, Windows creats a 1gb hidden volume called virtual memory, this is not disclosed in the install and the size can only be changed by playing inside the registry, this is why a windows 32bit system can only really access 3gb of ram not 4gb, because 1gb is Virtual memory on a partition you can't see and only shows up with 3rd party programs that can examine the disk closely, but destroying that partition will cause the OS not to boot, because that virtual partition is part of the em386 that allows computers to operate in extended mode, while processing in real mode, on the CPU itself
 
want to see if you have a page file, its very easy lol. Check the size of your disk space to the actual date on it, do this by revealing hidden files and folders and protected system files except pagefile.sys, select everything and hit properties, now compare sizes if both line up you have it.

Also first move the page file from Drive C to Drive D, this requires 2 partitions or 2 drives. Now disable the page file on C drive and reboot, the page file is now on D Drive. Now go into disk management and unmount Volume D then restart.

Go back to system properties, it hasnt turned paging back on in C Drive because its looking for D drive that no long exists, now at this point go into the registry and delete the entry for the page file in D Drive, reboot and remount volume D

Page file is deactivated and you just fooled windows, and no it has no effect, the page file is used because most computers dont support enough system ram to run without it. The Page file is used to store data that cant fit in the memory and is of low priority to your current tasks.

Whats the effect, you will consistantly be using your memory and the only seek time on the hard drive is accessing data to be loaded into the memory. Without enough system memory though, this will cause a system crash.

And like I said the page file is used only when system memory is not required. Virtual memory is another story and is not the page file, Windows creats a 1gb hidden volume called virtual memory, this is not disclosed in the install and the size can only be changed by playing inside the registry, this is why a windows 32bit system can only really access 3gb of ram not 4gb, because 1gb is Virtual memory on a partition you can't see and only shows up with 3rd party programs that can examine the disk closely, but destroying that partition will cause the OS not to boot, because that virtual partition is part of the em386 that allows computers to operate in extended mode, while processing in real mode, on the CPU itself

You really have your facts wrong. A paging file is used always. The system pages out data in the memory to disk once it has been idle for long enough. This allows processes that are actively using memory to have access to the fast RAM versus the slower disk. Like I said, check your task manager. If it shows the number of page faults, you will see that even with your supposed lack of a paging file, the processes are still getting page faults. If there was no page file, where are the pages going to? Nowhere according to you. Your processes would quickly consume all of your memory if this were true.

Virtual memory is also exactly what the paging file is a part of. Virtual memory maps the physical RAM and disk paging so that multiple processes can use the same RAM space. Wiki has a pretty good overview of what Virtual Memory is and how it works. You might want to check it out.

Windows x86 can also access more than 3GB. I have seen 3.6GB of RAM available. 3.3 is pretty common.
 
This statement is false.



This statement is also false.


no your confusing virtual memory with the page file which isnt what MS is refrencing with virtual memory, reread my post it was further explained so you better understand what the virtual memory they are talking about is. Remember all CPU's execute windows in 386 extended mode while processing data in real mode, and real mode is 16bit, while extended supports 32bit, but with massive delay's in processing because of the inherint problems of the x86 design. So a virtual memory partion is created, on modern computers this is a 1gb virtual memory partition that is created, to handle this, the virtal memory is part of EM386, what this means is it creats a virtual memory address space on the local disk to support more than 1gb of memory, because x86 at its heart is 16bit in its execution, and allows extended mode applications to transfer between extended and real mode, thus eliminating the inherit delay of operating in extended mode alone, this was done with the 486 and saw use in Windows 3.1, before that you either used real mode which was 16bit or extended mode which was 32bit, but extended mode was a good 200% slower to execute data, the 486 allowed the computer with the right programming to switch between real and extended mode, and this is the virtual memory microsoft refers to, without this virtual memory address space, no windows wont work, also you cant turn it off if you do with a low level disk tool windows will fail to work.
 
Even if you are correct, turning off the pagefile is not a good idea. There are programs that will not work at all without a pagefile.

I can confirm this. On this computer, I had the pagefile set very low temporarily (I forgot exactly why) and I went to open an older game, it told me that the "swap file" was set too low.
 
You really have your facts wrong. A paging file is used always. The system pages out data in the memory to disk once it has been idle for long enough. This allows processes that are actively using memory to have access to the fast RAM versus the slower disk. Like I said, check your task manager. If it shows the number of page faults, you will see that even with your supposed lack of a paging file, the processes are still getting page faults. If there was no page file, where are the pages going to? Nowhere according to you. Your processes would quickly consume all of your memory if this were true.

Windows x86 can also access more than 3GB. I have seen 3.6GB of RAM available. 3.3 is pretty common.

no thats your confusion im afraid, with no page file if you consume your system ram, windows hangs with an out of memory error, you know the error we used to see with windows 95 and 98. With 8gb of ram, if you have a page file turned on you will see no use coming to it from windows because you are not using 8gb. I can have 4 web browsers open, a 3d modeling app, photoshop and WMP playing a DVD and shows memory usage at 6.2gb of I have 8gb, that includes other background process's. I have got out of memory error once, that was a bad site i visited that overwhelmed firefox wtih popups sending me out of memory, but it would also bring any PC with page files on to a hault as well.
 
Back
Top Bottom