Community
I read this piece about "code rot" this morning. Curious.
Every Windows PC I have ever owned has seemed to get slower as time goes by - and that includes servers. Sometimes defragging the disk and/or a reboot speeds it up a bit - but mostly not. People presumably think this is how computers are supposed to be.
Well it isn't.
Apple kit doesn't seem to have this problem, neither apparently does Linux. I recall back in the 90s our trusty Netware server went for months on end without a reboot or loss of performance.
So is it a real effect or are we all imagining it? What's your experience?
This content is provided by an external author without editing by Finextra. It expresses the views and opinions of the author.
Alexander Boehm Chief Executive Officer at PayRate42
05 September
Alexander Saleh Head of Partnerships at Coincover
02 September
Alex Kreger Founder & CEO at UXDA
Welcome to Finextra. We use cookies to help us to deliver our services. You may change your preferences at our Cookie Centre.
Please read our Privacy Policy.