Community
I read this piece about "code rot" this morning. Curious.
Every Windows PC I have ever owned has seemed to get slower as time goes by - and that includes servers. Sometimes defragging the disk and/or a reboot speeds it up a bit - but mostly not. People presumably think this is how computers are supposed to be.
Well it isn't.
Apple kit doesn't seem to have this problem, neither apparently does Linux. I recall back in the 90s our trusty Netware server went for months on end without a reboot or loss of performance.
So is it a real effect or are we all imagining it? What's your experience?
This content is provided by an external author without editing by Finextra. It expresses the views and opinions of the author.
Carlo R.W. De Meijer The Meyer Financial Services Advisory (MIFS) at MIFSA
30 September
Alex Malyshev CEO, Co-founder at SDK.finance, FinTech software provider
Erica Andersen Marketing at smartR AI
28 September
Anurag Mohapatra Director of Fraud Strategy and Marketing at NICE Actimize
26 September
Welcome to Finextra. We use cookies to help us to deliver our services. You may change your preferences at our Cookie Centre.
Please read our Privacy Policy.