time.gifDuring a recent email conversation with Damien Mulley on the risks posed to dial up users I also discussed with him how the rapid advancement of technology is making older technology more dangerous.  The following is a summation of that conversation and hopefully it will provide you with some food for thought.

Despite the media splashes regarding the latest and greatest operating systems, database platforms and applications many companies are at risk from technology that has been around for many years.  This is due to the fact that most large companies still rely on old technology, such as Digital VAX servers, to run proprietary business critical systems.   Some of these systems have been in place since the 90s or indeed the 80s (I know I could have said the last century but that makes ME feel too old).  The number of people with skills in those areas is rapidly dwindling as they retire or indeed die.  On top of that, some of these companies, for example Digital, no longer exist so their operating system and software are not being developed, maintained and patched as regularly as more modern systems leaving these older platforms more vulnerable. 

To compound the issue, many of these old proprietary systems were initially designed and set up when the Web was still a glimmer in Tim Berners-Lee eye.  These systems which were designed to be not connected to any external networks were by default that bit more secure as the only people who could connect to them could only be connected to the company’s own internal network via expensive terminals.  In addition, these systems kept all the data centrally and not distributed on desktops in spreadsheets or personal databases.   But now with the demand for networks and connectivity they are invariably connected to the Internet in some manner with their data scattered to every device on the network and face the threats and dangers that brings.

Obsolete hardware also places these older systems at risk.  Some critical business systems may be running on equipment that spare parts are becoming increasingly hard to find, resulting in prolonged outages while parts are sourced and/or people with the skills to repair the machines are found.

Software manufacturers also continue to rapidly release new versions of their software which can unwittingly force companies into using obsolete, unsupported and as a result insecure systems.  A company that has proprietary in-house systems with a database platform at its core may not be able to fully test the latest release of the database software in time to ensure that it does not break their application or any other dependent systems.  IT managers will focus on keeping the system running rather than taking it down to apply a patch. 

Any issues identified in testing the new release of the database software could force the application to be rewritten which again has to be fully tested.  Depending on the organisation, the skills and resources available and the availability of test systems, this whole cycle can take quite a long time to complete.  I know of companies whose core systems are running on operating systems or database platforms (or indeed both) that are no longer supported by the vendors.  The result is that key systems that are being used in banking environments, financial companies, hospitals or systems managing critical infrastructure are potentially at risk of being more easily compromised because they are running on outdated and as a result unpatched systems.  I know of at least one financial company that is running a critical system on Windows NT Server 4.0.

Of course the counter argument to the above is that new technology is also making things less secure.   Everything is now coming with bluetooth/wi-fi enabled so you can connect to anything, anyone anywhere.  But of course anything, anyone can connect to you.  Web 2.0 is advancing at a rate of knots with new systems and ways of sharing information exploding around us.  Social networks, Google Docs, etc. are enable easier sharing of information but how secure are these?  We may be running Web 2.0 at the moment but unfortunately we are still at Security 1.0 – if even that.

However, no matter what vintage your systems are you still need to ensure that they are available and secure.  Our role as information security professionals is not to deploy and use the latest and greatest technology, but to ensure that the business can continue to function in a secure and safe manner.  So to counter the above;

  • Ensure you do a thorough risk assessment and identify the main risks facing those archaic systems.
  • Identify ways you can control those risks such as segmenting your network, controlling traffic to these systems from specific workstations or blocking unwanted traffic using firewalls.
  • Present your findings to the business highlighting the risks, what is currently in place and what needs to be put in place.
  • Get the business to sign off and accept the risks based on the controls you are able to get in place.
  • Ensure you have updated your business continuity plan to factor in how to recover older systems should they fail.
  • Ensure you update you incident response plan in the event those systems are breached.

About the Author: bhimport

Let’s Talk

Please leave your contact details and a member of our team will be in touch shortly.

"*" indicates required fields