Week’s Technology News – 27th February 2015

Boards acknowledge cyber risk on their 2015 agenda

Back in 2013, following a KPMG report that cyber leaks at FTSE 350 firms were putting the UK’s economic growth and national security at risk, the heads of UK intelligence agencies MI5 and GCHQ then asked leading businesses to take part in a Cyber Governance Health Check.  The results were a stark wake up call.

As we reported in our blog on 19th December, Board engagement is pivotal to the success of any cyber security plan and thwarting the eye popping 80% of preventable attacks in 2014.

The 2015 Cyber Governance Health Check has just been published and reveals that 88% of companies are including cyber risk on their Risk Register with 58%+ anticipating an increased risk over the next 12 months.  However, only 21% say their boards get comprehensive information and only 17% regard themselves as having a full understanding of the risks. This is clearly insufficient in the light of the continuing squeeze on data security and compliance measures.

You do not have to be a FTSE 350 to want continued trust from clients and the comfort of having up to date data security measures.   So wake up and smell the budding roses of 2015 and do your own health check review now:

  • Re-evaluate what the unique crown jewels of your organisation are (key information and data assets) as they may have changed in in the 12 months.
  • Review risk from any 3rd party suppliers and avoid contractual complacency – get into active compliance.
  • Be pro-active about risk and create a competitive advantage of rivals.
  • Arrange for a ‘pen test’ and get in shape to be security fit for purpose in 2015.

GCHQ


Windows Server 2003 is dying – but Windows Server 2012 will offer an elixir

With the forthcoming end of life for Windows Server 2003 and cessation of support from Microsoft on 15th July 2015, the effect will be severe for the many business still running this server in their data centre with exposure to cyber attack, unless considered steps are taken now to plan for upgrade.

Microsoft’s own survey recently confirmed that there were 22 million ‘instances’ (database environments) with WS2003 still running.

Organisations clearly need to plan their migration strategy – and quickly – if they are going to protect their infrastructure. End of support means no patches, no safe haven and no compliance.  Any company continuing to run WS2003 beyond July will fail regulatory compliance audits which could result in losing commercial contracts. So delays are not only expensive but highly risky.

The advances in the data centre with Windows Server 2012 RT offer integrated virtualisiation of compute, storage and networking along with enterprise class scalability and security.  The Cloud options of Microsoft Azure and Office 365 will deliver applications faster and increase productivity and flexibility – and take away risk.

Security implications

  • Software and Hardware compatibility – If you are running a mixture of physical and virtualised servers, then priority should go to addressing physical assets, as most WS2003 licences are tied to the physical hardware.
  • Compliance against many industry requirements has moved from a best practice ‘good to have’, to a mandatory requirement, so no option.
  • Payment Card Industry Data Security Standard (PCI DSS) v2, v3 – providing adequate assurance levels to meet the requirements of PCI will fail.
  • UK Government – connecting to the Public Services Network (PSN), whether through an assured connection or via an Inter Provider Encryption Domain (IPED) will be a headache if updates cannot be supported securely.
  • Industry standards Industry standards such as ISO 27001:2013 and the Cloud Security Alliance all require you ensure your systems and applications are up to date.
  • Disaster Recovery and Resilience  How do you re-start servers that are no longer supported? If DR is key to you business then migrating is a necessity will be fairly expensive.

Planning to move

  • Integrate your servers and their lifecycle into your strategy and risk management process.
  • Check what the servers do for you and do data mapping, flow and services exercise.
  • Identify your core assets and check them against confidentiality, integrity, availability and likelihood of compromise to help future design and investment decisions.
  • Create fit-for-purpose security architecture within your Cloud (ie should you need to retain legacy data which is rarely used – create security zones using layered firewalls, ingress and egress controls, file integrity and protective monitoring.
  • Test – lots – and then get a 3rd party certified security professional to conduct an ethical hack.
  • Failure to plan is planning to fail – do not let your business suffer by putting your head in the sand.

885284

This week’s technology news – 20th February 2015


Microsoft enjoys gold in Europe

Microsoft’s VP of Legal & Corporate Affairs, Brad Smith announced on 16th February 2015 that it had become the first major cloud provider to adopt an international standard for cloud privacy – which is also the world’s first.

This follows the EU data protection authority’s endorsement of Microsoft’s gold standard for cloud privacy back in 2014 (see our blog 17th April 2014).  The new ISO creates a uniform, international approach to protecting privacy for personal data stored in the cloud.

Smith is clearly pleased:  “The British Standards Institute (BSI) has independently verified that in addition to Microsoft Azure, both Office 365 and Dynamics CRM Online are aligned with the standard’s code of practice for the protection of Personally Identifiable Information (PII) in the public cloud”.

Where standards will affect business assurance and safeguards to industry, this new ISO is important commercially as ISO 27018 assures enterprise customers their privacy is safe – and the new standards promise the data will not be used for advertising.

According to Smith, Microsoft can only process identifiable data the customers provide and is obliged to notify the customers where their data is, and who else is using it (in case there are third parties in need of their data). Additionally, the company offering cloud services must notify the client in case the government requests disclosure of ‘PII’ data.

azure

Google’s CIE says “Don’t get lost in the digital Dark Age”

Chief Internet Evangelist for Google, Vint Cerf, a “father of the internet” and holder of the highest civilian honour, the U.S. National Medal of Technology, addressed the American Association for the Advancement of Science (AAAS) annual conference in San Jose last week.  His talk aired concerns that all the images and documents we have been saving on computers will eventually be lost – and that future generations will have little or no record of the 21st Century as we enter what he describes as a “digital Dark Age”.

This would occur as hardware and software become obsolete (and as backward compatibility is not always guaranteed) and old formats of documents, presentations or images, may not be readable by the latest version of the software or retrievable from external hard drives.

“The key here is when you move those bits from one place to another, that you still know how to unpack them to correctly interpret the different parts. That is all achievable – if we standardise the descriptions…. We have various formats for digital photographs and movies, and those formats need software to correctly render those objects.  Sometimes the standards we use to produce them fade away and are replaced by other alternatives and then software that is supposed to render images can’t render older formats so the images are no longer visible”.

“Over time, we accumulate vast archives of digital content, but may not actually know what it is.”  As it is unclear what would be the most important data of our generation it was important to preserve as much as possible.

“The solution is to take an X-ray snapshot of the content and the application and the operating system together, with a description of the machine that it runs on, and preserve that for long periods of time. And that digital snapshot will recreate the past in the future.” Cerf calls this digital form, ‘Digital Vellum’ to be held in servers in the cloud – and accessible as required because descriptions have been standardised.

Whilst there is no guarantee of Google being around in 3000, the notion is that the x-ray snapshot captured is transportable from one place to another. So, it could move from say Google cloud to another cloud, or back onto a personal machine.

Google-Vincent-Cerf-631_jpg__800x600_q85_crop

See video:  http://emp.bbc.co.uk/emp/embed/smpEmbed.html?playlist=http%3A%2F%2Fplaylists.bbc.co.uk%2Fnews%2Fscience-environment-31458902A%2Fplaylist.sxml&title=Net%20pioneer%20warns%20of%20digital%20’Dark%20Age’&product=news“>http://emp.bbc.co.uk/emp/embed/smpEmbed.html?playlist=http%3A%2F%2Fplaylists.bbc.co.uk%2Fnews%2Fscience-environment-31458902A%2Fplaylist.sxml&title=Net%20pioneer%20warns%20of%20digital%20’Dark%20Age’&product=news

When just one drop IS enough

An American company, Nanobiosym has shown off its latest mobile diagnostic device, ‘Gene Radar’, which can perform real time testing on a drop of blood, saliva or other bodily fluid to detect disease.

Using a nanochip in a mobile device, they claim it provides a gold standard at DNA/RNA level, revolutionising the previous mountainous PCR processing which went before it in medical profiling, to create more efficient scientific solutions to viral scanning.  A mobile scanner that can detect whether a person has Ebola, HIV or the flu virus in less than one hour has great significance. The technology can be deployed in wearables, smart phones and notebooks and apps for self diagnosis are also being developed apace.

Nanobiosym is one of several US companies chasing healthcare business in this sphere, including Corgenix (a Microsoft Gold Service Partner) and Nanomix.  CEO of Nanobiosym Dr Anita Goel is passionate about the opportunity for this new technology to truly democratise healthcare, especially in third world countries, which do not have the industrialised history and infrastructure investment in healthcare and take it to the people.

The personalisation and mobility of this healthcare offering is very exciting. It brings together physics, biomedicine and nanotechnology to diagnose conditions and is viewed by Goel has having the potential to cut the costs of some conditions by up to 99%, surely of interest to healthcare boards around the globe, where the pressure on budgets is forever being squeezed.

The development is eye catching when in the West, traditional HIV screening would cost $200 with results taking two weeks – and six months in Africa.  The outbreak and spread of Ebola hooked world headlines in 2014 and its impact is still being felt.  The new technology being developed by these companies can detect the disease at very low levels, before a patient is even showing symptoms.  In practical terms, scanning for this and other diseases at airports say, could help contain, advise and start pro-active steps for treatment, even affecting future generations.

The company is waiting for approval from the US Food and Drug Administration (FDA) before offering the device for sale.  With diseases like Ebola, it would be a straightforward tick for border agencies, keen to control migration of those affected. However the ramification for detection through apps of other genetic diseases like Parkinsons or Alzheimer’s carries with it the health warning that the patient’s very knowledge of the disease could alter and affect their life, decisions and outlook if pre symptoms were detected whilst there was still no cure.

28337-technology-generadar

See video:  http://goo.gl/FcBXoD