MULTIFACTOR - Too Little, Too Late?

By Peter Rietveld, Domain Architect Identity & Access Management, Ahold Delhaize [Euronext: AD]

Peter Rietveld, Domain Architect Identity & Access Management, Ahold Delhaize [Euronext: AD]

Over the last few years, we are made to believe that the password is rapidly fading into the past and that Multifactor, or even passwordless, are the current standard for authentication. The real world, however - and particularly the Enterprise is still struggling. How come?

Traditionally, authentication solutions are closed solutions, tightly integrated with what they are securing. The core mistake is that the authentication factor itself is the center of all attention. “Our biometrics” or “our OTP” is the central storyline in any product. The market offers a plethora of solutions, ranging from science-fiction-like biometrics using the human heart via dongles and smartcards to message-based solutions using mobile phones. They all claim to be strong, and yet they all have their downsides or specific weaknesses. The devil is, as always, in the details, and a corporate usage is not for the faint-hearted.

For instance, for a smartcard, each user will need a smartcard and a smartcard reader. For biometrics, each device needs a reader, too, be it fingerprint, movement, or iris. Besides the obvious cost factor of all these readers, it must also be considered that the reader must be work with any device which can be used to gain access. So if you use the movement sensors in the smartphone, you don’t have a solution for laptops. No access then?

For a smartphone-based one-time password, now very much in vogue, the user may not use the application from the same device as in that case, there is only one factor. Hence some OTP (one-time-password) vendors add device recognition to the mix; the authentication layer also verifies the device used and can actually block users when they use the same device for generating the soft OTP.

Today authentication would be behavioral biometrics and a Time based One Time Password via the Smartphone, but that may change next year. On the one hand, there is security as a driving force, on the other hand, there is a push from usability. Security will change once a new hack or trick appears. Suddenly the fancy feature is a weak link.

Yet the prime hurdle is the human user – not all users appreciate being mandated to use a private device to log in to a work application. Or be forced to use biometrics. Lawsuits have been waged and lost on this. Another major hurdle is the general usability in a less tech-savvy user base; when the users are not just well-educated office workers but normal people. Blue collars, elderly people, low literates – the days when IT was just office automation are long gone.

Usability is just as directly related to the growing diversity in devices connecting, first triggered by BYOD and MDM, but currently moving to RPA and IoT. Users today have more than one device with different technology, so we also must support multiple solutions at the same time. It is the diversity of the IT ‘edge’ landscape where cars log on to the network to synchronize the agenda of the driver. And of course, we can ‘enroll’ cars too yet bootstrapping security is extremely error-prone. In the enterprise, it is scalability and variation that blocks MFA.

This brings us over the traditional boundaries of IAM, which isn’t concerned with devices or Non-Person Accounts. The central paradigm of most authentication - verifying the human users – is no longer valid. The central approach, forcing a single strong method for all users, is just as invalided today; we can’t force biometrics on everything accessing our systems as robots don’t do biometrics. We must build backdoors and secure them, too.

The capital mistake in MFA today is considering managed devices as secure (something you have) and unmanaged devices as insecure. A device may be something ‘you have’ in the physical world, on the wire, the only device ‘authentication’ is a file on the device; probably a PKI certificate. From MFA theory, this wouldn’t even qualify as authentication since this is a file that can be copied.

As a TLS handshake doesn’t perform a DNS-lookup for the client, it would at best be an assumed identification. Besides this, there is also the echo of the 1990s security thinking, in which a ‘managed’ device would be secure, and a non-managed device, wouldn’t. A corporate device managed by the IT-department today may very well be a windows 7 machine with IE9 for support of legacy applications. The only thing ‘secured’ here is the job of the IT-staff.

With the advent of the OATH standard, the pattern started changing into the right direction – products are evolving towards a pluggable model supporting different authentication modules, and towards a dynamic model of conditional access, leveraging combinations of methods to ensure adequate assurance levels. The core of our desired solution for multifactor is support for pluggable arrays of factors and a threshold mechanism to weigh combinations. Any solution short of this isn’t a solution — just a product.

Weekly Brief

Read Also

Cyber Security and the Importance of this vital and indispensable technology for a CXO

Cyber Security and the Importance of this vital and indispensable...

Elliott Franklin, Director of IT Governance & Security, Loews Hotels
 Market Growth

Market Growth

Ioannis Roussos, Head of Deposits & Investment at Eurobank
Enterprise Agility in the face of rising cyber threats

Enterprise Agility in the face of rising cyber threats

Jonathan Sinclair, Associate Director, Cyber Security, Bristol Myers Squibb
Cloud At The Edge

Cloud At The Edge

Duncan Clubb, Head of Digital Infrastructure Advisory, CBRE
Neurodiversity: The Untapped Potential in Cybersecurity

Neurodiversity: The Untapped Potential in Cybersecurity

Craig Froelich (CISO) at Bank of America
Automating the Engineering Journey with the Cloud

Automating the Engineering Journey with the Cloud

Wouter Meijs, Global Head of Cloud, ING