European Identity & Cloud Awards 2015
The European Identity & Cloud Awards 2015 were presented by KuppingerCole at the 9th European Identity & Cloud Conference (EIC). These awards honor outstanding projects and initiatives in Identity & Access Management (IAM), Governance, Risk Management and Compliance (GRC), as well as Cloud Security.
Jun 17, 2015: How to Cope with Challenging Identities in a Converged World
Over the past years the term of the Identity Explosion, depicting the exponential growth of identities organizations have to deal with, raised. We introduced the need for a new ABC: Agile Business, Connected. While agility is a key business requirement, connected organizations are a consequence of both the digital transformation of business and of mobility and IoT. This rapid evolution in consequence means that we also have to transform our understanding of identities and access.
Consent – Context – Consequence
by Martin Kuppinger
Consent and Context: They are about to change the way we do IT. This is not only about security, where context already is of growing relevance. It is about the way we have to construct most applications and services, particularly the ones dealing with consumer-related data and PII in the broadest sense. Consent and context have consequences. Applications must be constructed such that these consequences can be taken.
Imagine the EU comes up with tighter privacy regulations in the near future. Imagine you are a service provider or organization dealing with customers in various locations. Imagine your customers being more willing to share data – consent with sharing – when they remain in control of data. Imagine that what Telcos must already do, e.g. in at least some EU countries, becoming mandatory for other industries and countries: Handing over customer data to other Telcos and “forgetting” about large parts of that data rapidly.
There are many different scenarios where regulatory changes or changing expectations of customers mandate changes in applications. Consent (and regulations) increasingly control application behavior.
On the other hand, there is context. Mitigating risks is tightly connected to understanding the user context and acting accordingly. The days of black and white security are past. Depending on the context, an authenticated user might be authorized to do more or less.
Simply said: Consent and context have – must mandatorily have – consequences in application behavior. Thus, application (and this includes cloud services) design must take consent and context into account. Consent is about following the principles of Privacy by Design. An application designed for privacy can be opened up if the users or regulations allow. This is quite easy, when done right. Far easier than, for example, adapting an application to tightening privacy regulations. Context is about risk-based authentication and authorization or, in a broader view, APAM (Adaptive, Policy-based Access Management). Again, if an application is designed for adaptiveness, it easily can react to changing requirements. An application with static security is hard to change.
Understanding Consent, Context, and Consequences can save organizations – software companies, cloud service providers, and any organization developing its own software – a lot of money. And it’s not only about cost savings, but agility – flexible software makes business more agile and resilient to changes and increases time-to-market.
Venom, or the Return of the Virtualized Living Dead
by Matthias Reinwarth
The more elderly amongst us might remember a family of portable, magnetic disk based storage media, with typical capacities ranging from 320 KB to 1.44 MB, called Floppy Disc. These were introduced in the early 1970s then faced their decline in the late 1990s, with today’s generation of Digital Natives most probably not having seen this type of media in the wild.
Have you ever thought it possible in 2015, that your virtual machines, your VM environment, your network and thus potentially your complete IT infrastructure might be threatened by a vulnerable floppy disk controller? Or even worse: by a virtualized floppy disk controller? No? Or that the VM you are running at your trusted provider of virtualization solutions might have been in danger of being attacked by an admin of a VM running on the same infrastructure for the last 11 years?
But this is exactly what has been uncovered this week with the publication of a vulnerability called Venom, CVE-2015-3456 (with Venom being actually an acronym for “Virtualized Environment Neglected Operations Manipulation”). The vulnerability has been identified, diligently documented, and explained by Jason Geffner of CrowdStrike.
Affected virtualization platforms include Xen, VirtualBox and QEMU, but it is the original open source QEMU virtual floppy disc controller code, that has been re-used in several virtualization environments, which has been identified as the alleged origin of the vulnerability.
As a floppy disk driver still is typically included in a VM configuration by default and the issue is within the hypervisor code, almost any installation of the identified platforms is expected to be affected, no matter which underlying hosting operating system has been chosen. Although no exploits have been yet documented prior to the publication, this should be expected to change soon.
The immediately required steps are obvious:
- If you are hosting a virtualization platform for yourselves or your organization, make sure that you’re running a version that is not affected or otherwise apply the most recent patches. A patch for your host OS and virtualizing platform should be already available. And do it now.
- In case you are running one or more virtual machines at providers using one of the affected platforms, make sure that appropriate measures have been taken for mitigating this vulnerability. And do it now!
More importantly this vulnerability again puts a spotlight on the reuse of open source software within other products, especially commercial products or those used widely in commercial environments. Very much like the heart bleed bug or shellshock this vulnerability once more proves that relying on the given quality of open source code cannot be considered appropriate. This vulnerability has been out in the wild for more than 11 years now.
Open source software comes with the great opportunity of allowing code inspection and verification. But just because code is open does not mean that code is secure unless somebody actually takes a look (or more).
Improving application and code security has to be on the agenda right now. This is true for both commercial and open source software. Appropriate code analysis tools and services for achieving this are available. Intelligent mechanisms for static and dynamic code vulnerability analyses have to be integrated effectively within any relevant software development cycles. This is not a trending topic, but it should be. The responsibility for achieving this is surely a commercial topic, but it is also a political topic and a topic that has to be discussed in the various OSS communities. Venom might not be as disruptive as heart bleed, but the next heart bleed is out there and we should try to get at least some of them fixed before they are exploited.
And while we’re at it, why not change the default for including floppy disks in new VMs from “yes” to “no”, just for a start…
Bedrohungen für privilegierte Zugänge erkennen und abwenden
Mit den jüngsten Sicherheitsvorfällen rücken Systemadministratoren und privilegierte Zugangsdaten weiter in das Zentrum des Interesses. Die nicht abreißenden Schlagzeilen über Fälle von Datendiebstahl bei Unternehmen und Behörden zeigen, dass es sich nicht um Einzelfälle handelt, sondern um ein Problem, dem sich alle Unternehmen stellen müssen.
The Future of Federation
Federated authentication is the bedrock of secure Cloud access control. It enables organisations to extend their business operations beyond their network boundaries and join identity repositories from multiple sources and access multiple service providers using the same authentication environment.
100%, 80% or 0% security? Make the right choice!
by Martin Kuppinger
Recently, I have had a number of conversations with end user organizations, covering a variety of Information Security topics but all having the same theme: There is a need for certain security approaches such as strong authentication on mobile devices, secure information sharing, etc. But the project has been stopped due to security concerns: The strong authentication approach is not as secure as the one currently implemented for desktop systems; some information needs to be stored in the cloud; etc.
That’s right, IT Security people stopped Information Security projects due to security concerns.
The result: There still is 0% security, because nothing has been done yet.
There is the argument, that insecure is insecure. Either something is perfectly secure or it is insecure. However, when following that path, everything is insecure. There are always ways to break security, if you just invest sufficient criminal energy.
It is time to move away from our traditional black-and-white approach to security. It is not about being secure or insecure, but, rather, about risk mitigation. Does a technology help in mitigating risk? Is it the best way to achieve that target? Is it a good economic (or mandatory) approach?
When thinking in terms of risk, 80% security is obviously better than 0% security. 100% might be even better, but also worse, because it’s costly, cumbersome to use, etc.
It is time to stop IT security people from inhibiting improvements in security and risk mitigation by setting unrealistic security baselines. Start thinking in terms of risk. Then, 80% of security now and at fair cost are commonly better than 0% now or 100% sometime in the future.
Again: There never ever will be 100% security. We might achieve 99% or 98% (depending on the scale we use), but cost grows exponentially. The limit of cost is infinite for security towards 100%.
Leadership Brief: Facing the Future: Identity Opportunities for Telcos - 71287
by David Goodman
Telco operators are encountering challenges and opportunities that are shaping the future direction of communications. Faced with the erosion of revenues from the rapid encroachment of the OTT (over-the-top) players into their traditional market strongholds, operators are realising that data represents their most significant asset to provide added value to their customers in the future. Key to this transformation will be how operators manage users’ digital identity data better and position themselves as secure identity brokers/providers in a highly competitive market.
Understanding and Dealing with Macro-Level Risks that Affect your Institution’s Risk Profile
The concept of "think globally, act locally" has new meaning in the context of business organization risk from IoT, the cloud and other networked information system functions. The local instances of information functions on which businesses increasingly rely are part of data and identity “supply chains” that are hybrids of technology and policy that are themselves increasingly part of vast global networks where individual businesses often perceive a loss of leverage and control and increased risk. In effect, federated and cloud based data and identity functions are enabling these functions to be outsourced, like shipping, payroll, accounting and other company functions that have previously been outsourced to global networks.
Assessing and Mitigating Cloud Risks
The modern reality is that even the most technology conservative companies are thinking to shift some of their valuable assets to the cloud. However, since anyone with a credit card can purchase cloud services with a single click, the governance and control of organisations are frequently being circumvented. This can create various challenges for organisations that wish to adopt the cloud securely and reliably.
This session will lead you through various approaches on how to assess and mitigate risks for onboarding cloud solutions.
Mike Small - Cloud Risk Assessment
When moving to the use of cloud services it is most important to take a risk based approach. However the process involved is often manual and time consuming; a tool is needed to enable a more rapid and consistent assessment of the risks involved. This session describes why a risk based approach to the use of cloud services is needed. It introduces the KuppingerCole Cloud Rapid Risk Assessment Tool developed by KuppingerCole to help organizations assess the risks around their use of cloud services together in a rapid and repeatable manner.
Olga Kulikova - Dynamic Control Selection Framework for Onboarding Cloud Solutions
This talk proposes a data-driven selection of organisational, technical, contractual and assurance requirements, so secure usage of cloud solutions within the enterprise can be guaranteed. The importance of data oriented control selection is outlined and key control domains are introduced.
Mario Hoffmann - Dynamic Certification of Cloud Ecosystems
Cloud ecosystems are dynamic and flexible enablers for innovative business models. Some business models, especially for the European cloud market, however, still face challenges in security, privacy, and trust. A common approach among cloud providers addressing these challenges is proving one's reliability and trustworthyness by audit certificates. Basically, audit certificates are based on national and/or international as well as business and/or governmental compliance rules. The most prominent certifications in cloud computing are the "Open Certification Framework (OCF)" of Cloud Security Alliance, EuroCloud's "Star Audit", and "Certified Cloud Service" provided by TÜV Rheinland as well as more general certifications following ISO 27001, BSI Grundschutz, ENISA, and NIST.
This session discusses the state of the art of auditing and certifying cloud ecosystems and how current certification catalogues and schemes have to be enhanced to meet future requirements - requirements such as dynamic certification, on-demand-audits, and automatic monitoring and evaluations.
Stefan van Gansbeke - One Step Closer to the Unhackable Enterprise
The threat landscape became wicked and rougher. Governments are desperately trying to fight the cyber threats. But their efforts will never satisfy the needs. As a company, community or individual you remain a vulnerable target. Applying a layered information security strategy can effectively reduce your risk exposure. Define your drivers and long term security goals; involve your stakeholders; engage your customers, employees and suppliers; clearly communicate and achieve your targets by implementing the security roadmap are the key steps for becoming a security intelligent company who will be better protected against the next attack.
Recruiting Customers, Suppliers and Even Competitors to Help Reduce Risk
Various types of shared economic interests and risks create communities of interest where separate organizations work together such as in myriad supply chains worldwide. How can COIs come together in structured settings such as technical and policy standards initiatives, government programs, markets and other regulatory and self regulatory contexts to identify common needs and design, develop and deploy mutually acceptable solutions?