Biometrics and privacy: A positive match How organizations can use biometrics technologies and protect individuals’ privacy in the journey to high performance
Contents Biometric technologies have arrived
Introduction 4 A biometrics primer
Biometrics and privacy— Checks and balances
Addressing privacy concerns 11
Stakeholders in the privacy debate
A positive match
Biometric technologies have arrived Increasingly adopted as a means to improve security, convenience, and inclusion in society, biometric solutions are helping organizations gain greater efficiency and reduced cost. But this steep adoption curve is raising legitimate concerns over how sensitive biometric data will be used and safeguarded.
Protecting individuals’ privacy is a balancing act. On the one hand, frequent media reports on high-profile data breaches and probing questions from data privacy regulators and public figures set the tone for a more privacy-conscious world. On the other hand, individuals are increasingly willing to share their private data; a result of trusting that their data will not be misused, and changing cultural attitudes.
In short, citizens are seeking better, more efficient services while being assured that their basic privacy rights are protected. The result is a shift of onus on organizations that develop or use biometrics technologies to understand and address privacy concerns. So how can organizations reap the benefits offered by biometric solutions while preserving— or enhancing—the individual’s right to privacy?
Introduction Originally captured from criminals or suspects, biometric records were a breakthrough for police investigations and have been used at scale since the 1980s. However, more recently, the benefits of using biometrics to improve security, convenience, and inclusion in society—more efficiently and at reduced cost—are being widely recognized. Today, biometric solutions are being used by the general public in areas such as: • Border control and immigration (ePassports, border control gates, residence, visa or asylum permits) • Civil ID (identity cards, health insurance, social benefits schemes, voting rights) • Banking (Automated Teller Machines, private banking services, mobile banking) • Shopping (customer personalization, VIP/loyalty enablement, segmentation) • Policing and security (investigations, custody, watchlisting, surveillance) • Travel and transportation (automation, frequent flyer recognition)
The number of deployed biometrics solutions is growing fast. Indeed, the biometrics market is estimated to grow from $5 billion in 2010 to $17 billion by 2017, at a compound annual growth rate of around 18.5 percent1. As the adoption curve of new technologies steepens, so does the volume of highly-personal biometric data that is in use worldwide, leading to an increase in privacy concerns. Consequently, every stakeholder in the biometrics business is being drawn in to the privacy debate. TechSci Research, “Global Biometric Systems Market Forecast & Opportunities, 2017,” July 2012
A biometrics primer The term “biometrics” refers to the automated recognition of individuals based on their physiological and/or behavioral characteristics. Physiological traits commonly used include face, fingerprints, irises, hand or finger vein patterns, and DNA. Behavioral traits may include signature, keystroke dynamics, and gait. Some traits, such as voice, can be both physiological and behavioral. By using biometrics, it is possible to confirm an individual’s identity based on “who they are,” rather than by documentation—“what they have”—or passwords—“what they know.” These factors can also be combined (“multifactor”) to gain higher levels of security. A biometric solution will always involve at least one, and often all three, of the following processes: • Enrollment—the capture and storage of the biometric samples against which all subsequent identity comparisons will be performed • Verification—the validation of a person’s claimed identity through a one-to-one comparison between the “live” captured biometric data and the corresponding biometric samples data acquired in the enrollment phase; this stage is also known as authentication
• Identification—the determination of an individual’s identity through a one-to-many comparison of a “live”captured biometric sample against the entire enrollment database, without the subject having to claim an identity. Biometrics information is captured as digital images, known as “samples,” and converted to “templates” that can be mathematically compared to each other by use of various biometric algorithms. Biometric samples and/or templates may be stored by a system at any of the above stages, depending on its design. It is generally accepted that biometric traits cannot be easily transferred between people, and thereby represent a highly secure unique identifier. An increasing number of biometric solutions involve more than one biometric technology, such as face recognition and hand vein patterns. While these “multi-modal” biometric systems store more biometric data than their uni-modal counterparts, they are generally more accurate and fraud-resistant than single biometric solutions, more future-proof, and permit broader access by all members of the population.
Automated border control gates aim to reduce queues and maintain security at Amsterdam Schipol Airport’s international terminal. Ranked as Europe’s fourth largest airport, Amsterdam Airport Schiphol processes more than 50 million passengers a year and more than 130,000 tonnes of cargo per month. With an annual expected growth in passengers of approximately 5 percent, but no anticipated increases to its more than 400 border guards, Amsterdam’s Ministry of the Interior and Kingdom Relations, the Royal Military Police and the Schiphol Group approached Accenture to launch a trial automated border control system at Schiphol airport. Accenture initiated the first phase of a Self-Service Passport Control project in August 2011; instead of European Union travelers having their passports manually checked by a border agent, they can enter a gate and place their passport on an automated device. In a matter of seconds, the device can 6
read and authenticate the passport while a camera located on the automated border control takes a picture of the traveler’s face. The photograph is checked in the system for a biometric match with the picture stored on the ePassport’s chip. The passport data is also used for background checks to identify potential fraud or authentication issues. If all the ePassport checks, background checks and identity verification are successful, the gates open and the traveler is free to continue on his journey. Within the first six weeks of the trial, more than 210,000 passengers have been through the e-gates. What is more, in the first weekend of their use, the gates were able to identify discrepancies in four travelers’ passports that resulted in exclusions from crossing the border.
US-VISIT enhances security and reduces immigration fraud Accenture is supporting the United States Visitor and Immigrant Status Indicator
Technology (US-VISIT) program to implement a virtual border, which aims to extend immigration and border management to points within the physical borders of the United States and beyond. Decision makers are then better able to distinguish security risks from illegitimate individuals, stop them before they reach the United States or identify them while they are inside the United States. Accenture assumed operational responsibility for the legacy Automated Biometric Identification System (IDENT) in 2004 and it is now one of the largest biometrics-based programs in the world: • Handles more than 214 million identifications and verifications to date • Returns ID matches in less than 10 seconds • Performs searches across 140 million unique identities in seconds • An automated 10-fingerprint system has 45,000 watch list matches that two-print could not catch.
Unique ID Authority of India Aadhaar program protects privacy The Unique Identification Authority of India’s (UIDAI) Aadhaar program will provide a unique identification number for the nation’s 1.2 billion citizens. The aim is to use the program as an identification framework for various government schemes and provide financial inclusion for socially disadvantaged citizens. The Aadhaar program is being rolled out over the next decade and aims to process hundreds of thousands of identity validation requests each second against the world’s largest database of individuals. The UID uses multiple types of biometric data for identification, including iris scans, fingerprints for all 10 fingers, and multiple facial images. Since 2011, around 200 million citizens have been enrolled, making it the world’s largest biometricsbased database. The system is processing around one million enrollments every day at its peak, using the services of three different biometric service providers—including an Accenture-led consortium.
Biometric benefits Biometrics can offer various benefits, including: Enhanced security—the IEEE,2 among others, refers to biometrics as the “strongest” of the three authentication steps mentioned previously, due to: • Strength and non-repudiation— Users select passwords that they easily remember, and/or write down. Tokens can often be duplicated, borrowed or stolen. Unlike a password, it is rare that the user will have the option of choosing a “bad” (insecure) biometric feature to enrol. Biometric features are unique, meaning biometric authentication offers limited deniability • Usage motivation—Often users are the weakest link in a security process—they may not be interested, nor see the need for, some security measures. Biometrics can increase adherence to security policies by being less inconvenient and offering a positive user experience. This improved experience can motivate users to take full advantage of the security mechanisms, instead of attempting to circumvent them. Increased convenience—in general, citizens are not interested in security— rather, they want to go about their business with minimal inconvenience. Biometrics can help them achieve this, by enabling faster, more streamlined, and more natural user interaction with a system, mimicking how people recognize and trust people they know. Furthermore, users are not able to forget
their biometrics, or leave them at home. Reduced cost—incorporating biometric recognition into a system comes with a certain investment cost; when deployed correctly, it will offer opportunities to reduce operational costs, through simplified processes, self-service automation (giving a decreased need for manual work), and reduced maintenance (such as removal of the need to maintain or replace smartcard tokens and readers). Greater inclusion—in many developing countries, people live without official identities, or the ability to prove them. Access to many services relies upon the provider being able to identify a person—without an “official” identity it is hard to open a bank account, apply for a loan, or prove eligibility to government benefits. Effective identification of citizens brings benefits for governments, allowing them to maintain service delivery, help optimize service provision, and reduce fraud, in areas such as healthcare, food subsidies, border control, or taxation. In short, inclusion-related benefits derived from biometric systems can accrue to both governments and their citizens. Institute of Electrical and Electronics Engineers, http://www.ieee.org
The promise of privacy Privacy can be defined as a person’s right to control the use of their personal information—what data is collected, how it is acquired, how it will be used, and who has access to it? According to Article 12 of the Universal Declaration of Human Rights, privacy is the right of an individual to be free from unsolicited intrusion: “No one shall be subjected to arbitrary interference with his privacy, family, home or correspondence, nor to attacks upon his honor and reputation. Everyone has the right to the protection of the law against such interference or attacks.” Article 12 Universal Declaration of Human Rights However, a steady stream of new technologies has constantly challenged our cultural and legal attitudes towards privacy. With each technological development, fresh concerns arise and new ways to address them. For example, historically, privacy concerns were raised around the use of photography or telephony; today, these concerns typically focus on identity theft, data aggregation, online profiling, electronic surveillance, social media— and, indeed, the use of biometrics.
Fears about new technologies have been allayed by a combination of regulation, practice and familiarity. Examining these aspects in more detail: Regulation—The past 25 years have seen several waves of data privacy regulation around the world, with mature markets revising their legal regimes and emerging markets passing new laws. All these laws regulate— to a greater or lesser extent—how organizations and governments handle personal data and are all based on common principles, such as: • Transparency and control—the individual must be told about the use of their personal data and they must be able to control that use • Purpose limitation—the personal data can only be used for the original purpose • Security—the personal data must be protected against unauthorized or accidental disclosure • Data quality—the personal data collected must be appropriate for the task in hand, accurate and kept up to date.
Practice—Practice has changed. Governments and organizations are much more aware than they used to be, not only of the danger of ignoring data privacy requirements, but also of the reputational and commercial benefits of handling personal data wisely. Familiarity—Individuals have had to familiarize themselves with an everchanging world: global outsourcing of business processes, the Internet and social media have all been embraced and absorbed by a responsive public. Indeed, biometrics is “just another frontier” in the changing technology landscape.
Biometrics and privacy— Checks and balances So how can organizations reap the benefits offered by biometric solutions while preserving or enhancing the individual’s right to privacy?
Checks... Citizens are rightly concerned that their biometric data is stored safely and used only for the purpose intended. The introduction of biometric solutions often triggers a range of privacy questions, from age-old Orwellian associations around Big Brother and surveillance states, to more practical questions such as: • Why is my data needed? • How will my data be used in the future? • How reliable is biometric identification?
...and balances On the other hand, by providing increased security to systems that hold sensitive data, such as medical or tax records, biometrics may help to increase the privacy of individuals. Similarly, implementing biometric security for retail payments and banking, or other services, could help to reduce the impact of identity theft. As a consequence, many now regard biometrics as a “privacy-enhancing technology” (PET). So where should we draw the line? Let us explore further the relationship between biometrics and privacy, considering the various stakeholders in the privacy debate, their specific perspectives, and the possible approaches that could be applied.
• How will my data be safeguarded? • What if my fingerprints are stolen?
Addressing privacy concerns Personal invasiveness
Process and usage
• Societal or cultural—biometric collection can be seen as undignified, perhaps even stigmatizing, for those from whom the biometric is collected, depending on the method (for example, “criminalizing” them, in the case of fingerprint capture, or perhaps being religiously or traditionally unacceptable in the case of face capture).
• Function creep (also known as scope creep)—describes the situation when individuals’ biometric information is used for purposes other than those initially intended (and agreed to by the users), perhaps due to sharing between external systems or entities. For example, consider a welfare system that requires a finger scan to enroll. Let us assume that guarantees were made to the user at enrollment that the finger scan is being collected solely for the purposes of guarding against benefits fraud—checking that the user is not already registered for welfare. If the finger scan was subsequently used for another purpose (perhaps law enforcement) this constitutes “function creep,” and can cause significant public unease.
• Data leaks—leaks occur when the entities handling the biometric data do not protect it sufficiently, and it “leaks” outside their organizations (for example, there are regular reports of organizations notifying their customers that their name, address, password, or worse, banking or credit card data, has been compromised). This concern is not specific to biometrics, but extends to any sensitive data. However, people may be more concerned that their biometric data goes missing, because unlike a PIN code it cannot be “reissued.”
• Revealing additional personal information—there is a possibility of identifying additional personal data from biometric information; for example, fingerprint scans can show traces of dermal disease or worn papillary structures caused by hard labor.
• Invasiveness or hygiene—capture techniques for some biometrics are relatively invasive and sometimes even uncomfortable (for example, retinal scanning, or DNA collection). Many capture techniques require the user to at least touch something, obvious exceptions being voice, iris, face, and some specialized vein or fingerprint devices. • Trust or control—some individuals perceive a loss of freedom if the use of biometrics becomes widespread. Citizens question whether their governments will securely store their biometric data, use it correctly, and allow them to correct or otherwise control it. In addition, they question whether their right to anonymity will be safeguarded.
• International data exchange—it is easy to imagine how biometric data can be exchanged with, or transferred for processing to, countries with weaker or nonexistent data privacy legislation. The concern about the subsequent uses of that data and the existence of real protection and rights for individuals are all too real.
• Spoofing—the act of fooling biometric systems to either impersonate someone else, or falsely go undetected. Sometimes achieved with false biometrics (for example, fingers made of gelatine, contact lenses, etc). • Irrevocability—the inability to change biometric features (“I only have 10 fingers”), as they are permanent (or semi-permanent), and the ensuing risk to the individual if their biometric data is lost or stolen.
In addition to biometric-specific concerns, we should also acknowledge a general “suspicion of any new technology” that many citizens hold.
Stakeholders in the privacy debate A range of stakeholders have valid—and sometimes conflicting—concerns when it comes to matters of privacy. The main stakeholders include:
Citizens and consumers Given the highly personal nature of biometric information, individuals are understandably concerned about the capture and use of this data, and that of their families, friends, and communities. These concerns will, in turn, determine the take up and eventual success of any biometric solution. In the real-world, the impact of user acceptability of biometrics—and one that determines the difference between success and failure—generally manifests on a scale which can range from sabotage or boycott, to enthusiasm. Clearly, for a biometric solution to be most effective, users need to be comfortable with the privacy implications and understand—and value—the benefits that are on offer.
Governments are both major producers and consumers of citizen identity data; as public authorities, they also have a responsibility to protect the privacy of those they represent.
Typically, privacy regulators are publicly-funded independent authorities set up to enforce data privacy laws, individuals’ rights and organizations’ duties. Regulatory oversight often comes in different forms, ranging from data privacy commissioners, government ministries (for example, labor, trade), province or state level bodies (for example, United States, Canada, Germany, Switzerland), to other bodies (for example, the Financial Services Authority in the United Kingdom, and the Federal Trade Commission in the United States).
Governments use biometrics to provide efficient and secure access to citizen services, through reliable identification of individuals. The public sector is currently the largest market for biometric applications worldwide, such as identity cards, social benefits, immigration and border control, or voting. Governments are typically bound by the same privacy laws as private sector organizations. If anything, they face a greater degree of scrutiny from regulatory bodies such as Data Privacy Commissions, due not only to the vast amount of personal information to which they have access, but also to fears of a “surveillance state.”
Recently, these bodies have been involved in assessing and promoting the privacy of citizen’s biometric data. Their powers are significant, and there are instances where such authorities have prevented specific biometric systems that are termed “disproportionate” (overly-invasive, relative to the benefit they deliver) from going live.
Privacy advocate organizations A number of not-for-profit privacyadvocate organizations are active on a broad range of privacy topics, and are usually highly vocal where biometric or other identification technologies are concerned. They typically cite the “intrusion into civil liberties,” costs, and identity fraud risks as reasons for discouraging the adoption of biometrics, particularly the growth of biometric databases against which individuals can be matched.
Private enterprises Increasingly, businesses wish to recognize their customers using individuals’ identity and biometric data for various legitimate purposes. Biometric technologies are seen as a powerful tool to help improve customer service levels, as it enables automated and efficient identification of individuals; for example, services such as personalization, self-service provision, and fraud prevention. With this adoption comes a need to handle the data securely, legally and fairly. Private enterprise has historically understood that the security of the customer data that they own is paramount. This responsibility is compounded by the inclusion of biometric data, due to its great sensitivity, and the highly-international environment in which these enterprises operate.
Clear direction—and enforcement—from governance bodies is key to helping private enterprise use and handle biometric data appropriately. That said, it is essential that the standards and regulations applied should be proportionate, to avoid overextending implementation costs.
Technology suppliers Technology suppliers have an interest both in the general health and growth of the biometric industry, and in supplying their biometric software or hardware products. The biometric marketplace is highly dynamic, with vendors operating on rapid innovation cycles to bring the latest technology to market and reduce implementation costs. In this environment, it is often hard for customers who lack the detailed technical knowledge to differentiate between real issues and “scare tactics,” or between valid privacy-protection strategies and “snake-oil” products. Careful, informed assessment is required.
Managing mitigations Although taking account of the many concerns discussed might begin to discourage organizations from implementing biometric solutions altogether, there are a number of potential mitigations to help address them including:
Procedural Procedural mitigations include those measures that can be taken to reduce the impact or likelihood of privacy intrusion through good design or operational process. Biometric modality selection A basic question for any system using biometrics, but one for which the answer is too-often assumed, is which biometric modalities should the system leverage? As noted previously, certain modalities (such as fingerprints) raise greater privacy concerns than others (such as signature), as illustrated in Figure 1. In this instance, adoption of lessprivacy-sensitive modalities such as signature, face, or voice biometrics can be an effective and straightforward
measure to address privacy concerns, provided they are compatible with the solution objectives. Quantifying biometric privacy impacts Various bodies worldwide have developed frameworks that attempt to quantify the privacy risks associated with new systems, products or services. • Many data privacy commissioners, such as in Canada, Ontario and the United Kingdom encourage the use of privacy impact assessments (PIA) and have published frameworks,3 designed to help project managers identify privacy risks, avoid inadequate mitigation strategies, inform their communication strategy, and instil confidence in their solution. PIAs are increasingly widely-used, for example, by all federal bodies in the United States, and may soon be required by the proposed European data protection regulation for certain types of data processing.
• An alternative is the BioPrivacy Application Impact Framework (AIF),4 created by the International Biometrics Group (IBG). The BioPrivacy AIF uses 10 “Technology Risk Ratings” across four categories (Does the system perform verification or identification? Is it overt or covert? Are the technologies used based on behavioral or physical traits? Are the underlying databases searchable?) to provide an overall privacy risk rating for the system. A key factor in these assessments is the concept of “proportionality”— personal information, including biometrics, should only be collected if truly required and its collection and use if appropriate for the goals to be achieved. For example, for mission-critical systems, collection and storage of multiple biometrics, with a correspondingly significant privacy impact, may be acceptable; for convenience systems, it will likely not.
Figure 1. Biometric modality options
Impact assessment frameworks are very helpful in terms of the structure they bring to the privacy discussion, and in developing an understanding of the degree of privacy risk that a specific biometric system presents. Furthermore, it is worth noting that completion of a particular assessment framework may be mandatory for some biometric implementations in certain geographies. Privacy by design The “privacy by design” principle refers to the embedding of privacy and security measures throughout the technology life cycle, from early-stage design to deployment, use and ultimate disposal. Here, engineers and privacy professionals are brought together at the early stage of development to “code” privacy requirements into every aspect of the product or service. For example, default settings on websites should be constructed so that users do not have to change anything to protect their privacy. Transparency and control should be built into processes to empower individuals to understand the uses and disclosures of their personal data and they should be equipped with easy-to-use tools to exercise control over their data. The concept, championed by Dr. Ann Cavoukian, the Ontario Information and Privacy Commissioner in the mid1990s, and now enjoying widespread adoption, relies heavily on the idea that businesses can obtain a competitive advantage and realize commercial gains through sound data privacy practices.
Privacy by design lends itself well to the development of biometric solutions. By focusing early in the design phase on the protection of the biometric data, as well as transparency and control for the user, developers can mitigate or eliminate many of the common apprehensions expressed by other stakeholders. Biometric data can be protected at all stages of use (including collection, storage, transmission, and matching), with access to biometric system functions and data limited to defined personnel under specific conditions, with explicit controls on usage and export built into the system. Separation of biometric data from other personal information is also an example of a privacy by design measure, as discussed in the following section on data anonymization. Data anonymization Data anonymization refers to the enforced full or partial separation of biometric and other personal data, such as name or related identity numbers. In principle, this means that if the biometric database is lost, users’ privacy is not, since there is no way to relate the biometrics to individuals given the absence of a global and complete biometric identification service. In practice, the effectiveness of this measure depends on how the links between biometric and other personal data are held, and how the databases are hosted and secured; if all can be compromised, then the measure is worthless. Scope control Firm control of a biometric solution’s scope is key to address concerns around function creep—the use of data for a
purpose beyond that originally declared. As well as being a key “ leading practice” implemented by effective project managers, scope control of systems that process personal data is increasingly mandated by legislation, as described in the following section on regulatory measures (see page 17). Auditing, accountability, and oversight Any biometric system will require a strict governance structure to maintain correct and lawful use. Additional scrutiny by appropriate organizations, such as the regional Privacy Regulator, may also be required, depending on the jurisdiction and the nature of the system itself. Inspection and redress processes Increased focus is being placed on the rights of citizens and consumers to view and correct their personal data—and this applies also to biometric data. In practice, it can be difficult for individuals to inspect and make sense of their biometric traits—are individuals aware of their own iris image, for example? However, the principle is still valid, and we expect end-users to become increasingly aware of biometric technologies—and their own biometric characteristics—in the future.
Educational Users’ confusions, misunderstandings, myths or misconceptions can often blight system adoption—myths surrounding biometrics abound. To an extent, there is a duty of care incumbent on systems builders to protect their users. As a consequence, effective and clear communication on what a system
will be used for, how biometric data will be collected and handled, and how individuals can address their concerns, is key to successful user uptake—yet often overlooked.
Technological An ever-increasing range of technological capabilities are available to help systems developers enhance the privacy protection of their solutions, beyond traditional data security measures such as encryption. Revocable/cancellable/one-time algorithms The main difference between biometric authentication and conventional authentication methods, such as username and password, is that biometrics are intrinsically linked to our body and cannot be replaced. Various capabilities have been proposed to mitigate this specific risk, some being more effective than others. In principle, these capabilities provide mechanisms to enable biometric templates to be revoked when they are believed to be compromised—either upon the request of the user, or the biometric data processor. Once a template is revoked, it is no longer valid for use in the system. As a natural consequence of revocation there is the need for a biometric template reissuance process, which will enable the user to access the system again by means of their newly-issued template. To date, there are few known instances of deployed systems using this family of algorithms, most probably due to the additional cost and complexity
that they bring to a system’s design, compared to the perceived risk that they address. Privacy-protecting algorithms Privacy-protecting algorithms, also known as “template protection algorithms,” are specialized biometric algorithms that create encrypted templates from biometric images. In doing so, the original images cannot be recovered—hence, protecting the privacy of their “owners”—but the templates can still be used for accurate biometric matching. Template protection algorithms are provided commercially by a range of vendors. Implementation of such algorithms can help to protect the system against potential attacks, and can improve user or regulator acceptance where appropriate education is performed. The key disadvantage of privacyprotecting algorithms, aside from the additional cost and complexity that they bring, is that they create a vendor lock-in, since the encrypted templates can only be matched by a vendor-specific matching algorithm. Consequently, they remove the ability of the system builder to select from different, potentially more accurate, efficient or cost effective matching algorithms and strategies. Spoof/Liveness detection Spoofing attacks occur when an imposter seeks to be recognized as a different, legitimate user by replicating their biometric characteristics (or, less frequently, when they seek to obscure their own biometrics so that they are not recognized as themselves).
Biometric fraud detection and antispoofing capabilities are improving all the time and their increasing implementation in live systems should reassure users that their privacy will not be compromised. Even if citizens’ biometric data is stolen from a database, they will be unable to be used in a replay-style attack due to effective anti-spoofing measures.
Regulatory Regulators not only set and enforce rules for how personal data should be processed, but also they are starting to move toward a new approach whereby organizations are rewarded for the effectiveness of their data privacy compliance programs. Evidence of this approach can be found in the increased promotion of data privacy impact assessments, the adoption of the “accountability principle” in many new pieces of regulatory guidance and legislation, as well as the principle of “privacy by design” outlined above. The new approach can serve not only to reduce the administrative burden for companies and regulators, but also to improve privacy outcomes for individuals. Organizations are able to spend less time retroactively redesigning solutions or responding to incidents and more time proactively preempting privacy issues, training staff and embedding programs throughout the organization. http://www.ico.gov.uk/for_organisations/ data_protection/topic_guides/privacy_impact_ assessment.aspx
Application of privacy regulations
Clear end-user education
Inspection and redress
wAuditing, Accountability, Oversight
Privacy by design
Quantifying privacy impact
Biometric modality selection
Process and usage Function creep
Technology integrity Spoofing
Revealing additional personal information
The matrix above summarizes the relationship between concerns and possible mitigations; the green color indicates a mitigation that can address a specific concern.
As illustrated, the most effective measures are some of the simplest—careful biometric modality selection, and clear end-user education.
Figure 2 Concerns and possible mitigations
A positive match Like any new technology, biometrics is subject to privacy concerns—and since biometric data relates to a specific individual, the concerns are personal and sensitive. Fortunately, there are many ways to address these concerns; technology solutions can help, but good system design involving technology architects, business and privacy specialists, privacy advocates, and end users, and appropriate procedural, educational, and regulatory requirements, can protect privacy more effectively. Given these complex privacy issues, what of the future? Will organizations’ processing biometric data be increasingly challenged to justify their use and handling of this information? Will privacy issues become more acute as biometrics systems become more widespread and the public becomes more aware? Or will citizens and consumers choose to accept the use of biometric data as an essential ingredient of modern life?
Accenture believes people are becoming increasingly comfortable with biometric technologies, just as they have adopted credit cards, mobile phones, ecommerce or, more recently, social media. We expect fewer irrational fears as biometrics become better-understood and supported by an improved framework of standardized, regulated ways to handle the privacy of biometric systems. While biometrics technologies may not be a cure-all, they do offer benefits, whether processing passengers more quickly at airports or making it harder for fraudsters to steal an identity or access systems that hold personal data. Indeed, privacy and biometrics complement each other in better serving citizens and the society—and may well be a positive match that, ultimately, helps organizations achieve high performance.
Find out more If you would like to know more about the Accenture work with biometrics and privacy, please contact: Bojana Bellamy Director of Data Privacy, Accenture [email protected] Cyrille Bataller Director, AccentureTechnology Labs Europe [email protected] Mark Crego Global Border and Identity Management Lead [email protected] Alastair Partington Border and Identity Management Product Lead [email protected]
About Accenture Technology Labs Accenture Technology Labs, the dedicated technology research and development (R&D) organization within Accenture, has been turning technology innovation into business results for more than 20 years. Our R&D team explores new and emerging technologies to create a vision of how technology will shape the future and invent the next wave of cutting-edge business solutions. Working closely with Accenture’s global network of specialists, Accenture Technology Labs help clients innovate to achieve high performance. For more information, please visit: www.accenture.com/accenturetechlabs.
About Accenture Accenture is a global management consulting, technology services and outsourcing company, with 257,000 people serving clients in more than 120 countries. Combining unparalleled experience, comprehensive capabilities across all industries and business functions, and extensive research on the world’s most successful companies, Accenture collaborates with clients to help them become high-performance businesses and governments. The company generated net revenues of US$27.9 billion for the fiscal year ended Aug. 31, 2012. Its home page is www.accenture.com.
Biometrics and privacy: A positive match - Accenture
Biometrics and privacy: A positive match How organizations can use biometrics technologies and protect individuals’ privacy in the journey to high per...
Dec 9, 2017 - Develop a communications strategy to support the key events that are held by or sponsored by Accenture, including the Accenture Oracle Leadership Council and the Oracle OpenWorld event. This includes the development of client invitation
Sep 2, 2009 - The answer depends on what you use cloud computing for, and your expectations. If you are ...... develop, test, deploy, and support custom applications developed in the language the platform supports. ..... the only authorization levels
C. Decorrelated and Normalized Manhattan Distance The keystroke dynam- ics features consist of both dwell and ... distance is applied to decorrelate and normalize the keystroke dynamics feature vari- ables so that the covariance matrix ..... pass-phr
291. Chapter 19.: Get Rid of That Guilt Feeling ....................â¦â¦..â¦â¦.. 308. Part Five: Action Please! Chapter 20.: Now It's Time to Test Your Own Success Quotient ..... 323. Chapter 21.: Awaken the Sleeping Giant Within You â¦â¦â¦â¦
In this seminal article, the authors presented positive psychology as a corrective to what they described as the dominant approach of modern psychology: the ...... to wishful thinking and overly rosy visions and so as not to ignore important 'negativ
Jun 30, 2011 - Deirdre K. Mulligan*. Jennifer King**. ABSTRACT ... We argue that regulators would do well to ensure that the concept of privacy they direct companies to embed affords the desirable .... http://www.ftc.gov/reports/privacy2000/ privacy2
number of emails reaching the inbox. How well you do with deliverability depends on how your organization: Manages its email sender reputation. Manages lists. Ensures a quality database, such as where you get email addresses from and how you manage b
Mar 15, 2017 - We provide insights for CIOs and tech leaders to help harness the power of new and emerging technologies, and future-proof your IT. Global. bit.ly/techvision2017 .... The 2017 Accenture #Oracle Leadership Council has begun: where indus
Accenture Collateral. Management Services. Achieving high performance through collateral management transformation ... to collect and return cash and collateral, recall and substitute collateral, mark collateral to market, asset-service ... and Deriv
May 30, 2017 - Technology will continue to transform the way we work and live, raising many questions about both opportunities and challenges. Accenture believes that these innovations are a force for .... From the eyes of patients, connected healthc
Aug 29, 2008 - Sesungguhnya pada yang demikian itu terdapat tanda-tanda (kekuasaannya) bagi setiap orang yang banyak bersabar dan banyak bersyukur,â Surat Asy Syuura ayat 33 â Allah-lah yang menjadikan malam untuk kamu supaya kamu beristirahat pa
This paper was selected for presentation by an SPE program committee following review of information contained in an abstract submitted by the author(s). Contents of the paper have not been reviewed by the Society of Petroleum Engineers and are subje
Results 1 - 25 of 305 - Christopher Webber. This IDC Presentation was presented at the December 2017 Strategic Alliances Leadership Council. This Presentation includes recent 2017 research regarding challenges associated with multidimensional relatio
anxiety, hostility, scorn, and disgust. Mood states related to de- pression such as sadness and loneliness also have substantial loadings on this factor. At the trait level, NA is a broad and pervasive predisposition to experience negative emotions t
Maka dari itu kami mengumpulkan informasi terkecil dan hanya atas persetujuan Anda, dan hanya akan menggunakannya untuk tujuannya saja. Kami tidak akan membocorkan informasi kepada pihak ketiga tanpa sepengetahuan Anda. Kami di Samsung Electronics be
Mar 30, 2017 - Software giant, Oracle, is rumoured to be in talks to buy consultancy house, Accenture, according to reports from The Register. The report said Oracle has hired global specialists to explore the feasibility of buying the company and ha
The basic premise of positive psychology is that human beings are often drawn by the future more than they are driven by the past. A change in our orientation to time can dramatically affect how we think about the nature of happiness. Seligman identi