Retail Payments Risk Forum
Font Size: A A A

Portals and Rails

September 29, 2014

Let's Talk Token, Part II: Distinguishing Attributes

Several weeks ago, Portals and Rails embarked on a series of posts on tokenization. In the first installment, we defined tokenization and distinguished between a merchant-centric enterprise tokenization solution and payment tokens generated as an issuer-centric end-to-end solution. Since writing the first post, payment tokens has jumped front and center in the payments community when Apple introduced Apple Pay, which uses tokenization. Also, the Mobile Payments Industry Workgroup just released a detailed white paper recounting their recent meeting on the current tokenization landscape in the United States.

In today's installment, we look at some distinguishing attributes of the end-to-end token initiatives currently under way and consider their impact on mitigating risk in payments transactions.

  • Token format: Common ground exists in the payments industry in terms of the token format. The end-to-end token solution relies on the creation of a token, known as a device account number (DAN), to initiate a payment in place of the original primary account number (PAN). To mitigate operational risks and make use of existing messaging rules and applications associated with the payment transaction, it is imperative that the format of the DAN preserves the format structure of the PAN. This means that DAN generation should be as random as possible, even while preserving the original PAN format structures to maintain basic card or account validation rules associated with the PAN.

  • Token type: Payment tokens can be dynamic or static. Dynamic tokens are valid either for a single transaction or for a limited number of transactions occurring in a very short time. By the time a fraudster intercepts a dynamic token, it has likely already expired, so the fraudster can’t use it. However, there is a slight down side to dynamic tokens—they can work against loyalty programs as well as some back-end fraud detection systems. Because each transaction has a different DAN, merchants and processors cannot consolidate multiple transaction information for an individual cardholder.

    On the other hand, static tokens are multi-use, so they allow merchants to connect the token user with past transactions. But given their multi-use nature, they are not as secure as dynamic tokens. For additional security, each transaction with a static token can include an additional element: a uniquely generated cryptogram.

  • Device coverage: Tokens can be created and stored either on a secure element on a mobile phone or in a cloud. Much industry discussion focuses on which approach is more secure, but the approach also has an impact on device access to the token. Storing a token only on secure elements limits tokens to mobile phones, a situation that does not address the significant volume of card-not-present payments that consumers conduct on computers and other devices. Alternatively, storing a token in a cloud would allow any connected device (mobile, tablet, laptop, or computer) to access the token, so all e-commerce transactions would be covered.

  • Token service provider: A number of parties can play the critical provider role. The provider is ultimately responsible for generating and issuing the DAN, maintaining the DAN vault, and mapping the DAN to the PAN for presentment to the issuer that ultimately authorizes the transaction. A network, issuer, processor, or another third-party provider can perform this role. We can make a case for any of these parties to play the role, but the critical risk mitigation factor to note is that the merchant should never see the PAN, thereby preventing a breach of payment card data within their systems.

To date, a standards body controlled by the largest global card networks and a company representing the largest global banks has driven most of the payment tokenization standardization efforts. Although these organizations have advocated for public discussions and input in an open environment, some critics argue that the management of standards development should be left to an open-standards body such as X9 or ISO. Tokenization efforts and standards will continue to evolve as tokenization may play a critical role in mitigating payment risk in the future. Still, security challenges will remain even with its adoption. In the next installment of this tokenization series, we will examine risks that that a tokenized payments environment won't resolve, and risks that will be all new.

By Douglas A. King, payments risk expert in the Retail Payments Risk Forum at the Atlanta Fed


September 29, 2014 in authentication, fraud, mobile payments | Permalink | Comments (0) | TrackBack (0)

September 22, 2014

New ACH Return Rate Threshold on the Horizon

In a December 2013 post, we asked the question, Is it the right time for lower ACH return rate thresholds? We can now say that the answer is "Yes." The voting membership of NACHA-The Electronic Payments Association recently approved a NACHA Operating Rule amendment that will reduce the unauthorized debit return rate threshold.

The process of returning payment transactions is a pain point for the receiving financial institutions that incur the costs of exception processing, which includes handling customer service inquiries and the returns. Unauthorized transactions are also a pain point for customers who have experienced such postings to their accounts. For the financial institution originating transactions on behalf of businesses and third-party customers, ongoing and proactive monitoring of return rates can help them quickly identify potential problems and determine if those problems have been addressed.

The NACHA Operating Rule amendment will reduce the threshold for returns of unauthorized debit entries from 1 percent to 0.5 percent, effective September 18, 2015. An originating depository financial institution will be subject to possible reporting and fines if they have an originator or third-party sender whose return rate for unauthorized debits exceeds the current threshold.

As NACHA states in its information on the new rule, this 0.5 percent threshold is more than 16 times higher than the average network return rate of 0.03 percent for unauthorized debit entries in 2013. This new threshold will continue to emphasize the importance of institutions focusing on high return rates and working with their customers to bring any excessive rates down. The amendment also establishes a review process for when returns for "administrative" or "overall return" reasons exceed certain levels. For administrative returns, this will be 3 percent, and for overall returns, it will be 15 percent. Administrative returns include debits returned for reasons such as closed account, invalid account number structure, or the account number not corresponding to an existing account. Overall returns for ACH debits include unauthorized and administrative reasons, as well as others such as insufficient funds and stop payments.

Unlike the unauthorized return threshold, breaching return rate levels for administrative and overall return reasons will not result in an automatic requirement to reduce the return rate or undergo a rules enforcement proceeding. Instead, exceeding these return rates will lead to a process to determine if the origination practices of a given originator or third-party sender need to be modified to achieve lower exception levels.

The timeframe for implementing this rule allows originating financial institutions to look carefully at their current return monitoring processes and determine whether customers are near these return rates and to put into place practices that would address problem areas. Will this new rule affect your due diligence processes? Does your current monitoring already show that your customers' return rates are lower than the new thresholds?

Photo of Deborah ShawBy Deborah Shaw, a payments risk expert in the Retail Payments Risk Forum at the Atlanta Fed

September 22, 2014 in ACH, debit cards, regulations | Permalink | Comments (0) | TrackBack (0)

September 15, 2014

Let’s Talk Token: Authenticating Payments

It's challenging to have a conversation about EMV cards—cards with chip technology—given their well-documented fraud-mitigating shortcomings, without diving into a conversation on tokenization. And these conversations just intensified with Apple announcing the use of tokenization with its soon-to-be launched mobile payment application. Tokenization of payment card data can provide an additional layer of security to EMV cards for in-person payments and mitigates fraud risks that these cards don't address in the non-face-to-face environment.

I recently spoke at a forum on EMV cards, where it became evident to me that there is a high degree of confusion in the payments industry, especially within the merchant community, about tokenization. Currently, multiple standards initiatives around a new tokenization framework are under way, so Portals and Rails is embarking on a series of posts on tokenization. In this first installment, we define tokenization and distinguish between tokens generated within the merchant's environment (an enterprise solution) and payment tokens generated as an end-to-end-solution. A future post will compare the various payment end-to-end tokenization initiatives that have been announced to date.

In the data security and payments environment, tokenization is the substitution of sensitive data with a surrogate value representing the original data but having no monetary value. For payment cards, tokenization refers to the substitution of part or all of a card’s PAN, or primary account number, with a totally randomized value, or token. A true token cannot be mathematically reversed to determine the original PAN, but a token service provider in a highly secure environment can subsequently link it to its associated PAN.

Tokenization of payment credentials has been around since the mid-2000s, driven primarily by the issuance in 2004 of the Payment Card Industry Data Security Standard (PCI-DSS), which defines merchant requirements for protecting cardholder data. Merchants historically stored PANs for a variety of reasons, including to use in settlement reconciliation, perform incremental authorizations, handle chargebacks, and identify cardholder transactions for loyalty programs. With tokenization, merchants can remove PANs from their data environment and replace them with tokens—and thereby reduce their PCI-DSS compliance requirements. However, this enterprise solution still requires that the PAN enter the merchant environment before the tokenization process taking place.

Under the tokenization initiatives currently under way from the Clearing House and EMVCo, a financial institution would issue a token replacing a cardholder's PAN to the person's mobile handset, tablet, or computer device before initiating a digital payment transaction. So the merchant, rather than receiving the cardholder's PAN for initiating a transaction, would receive a token value associated with that PAN, which would then be de-tokenized outside the merchant's environment to obtain the necessary authorization and complete the transaction. The merchant never has knowledge of the cardholder's PAN—and that is a significant difference between these tokenization initiatives and the enterprise solution related to handling payment credentials.

The Clearing House's and EMVCo's concepts for payment tokenization are similar in many ways, but they also have differences. A future post will delve into the end-to-end tokenization initiatives and consider the impact on mitigating risk in payment transactions.

Photo of Douglas A. KingBy Douglas A. King, payments risk expert in the Retail Payments Risk Forum at the Atlanta Fed

September 15, 2014 in cards, chip-and-pin, EMV | Permalink | Comments (0) | TrackBack (0)

September 08, 2014

Seeking a Successful Biometric Solution

As an earlier post noted, advances in technology have spurred the implementation of various biometric authentication methodologies in the consumer market. But as people are discovering, not all methodologies are equally suited for all applications. Those who are implementing such applications have to consider risk level, cost, operating environment, and targeted population. They also have to evaluate a number of other factors to determine if a particular biometric is better suited than another for an intended application. These factors include but are not limited to:

  • Uniqueness. While the biometric doesn't always have to be unique to every individual on the planet, the probability that two people share a particular characteristic should be unlikely enough to prevent an unacceptable number of false acceptances (when one person is wrongly authenticated as another). For example, fingerprints are considered to be unique to every individual, but current smartphone fingerprint readers have such low-resolution scanners that the possibility of a false acceptance is one in 44,000. This rate is most likely sufficient for many applications, but a high-dollar transaction may require supplemental authentication.
  • Universality. The targeted characteristic must be present in the overall population, with only a few exceptions. Only a couple of biometric elements, such as DNA and facial recognition, can provide complete population coverage. Hand geometry and vein recognition, for example, won't work on people who are missing fingers or other body parts.
  • Permanence. The characteristic should not change over time. Even though people can alter almost any physical characteristic through medical procedures, the possibility of such alteration to the characteristic being considered for biometric authentication should be infrequent among the population—and the alteration procedure should be relatively expensive.
  • Collection ease. The more invasive the collection of the biometric sample, the more resistance people will have to it. People tend to view facial and voice recognition and fingerprinting as noninvasive but retinal scans as highly invasive—a light beam scans the back of the person's eye, which can be very uncomfortable.
  • Performance. The biometric element must support the creation of a template that is accurate and quickly obtained while also providing minimal database storage requirements. A system that takes a long time to authenticate someone during peak usage periods will encounter user dissatisfaction and possibly decreased productivity.
  • Accuracy. Individuals should not be able to fool the system. Fingerprint readers should verify that the right fingerprints belong to the right person, that a spoken phrase is live and not recorded, and so on.
  • User-embraced. Even when people have to use certain biometric authentication systems as a condition of their employment, the technology should be one that has a high level of acceptance, with minimal cultural, religious, collective bargaining, or regulatory implications.
  • Cost-effectiveness. As with all risk management practices, the cost of implementing and operating the system must be commensurate with the risk exposure for using a less secure authentication system.

As you consider the possibility of implementing a biometric authentication methodology for your customers, I hope you will find these evaluation elements helpful.

Photo of David LottBy David Lott, a payments risk expert in the Retail Payments Risk Forum at the Atlanta Fed

September 8, 2014 in authentication, biometrics, innovation | Permalink | Comments (0) | TrackBack (0)