March 07, 2011
Moving to chip-and-pin: The cost of foresight versus the price of hindsight
As I watched the dramatic events in the Middle East unfold over the past few weeks, I realized that revolution may be the only form of change in the world today that takes less than five years. This seems particularly true in the payments industry, where managing new technology is at the heart of the change process.
It has taken six years to implement Check 21 in the United States. Meanwhile, Canada has established a five-year plan to move to chip-and-pin technology for payment cards. The United Kingdom has announced a plan to eliminate checks in seven years, with a five year checkpoint. In Europe, the goal of achieving seamless cross-border payments services as codified in the Payment Services Directive is in its fourth year of implementation, and talk has turned to setting another set of deadlines for mandating implementation of actual payment traffic, as opposed to technical readiness.
The common thread in each of these as-yet-uncompleted initiatives is that they are all actually under way. They have a start date and an anticipated finish date, a known goal toward which all participants are driving. At such time that each effort was initiated, there was someone, or perhaps many "someones", who determined that there was a compelling societal, if not individual participant, business case for moving forward toward a somewhat distant vision.
Today in the United States, however, in the wake of the economic crisis that has created a backlog of payments and IT initiatives, new investments seem stalled under the jaundiced eyes of senior financial planners. In essence, key projects whose deliverables we know we will need in five years are in danger of never getting started. Why? Because a present business case based in today’s experience is hard to construct, and funding for projects with better short-term results may be given precedence over far more strategic long-term projects with better net-present-value results.
A case in point may be the effort to move from magnetic stripe card technology to the more fraud-resistant chip-and-pin technology now being deployed throughout the rest of the world's developed nations. My colleagues and I have written about this issue previously, and some very smart friends of mine in the industry have assured me that current card losses, and I assume current all-in costs of card fraud management and mitigation, are just not bad enough to create a positive business case for change, particularly for large issuing banks who are potential market movers.
My problem, however, lies in the fact that the business case should not be based on current costs, but rather on the anticipated costs five years from now when implementation is likely to occur and depreciated investment costs actually come on line. Of course, the $64,000 question is, "What costs of fraud should we forecast for 2016?"
Frankly, no one actually knows that number, but we do know that it is very likely to be much higher than today if the United States is the last developed country on the planet to move away from mag-stripe cards. The problem is further complicated by the fact that best estimates for fraud cost growth should be augmented by less quantifiable "soft" costs that also loom in the distance. For example, if other nations decide to no longer dual-provision their cards with mag stripes in order to prevent the immigration of fraud from the United States, what costs will U.S. banks incur to continue to provide services to their globetrotting customers? Will we be the ones now having to dual-provision our cards with chip-and-pin? Several U.S. financial institutions have already announced plans to do that very thing. Additionally, with no planned changes in sight, U.S. banks will be tempted to invest in bridge technology to mitigate the growing cost of mag-stripe fraud, thereby inflating the multiyear cost picture with interim investments.
What then should we do? Perhaps we should follow the lead of the U.K. Payments Council's efforts to signal the end of checks as a payment instrument. That is, establish a long-term roadmap for desired change by picking a reasonable future date for a move to chip-and-pin, set some known interim checkpoints for further reflection, and begin an orchestrated process of educating merchants and other key players on optional ways to make the change. With such a target in place, all parties—merchants, issuers, acquirers, processors, card brands, suppliers—could then make better interim investment choices aimed at minimizing long-run costs while maximizing short term benefits to their customers.
One of my favorite movies is Field of Dreams, in which the owner of an Iowa cornfield devotes some of his acreage to the fanciful construction of a baseball field on which the spirits of great players from the past gather each night to play. The movie's famous line, "Build it and they will come," may be the answer for some of our complicated payment investment decisions. Who then should make the call? Absent a probably not-so-welcomed mandate from Congress or a government agency, the job falls to enlightened market forces anxious to control their own destiny. Many groups like the Smart Card Alliance, the Merchants Advisory Group, and others have begun to lay out multiyear roadmaps. My hope is that huddling around these and other ideas in the very near future might be the best way to proceed. Without such efforts at collaboration, I have a gut feeling that five years from now we may, as an industry, be reflecting on the fact that, regardless of the end date, we should have started sooner.
By Rich Oliver, executive vice president of the Atlanta Fed and director of the Retail Payments Risk Forum
TrackBack URL for this entry:
Listed below are links to blogs that reference Moving to chip-and-pin: The cost of foresight versus the price of hindsight: