The Third Age of credit

Society is beginning to wake up to a tremendous shift in one of the most fundamental underpinnings to how we live our lives: the credit system. Even though it’s not commonly known, credit infrastructure has existed about as long as civilization itself. In one way or another, credit systems have always formalized the one essential basis for relationships between people: trust.

Over millennia, the way credit looks, feels and is used has changed dramatically. Today, buoyed by a plethora of technologies and a golden age for abundant data, credit is undergoing its most radical change yet. But it is being pulled in many directions by competing forces, each with their own vision for the future.

In the beginning, credit was highly personal and subjective — this persisted for thousands of years. Over the last century, a miracle happened: Driven mostly by statistical modeling, credit became for the first time “objective.” Yet today, the cracks in that system are beginning to show, and we now stand on the brink of another revolution — the “Third Age” of credit.

We are on the verge of an exponential leap. The last year has witnessed a Cambrian explosion in credit innovation, unveiling hundreds of possibilities for the future of credit. Unlike the last two ages, credit of the future will be personal, predictive, self-correcting and universal.

The First Age: credit as trust

Modern anthropologists paint a picture of early agricultural society as a community of unsophisticated barterers, trading goods and services directly. In this picture, there is no room for a credit system: I trade you what I have and you want for what you have and I want. But, as historian David Graeber points out in his excellent etymology of credit, Debt: The First 5,000 Years, this account of early civilization is a myth.

The barter system has one major fault, known as the double coincidence of wants. If I am a chicken farmer, and I want to buy shoes from a cobbler, then my only hope is to find a cobbler who wants some of my chickens. If no cobbler in my town wants chickens, then I have to find out what the cobbler wants and begin bringing third parties into the transaction until all wants are fulfilled.

Today, we have a simple solution to this problem — money. Though it’s not conventionally viewed this way, money is actually a form of credit. The radical innovation of money was to introduce one third-party into every transaction: the government. When the farmer doesn’t have anything that the cobbler wants, he pays the cobbler in dollars; the dollars provide a deferred opportunity for the cobbler to then buy what she wants. All of this is possible because people trust that the value of a dollar will remain the same, and that trust comes from the fact that the government vouches for each dollar’s value. When you accept money as payment, you are giving the government credit for their claim that the money you accept can be redeemed for (about) the same value at a later date.

For the first 10,000 years or so, credit was useful… but imperfect.

People take this feature of money for granted, but even today, it’s not ubiquitous — take the example of the three-tier pricing phenomenon in Zimbabwe: The government released bond notes pegged 1:1 to the U.S. dollar, but shops accepted actual U.S. dollars at a premium to the notes (meaning a purchase would be less expensive in U.S. dollars than bond notes). This is the literal embodiment of Zimbabwe’s citizens not giving its government any credit. (Which also led to weird discrepancies in bitcoin prices in the country.)

Money is an amazing financial instrument for so many reasons. It is a medium of exchange. It is a store of value. It is highly divisible. It is fungible across many uses. It is universally coveted. It is liquid. But early societies didn’t have anything resembling modern money, so instead, they used credit. (See a timeline of payments over the course of civilization here.)

Credit has existed as long as human economies have. Some of the earliest writings discovered by archaeologists are debt records. (Historian John Lanchester profiles the history of credit excellently in When Bitcoin Grows Up.) But credit had a lot of issues: How do you give credit to a stranger or foreigner you don’t trust? Even for those you do trust, how do you guarantee they will pay you back? What is the right amount to charge on a loan?

Early debt systems often answered by formalizing rules such as debtors going into slavery or forfeiting their daughters. These conditions artificially constrained debt, meaning that, for most of human history, economies didn’t grow much, their size being capped by a lack of credit.

So, for the first 10,000 years or so, credit was useful… but imperfect.

The Second Age: credit as algorithm

This all changed in 1956. That year, an engineer and a statistician launched a small tech company from their San Francisco apartment. That company, named Fair, Isaac and Co. after its founders, came to be known as FICO.

As Mara Hvistendahl writes, “Before FICO, credit bureaus relied in part on gossip culled from people’s landlords, neighbors, and local grocers. Applicants’ race could be counted against them, as could messiness, poor morals, and ‘effeminate gestures.’ ” Lenders would employ rules such as, “prudence in large transactions with all Jews should be used,” according to Time. “Algorithmic scoring, Fair and Isaac argued, was a more equitable, scientific alternative to this unfair reality.”

It’s hard to overstate how revolutionary FICO really was. Before multivariate credit scoring, a banker couldn’t tell two neighbors apart when pricing a mortgage. The move to statistical underwriting — a movement that had roots as early as the 1800s in the U.S. — had a snowball effect, inspiring lookalike algorithmic credit systems around the world. Credit is all about risk, but until these systems developed in the mid-century, risk-based pricing was almost entirely absent.

Famously, Capital One founder Richard Fairbank launched IBS, his “information-based strategy.” As he noted, “First, the fact that everyone had the same price for credit cards in a risk-based business was strange. […] Secondly, credit cards were a profoundly rich information business because, with the information revolution, there was a huge amount of information that could be acquired about the customers externally.”

Today, algorithmic credit is ubiquitous. Between 90 percent and 95 percent of all financial institutions in the U.S. use FICO. In the last year alone, FICO released new credit scores in Russia, China and India using novel sources of data like utility bills and mobile phone payment records. Banks around the world now implement risk-based pricing for every kind of credit.

What does a new world of credit look like?

Thousands of startups are all finding new ways to apply this same concept of statistical modeling. WeLab in Hong Kong and Kreditech in Germany, for example, use up to 20,000 points of alternative data to process loans (WeLab has provided $28 billion in credit in four years). mPesa and Branch in Kenya provide developing-world credit using mobile data, Lendable does so using psychographic data and Kora does this on blockchain. Young peer-to-peer lending startups like Funding Circle, Lending Club and Lufax have originated more than $100 billion in loans using algorithmic underwriting.

Yet this global credit infrastructure is not without its significant drawbacks, as Americans found out on September 7, 2017, when the credit bureau Equifax announced a hack that exposed the data of 146 million U.S. consumers.

The fallout from the massive breach sparked conversations on credit, forced us to re-evaluate our current credit system and finally inspired the companies to look beyond the Second Age. White House cybersecurity czar Rob Joyce opined that the time has come to get rid of Social Security numbers, so intimately tied to credit scores, which can’t be changed even after identity theft.

Today, we are held hostage by our data. We become vulnerable by being forced to rely on insecure SSNs and PINs that can be stolen. We have no choice how that information is used (more than 100 billion FICO scores have been sold.)

FICO also doesn’t take into account relevant factors such as income or bills, and in some cases only reflects poor payment history and not on-time payments. And on top of that, 50 percent of a person’s score is dependent on their credit history — inherently biasing the system against the younger borrowers who should be leveraging credit the most.

Lastly, as Frank Pasquale writes in The Black Box Society, credit scoring is opaque. This creates disparate impacts on different groups. Algorithms accidentally incorporate human biases, making loans more expensive for minorities. Building credit often requires adherence to unknown rules, such as rewarding “piggybacking” off of others’ credit — a structure that perpetuates economic inequality.

Maybe the Equifax hack was a good thing. It was a jarring reminder that a credit system reliant on historical statistical modeling, opaque algorithms and insecure identifiers is still far from perfect. Were the hackers really Robin Hood in disguise, freeing us from our hostage-like dependence on an outdated scoring system?

The time has come to move beyond the weaknesses of the modern credit regime, and technology is today taking the first step.

The Third Age: credit as liberation

What does a new world of credit look like?

In the last year there has been a Cambrian explosion of new ideas to drive modern credit forward. It is too early to tell which system(s) will win out, but the early indications are truly mind-blowing. Credit is on the precipice of an exponential leap in innovation, which will reshape the world of financial inclusion. It will become more personal, predictive instead of reactive and instantaneous.

One of the most revolutionary aspects of the future of credit is that it will increasingly come to look like cash (and cash, conversely, like credit). Consumers won’t have to request credit; rather it will be automatically allocated to them in advance based off many factors, such as behavior, age, assets and needs. It will be liquid, rather than dispersed in fixed tranches. And as it becomes increasingly commoditized, in many cases it will be close to free.

Customers will have one form of payment for all purchases that automatically decides on the back-end what the best type of funding is, cash or credit, optimizing for efficiency and low fees. Imagine Venmo, credit cards, checks, PayPal and cash, all rolled into one payment method.

People will no longer have multiple credit lines, such as separate credit cards, student loans and mortgages. People will have a guaranteed “credit plan” available to them, all linked into one master identity or profile.

Physical instruments like dollar bills and plastic cards will be phased out and live only in museums. Biometric identifiers like fingerprints will be all you need to make a purchase. Prices will become infinitesimally divisible, optimized in some cases for fractional cent values. Denominations and different currencies will become background features.

In the future, people will be paid in real time (Walmart is experimenting with this now), instead of waiting for work credit every two weeks. Payday loans as an industry will evaporate. WISH Finance is building an Ethereum-based blockchain for cash flow-based underwriting. It’s easy to see this applied to consumers: get real-time credit based on your regular pay and expenses.

Naturally, talking about the future of credit, we have to talk about blockchains.

In the next phase, credit will revolve around the individual. Right now we live in a world of gatekeepers: Centralized data aggregators, such as credit bureaus, act as intermediaries to credit. This advantage will increasingly be eroded by individually permissioned data (a concept known as self-sovereign identity). This is consistent with trends in cross-border work and globalization: In an atomized world, the individual is the core unit and will need to take her information with her, without reliance on third parties. It could reduce some $15 billion in annual fees paid to access data and make information more secure, eliminating single points of failure.

One-size fits all scores like FICO will become disaggregated. Credit is a relational system: Our credit indicates our standing relative to a wide network. But people shouldn’t be represented by averages. Credit will become more multivariate, using machine learning and breaking apart the contributing factors and weights that make up FICO (the company where I work, Petal, is doing this to democratize credit cards).

It makes little sense to set single credit benchmarks — such as the 350 to 850 score range — irrespective of age, so consumers will be compared to their cohort. Per Experian, youngest people have the lowest credit scores. However, youth is when people should be borrowing the most, both to build credit and because they should be saving cash for their spending later in life.

Credit will become contextual. Your maximum available credit will fluctuate based on ever-changing factors such as payroll and bills. It also will be specific to purchases: You will receive different levels and costs of credit based on the value and type of the asset you’re buying. For instance, credit to buy a crib for your newborn may be cheaper than credit to buy a trip to Vegas. Illiquid assets will be automatically usable to secure credit, as Sweetbridge is doing. (The founders of Kora point out that the problem is not that the poor don’t have wealth, it’s that their capital is locked up.)

Credit will be psychographic and predictive. It won’t be enough to look backwards at your past behavior — your creditworthiness will change dynamically as you move around, make purchases and stay active. It will be dynamically assigned to specific needs (like ink if you buy a printer) before you realize you have them.

Naturally, talking about the future of credit, we have to talk about blockchains. They will have three early uses:

  • Funds dispersal: It will become much cheaper to disperse credit and accept payments using services like Stellar. There will be no latency from banks having to verify transactions against their own accounts.

  • Underwriting: Data will be aggregated into universal profiles (like those being built at uPort and Bloom) from a wide variety of sources, such as credit bureaus, phone bills, academic transcripts and Facebook. As mentioned, these will be self-sovereign, and make it much easier for credit providers to underwrite borrowers.

  • Contract enforcement: Smart contracts will be self-enforcing, automatically collecting debt payments, re-adjusting themselves if someone is credit crunched in the short term and refinancing if customers can consolidate or lower their APRs. The universal ID and contract will keep people from “running to Mexico” with their credit funds.

In the future, credit (and capital) will be automatically allocated to people based off predictive AI. Better risk pricing will continue to drop rates at which consumers can borrow, toward 0 percent. The federal funds rate has been around 1 percent for the last couple of years — in 1980 it was 18 percent! A combination of machine learning and what Bain calls “A world awash in money,” with larger investors hunting for lower returns, will continue to drive these rates down.

At a higher level, blockchain protocols like Dharma will set up smart contracts for the credit economy that allocate capital in the most efficient way. Credit will not rely on active investment managers to lend or borrow: Any capital not currently tied into a contract will be programmed to continuously search for the highest risk-adjusted return — including provision of credit.

Credit providers, at scale, will experience massive network effects. “Network effects” describe the condition in which networks become more valuable to users as more users participate. This doesn’t traditionally apply to credit: Just because other people have the same credit card as you, you don’t accrue any benefits. But in the future it will: More data points within credit networks will provide better underwriting, which will create fairer pricing, creating a virtuous cycle of data. User experiences and pricing will benefit tremendously as a result. Initiatives like the U.K.’s Open Banking will accelerate this trend.

Tom Noyes calls this The Democratization of Data. In a world of smaller, local data sets that collaborate (80-90 percent of all our current behavior is local), bridging disparate data gaps will increase credit participation to 100 percent (currently, only about 71 percent of Americans have credit cards).

And these are just some of the more probable, routine ideas. Futurists like Daniel Jeffries envision currencies with built-in features to incentivize different behaviors — like saving versus spending — and universal basic income tokens, to decentralize financial inclusion. Platforms like Bloom, which now has 100 applications being built on it, are reimagining credit at the protocol level. These systems are tackling first-principles questions, such as can the future be entirely meritocratic, or can people inherently create trust with no data.

We are living in the prologue to the Third Age. It’s hard to tell exactly how the future of credit will play out, but from where we stand, we can see that it will represent the biggest departure from the past in credit’s history, and we’re just today taking the first steps.



from TechCrunch http://ift.tt/2HHtSLX

No comments:

Post a Comment