Home > data science, finance, modeling > Summers’ Lending Club makes money by bypassing the Equal Credit Opportunity Act

Summers’ Lending Club makes money by bypassing the Equal Credit Opportunity Act

August 29, 2013

Don’t know about you, but for some reason I have a sinking feeling when it comes to the idea of Larry Summers. Word on the CNBC street is that he’s about to be named new Fed Chair, and I am living in a state of cognitive dissonance.

To distract myself, I’m going to try better to explain what I started to explain here, when I talked about the online peer-to-peer lending company Lending Club. Summers sits on the board of Lending Club, and from my perspective it’s a logical continuation of his career of deregulation and/or bypassing of vital regulation to enrich himself.

In this case, it’s a vehicle for bypassing the FTC’s Equal Credit Opportunities Rights. It’s not perfect, but it “prohibits credit discrimination on the basis of race, color, religion, national origin, sex, marital status, age, or because you get public assistance.” It forces credit scores to be relatively behavior based, like you see here. Let me contrast that to Lending Club.

Lending Club also uses mathematical models to score people who want to borrow money. These act as credit scores. But in this case, they use data like browsing history or anything they can grab about you on the web or from data warehousing companies like Acxiom (which I’ve written about here). From this Bloomberg article on Lending Club:

“What we’ve done is radically transform the way consumer lending operates,” Laplanche says in his speech. He says that LendingClub keeps staffing low by using algorithms to screen prospective borrowers for risk — rejecting 90 percent of them – – and has no physical branches like banks. “The savings can be passed on to more borrowers in terms of lower interest rates and investors in terms of attractive returns.”

I’d focus on the benefit for investors. Big money is now involved in this stuff. Turns out that bypassing credit score regulation is great for business, so of course.

For example, such models might look at your circle of friends on Facebook to see if you “run with the right crowd” before loaning you money. You can now blame your friends if you don’t get that loan! From this CNN article on the subject (hat tip David):

“It turns out humans are really good at knowing who is trustworthy and reliable in their community,” said Jeff Stewart, a co-founder and CEO of Lenddo. “What’s new is that we’re now able to measure through massive computing power.”

Moving along from taking out loans to getting jobs, there’s this description of how recruiters work online to perform digital background checks for potential employees. It’s a different set of laws this time that is subject to arbitrage but it’s exactly the same idea:

Non-discrimination laws prohibit employers from asking job applicants certain questions. They’re not supposed to ask about things like age, race, gender, disability, marital, and veteran status. (As you can imagine, sometimes a picture alone can reveal this privileged information. These safeguards against discrimination urge employers to simply not use this knowledge to make hiring decisions.) In addition to protecting people from systemic prejudice, these employment laws intend to shield us from capricious bias and whimsy. While casually snooping, however, a recruiter can’t unsee your Facebook rant on immigration amnesty, the same for your baby bump on Instagram. From profile pics and bios, blog posts and tweets, simple HR reconnaissance can glean tons of off-limits information.

Along with forcing recruiters to gaze with eyes wide shut, straddling legal liability and ignorance, invisible employment screens deny American workers the robust protections afforded by the FTC and the Fair Credit Reporting Act. The FCRA ensures that prospective employees are notified before their backgrounds and credit scores are verified. Employees are free to decline the checks, but employers are also free to deny further consideration unless a screening is allowed to take place. What’s important here is that employees must first give consent.

When a report reveals unsavory information about a candidate, and the employer chooses to take what’s called “adverse action,”—like deny a job offer—the employer is required to share the content of the background reports with the candidate. The applicant then has the right to explain or dispute inaccurate and incomplete aspects of the background check. Consent, disclosure, and recourse constitute a straightforward approach to employment screening.

Contrast this citizen-empowering logic with the casual Google search or to the informal, invisible social-media exam. As applicants, we don’t know if employers are looking, we’re not privy to what they see, and we have no way to appeal.

As legal scholars Daniel Solove and Chris Hoofnagle discuss, the amateur Google screens that are now a regular feature of work-life go largely unnoticed. Applicants are simply not called back. And they’ll never know the real reason.

I think the silent failure is the scariest part for me – people who don’t get jobs won’t know why.

Similarly, people denied loans from Lending Club by a secret algorithm don’t know why either. Maybe it’s because I made friends with the wrong person on Facebook? Maybe I should just go ahead and stop being friends with anyone who might put my electronic credit score at risk?

Of course this rant is predicated on the assumption that we think anti-discrimination laws are a good thing. In an ideal world, of course, we wouldn’t need them. But that’s not where we live.

Categories: data science, finance, modeling
  1. TMW
    August 29, 2013 at 7:43 am

    Yet at a time when many, many consumers can’t get loans from traditional sources or are asked to pay interest rates above 25% for credit card debt Lending Club and other like them are bridging part of the gap between those who need liquidity and those who can provide it.

    Like

  2. Abe Kohen
    August 29, 2013 at 8:03 am

    So if I understand correctly, Lending Club has applied algorithmically what small-town bankers have been doing for years, that is, knowing their customers and the likelihoods of repayment or default from prospective customers. Is that a bad thing?

    Like

    • August 29, 2013 at 8:22 am

      Yes it is indeed if you think bankers are prone to being racist or small-minded. And since I believe the average person isn’t perfect, I don’t have any reason to think bankers are better than the average person.

      And, oh wait, speaking of racist bankers, have you been reading the news lately? Wells Fargo is paying up because of this very issue. http://www.csmonitor.com/Business/2012/0712/Wells-Fargo-to-pay-175M-in-discrimination-lawsuit

      Let’s get real. And let’s not idealize the past. There’s a reason anti-discrimination laws were passed.

      Like

  3. TMW
    August 29, 2013 at 9:04 am

    So bankers are racist and by association so are banks. Now someone wants to use other information to lend money…match up those who have it to lend (ie. individual investors who are tired of low returns on bank products and bonds) with those who need it…and somehow that’s racist too? If you hate the banking industry so much, why are you hating on those who want to disrupt it?

    Like

    • August 29, 2013 at 9:15 am

      Wow, so I have to like something because it’s unfriendly to something I don’t like?

      The truth is, I don’t hate the _concept_ of banking, I just think it’s being done poorly. And I don’t think newcomers who come along and offer bank-like products but skirt important regulation related to discrimination is necessarily an improvement. Not for everyone, at least. Of course it’s an improvement for the people making money.

      Like

  4. TMW
    August 29, 2013 at 9:28 am

    Well…and a lot of the people making money are the retired individuals who are the source of Lending Club’s funds. Just because some set of people you look down on are making money doesn’t mean that many others in society aren’t benefiting. Retired people who have their money earning 1% and are likely going to get squeezed hard should inflation rear its ugly head again have a chance to take more of the spread a bank makes and put it in their pocket through LC and the like. I haven’t followed your blog for long but I would think you would support a mechanism that allows individuals to interact more directly as debtors/creditors. The people taking back more control from the institutions?

    Like

    • August 29, 2013 at 9:47 am

      Poor reasoning. Just because some good people are benefitting from a system doesn’t mean the system is just.

      Like

  5. TMW
    August 29, 2013 at 9:54 am

    I’d say the world’s too complicated a place for any system to be “just” in everyone’s eyes anymore.

    Like

    • August 29, 2013 at 4:46 pm

      Feel free to read other blogs if you don’t like rants against injustice.

      Like

  6. August 29, 2013 at 10:10 am

    Wow that’s a lot of push back for this post. I for one think the idea of banking is a horrible idea that was created 500 some years ago and has been a cancer on human progress ever since. The idea of unregulated banking with only the rich and powerful deciding the winners and losers is a cancer that will kill society.

    Like

    • Abe Kohen
      August 29, 2013 at 10:36 am

      Do you really think there would be a Google, an Apple, a Microsoft, pharmaceutical companies, without capital? What you call cancer is what allowed this country (and others) to grow and give us a comfortable lifestyle unimaginable by our forebears.

      Like

      • P
        August 29, 2013 at 12:56 pm

        Capital and banking are distinct. Capital and equity along with economies of scale are hard to disagree with.

        Like

  7. August 29, 2013 at 11:10 am

    The title of this post does not imply, but states as fact, that Lending Club bypasses federal law in its underwriting process. Yet, you provide no proof to back up that claim. Just because Lending Club uses algorithms to screen prospective borrowers for risk this does not imply that it is ignoring the Equal Credit Opportunity Act.

    Again, where is your proof? To make a statement as fact without providing a shred of evidence to support your claim does your blog and your readers a major disservice.

    Like

    • August 29, 2013 at 4:49 pm

      Peter,

      Great question.

      How would I prove it, though? The models are private.

      My experience in internet advertising modeling suggests that there’s about a 100% chance that a given model uses attributes that are forbidden under the ECOA. I can talk more about that, but I think the “who is your social network on Facebook” kinda speaks for itself.

      Tell you what: if those guys are willing to have me go in and look at their models and verify that they’re not violating the spirit of ECOA, I’m happy to do it in my spare time.

      Cathy

      Like

      • August 29, 2013 at 4:56 pm

        Again, so, why state something as fact if you cannot prove that it is true? Even if you “believe” there is a 100% chance I still don’t think you should state it as fact without one piece of evidence. I follow this industry very closely and I haven’t heard any mention of Lending Club using Facebook data in their underwriting.

        Like

        • August 29, 2013 at 5:03 pm

          As far as I know they haven’t said anything one way or the other. Why would they address this? It makes them look bad. I am accusing them of something I believe to be true. I agree that I haven’t proven it, but I’d like to have that discussion with them, publicly, at any time.

          Like

  8. CP
    August 29, 2013 at 12:31 pm

    Math Babe, think about it a different way. All these bankers have tons of cash, they have no obligation to part with their money. But they do. They ‘lend’ the money to someone who is in need. The borrower gets to use that money to enrich themselves in a way that allows them to pay back the money. The lender in turn gets a little growth on their capital.

    Imagine you are a VC and any person comes to you with a startup idea. Aren’t you going to look at their business plan and make sure it is sound? Aren’t you going to look for some indication that this person can not only envision what he/she wants to do but is also capable of executing on that vision? After all, if the money does not grow, biggest loser is the VC firm.

    Yes, using blanket conditions like Race, Gender, Religion etc is not ethical at all. Frankly, with sudden rise in all the social data that is available freely, our society needs to have a conversation of what is okay to use and what is not okay. If using facebook data is okay for Lending agency, why is Facebook not liable for publicly sharing all info about someone? Why aren’t we trying to figure out what is mine is what is Google’s or Facebook’s property?

    Background checks are routinely done prior to hiring someone. Credit check is routinely done before lending to someone. These were established way of identifying risky candidates…not anymore. If we want to regulate what people can use to make decisions about a candidate, we must also think about what data is publicly available of that person.

    After all, aren’t Google and Facebook monetizing what they know about us in some way?

    Like

    • August 31, 2013 at 1:01 am

      CP – You misunderstand how bankers lend out money. They do not necessarily have ‘tons of cash’ just hanging about to be lent out. All money is currently based on debt and is created out of thin air by moving a bank balance up and giving the borrower a cheque book. No bank lends out its assets or its depositors assets. It simply creates ‘cheque money’. For example if you received a mortgage of $100,000 @ 10% for a year; the bank would simply create the money by raising the balance in your new account to $100,000. At the end of the year when you repay the mortgage the bank’s profit is $110,000 less their paperwork expenses, or about $109.000. If the borrower only repaid 50% of the mortgage the bank would only make (50,000 – 1,000) $49,000 on the loan. What the bank looses is the “opportunity” to make another $60,000! Consequently, even lending to demi-deadbeats is profitable for a bank. By using social markers and not financial markers like income and debt repayment history etc. a bank is making credit available by social class rather than the credit worthiness of the applicant. This practice of deciding credit worthiness based on social factors MAY indicate ability to pay but will not indicate willingness to repay nor the real credit worthiness of an applicant, which is why it is an objectionable practice it discriminates against individuals based on their social class or their perceived social class.

      Like

  9. August 29, 2013 at 3:11 pm

    Silent failure: “people who don’t get jobs” (or credit) “won’t know why.”

    This has been true for a long time. Does regulation make silent failure more or less likely? Based on the regulation-aware hiring committees that I’ve been on, I’d say far more likely. Requiring employers to justify rejecting a qualified applicant almost guarantees that the reason provided will be sanitized.

    Silent failure has always been a feature/bug of labor and credit markets. Most people just won’t admit that they found an applicant “domineering,” “insecure,” “incompetent,” “untrustworthy,” “insincere,” “intimidating,” “weak” or that they didn’t fit into the organization’s culture. Regulations intended to remedy historical inequalities have exacerbated today’s culture of silent failure and fear among employers and creditors. As long as the fear is there, rejected applicants can never be sure they are getting honest feedback. There are no easy answers to this. If you want honest invaluable feedback, you need to ask for it in a way that eliminates the fears of the employer or creditor.

    So, in my mind, the problem with Lending Club and others who mine social networks for insights into credit and employment decisions isn’t that they circumvent relatively ineffective regulations and anti-discrimination laws. It’s far more serious than that.

    These “mining” organizations create transactions based on asymmetric information that instils fear and distrust into society. They know everything about their applicants/customers/investors but the applicant/customer/investor knows almost nothing about the opaque transaction. More importantly, the applicant doesn’t know what the “mining” organization knows about them. The power imbalance will inevitably be leveraged in unhealthy ways – servicing abuses, requests to “mine” wealthy or influential friends for business purposes, etc….

    Unlike Abe’s small town banker (comment #1) Lending Club investors have absolutely no incentive to build desirable communities or to thwart profitable but socially destructive investments. Not being part of a visible part of a community, these investors and Lending Club have little or no reputational risk. At least not compared to a small-town banker whose practices are far more well-known in the community.

    Like

  10. wb
    August 29, 2013 at 3:27 pm

    >> Turns out that bypassing credit score regulation is great for business, so of course.
    so what you saying is that race, religion etc. does improve credit forecasts ?

    Like

    • August 29, 2013 at 3:29 pm

      I think it might. But I also think that’s a product of other racist elements of society.

      Like

  11. August 29, 2013 at 8:33 pm

    The comment” support a mechanism that allows individuals to interact more directly as debtors/creditors”, good idea but they don’t want you to interact with them..sad and maddening but it’s what it is. I see in my healthcare writings. Pay attention to what Cathy says here as it’s true. Also “commenter Greg Taylor” above knows his stuff.

    Credit agencies mining social data, yeah they are, I blogged it a while back..

    http://ducknetweb.blogspot.com/2013/05/credit-agencies-mining-social-networks.html

    I used to program and integrated a medical records program I created years ago with billing software that I did not write. I’m going back through and verifying my queries and so forth and matching on unique IDs and I’m thinking I either had a bug or the doctors office has some data input errors…turns out there were no bugs and there was no data input errors, the payment algorithms were that low, ok? So that’s what got me on my mission to research and educate myself as I came from an entirely different focus in the logistics area to where all the algorithms and software were miles ahead. How many times did you see on the news that some patient was denied care by an insurer? We still see it and the State of California just pushed United Healthcare to finally cover speech therapy when it was health related, like a stroke. They would deny those claims right and left with their algorithms. Ok enough with that example, but this is why you need to listen to what Cathy says because she’s right and when you have “been there seen that” there’s nothing else that anyone can say.

    Today I just wrote about a new “low” with Humana and Lilly Pharma and their agreement to mine and sell research data, United’s already doing it and this conflicts directly with the FDA Sentinel Initiative as insurers found out they could make a buck, and basically discounting any research data the government produces..

    When it comes to “flipping algorithms for profit” folks query, market and sell anything that will fly sadly. FICO is a good example as they have an analytics service they sell that “scores” you with using your credit data, web data and other sources they won’t reveal that they say predicts whether or not you will your prescriptions..bunk..I know and it go so bad FICO broke down and bought a medical revenue cycling company so they could supposedly keep their integrity in this area and keep selling those analytics to drug and insurance companies.
    I wrote a series of blog posts to where I focused on every day events like this that occur to try and bring some awareness around as so much of it is getting flawed too with the selling epidemic side of all of this, the Attack of the Killer Algorithms:) The Lending Club has to be buying data from somebody out there and I have had my little blog campaign going that we need to license (and excise tax) the data sellers so we have some record of who is selling what kind of data to who and have a Federal site to where consumer can look it up. We can’t stop it but this would be a key to the epidemic that going on out there.

    Your data is all over the web sometimes and you have no clue how it got there and on top of that there’s a good chance there flaws too. As and example, Walgreens makes about a billion a year, selling data only, those are the folks to excise tax, as they sell tons of data. Banks too make billions. At least with a license we could in some way give regulators and law enforcement some kind of a leg to stand on to fine, take away a license, and investigate some of this activity. You need a license to get married, sell real estate, practice medicine and so on and yet this field is like the wild west with nobody watching. I have written to the FTC a number of times. One of their lawyers answered me once so when I have an update on this topic he gets it:) It’s about the best I can do.

    So we have the Lender’s Club sitting around buying and mining data..well the are contributing to the problem and this is why manufacturing is having such a hard time as all want to flip algorithms for money. Once the code is done and in place it’s on auto pilot and the money rolls in without having to hire employees, start a plant, or whatever.

    Mr. Summers is definitely a contributor to the problem of epidemic data selling and the flaws and inequality that it creates and yes we don’t need him running the Fed as God only knows what roll out of his head there. He could turn the entire Fed into a data mining operation:) A while back I chatted with a banker incognito and we both agreed that banks are not much more than software companies that control and move a lot of money, and as long as they have the code and models to control it, we don’t stand a chance. Sorry to get on such a big soap box here but it goes back to my past in seeing in and I’m in there too getting fleeced with all of this. From what I have read I don’t think Larry Summers is smart enough to see the data selling epidemic as it is as most are not but to be all about “Larry” he’s going to play it to line his pocket as long as people throw money in his direction.

    You deserve to know why you were rejected and what kind of algorithm (in my case) did it and until this is somehow addressed the functions keep running on servers 24/7, moving and taking our money, one algorithm at a time.

    Like

  12. savanarola
    August 30, 2013 at 3:48 am

    There is another set of laws involved here, 15 U.S.C. 1681 et seq., the Fair Credit Reporting Act. It sounds as though Lending Club functions as a credit rating agency: it assembles essentially a dossier of information on an individual and uses it to make credit decisions. In effect, it is collapsing different functional roles in the statute into one entity. Under the definitions in the FCRA, that comes with obligations – and they specifically apply to these “silent failure” situations. If you are denied credit on the basis of a score attached to your name, you are supposed to be able to discover that score and correct anything incorrect in it. It will be interesting to see how they get around this. Given the current state of the law, you probably waive all rights when you apply through their website somehow – we’re perilously close to that being ok in this country at this point. But I do find it interesting. I can’t see how it would fly under the FCRA.

    Like

  13. ssanborn2013
    September 6, 2013 at 4:28 pm

    On behalf of Lending Club I would like to officially make two corrections: First, LendingClub’s program is operated in conjunction with an FDIC regulated bank and complies with applicable regulations including ECOA. Second, each declined applicant is provided a notice as to the reasons why they were unable to be approved for a loan in accordance with applicable regulations. Would you consider to make a correction to your post?

    Like

    • September 6, 2013 at 5:38 pm

      Interesting.

      I wasn’t actually claiming that Lending Club does anything illegal, more that the proprietary model which you guys use bypasses the spirit of the law. In other words, the fact that you follow the laws doesn’t surprise me.

      On the other hand, it does surprise me that you can give a given person such a notice considering that you do use a hybrid of measurements on individuals to determine their credit worthiness.

      Does this mean that you are willing to demonstrate that your proprietary model completely avoids using race, color, religion, national origin, sex, marital status, age, or public assistance status? If so I’d be impressed indeed and would certainly write a follow-up post.

      Like

  14. September 6, 2013 at 6:45 pm

    The amount of flawed data out there is on the rise for sure and again without know the metric of what all the sources are, you don’t know what they are using. Banks do it, companies do it and we don’t know. Even when you are a consumer request a copy of your credit information, you don’t get all the stuff that someone inquiring about you gets..60 Minutes proved that in a story they did a while back so why would the Lending Club act any differently in that respect.

    http://ducknetweb.blogspot.com/2013/02/60-minutes-confirms-super-attacks-of.html

    And for those who don’t qualify and maybe for those that do, guess what, there’s data for sale:) Believe me nobody, banks, companies want consumers to know about the billions they made selling data, Walgreens is good for about a billion a year so add it up up if you will. That could be a side business here as well. I looked at the site’s privacy statements and it resembles all the others out there with their 3rd party statements and a few other items that sell data..so again if you don’t get a simple and easy to read and interpret privacy statement and you have pages of legal mumbo jumbo, I assume there’s a very strong chance that data selling is going on and who and where we don’t know.

    Samsung in the mobile apps, which I posted about today just joined the ranks in the app department. SAP is trying to get cell phone data from Verizon with their proposition and is willing to split the additional profits with them, this would be versus Verizon doing their own data selling…crazy stuff…

    Something I wonder about too is that how accurate are the reporting on what they make selling data…good question I think as if one were to model something along that line as far as profitability then risk for investors could come out looking a lot better too…just me thinking out loud there. You almost can[‘t venture very far anywhere today without considering the data selling impact as it turns into a very very profitable side business for banks, companies,…you name it and they don’t want you to know how much they make:) I know I’m a broken record on this but it is to the point where it is creating economic disruptions and why manufacturing is hurting, why build factories and hire people, just put some algos out there, mine the data, hire a query master, figure out your marketing to sell your new queries as valuable analytics (whether they have value to not) and off they go. Sad this is the reality of what’s happening today.

    Like

    • Abe Kohen
      September 9, 2013 at 7:37 pm

      Maybe Cathy wants to do a story about Axciom’s “AboutTheData.com” which from what they do expose is full of errors.

      Like

      • September 9, 2013 at 10:14 pm

        I’m up for that for sure! I talk about flawed data all the time and I commented somewhere along the line about the Axicom’s site to where they say you can see your data. They are out of their minds to expect the public on their own dime to be the free labor to fix all their errors after they have already pulled in a few million or billions selling it.

        They tried that with doctor referral sites and it didn’t work as doctors were not going to fix the mistakes the data folks make.. The marketing trying to get them to do so was rip and kind of exaggerated too with tell MDs they could take over their web reputations and get it right…no interest as they don’t have time like we don’t to go through and fix everything others reported wrong about us. I got on to all of this about 3 years ago with the MDs and had a nice interview with the AMA as they got intrigued when I found my former doctor on the sites, just about all of them who had been dead for 8 years so how often do they update their data (grin).

        The other folks maybe a bit more frequent than that but still not often enough and why do you think you have such a hard time fixing credit information. That’s over head and they want to make money so a very low priority there. I know I sound like a broken record but everywhere I look with data and the movements of the business world with tangible suffering it all keeps pointing back to data selling, there’s huge profits in it and as long as the government and public don’t get it and can’t see behind closed server doors it continues. It’s more than just the brokers, we have banks, insurance companies, device companies, you name it all flipping algorithms for profit and they don’t care about the accuracy or the impact on consumers when it reaches the bottom end.

        I think Axicom is a little aware that the public is getting a bit smarter so they are playing defense here with “allowing” you to see what they have, otherwise they would not divulge in in thousand years.

        Like

  15. johngee34
    September 25, 2013 at 10:18 pm

    Too much power corrupts absolute- unregulated power corrupts absolutely! There is no rhyme or reason for institutional racism it exists

    Like

  1. August 30, 2013 at 6:58 am
Comments are closed.