Home > data science, finance, math, math education > How is math used outside academia?

How is math used outside academia?

September 7, 2012

Help me out, beloved readers. Brainstorm with me.

I’m giving two talks this semester on how math is used outside academia, for math audiences. One is going to be at the AGNES conference and another will be a math colloquium at Stonybrook.

I want to give actual examples, with fully defined models, where I can explain the data, the purported goal, the underlying assumptions, the actual outputs, the political context, and the reach of each model.

The cool thing about these talks is I don’t need to dumb down the math at all, obviously, so I can be quite detailed in certain respects, but I don’t want to assume my audience knows the context at all, especially the politics of the situation.

So far I have examples from finance, internet advertising, and educational testing. Please tell me if you have some more great examples, I want this talk to be awesome.

The ultimate goal of this project is probably an up-to-date essay, modeled after this one, which you should read. Published in the Notices of the AMS in January 2003, author Mary Poovey explains how mathematical models are used and abused in finance and accounting, how Enron booked future profits as current earnings and how they manipulated the energy market. From the essay:

Thus far the role that mathematics has played in these financial instruments has been as much inspirational as practical: people tend to believe that numbers embody objectivity even when they do not see (or understand) the calculations by which particular numbers are generated. In my final example, mathematical principles are still invisible to the vast majority of investors, but mathematical equations become the prime movers of value. The belief that makes it possible for mathematics to generate value is not simply that numbers are objective but that the market actually obeys mathematical rules. The instruments that embody this belief are futures options or, in their most arcane form, derivatives.

Slightly further on she explains:

In 1973 two economists produced a set of equations, the Black-Scholes equations, that provided the first strictly quantitative instrument for calculating the prices of options in which the determining variable is the volatility of the underlying asset. These equations enabled analysts to standardize the pricing of derivatives in exclusively quantitative terms. From this point it was no longer necessary for traders to evaluate individual stocks by predicting the probable rates of profit, estimating public demand for a particular commodity, or subjectively getting a feel for the market. Instead, a futures trader could engage in trades driven purely by mathematical equations and selected by a software program.

She ends with a bunch of great questions. Mind you, this was in 2003, before the credit crisis:

But what if markets are too complex for mathematical models? What if irrational and completely unprecedented events do occur, and when they do—as we know they do—what if they affect markets in ways that no mathematical model can predict? What if the regularity that all mathematical models assume effaces social and cultural variables that are not subject to mathematical analysis? Or what if the mathematical models traders use to price futures actually influence the future in ways the models cannot predict and the analysts cannot govern? Perhaps these are the only questions that can challenge the financial axis of power, which otherwise threatens to remake everything, including value, over in the image of its own abstractions. Perhaps these are the kinds of questions that mathematicians and humanists, working together, should ask and try to answer.

  1. aaronschumacher
    September 7, 2012 at 10:02 am | #1

    By educational testing, do you mean VAM? I think that would be a great case; lots of different (but often largely similar) models, with confusion over what the real meaning of the results is and much hanging on how the results actually get used.

  2. September 7, 2012 at 10:05 am | #2


    Yes that’s what I mean, sorry I should have been more precise.


  3. Ashim
    September 7, 2012 at 10:18 am | #3

    Perhaps a note on the lines on Nicholas Naseem Taleb’s Fooled by Randomness / Black Swan,saying that the statistical modelling of events in finance as games of chance does’nt work very well. Also I think there is an article on Statistical modelling in Biology vs Mathematical Models in Biology which I think is titled something like “Why most Medical Research is False”. People who study PDE’s would be happy to hear that PDE models are considered by some people to be better than Stats Models.

  4. Deane
    September 7, 2012 at 10:26 am | #4


    I think it might be useful to distinguish between situations where math is used but just as a relatively static set of tools that are always used in a more or less algorithmic fashion and situations where a mathematician is needed on an ongoing basis to analyze new situations by either adapting existing tools and algorithms or even developing new ones.

    I’ll let others discuss specific applications, but I just wanted to make one philosophical remark. My experience (mostly with valuation models for fixed income securities and interest rate derivatives) has been that my most useful role as a mathematician has been to guide others on how to making sure that the mathematical tools being used are appropriate and are being used properly. For many reasons that I’m sure you know, this usually means avoiding unnecessary complication and sophistication and keeping things as simple as possible. I am often steering people away from the more fancy mathematical tools they have heard about and showing them how things can be done more simply and directly using ideas and tools that even non-mathematicians can understand. Of course, there are also times where I have to show people why what they are doing is too simplistic and they really need to use something more sophisticated.


  5. Steve Stein
    September 7, 2012 at 10:32 am | #5

    You really really need to talk to Loren Shure. (HCSSiM ’71, now at MathWorks)

  6. Steve Stein
    September 7, 2012 at 11:10 am | #6

    How about sports? How do you determine how to value a player? Tom Tango is among the leading practitioners nowadays. You could peruse the wiki that’s set up for his work “The Book – Playing the Percentages in Baseball”. For instance, this: http://www.tangotiger.net/wiki/index.php?title=Linear_Weights

  7. Lew
    September 7, 2012 at 11:49 am | #7

    Taleb’s “Black Swan” and some of his writing available online have more details on both financial modeling and problems with the standard Black-Sholes.

  8. September 7, 2012 at 12:00 pm | #8

    music makes mathematics

    Hyppasos von Metapont, pupil of Pythagoras, 500 before Christ,
    is musician, mathematician and philosopher, all things belonging together.
    He asked the question:
    When I have two tones, which are the same and which are one octave apart,
    then we know, they sound good together, like the two Sirens of Odysseus.
    Which tone lies in the middle?
    What is the middle between these two tones?
    He answers his question three times; the result is astonishing and wonderful

    The first middle is called the arithmetic middle, because we calculate it:
    From the whole string we take a part away
    from the half string we add the same part
    The result is 1/3
    The whole string is 66 cm long, minus 1/3 (22) gives 44
    The half string is 33 cm long, add 1/3 (11) gives 44
    so 44 cm is my middle note (3:2)
    The second middle is called the harmonic middle, and we get it by just
    dividing half the string in two, so between the whole string (4/4) and the half string(2/4) lies ¾ of the string, half of 33 is 16.5 cm (4:3)
    The third middle is called geometric middle and is built so:
    I take the length of half a string, I multiply this length with some factor, to get my new middle length.
    I multiply my new middle length another time with the same factor to get the length of my whole string.
    So starting with the length of my half string I multiply two times to get my whole string.
    (length of half string) times something = new middle
    (length of new middle) times the same something = (length of whole string) = 2 times (length of half string)
    So multiplying the (length of half the string) twice gives us (length of whole string)
    The outcome is square root 2 (1.41…)

    And now something strange happens, square root of 2 cannot be written down as a ratio of whole numbers, as is the case with our two other middle tones.
    We can only approximate this number, about 1.41, we can never wright it down exactly. When playing this note, we can hear that, we play around a note.
    This number we call irrational, we cannot grasp it with our ratio.
    And always when we make music, we play around this tone, we play around this irrational number, and that is the reason music is so beautiful
    The combination of tones which have a ratio in whole numbers and tones
    which are irrational makes music so exciting.
    Also this was the first time in history of the human kind that people were
    thinking about numbers as numbers, and from here western mathematics evolved.

    The fact the music evocates Greek mathematics seems so unthinkable, that until now (2007) nobody has dared to write that down. (Professor Kittler)
    And this mathematics tells us that there exist fundamental insolubilities.
    The geometric middle in music can only be approximated. And that makes music so beautiful.

    From “Musik und Mathematik I ” Professor Friedrich Kittler

  9. David Kemnitzer
    September 7, 2012 at 12:10 pm | #9

    A couple of obvious uses of math in public policy:
    1. Pollution control…models of dispersion, concentration, transformation, degradation of chemicals in the environment
    2. Climate change
    3. Actuarial predicitions that “inform” debates about Social Security, Medicare
    4. Budget and deficit projections (have you noticed that the debates about budgets, debt, and deficits all use 10-year figures — in other words, FICTION — nowadays?)

  10. somedude
    September 7, 2012 at 12:44 pm | #10

    Well I think that cryptography counts. They use elliptic curve cryptography nowadays.

    Furthermore a nice one would be if you want to buy and sell oil or gas if you are a middle man. Buying a lot gets you the stuff cheap, but if the winter is warm, you may not sell a lot and still make a loss.
    Furthermore some bigger players can store the stuff, so this makes some nice models using statistics and probability theory.

    Same mathematical principle if you want to make a train schedule. Trains have a certain spread in the time it takes to go between two stations. People like trains to both be on time and to run often, and you need to strike a balance when you design your train schedule.

  11. September 7, 2012 at 12:58 pm | #11

    Fascinating discussion, I wish I had the current mathematical experience to provide specific examples, but I agree with Ashim that medical statistics must be a goldmine of radical abuse, especially in market-driven pharmacological “research”, both in corporate labs and in corporately-funded academic labs. The corruption involved, often conveniently ignored in the Main $tream Media, seems to only emerge from occasionally-revealed records of a proliferation of lawsuits or unredacted regulatory rulings. #OccupyPharma

    • Ashim
      September 7, 2012 at 1:35 pm | #12

      I was referring to the fact that many statistical results are simply due to “chance”. No clue about corruption in Medical Stats.

  12. limeade17
    September 7, 2012 at 1:13 pm | #13

    We use math to help big health insurance companies go through their claims data and figure out where certain providers seem to have unbelievably sick patients (and thus, conveniently, higher reimbursement rates). Then they sit these doctors down for a talking-to, with some handy graphs and charts and lawyers.

  13. Geoff
    September 7, 2012 at 2:53 pm | #14

    ++ to the crypto comment above.

    And, a fun application might be self-driving cars — e.g., CMU’s Red Team, Stanford’s Junior, and Google’s self-driving car. Going from sensor data to a model of the external world is a pretty interesting problem, as is discovering a collision-free and law-abiding path in real time.

  14. September 7, 2012 at 3:12 pm | #15

    You might look at all the math that needs to be understood to create and support MATLAB and the various add-on products where I work, MathWorks. Here, many mathematicians have careers making new numerical algorithms for applications as well as making existing algorithms accessible to technical people who have expertise outside of computing but need to understand data.

  15. Geoff
    September 7, 2012 at 4:14 pm | #16

    Also, there are a number of (to me) beautiful results about what are fast algorithms for various learning-related problems. Some examples:

    The log-barrier method for solving linear programs. Before the 1980s, it was not known whether there were guaranteed-efficient algorithms for this problem, and the discovery of fast methods led to an explosion of applications (Google CPLEX and look at their promotional material for a list of some). The proof that this method works is quite interesting: it blends Newton’s method with path-following in a way which (to me at least) seemed unlikely to work, but in fact converges globally and rapidly.

    Locality-sensitive hashing: for finding approximate nearest neighbors quickly in high dimensions, which is a component of many statistical algorithms. (E.g., find the people who liked similar movies to me, in order to make recommendations for what I might want to watch.)

    The lasso and compressed sensing: a particular convex relaxation of a non-convex optimization problem works with high probability, even though as an optimization relaxation it seems pretty bad. The results include better image reconstruction (e.g., from things like CAT scans) and statistical learning from tiny amounts of data (e.g., a million gene activity measurements per individual in a microarray, and only a few hundred observed individuals we can use to figure out which genes are relevant).

  16. September 7, 2012 at 4:52 pm | #17

    Models used to map social networks and glean information from them might lead to an interesting discussion. Not sure where you can get good examples though.

  17. Zebrina
    September 7, 2012 at 6:33 pm | #18

    I’m a geophysicist. We use stochastic modelling and inversion algorithms to predict accumulations of oil and gas in the subsurface.

  18. September 7, 2012 at 9:28 pm | #19

    Hey, another thing that may be of interest: The application of logic, category theory and whatnot in functional programming and programming in general.

    Disclaimer: I started programming when I was 10 and have spent ~15 years on that now. About 5-6 years ago, I took the functional programming path and now dependent types are dragging me ever deeper down into the foundations of math. Back then I was like “yuck, math!”, now that’s changed. I still fail at calculus 101. My knowledge has huge gaps, I may explain things oddly and there… and I tend to ramble on and on. Anyway, here goes:

    Every type system induces a logic, hence every programming language with types has a corresponding logic. If I have a function that takes an A and produces a B upon execution (and that function passes the type checker), I have a constructive proof of A ⇒ B in that logic. (The function need not even be used / executed, the fact that it typechecks is enough.) Most of these logics are inexpressive and wildly inconsistent (just look at C or C++). They exist, nonetheless.

    If the logic *is* consistent, then no program written in that language can crash. This is a very nice property to have for reliability and it’s also a stepping stone in the defense against hacking and exploitation. (Another side effect is that it may suddenly make sense to write proofs as programs in that language, because proofs are just tests that are run on all possible inputs at once, without actually running anything.) By restricting the feature (sub)set of a language that one uses, one can get a more sane logic, up to (relative) consistency.

    Functional programming languages usually have a quite expressive and relatively sane logic in their type system. Haskell has some unsafe functions that break things (but you have to explicitly import them) and non-termination will also break the logic (take f : A → B ≔ x ↦ f x (for any A, B)). There might even be a soundness proof for Haskell somewhere (under the assumption that you stay clear of those things), and I could keep rambling about Haskell: Types and functions form a category, most interesting things that somehow influence control flow form (endo)functors or natural transformations (and people (programmers!) start to look for improvement when they don’t), … but that’s enough, you can easily search for more.

    I think that with Haskell, Scala, and F#, functional programming has hit the mainstream. Heck, there are multiple Haskell web frameworks, people write games in Haskell, there’s the “Real World Haskell” book (http://book.realworldhaskell.org/)… and then there’s also https://ocaml.janestreet.com/ with tons of information for OCaml, taken from real world (financial) programming. Oh yeah, and there are blog posts on category theory in JavaScript.

    Concurrency is more and more of an issue with growing amounts of data, especially now that you can no longer just make processors faster, and functional programming brings with it a mindset that makes it much easier to parallelize a lot of things. Map/reduce is everywhere on the web.

    Functional idioms (which are just math in disguise) ease maintenance. Math is old and time-tested, computers and programming are comparatively young. There are many structures in math that help with working in a clear and efficient way, and in a functional language you can pretty much just copy them into your code.

    People rely on types as documentation, as a safety net, as a tool that allows monkey-coding while guaranteeing that stuff does what it should. (Done that myself, it works!) If you’re using functional programming languages, you can’t avoid being exposed to logic and a bit of category theory. Stupid-simple data structures are the equivalent of common logical constructions (and, or, …), monads (the main abstraction for effectful computations in Haskell) come from category theory, and many interesting data structures form (endo)functors. Sooner or later, you’ll have to use these. It’s also efficient to use these abstractions because (once you’ve understood them) they make it easier to think about the code, and it’s efficient to learn more and more math because that makes it easier to program. It also makes it easier to test and verify code. Pragmatism drags you on.

    Types are a drug. Once you’ve grokked the possibilities, you probably can’t stop. (I couldn’t.) At some point in time, you might want to get rid of turing completeness to make the logic sound. (All practically useful computations terminate, so you don’t need – and don’t want – nontermination. Plus, there are category-typehackery things that allow sensible non-terminating programs while keeping the bad stuff out: Flip the arrows in your data and get codata.) Once you’re past that point, you’re a lost cause, far removed from the real world. You’ll want ever more expressive types. Dependent types allow values in types, hence functions as well, which means that you can encode any proposition as a type. (The previous parameter is prime. This parameter taken from the carrier type(/set) F_(2^32) denotes a valid file handle, belonging to a file that was opened before, with sufficient permissions for the action. …) If there is a value of a given type, the corresponding proposition is true – if your logic is consistent. Suddenly, you’re at the foundations of math, brutally confronted with undecidability, staring into the nothingness, juggling axioms, amazed by independence proofs and inconsistencies. Well, at least I am.

    Now back up, but stop at turing completeness: Somewhere on the boundary between academia and hacking there’s LANGSEC (http://langsec.org/occupy). To them, exploits are constructive proofs of the insecurity of a system, programs that run on the “weird machine” formed by the input handling routines and any bugs or functions present in them. They show how to use theory to find exploitable bugs, and they show how to use theory to avoid whole classes of bugs. Hackers are picking up speed (and theory). Writing secure software will require more and more theory somewhere in the system, and some of it is bound to leak through to the mere ‘mortal’ programmers.

    Interesting times ahead!

    (If anything of what I’m written is relevant to you, ping me and I’ll happily explain more or answer questions.)

  19. September 7, 2012 at 10:54 pm | #20

    See the INET interview with Judy Klein:
    The Rules of War and the Development of Economic Ideas

  20. September 7, 2012 at 11:00 pm | #21

    See McCorduck’s Machines Who Think after 25 Years: Revisiting the Origins of AI by Philip Mirowski of University of Notre Dame. You might find his essay of use. Available at:

  21. lew
    September 8, 2012 at 4:30 pm | #22

    Just ran across this, thought it might be interesting in this context.
    Dan Kalman “Elementary Mathematical Models : Order Aplenty and a Glimpse of Chaos”.

    The language of mathematics has proven over centuries of application to be an indispensable tool for the expression and analysis of real problems. With numerical, graphical, and theoretical methods, this book examines the relevance of mathematical models to phenomena ranging from population growth and economics to medicine and the physical sciences. In a book written for the intelligent and literate non-mathematician, Kalman aims at an understanding of the power and utility of quantitative methods rather than at technical mastery of mathematical operations. He shows first that mathematical models can serve a critical function in understanding the world, and he concludes with a discussion of the problems encountered by traditional algebraic assumptions in chaos theory. Though models can often approximate future events based on existing data and quantitative relationships, Kalman shows that the appearance of regularity and order can often be misleading. By beginning with quantitative models and ending with an introduction to chaos, Kalman offers a broad treatment of both the power and limitations of quantitatively-based predictions.

  22. Polytrope wants a cracker
    September 9, 2012 at 3:30 am | #23

    This looks interesting, a declassified 1967 paper that discusses how the U.S. Navy used Bayes’s Theorem to predict the location of submarines:

  23. Francisco
    September 9, 2012 at 12:53 pm | #24

    - Visual effects is a good area that is often overlooked. Notices of the AMS has a nice overview here: http://www.ams.org/notices/201005/rtx100500614p.pdf. In a nutshell, one has computational fluid mechanics for simulating various sorts of fluids (water, flames, smoke, etc.), continuum mechanics for flesh simulation and plastic deformation, various types of models for cloth simulation, and so on.

    - Image processing. This has become increasingly sophisticated in recent years and now incorporates non-linear functional analysis, wavelets and PDEs. There are many applications including image de-noising, restoration, de-blurring, edge detection, etc.

    These are more “input-output” type uses rather than “modelling for prediction” type uses (following the distinction made by Deane).

  24. September 9, 2012 at 7:04 pm | #25

    How about M&S of DIME(FIL)/PMESII used by DOD to gain deeper understanding to environmental conditions and cultural landscape? One such discussion exists in Human Terrain Systems where data mapping is underway using multi-disciplined teams of sociologists, economists, psychologists, anthropologists, etc., to determine cultural influences affecting Afghani decision-making, strategy, and alliances. The purpose is to provide context to better inform USG and US military policy in the region; notice I said inform policy and not decide policy.




  25. September 10, 2012 at 8:25 am | #26

    All sorts of engineering design depends on using mathematical calculations. Whether building bridges, water supplies, buildings, electrical systems, mechanical devices, waste treatment and disposal, or any number of things, math is essential for determining what materials to use and what methods and sizes of materials will accomplish the job efficiently and effectively.

  26. jeg3
    September 10, 2012 at 10:22 pm | #27

    Fraud Detection (needed for current events):
    “Math Tree May Help Root out Fraudsters: Applying Algorithm to Social Networks Can Reveal Hidden Connections Criminals Use to Commit Fraud”

  27. Sara Billey
    September 15, 2012 at 1:23 am | #28

    As a mathematician, I would like to understand the model that Zillow uses to estimate home prices. I think they are doing a much better job than my local government. I know some people in both camps and neither group will tell me their model. Does anyone know what is being used? At least some of the government models should be publicly available.

    Does RealDirect do estimates? If so, do they explain how?

  28. Constantine Costes
  29. nicolas
    February 13, 2013 at 1:55 pm | #30

    “what if markets are too complex for mathematical models?” I like how one can sincerely ask question everyone knows the answer for.

    oh really ? and by using asynchronous series we underestimate correlation ? hmmm… never thought so.. interesting idea, but let’s just call that unestablished theory, you look funny anyway.

    sorry to be so smug, but really..

Comments are closed.

Get every new post delivered to your Inbox.

Join 887 other followers

%d bloggers like this: