NHacker Next
  • new
  • past
  • show
  • ask
  • show
  • jobs
  • submit
In Math, Rigor Is Vital. But Are Digitized Proofs Taking It Too Far? (quantamagazine.org)
umutisik 1 hours ago [-]
With sufficient automation, there shouldn't really be a trade-off between rigor and anything else. The goal should be to automate as much as possible so that whatever well-defined useful thing can come out theory can come out faster and more easily. Formal proofs make sense as part of this goal.
_alternator_ 45 minutes ago [-]
Let’s not forget that mathematics is a social construct as much as (and perhaps more than) a true science. It’s about techniques, stories, relationships between ideas, and ultimately, it’s a social endeavor that involves curiosity satisfaction for (somewhat pedantic) people. If we automate ‘all’ of mathematics, then we’ve removed the people from it.

There are things that need to be done by humans to make it meaningful and worthwhile. I’m not saying that automation won’t make us more able to satisfy our intellectual curiosity, but we can’t offload everything and have something of value that we could rightly call ‘mathematics’.

justonceokay 5 minutes ago [-]
> mathematics is a social construct

If you believe Wittgenstein then all of math is more and more complicated stories amounting to 1=1. Like a ribbon that we figure out how to tie in ever more beautiful knots. These stories are extremely valuable and useful, because we find equivalents of these knots in nature—but boiled down that is what we do when we do math

seanmcdirmid 42 minutes ago [-]
Automating proofs is like automating calculations: neither is what math is, they are just things in the way that need to be done in the process of doing math.

Mathematicians will just adopt the tools and use them to get even more math done.

quietbritishjim 38 minutes ago [-]
I don't think that's true. Often, to come up with a proof of a particular theorem of interest, it's necessary to invent a whole new branch of mathematics that is interesting in its own right e.g. Galois theory for finding roots of polynomials. If the proof is automated then it might not be decomposed in a way that makes some new theory apparent. That's not true of a simple calculation.
ndriscoll 10 minutes ago [-]
This is literally the same thing as having the model write well factored, readable code. You can tell it to do things like avoid mixing abstraction levels within a function/proof, create interfaces (definitions/axioms) for common ideas, etc. You can also work with it interactively (this is how I work with programming), so you can ask it to factor things in the way you prefer on the fly.
jhanschoo 10 minutes ago [-]
There are areas of mathematics where the standard proofs are very interesting and require insight, often new statements and definitions and theorems for their sake, but the theorems and definitions are banal. For an extreme example, consider Fermat's Last Theorem.

Note on the other hand that proving standard properties of many computer programs are frequently just tedious and should be automated.

3yr-i-frew-up 17 minutes ago [-]
[dead]
storus 47 minutes ago [-]
There are still many major oversimplifications in the core of math, making it weirdly corresponding with the real world. For example, if you want to model human reasoning you need to step away from binary logic that uses "weird" material implication that is a neat shortcut for math to allow its formalization but doesn't map well to reasoning. Then you might find out that e.g. medicine uses counterfactuals instead of material implication. Logics that tried to make implication more "reasonable" like relevance logic are too weak to allow formalization of math. So you either decide to treat material implication as correct (getting incompleteness theorem in the end), making you sound autistic among other humans, or you can't really do rigorous math.
YetAnotherNick 56 minutes ago [-]
The thing is if something is proved by checking million different cases automatically, it makes it hard to factor in learning for other proofs.
jl6 37 minutes ago [-]
Imagine a future where proofs are discovered autonomously and proved rigorously by machines, and the work of the human mathematician becomes to articulate the most compelling motivations, the clearest explanations, and the most useful maps between intuitions, theorems, and applications. Mathematicians as illuminators and bards of their craft.
tines 22 minutes ago [-]
But in this future, why will “the most compelling motivations, the clearest explanations, and the most useful maps between intuitions, theorems, and applications” be necessary? Catering to hobbyists?
ndriscoll 2 minutes ago [-]
Very far in the future when AI runs everything, of course math will be a hobby. In the nearer term, it seems apparent to me that people with stronger mental models of the world are able (without even trying!) to formulate better prompts and get better output from models.
layer8 18 minutes ago [-]
Mapping theorems to applications is certainly necessary for mathematics to be useful.
tines 12 minutes ago [-]
Sure, applications are necessary, but why will humans do that?
layer8 5 minutes ago [-]
I agree (https://news.ycombinator.com/item?id=47575890), but the parent assumes that AI will lack the ability.
layer8 20 minutes ago [-]
The question is whether the capabilities that would let AI take over the discovery part wouldn’t also let them take over the other parts.
johnbender 2 hours ago [-]
I’m confused by the calculus example and I’m hoping someone here can clarify why one can’t state the needed assumptions for roughed out theory that still need to be proven? That is, I’m curious if the critical concern the article is highlighting the requirement to “prove all assumptions before use” or instead the idea that sometimes we can’t even define the blind spots as assumptions in a theory before we use it?
pavpanchekha 1 hours ago [-]
In calculus the core issue is that the concept of a "function" was undefined but generally understood to be something like what we'd call today an "expression" in a programming language. So, for example, "x^2 + 1" was widely agreed to be a function, but "if x < 0 then x else 0" was controversial. What's nice about the "function as expression" idea is that generally speaking these functions are continuous, analytic [1], etc and the set of such functions is closed under differentiation and integration [2]. There's a good chance that if you took AP Calculus you basically learned this definition.

The formal definition of "function" is totally different! This is typically a big confusion in Calculus 2 or 3! Today, a function is defined as literally any input→output mapping, and the "rule" by which this mapping is defined is irrelevant. This definition is much worse for basic calculus—most mappings are not continuous or differentiable. But it has benefits for more advanced calculus; the initial application was Fourier series. And it is generally much easier to formalize because it is "canonical" in a certain sense, it doesn't depend on questions like "which exact expressions are allowed".

This is exactly what the article is complaining about. The non-rigorous intuition preferred for basic calculus and the non-rigorous intuition required for more advanced calculus are different. If you formalize, you'll end up with one rigorous definition, which necessarily will have to incorporate a lot of complexity required for advanced calculus but confusing to beginners.

Programming languages are like this too. Compare C and Python. Some things must be written in C, but most things can be more easily written in Python. If the whole development must be one language, the more basic code will suffer. In programming we fix this by developing software as assemblages of different programs written in different languages, but mechanisms for this kind of modularity in formal systems are still under-studied and, today, come with significant untrusted pieces or annoying boilerplate, so this solution isn't yet available.

[1] Later it was discovered that in fact this set isn't analytic, but that wasn't known for a long time.

[2] I am being imprecise; integrating and solving various differential equations often yields functions that are nice but aren't defined by combinations of named functions. The solution at the time was to name these new discovered functions.

coldcity_again 1 hours ago [-]
That's very helpful and clear, thank you
zitterbewegung 2 hours ago [-]
I think the future of having lean as a tool is mathematicians using this or similar software and have it create a corresponding lean code. [1] This is an LLM that outputs Lean code given a mathematical paper. It can also reason within lean projects and enhance or fix lean code.

[1] https://aristotle.harmonic.fun

j45 18 minutes ago [-]
Is digitized proofs another way of saying the equivalent of a calculator, when a calculator was new?
riverforest 2 hours ago [-]
Rigor is the whole point of math. The moment you start asking if there is too much of it you are solving a different problem.
woopwoop 2 hours ago [-]
Rigor is not the whole point of math. Understanding is. Rigor is a tool for producing understanding. For a further articulation of this point, see

https://arxiv.org/abs/math/9404236

1970-01-01 51 minutes ago [-]
This conflates rigor with proof. Proof is the solve to the argument you are making. Rigor is how carefully and correctly the argument is made. You can understand something without rigor but you cannot prove it.
storus 1 hours ago [-]
Rigor is one solution to mutual understanding Bourbaki came up with that in turn led to making math inaccessible to most humans as it now takes regular mathematicians over 40 years to get to the bleeding edge, often surpassing their brain's capacity to come up with revolutionary insights. It's like math was forced to run on assembly language despite there were more high-level languages available and more apt for the job.
cbdumas 59 minutes ago [-]
> It's like math was forced to run on assembly language despite there were more high-level languages available and more apt for the job.

I'm not a mathematician but that doesn't sound right to me. Most math I did in school is comprised concepts many many layers of abstraction away from its foundations. What did you mean by this?

storus 52 minutes ago [-]
My math classes were theorem, lemma, proof all day long, no conceptualization, no explanation; low-level formulas down to axioms. Sink or swim, figure it out on your own or fail.
meroes 2 hours ago [-]
If rigor is the whole point why are we so focused on classical math (eg classical logic) not the wider plurality?
gzread 1 hours ago [-]
It seems you have never tried to prove anything using a proof assistant program. It will demand proofs for things like x<y && y<z => x<z and while it should have that built in for natural numbers, woe fall upon thee who defines a new data type.
58 minutes ago [-]
ux266478 2 hours ago [-]
Rigor was never vital to mathematics. ZFC was explicitly pushed as the foundation for mathematics because Type Theory was too rigorous and demanding. I think that mathematicians are coming around to TT is a bit of funny irony lost on many. Now we just need to restore Logicism...
Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact
Rendered at 16:15:35 GMT+0000 (Coordinated Universal Time) with Vercel.