265
submitted 1 year ago by troyunrau@lemmy.ca to c/math@lemmy.world
top 50 comments
sorted by: hot top controversial new old
[-] Aesthesiaphilia@kbin.social 35 points 1 year ago

I mean you've just translated from a language most people don't speak to a different language most people don't speak

[-] Zeth0s@lemmy.world 10 points 1 year ago* (last edited 1 year ago)

A simpler language many people know (math) to one of the imfinite dialect of a language most people don't speak.

Left representation is definitely more readable and understanded by more people

[-] beefcat@lemmy.world 0 points 1 year ago* (last edited 1 year ago)

I don't know about that, I know a lot of successful programmers who never took calculus.

The barrier to entry for programming is considerably lower today than it was even 15 years ago. Lots of kids, myself included back in the day, were learning basic control flow in languages like C, Python, or JavaScript long before taking advanced math courses in high school.

[-] Zeth0s@lemmy.world 2 points 1 year ago* (last edited 1 year ago)

Where I grow up sum at least is thought in all high school. Final exam in many high school (mine included) must have at least exercises on integrals, that are just infinitesimal sums.

If one went to high schools, 90% they know these symbols. Very few of them can program.

Programming doesn't require math, but scientific computing, algorithms and hpc do require understanding of linear algebra, as computers "think" in linear algebra

[-] beefcat@lemmy.world 1 points 1 year ago* (last edited 1 year ago)

It was never required in my school district, where the minimum requirement was Algebra 2.

But the popularity of this post kind of proves my point. There are a lot of programmers out there who readily understood the for loops on the right, but not the sigma notation on the left. Pretending their experience is invalid cuts us off from a potential avenue to help more people understand these concepts.

load more comments (2 replies)
[-] maegul@lemmy.ml 21 points 1 year ago* (last edited 1 year ago)

Part of what's going on here is that math notation is ... not good. Not for understanding, readability or explanation. Add in the prestige that surrounds being "good at math" and "being able to read that stuff?" and you get an unhealthy amount of gate keeping.

Whenever I've been able to find someone breakdown a set of equations into computer code has been a wonderful clarifying experience. And I think it goes beyond just being better at code or something. Computer code, more often, is less forgiving about what exactly is going on in the system. Maths, IME, often leaves some ambiguity or makes some presumption in the style of "oh, of course you'd need to do that". While if you going to write a program, it all needs to be there, explicitly.

I recommend Brett Victor's stuff on this: Kill Math

[-] Zeth0s@lemmy.world 12 points 1 year ago* (last edited 1 year ago)

That's absolutely the opposite for me. Math language is extremely good in summarizing extremely complex logic in few lines. We have huge ML projects with a looot of logic, that can be summarized with either 10 lines of math or 100 lines on English overwhelming cognitive complex.

Math is the best language we have for logic.

This meme is the proof, left representation is more concise and clearer than the for loop, and therefore allows for easily represent much more complex logic, while for loops become quickly unreadable (map and reduce are for instance more readable)

[-] RagingNerdoholic@lemmy.ca 1 points 1 year ago

Concise, yes. Clearer, definitely not.

load more comments (7 replies)
[-] kuzcospoison@lemmy.ml 8 points 1 year ago

It's funny, with the increase in use of numerical models, so much math has been turned into computer code. Derivatives and integrals as well are defined by finite difference formulas that serve as the basis for the notations. The point of them isn't to explain, it's just to simplify writing and reading it. I agree it can be a bit obtuse but if you had to write out a for loop to solve a math equation every time it would take forever lol

[-] maegul@lemmy.ml 1 points 1 year ago

Well this is where the computing perspective comes in.

Programming culture has generally learnt over time that the ability to read code is important and that the speed/convenience of writing ought to be traded off, to some extent, for readability. Opinions will vary from programmer to programmer and paradigm/language etc. But the idea is still there, even for a system whose purpose is to run on a computer and work.

In the case of mathematical notation, how much is maths read for the purposes of learning and understanding? Quite a lot I'd say. So why not write it out as a for loop for a text/book/paper that is going to be read my many people potentially many times?!

If mathematicians etc need a quick short hand, I think human history has shown that short hands are easily invented when needed and that we ought not worry about such a thing ... it will come when needed.

[-] Zeth0s@lemmy.world 5 points 1 year ago* (last edited 1 year ago)

Actually programs are much less readable than corresponding math representation. Even in a simpler example of a for loop. Code is known to quickly add cognitive complexity, while math language manage to keep complexity understandable.

Have you tried reading how a matrix matrix multiplication is implemented with for loops? Compare it with the mathematical representation to see what I mean

Success of fortran, mathematica, R numpy, pandas and even functional programming is because they are built to make programming closer to the simplicity of math

[-] maegul@lemmy.ml 1 points 1 year ago

Will I think there’s a danger here to conflate abstraction with mathematical notation. Code, whether Fortran, C or numpy, is capable of abstraction just as mathematics is. Abstraction can help bring complexity under control. But what happens when you need to understand that complexity because you haven’t learnt it yet?

Now sure writing a program that will actually work and perform well adds an extra cognitive load. But I’m talking more about procedural pseudo code being written for the purposes of explaining to toss who don’t already understand.

[-] Zeth0s@lemmy.world 2 points 1 year ago* (last edited 1 year ago)

Math is the language developed exactly for that, to be an unambiguous, standard way to represent extremely complex, abstract concepts.

In the example above, both the summation and the for loop are simply

a_1 + a_2 + ... + a_n

Math is the language to explain, programming languages is to implement it in a way that can be done by computers. In a real case scenario is more often

sum(x)

or

x.sum()

as a for loop is less readable (and often unoptimized).

If someone doesn't know math he can do the same as those who don't know programming: learn it.

Learning barrier of math is actually lower than programming

[-] kogasa@programming.dev 4 points 1 year ago

Using for loops instead of sigma notation would be almost universally awful for readability.

[-] troyunrau@lemmy.ca 5 points 1 year ago

I agree. Mathematical notation is often terribly opaque. And sometimes outright broken. Why the hell is it sin²(x)? Any reasonable programmer will tell you that this syntax will only lead to trouble. ;)

[-] Zeth0s@lemmy.world 3 points 1 year ago

What's wrong with sin^2(x)?

[-] kogasa@programming.dev 4 points 1 year ago

Putting an exponent on a function symbol like that usually means either a typical exponential/power, except when it's -1, in which case it's a functional inverse. sin^(-1)(x) is the functional inverse of sin(x), which is not the same as the reciprocal (sin(x))^(-1). Some people even use sin^(a)(x) where a is an integer to denote functional composition, so sin^(2)(x) = sin(sin(x)).

Besides that pretty major issue, nothing.

[-] Falmarri@lemmy.world 17 points 1 year ago
[-] troyunrau@lemmy.ca 3 points 1 year ago

Haha, touché. I just used the xpost function on the original though. I hold myself blameless.

[-] shotgun_crab@lemmy.world 4 points 1 year ago* (last edited 1 year ago)

You can edit it I think (not sure if it works for crossposts)

[-] kamen@lemmy.world 16 points 1 year ago* (last edited 1 year ago)

Yeah, cool, except that the first time you encounter these (probably in high school) you'd be a minority if you somehow already know programming.

Edit: and if you somehow already know programming, chances are you've encountered some math in the process.

[-] beefcat@lemmy.world 8 points 1 year ago

I learned basic programming skills around the time I was taking algebra in middle school. This was in the '00s.

For me, code was a lot easier to understand and going forward I would write programs that implemented the concepts I was learning in math classes in order to better comprehend them (and make my homework easier). I demonstrated enough aptitude here that I was allowed to take two years of AP Computer Science in high school despite lacking the math prerequisites.

I know a lot of programmers who think they are "bad at math" but really, they struggle with mathematical notation. I think a big reason for this disconnect is that mathematical notation prioritizes density, while modern programming languages and styles prioritize readability.

These different priorities make sense, since math historically needed to be fast to write in a limited amount of space. Mathematicians use a lot of old Greek symbols, and single-letter variable identifiers. The learning curve and cognitive load associated with these features is high, but once mastered you can quickly express your complex idea on a single chalkboard.

In programming, we don't need to fit everything on a chalkboard. Modern IDEs make wrangling verbose identifiers trivial. The programming languages themselves make use of plain English words rather than arcane Greek letters. This results in code that, when well written, can often be somewhat understood even by lay people

[-] someguy3@lemmy.world 10 points 1 year ago

Maybe it's the order that you learn it in. For me the left side is the easy to read and understand one.

[-] nodimetotie@lemmy.world 1 points 1 year ago

I am with you

[-] hark@lemmy.world 6 points 1 year ago

Single-letter constant/variable names are strongly discouraged in programming but standard in math.

[-] StarManta@lemmy.world 6 points 1 year ago

Math standard practices were created at a time when everyone was doing them by hand. Absolutely no one would write out “coefficient of gravity” or whatever 20 times by hand while trying to solve a physics equation.

Single letter variable names were common in early programming for basically the same reason, only with typing.

Ever since the proliferation of autocomplete and intellisense in programming IDE’s, typing a 4-word-long variable name has become a few key letters and then hitting tab. Ever since then, code readability has trumped the desire to type out fewer letters.

[-] kogasa@programming.dev 2 points 1 year ago

Complicated math generally contains a lot more explicit definitions of the variables involved, either in English or with previously established notation. Writing proofs is more about communicating the result than it is proving it. In that sense it is similar to programming with an emphasis on maintainability.

[-] beefcat@lemmy.world 0 points 1 year ago

Sure, the variables have explicit definitions somewhere, but it still requires you to go back and reference them every time you forget what y stood for.

With more verbose identifiers like in code, you don't need these reminders. The cognitive load is reduced, because you no longer need to hold a table in your head that correlates these random letters with their definitions.

[-] kogasa@programming.dev 1 points 1 year ago

I assure you the cognitive load would not be reduced. It would just be less readable.

[-] MossBear@lemmy.world 5 points 1 year ago

I mean Freya Holmer is a pretty great teacher, so not surprising. I learned vector math watching her videos.

[-] thalamus@lemmy.world 4 points 1 year ago

One of my math teachers explained it exactly like this. ‘For the people who know how to program: this is the same as using a for loop’.

[-] galilette@mander.xyz 4 points 1 year ago

Math is a language, code is instruction. The language of math is geared toward efficiency and viability for abstractions a layer higher (ad infinitum). Once you are familiar with the language, the symbols take a life of their own, and their manipulation becomes almost mechnical, which reduces the cognitive cost for your mind to operate at the current level of abstraction, so you can focus your mental power on the next level of abstraction, and try to figure out something novel. You can of course unpack the compact language of math into the more plain -- in a sense more "flat" -- form of code instructions; the purpose is different: its more about implementing ideas than creating the ideas in the first place.

[-] beefcat@lemmy.world 4 points 1 year ago

I love this!

I struggled with higher math in high school until I started learning how to code. I was lucky and had math teachers that encouraged me to learn this way.

I would love to see a full calculus course that teaches you in code before teaching you the proper notation.

[-] reflex@kbin.social 4 points 1 year ago

Did my undergrad in math and never learned what that capital pi-looking thing was. Sigmas all the tyme doe.

[-] AlataOrange@lemmy.world 5 points 1 year ago

It's honestly not that useful I've only ever seen it in high level statistics

[-] mohKohn@kbin.social 1 points 1 year ago

They come up in complex analysis bc writing polynomials in terms of their roots is sometimes useful.

[-] kogasa@programming.dev 1 points 1 year ago

Convergence issues aside, you can get from a product to a sum by taking logarithms. This is often a feasible way to reason about them / prove results about them.

[-] Hazdaz@lemmy.world 3 points 1 year ago

One of the worst things is a teacher who knows his material so well that he can't dumb it down enough to explain it to someone who literally has never seen that notation ever before.

A teacher without empathy is a terrible, awful thing that can turn students off so fast.

I went to get my degree later in life and I would butt heads with this one particular math teacher all the time who admitted was extremely intelligent, but the entire class was lost because he could just would not break from his predetermined notes and lesson plans. Everyone else in that class was 18, 19 or 20 years old and too naive or timid to voice their concerns. I was considerably older and paying dearly for these classes, so you better believe I refused to just let issues slide. I'm sure some teachers would think I was a nightmare student, but I wasn't trying to be disruptive - I was simply trying to learn and this guy was just bad at it with 3/4 of the class dropping out eventually.

[-] BobbyBandwidth@lemmy.world 3 points 1 year ago
[-] troyunrau@lemmy.ca 3 points 1 year ago

Meta: I love this thread. It gives me hope that Lemmy has the critical mass required already. I can imagine this discussion taking place in r/math, and there being many times more comments, but the substantial points are all hit here. :)

[-] nodimetotie@lemmy.world 2 points 1 year ago

tbh, I am not sure what's more scary, the LHS or the RHS

[-] Hypersapien@lemmy.world 1 points 1 year ago* (last edited 1 year ago)
load more comments
view more: next ›
this post was submitted on 18 Jul 2023
265 points (91.8% liked)

math

792 readers
1 users here now

General community for all things mathematics on @lemmy.world

Submit link and text posts about anything at all related to mathematics.

Questions about mathematical topics are allowed, but NO HOMEWORK HELP. Communities for general math and homework help should be firmly delineated just as they were on reddit.

founded 1 year ago
MODERATORS