I think the author makes a good point about understanding structure over symbol manipulation, but there's a slippery slope here that bothers me.<p>In practice, I find it much more productive to start with a computational solution - write the algorithm, make it work, understand the procedure. Then, if there's elegant mathematical structure hiding in there, it reveals itself naturally. You optimize where it matters.<p>The problem is math purists will look at this approach and dismiss it as "inelegant" or "brute force" thinking. But that's backwards. A closed-form solution you've memorized but don't deeply understand is worse than an iterative algorithm you've built from scratch and can reason about clearly.<p>Most real problems have perfectly good computational solutions. The computational perspective often forces you to think through edge cases, termination conditions, and the actual mechanics of what's happening - which builds genuine intuition. The "elegant" closed-form solution often obscures that structure.<p>I'm not against finding mathematical elegance. I'm against the cultural bias that treats computation as second-class thinking. Start with what works. Optimize when the structure becomes obvious. That's how you actually solve problems.
<p><pre><code> Mathematics is not the study of numbers, but the relationships between them
- Henry Poincaré
</code></pre>
I want to stress this because I think you have too rigid of a definition of math. Your talk about optimization sounds odd to me as someone who starts with math first. Optimization is done with a profiler. Sure, I'll also use math to find that solution but I don't start with optimization nor do I optimize by big O.<p>Elegance is not first. First is rough. Solving by math sounds much like what you describe. I find my structures, put them together, and find the interactions. Elegance comes after cleaning things up. It's towards the end of the process, not the beginning. We don't divine math just as you don't divine code. I'm just not sure how you get elegance from the get go.<p>So I find it weird that you criticize a math first approach because your description of a math approach doesn't feel all that accurate to me.<p>Edit: I do also want to mention that there's a correspondence between math and code. They aren't completely isomorphic because math can do a lot more and can be much more arbitrarily constructed, but the correspondence is key to understanding how these techniques are not so different.
Some people like Peter Norvig prefer top-down, hackers like me and you prefer bottom-up. Many problems can be solved either way. But for some problems, if you use the wrong approach, you're gonna have a bad time. See Ron Jeffries' attempt to solve sudoku.<p>The top-down (mathematical) approach can also fail, in cases where there's not an existing math solution, or when a perfectly spherical cow isn't an adequate representation of reality. See Minix vs Linux, or OSI vs TCP/IP.
The hacker’s mentality is like that of the painter who spends months on a portrait in order to produce a beautiful but imperfect likeness, marked with his own personal style, which few can replicate and people pay a lot for. The mathematical approach is to take a photo because someone figured out how to perfectly reproduce images on paper over a hundred years ago and I just want a picture, dammit, but the camera’s manual is in Lojban.<p>IMO, the mathematical approach is essentially always better for software; nearly every problem that the industry didn’t inflict upon itself was solved by some egghead last century. But there is a kind of joy in creating pointless difficulties at enormous cost in order to experience the satisfaction of overcoming them without recourse to others, I suppose.
Fair point about problem-fit - some problems do naturally lend themselves to one approach over the other.<p>But I think the Sudoku example is less about top-down vs bottom-up and more about dogmatic adherence to abstractions (OOP in that case). Jeffries wasn't just using a 'hacker' approach - he was forcing everything through an OOP lens that fundamentally didn't fit the problem structure.<p>But yes, same issue can happen with the 'mathematical' approach - forcing "elegant" closed-form thinking onto problems that are inherently messy or iterative.
I'd argue that everyone solves problems bottoms up. It's just that some people have done the problem before (or a variant of it) so they have already constructed a top-down schema for it.
I really enjoyed the book Mathematica by David Bessis, who writes about his creative process as a mathematician. He makes a case that formal math is usually the last step to refine/optimize an idea, not the starting point as is often assumed. His point is to push against the cultural idea that math == symbols. Sounds similar to some of what you're describing.
I really didn't like that book. Its basic premise was that we should separate the idea of mathematics from the formalities of mathematics, we should aim to imagine mathematical problems visually. The later chapters then consist of an elephant drawing that isn't true to scale and tell me why David Bessis thought it would be best to create an AI startup, that just put the final nail in the coffin for me. There's some historical note here and there, but that's it - it really could've been a blog post.<p>Every single YouTube video from tom7[0] or 3blue1brown[1] do way more on transmitting the fascinations of mathematics.<p>[0]: <a href="https://www.youtube.com/@tom7" rel="nofollow">https://www.youtube.com/@tom7</a><p>[1]: <a href="https://www.youtube.com/@3blue1brown" rel="nofollow">https://www.youtube.com/@3blue1brown</a>
Indeed, this is a fantastic book.<p>I could relate how he described the mathematical experience with what I feel is happening in my head/brain when I do programming.
Amazing book. I love how he brings math into something tacit and internal.
I have math papers in top journals and that's exactly how I did math;<p>Just get a proof of the open problem no matter how sketchy. Then iterate and refine.<p>But people love to reinvent the wheel without caring about abstractions, resulting in languages like Python being the defacto standard for machine learning
Now there's engineering and math. Engineering use maths to solve problems and when writing programs, you usually tinker with your data until the math tools pops in your mind (e.g. first look at your data then conclude that a normal distribution is the way to think about them). BAsically, one uses existing math tools. In math it's more about proving something new, building new tools I guess.<p>Sidenote: I code fluid dynamics stuff (I'm trained in computer science, not at all in physics). It's funny to see how the math and physics deeply affect the way I code (and not the other way around). Math and physics laws feels unescapable and my code usually have to be extremely accurate to handle these laws correctly. When debugging that code, usually, thinking math/physics first is the way to go as they allow you to narrow the (code) bug more quickly. And if all fails, then usually, it's back to the math/physics drawing board :-)
I completely agree. Start with what works, rough, understand it a bit deeper develop better solutions. Any trial-error, brute force or inelegant makes more natural for practioner. I think this aligns with George Pólya <a href="https://en.wikipedia.org/wiki/How_to_Solve_It" rel="nofollow">https://en.wikipedia.org/wiki/How_to_Solve_It</a> book.
The brute force is more productive and will build better intuition when you will realize the pattern and so elegant will come.
3Blue1Brown has a great video which frames this as a cultural problem that also exists in mathematics pedagogy:<p><a href="https://www.youtube.com/watch?v=ltLUadnCyi0" rel="nofollow">https://www.youtube.com/watch?v=ltLUadnCyi0</a><p>Personally, I find a mix of all three approaches (programming, pen and paper, and "pure" mathematical structural thought) to be best.
Math isn't about memorizing closed-form solutions, but analyzing the behavior of mathematical objects.<p>That said, I mostly agree with you, and I thought I'd share an anecdote where a math result came from a premature implementation.<p>I was working on maximizing the minimum value of a set of functions f_i that depend on variables X. I.e., solve max_X min_i f_i(X).<p>The f_i were each cubic, so F(X) = min_i f_i(X) was piecewise cubic. X was dimension 3xN, N arbitrarily large. This is intractable to solve as, F being non-smooth (derivatives are discontinuous), you can't well throw it at Newton's method or a gradient descent. Non-differentiable optimization was out of the question due to cost.<p>To solve this, I'd implemented an optimizer that moved one variable at a time x, such that F(x) was now a 1d piecewise cubic function that I could globally maximize with analytical methods.<p>This was a simple algorithm where I intersected graphs of the f_i to figure out where they're minimal, then maximize the whole thing analytically section by section.<p>In debugging this, something jumped out: coefficients corresponding to second and third derivative were always zero. What the hell was wrong with my implementation?? Did I compute the coefficients wrong?<p>After a lot of head scratching and code back and forth, I went back to the scratchpad, looked at these functions more closely, and realized they're cubic of all variables, but <i>linear of any given variable</i>. This should have been obvious, as it was a determinant of a matrix whose columns or rows depended linearly on the variables. Noticing this would have been 1st year math curriculum.<p>This changed things radically as I could now recast my maxmin problem as a Linear Program, which has very efficient numerical solvers (e.g. Dantzig's simplex algorithm). These give you the <i>global</i> optimum to machine precision, and are very fast on small problems. As a bonus, I could actually move three variables at once --- not just one ---, as those were separate rows of the matrix. Or I could even move N at once, as those were separate columns. This could beat all the differentiable optimization based approaches that people had been doing on all counts (quality of the extrema and speed), using regularizations of F.<p>The end result is what I'd consider one of the few things not busy work in my PhD thesis, an actual novel result that brings something useful to the table. To say this has been adopted at all is a different matter, but I'm satisfied with my result which, in the end, is mathematical in nature. It still baffles me that no-one had stumbled on this simple property despite the compute cycles wasted on solving this problem, which coincidentally is often stated as one of the main reasons the overarching field is still not as popular as it could be.<p>From this episode, I deduced two things. Firstly, the right a priori mathematical insight can save a lot of time in designing misfit algorithms, and then implementing and debugging them. I don't recall exactly, but this took me about two months or so, as I tried different approaches. Secondly, the right mathematical insight can be easy to miss. I had been blinded by the fact no-one had solved this problem before, so I assumed it must have had a hard solution. Something as trivial as this was not even imaginable to me.<p>Now I try to be a little more careful and not jump into code right away when meeting a novel problem, and at least consider if there isn't a way it can be recast to a simpler problem. Recasting things to simpler or known problems is basically the essence of mathematics, isn't it?
I agree with the thrust of the article but my conclusion is slightly different.<p>In my experience the issue is sometimes that Step 1 doesn't even take place in a clear cut way. A lot of what I see is:<p><pre><code> 1. Design algorithms and data structures
2. Implement and test them
</code></pre>
Or even:<p><pre><code> 1. Program algorithms and data structures
2. Implement and test them
</code></pre>
Or even:<p><pre><code> 1. Implement
2. Test
</code></pre>
Or even:<p><pre><code> 1. Test
2. Implement
</code></pre>
:-(<p>IMO, this last popular approach gets things completely backwards. It assumes there is no need to think about the problem before hand, to identify it, to spend any amount of time thinking about what needs to happen on a computer for that problem to be solved... you just write down some observable behaviors and begin reactively trying to implement them. Huge waste of time.<p>The point also about "C-style languages being more appealing" is well taken. It's not so much about the language in particular. If you are able to sit down and clearly articulate what you're trying to do, understand the design tradeoffs, which algorithms and data structures are available, which need to be invented... you could do it in assembly if it was necessary, it's just a matter of how much time and energy you're willing to spend. The goal becomes clear and you just go there.<p>I have an extensive mathematical background and find this training invaluable. On the other hand, I rarely need to go so far as carefully putting down theorems and definitions to understand what I'm doing. Most of this happens subliminally somewhere in my mind during the design phase. But there's no doubt that without this training I'd be much worse at my job.
Reminds me of the attempt to TDD a way to a sudoku solver. Agreed that it is a bit of a crazy path.<p>Not that Implement/Test can't work. As frustrating as it is, "just do something" works far better than many alternatives. In particular, with enough places doing it, somebody may succeed.
> Or even: 1. Test 2. Implement
IMO, this last popular approach gets things completely backwards. It assumes there is no need to think about the problem before hand<p>I think you misunderstand this approach.<p>The point of writing the tests <i>is</i> to think about the desired behaviour of the system/module you are implementing, before your mind gets lost in all the complexities which necessarily happens during the implementation.<p>When you write code, and hit a wall, it’s super easy to get hyper-focused on solving that one problem, and while doing so: lose the big picture.<p>Writing tests first can be a way to avoid this, by thinking of the tests as a specification you think you should adhere to later, without having to worry about <i>how</i> you get there.<p>For some problems, this works really well. For others, it might not. Just don’t dismiss the idea completely :)
I'm being a little rhetorically over the top. Of course, sometimes it's what you have to do.<p>In fact, right now I'm doing exactly Test/Implement because I don't know how else to solve the problem. But this is a last resort. Only because the first few attempts failed and I <i>must</i> solve this problem have I resorted to grinding out individual cases. The issue is that I have my back against the wall and have to solve a problem I don't understand. But as I progress, eventually I <i>will</i> understand the problem, and then my many cases are going to get dramatically simplified or even rewritten.<p>But all the tests I've created along the way will stay...
I would submit once you obtain a certain level of experience it becomes IDEAL to begin with implementation, in case a mathematical analysis may be either trivial or impossibly non-trivial... Of course if you're dealing in exchange rates and risk management, understand the math!
Are people not reading the article or are they so primed into thinking that math is a certain way that the authors words are missed?<p><pre><code> > The natural language which has been effectively used for thinking about computation, for thousands of years, is mathematics. Most people don’t think of math as free or flexible. They think of scary symbols and memorizing steps to regurgitate on tests. Others hear math and think category theory, lambda calculus, or other methods of formalizing computation itself, but these are hardly necessary for programming itself.
</code></pre>
I very much agree with the author here. It's also just a fact that this was the primary language for centuries. We didn't have programming languages in time of Newton but we did have computation<p><pre><code> > It’s not that programming languages aren’t good enough yet. It’s that no formal language could be good at it. Our brains just don’t think that way. When problems get hard, we draw diagrams and discuss them with collaborators.
</code></pre>
This is math. It's not always about symbols and numbers. It's about the relationships. It's not about elegance, even if that's the end goal. Math always starts very messy. But the results you see are usually polished and cleaned up.<p>I think if you carefully read the author then many of you might be surprised you're using math as your frame of thinking. The symbols and rigor can help but mathematical thinking is all about abstraction. It is an incredibly creative process. But I think sometimes we're too afraid of abstraction that we just call it different names. Everything we do in math or programming is abstract. It's not like the code is real. There's different levels of abstraction and different types of abstraction, but all these things have their uses and advantages in different situations.
I already regret reading this article. Don't get me wrong, it's well written, and I agree with most of it. But every time I read an article like this I get stuck in analysis paralysis for any code I need to write afterwards and that's just not very productive.<p>Here's hoping my recognising the issue will soften the blow this time! Mayhaps this comment might save someone else from a similar fate
Don't know how far along your career you are, but as a youngin', the occasional thought piece like this that introduces interesting new ideas and challenges me to reevaluate how I approach things have proven to be quite formative in retrospect
But many computer applications are models of systems real or imagined. Those systems are not mathematical models. That everything is an “algorithm” is the mantra of programmers that haven’t been exposed to different types of software.
Math is just a language. We use math in physics a lot, so I find it a weird claim to say that math doesn't apply to the real world.
Real world systems are chaos. Math is the most successful method for conceptualizing them into a model we can reason about - such as in a computer.
I wish there was a better explanation on what the exact problem was that they were trying to solve. I couldn't understand the problem - if I did I would have proposed my own solution, and then compared to the thinking process proposed to validate if that could've worked better for me, but I can't be bothered to follow the thinking process in the symbols like this without even knowing what we are solving for.<p>Was it about how to design a profitable algorithm? Was it about how to design the bot? was it about understanding if results from the bot were beneficial?<p>If that I would just backtest the algorithm to see the profit changes on real historical data?<p>> Definition: We say that the merchant rate is favorable iff the earnings are non-negative for most sets of typical purchases and sales. r'(t) is favorable iff e(P, S) >= 0.<p>If I understand the definition correctly, I would say that this is likely even wrong because you could have an algorithm that will be slightly profitable 90% of the time, but the 10% of the time it loses everything.<p>A correct solution to me is to simulate large numbers of trades based on as realistic data as you can possibly get and then consider the overall sum of the results, not positive vs negative trades ratio.
I'm an engineer I think there is definitely some pain points translating math to code.<p>I've written some nasty numerical integration code (in C using for loops) for example I'm not proud of it but it solved my issue. I remember at the time thinking surely there must be a better way for computers to solve integrals.
I struggled with this originally and it took years for it to click. But when it did I became both a much better programmer and mathematician for it.<p>I think what helps is to take the time to sit down and practice going back and forth. Remember, math and code are interchangeable. All the computer can do is math. Take some code and translate it to math, take some math and translate it to code. There's easy things to see like how variables are variables, but do you always see what the loops represent? Sums and products are easy, but there's also permutations and sometimes they're there due to lack of an operator. Like how loops can be matrix multiplication, dot products, or even integration.<p>I highly suggest working with a language like C or Fortran to begin with and code that's more obviously math. But then move into things that aren't so obvious. Databases are a great example. When you get somewhat comfortable try code that isn't obviously math.<p>The reason I wouldn't suggest a language like Python is because it abstracts too much. While it's my primary language now it's harder to make that translation because you have to understand what's happening underneath or be working with a diffident mathematical system and in my experience not many engineers (or many outside math majors) are familiar with abstract algebra and beyond so these formulations are more challenging at first.<p>For motivation, the benefits are that you can switch modes for when a problem is easier to solve in a different context. It happens much more than you'd think. So you end up speaking like Spanglish, or some other mixture of languages. I also find it beneficial that I can formulate ideas when out and about without a computer to type code. I also find that my code can often be cleaner and more flexible as it's clearer to me what I'm doing. So it helps a lot with debugging too<p>Side note: with computers don't forget about statistics and things like Monte Carlo integration. We have GPUs these days and that massive parallelism can often make slower algorithms faster :). When looking at lots of computational code it's not written for the modern massively parallel environment we have today. Just some food for thought. You might find some fun solutions but also be careful of rabbit holes lol
That's still a chaotic composition of thoughts, not driven by any identified structure or symmetry of the situation.<p>Why a program is needed? What constraints lead to the existence of that need? Why didn't human interactions need a program or thinking in math? Why do computers use 0s and 1s? You need to start there and systematically derive other concepts, that are tightly linked and have a purpose driven by the pre-existing context.
> Programming languages are implementation tools for instructing machines, not thinking tools for expressing ideas.<p>I completely disagree with that assumption.<p>Any function call that proceeds to capture logic, e. g. data from reallife systems, drones or robot, or robots in logistics - you will often see they proceed in a logic chain. Sometimes they use a DSL, be it in rails, but also older DSLs such as the sierra game logic and other DSLs.<p>If you have a good programming language it is basically like "thinking" in that language too. You can also see this in languages such as C, and the creation of git. Now I don't think C is a particularly great language for higher abstractions, but the assumption that "only math is valid and any other instruction to a machine is pointless", is simply flat out wrong. Both is perfectly valid and fine, they just operate on a different level usually. My brain is more comfortable with ruby than with C, for instance. I'd rather want languages to be like ruby AND fast, than have to adjust down towards C or assembly.<p>Also the author neglects that you can bootstrap in language xyz to see if a specific idea is feasible. That's what happened in many languages.
Effort was made to write this article.
Deep insight in several statements.
"Think in math, write in code" is the possibly worst programming paradigm for most tasks. Math notations, conventions and concepts usually operate under the principles of minimum description lenght. Good programming actively fights that in favor of extensibility, readability, and generally caters to human nature, not maximum density of notation.<p>If you want to put this to test, try formulating a React component with autocomplete as a "math problem". Good luck.<p>(I studied maths, if anyone is questioning where my beliefs come from, that's because I actually used to think in maths while programming for a long time.)
<p><pre><code> > Math notations, conventions and concepts usually operate under the principles of minimum description lenght.
</code></pre>
I think you're confusing the fact that math is typically written by hand on paper or a board. That optimization is not due to math, it is due to the communication medium.<p>If we look at older code we'll actually see a similar thing. People are limited in character lengths so the exact same thing happened. That's why there's still conventions like 80 char text width, though now those things serve as a style guide rather than a hard rule. It also helps that we can auto complete variables.<p>Your criticism is misplaced.
The people I know who „think in math“ don’t think in the syntax of the written notation, including myself.
If you're adding some computational/problem breakdown/heuristic steps on top/instead of mathematical concepts, then you're doing the opposite of what the author proposes.<p>Scientific conensus in math is Occam's Razor, or the principle of parsimony. In algebra, topology, logic and many other domains, this means that rather than having many computational steps (or a "simple mental model") to arrive to an answer, you introduce a concept that captures a class of problems and use that. Very beneficial for dealing with purely mathematical problems, absolute distaster for quick problem solving IMO.
> then you're doing the opposite of what the author proposes<p>No, it’s exactly what the author is writing about. Just check his example, it’s pretty clear what he means by “thinking in math”<p>> Scientific conensus in math is Occam's Razor, or the principle of parsimony. In algebra, topology, logic and many other domains, this means that rather than having many computational steps (or a "simple mental model") to arrive to an answer, you introduce a concept that captures a class of problems and use that.<p>I don’t even know what you mean by this.
I have a love-hate relationship with Math.<p>I love the logical aspect and the visualization aspect like writing down a formula and then visualizing/imagining a graph of all possible curves which that formula represents given all possible values of x or z. You can visualize things that you cannot draw or even render on a computer.<p>I also enjoy visualizing powers and logarithms. Math doesn't have to be abstract. To me, it feels concrete.<p>My problem with math is all to do with syntax, syntax reuse in different contexts and even the language of how mathematicians describe problems seems ambiguous to me... IMO, the way engineers describe problems is clearer.<p>Sometimes I feel like those who are good at math are kind of biased towards certain assumptions. Their bias makes it easier for them to fill in gaps in mathematical language and symbolism... But I would question whether this bias, this approach to thinking is actually a universally good thing in the grand scheme of things. Wouldn't math benefit from more neurodiversity?<p>I remember at school, I struggled in maths at some points because I could see multiple interpretations of certain statements and as the teacher kept going deeper, I felt like I had to execute a tree search algorithm in my mind to figure out what was the most plausible interpretation of the lesson. I did much better at university because I was learning from books and so I could pause and research every time I encountered an ambiguous statement. I went from failing school math to getting distinction at university level maths.
what compels software people to write opinion pieces. like you don't see bakers, mechanics, dentists, accountants writing blog posts like this...<p>Edit: to everyone responding that there are trade mags - yes SWE has those too (they're called developer conferences). In both categories, someone has to <i>invite</i> you to speak. I'm asking what compels Joe Shmoe SWE to pontificate on things they haven't been asked by anyone to pontificate on.
You're getting a lot of flak for this, but I think it is a legitim question to ask. I have many different hobbies, and have worked in different industries, but software development / programming is sort of unique in how much people discuss it online.<p>My takes are:<p>1) There are a lot of IT workers in the world, and they're all online natives. So naturally they will discuss ideas, problems, etc. online. It is simply a huge community, compared to other professions.<p>2) Programming specifically is for many both a hobby and a profession. So being passionate about it compels many people to discuss it, just like others will do about their own hobbies.<p>3) Software is a very fast-moving area, and very diverse, so you will get many different takes on the same problems.<p>4) Posting is cheap. The second you've learned about something, like static vs dynamic typing, you can voice your opinion. And the opinions can range from beginners to CS experts, both with legit takes on the topic.<p>5) It is incredibly easy to reach out to other developers, with the various platforms and aggregators. In some fields it is almost impossible to connect / interact with other professionals in your field, unless you can get past the gatekeepers.<p>And the list goes on.
What an insane statement. What compels anyone to write an opinion piece? They have an opinion and want to share it! Why in god's name should someone have to be invited to share their opinion, on <i>their own</i> website no less.
> you don't see bakers, mechanics, dentists, accountants writing things like this...<p>There are literally industry publications full of these.
Accountants certainly do. They've had trade magazines with opinion pieces since well before the internet.
Mathematicians certainly write volumes of opinion pieces. The article you are complaining about starts from the presumption that software could benefit from more mathematical thinking, even if that doesn't explain broader general trends.<p>(But I think it does apply more generally. We refer to it as Computer Science, it is often a branch of Mathematics both historically and today with some Universities still considering it a part of their Math department. Some of the industry's biggest role models/luminaries often considered themselves mathematicians first or second, such as Turing, Church, Dijkstra, Knuth, and more.)
> Computer Science<p>do we really have to retread this? unless you are employed by a university to perform research (or another research organization), you are not a computer scientist or a mathematician or anything else of that sort. no more so than an accountant is an economist or a carpenter is an architect.<p>> The article you are complaining about starts from the presumption that software<p>reread my comment - at no point did i complain about the article. i'm complaining that SWEs have overinflated senses of self which compel them to write such articles.
<p><pre><code> > what compels software people to write opinion pieces. like you don't see bakers, mechanics, dentists, accountants writing blog posts like this...
>> Computer Science
> do we really have to retread this? unless you are employed by a university to perform research (or another research organization), you are not a computer scientist or a mathematician or anything else of that sort. no more so than an accountant is an economist or a carpenter is an architect.
>> The article you are complaining about starts from the presumption that software
> reread my comment - at no point did i complain about the article. i'm complaining that SWEs have overinflated senses of self which compel them to write such articles.
</code></pre>
You're incredibly tiring commenter, to a point I already recognize your nickname. What compels YOU to be this way?<p>I wish there was a block button for the "overinflated senses of self".
I observe this compulsion a lot and in my opinion, it's almost always coming from resentment driving ego in an attempt to compensate for insecurity and self-loathing, which ultimately ends up misdirected towards others. It's almost always accidentally entertaining, but sadly ends up diminishing rather than elevating discourse.<p>To refute GP's point more broadly -- there is a lot in /applied/ computer science (which is what I think the harder aspects software engineering really is) that was and is done by individuals in software just building in a vacuum, open source holding tons of examples.<p>And to answer GP's somewhat rhetorical question more directly - none of those professions are paid to do open-ended knowledge work, so the analogy is extremely strained. You don't necessarily see them post on blogs (as opposed to LinkedIn/X, for example), but: investors, management consultants, lawyers, traders, and corporate executives all write a ton of this kind of long-form content that is blog post flavored all the time. And I think it comes from the same place -- they're paid to do open-ended knowledge work of some kind, and that leads people to write to reflect on what they think seems to work and what doesn't.<p>Some of it is interesting, some of it is pretty banal (for what it's worth, I don't really disagree that this blog post is uninteresting), but I find it odd to throw out the entire category even if a lot of it is noise.
what compels me to be which way? to call out obnoxious things in the people i'm surrounded by? dunno probably my natural revulsion to noxious things?
> do we really have to retread this? unless you are employed by a university to perform research (or another research organization), you are not a computer scientist or a mathematician or anything else of that sort. no more so than an accountant is an economist or a carpenter is an architect.<p>Doing math or science is the criterion for being a mathematician or scientist, not who employs you how.
Those who do not learn history are doomed to repeat it (poorly). Same for anyone doing software that entirely ignores Computer Science. You are missing core skills and reinventing well known wheels when you could be busy building smarter things.<p>> no more so than an accountant is an economist or a carpenter is an architect<p>I know many accountants who would claim you can't be an a good accountant without being an economist. Arguably that's most of the course load of an MBA in a nutshell. I don't know a carpenter who would claim to be an architect, usually when carpentry happens is <i>after</i> architecture has been done, but I know plenty of carpenters that claim to be artists and/or artisans (depending on how you see the difference), that take pride in their craft and understand the aesthetic underpinnings.<p>> reread my comment - at no point did i complain about the article. i'm complaining that SWEs have overinflated senses of self which compel them to write such articles.<p>You chose which article to post your complaint to. The context of your comment is most directly complaining about this specific article. That's how HN works. If you didn't read the article and feel like just generically complaining about the "over-inflated senses of self" in the software industry, perhaps you should be reading some forum that isn't HN?
bakers and mechanics have not had their ego stroked by being overpaid for a decade.
>I'm asking what compels Joe Shmoe SWE to pontificate on things they haven't been asked by anyone to pontificate on.<p>The Internet is <i>absolutely full</i> of this. This is purely your own bias here, for any of the trades you mentioned try looking. You will find Videos, podcast and blogs within minutes.<p>People love talking about their work, no matter their trade. They love giving their opinions.
If they'd had opinion pages at the time, the inventors of nixtamalization would have and should have written something like this.
Bakers certainly write books and magazines[0] on baking, as well as interminable stories about their childhood. Mechanics: [1]. I could only find one obvious one for dentists: [2]. Somebody else did accountants in the thread. I think it's a human thing, to want to share our opinions, whether or not they are well supported by evidence. I suspect software people write blogs because the tech is easier for them given their day job.<p>[0] <a href="https://www.google.com/search?q=blaking+magazine" rel="nofollow">https://www.google.com/search?q=blaking+magazine</a>
[1] <a href="https://www.google.com/search?q=mechanics+magazines" rel="nofollow">https://www.google.com/search?q=mechanics+magazines</a>
[2] <a href="https://dentistry.co.uk/dentistry-magazine-january-2023-digital-edition/" rel="nofollow">https://dentistry.co.uk/dentistry-magazine-january-2023-digi...</a>
lol, your original comment is you pontificating
I mean, maybe if your background is mathematics this would make sense. But for a lot of us it isn't, we're more linguistically oriented and we certainly are not going to come up with some pure mathematical formula that describes a problem, but we might describe the problem and break it down into steps and then implement those steps.
[dead]