25+ yr Java/JS dev
Linux novice - running Ubuntu (no windows/mac)

  • 0 Posts
  • 603 Comments
Joined 9 months ago
cake
Cake day: October 14th, 2024

help-circle




  • I have a real passion for software development. There are raw skills like algorithms that lots of other people are better at, but when it comes to API design, clean code, enacting principles of good architecture and UX, I think I am uncommonly good. Most of the people I work with don’t care and frankly I think caring and practice is all it takes but somehow that’s not something most people in the industry are capable of.

    But I love making software that people like or at least hate less than what they had before.



  • It sounds like you are a much better developer than me, but to be fair I’ve had to teach myself everything using nothing but books and Google for thirty years. I’ve rarely had the luxury of working with someone who had the knowledge to mentor me, and never got a degree outside an AAS in electronics, so I’ve probably missed some critical skills along the way.

    In a lot of ways, the AI fills that role because it’s better at answering questions than it is writing code. Earlier today it was explaining to me how a DOM selector could return a stale element in some cases in a failing end to end test. It took a few back and forths with some code examples before I really understood why the selectors might not be working.

    It also suggested some code changes that I had to push back on because, even though the code had errors, the errors weren’t causing the problem. While building an array of validators I had awaited them, causing them to run serially instead of in parallel during Promise.all(). So you definitely have to know what you’re doing to avoid having the AI waste your time (or at least more time than it takes to push back).

    I’m still trying to debug it, but without the AI, I’d be googling the fuck out of typescript syntax, JavaScript idiosyncrasies, and a whole testing framework I’ve never seen before.

    So…

    if the only real value that AI provides is “you don’t need to know the libraries you’re using”

    …returns false.


  • He’s 100% right and was only a little less professional than I think was deserved. A little too focused on the personal rather than the commit and wrongheadedness of the email itself. Anyone could submit a bad patch.

    Was there a similarly harsh invective sent to whomever approved the PR in the first place? I’d bet so.




  • The guideline that I follow is that if a tool doesn’t enforce a rule before the code can be merged, then it’s not a rule.

    Good guideline. One of the first things I did when starting a side project with a friend was to figure out how to run a code formatter triggered by git (I think?) that reformats code into a common style. If we didn’t like it we could apply our own styling when we pulled.

    I don’t always love every line of automatic styling even if I have free reign over the rules, but it saved so much time and effort in code reviews.









  • Yeah. I’m with you there. We don’t display the proper amount of anxiety, either being too detached or overdramatic, and suddenly they are laser focused on us.

    “Why did you google how long it takes a person to asphyxiate?”

    “I watched a movie where a guy holds his breath and got curious as to whether it was bullshit or not.”

    “Why is there a sword in your online cart?”

    “It was aspirational. Swords are expensive and I don’t know if I’ll get enjoyment commensurate to the cost.”

    “You like big words don’t you. You think you’re pretty smart, eh? You think you’re smarter than me?”

    “W—well… I mean… I don’t have enough evid—”

    Nightstick to the face. “Stop resisting arrest!”


    My point was more about unreliable narration than the interaction between gut reactions and neurodivergence. That’s a legitimate concern. One hopes that the non-gut-reaction part of the process vindicates us.


  • Intuition matters — it’s part of how people make sense of things, and I’d expect investigators to use it to focus their attention. But when cops talk about ‘just knowing’ someone was guilty, that’s not a reliable narrative of how the case actually unfolded. It’s more about self-mythologizing — building a story where they zeroed in on the suspect through instinct alone. That kind of framing works well in interviews and promotion boards, but it (ideally) oversimplifies what real investigation looks like.

    There are, of course, counter-examples. But those are usually more the subject of documentaries about injustice in the justice system.