• 1 Post
  • 648 Comments
Joined 2 years ago
cake
Cake day: June 9th, 2023

help-circle






  • Also, having no work-life balance is different if you own a significant fraction of the company vs. if you’re on salary.

    Like, if Jensen Huang spends 12 hours over the weekend working on something for nVidia and increases the share price by 0.01% (with a $4.165 trillion market cap, this means it goes up $416 million), his personal net worth will go up by $14.7m. Not bad for a little weekend work.

    Let’s assume that someone who is on salary is on something absurd like $1m per year and gets a 500% bonus for working overtime. Their 12 hours of weekend work is going to net them $28k. That’s certainly nice, but it’s about 1/500th of what Huang gets. And, your average engineer probably doesn’t get overtime at all, and if they did it would be closer to $3k not $30k.

    If someone who owns a business wants to have a bad work-life balance, that’s one thing. But, it should never be expected of anybody who’s just on salary.









  • “C++ compilers also warn you…”

    Ok, quick question here for people who work in C++ with other people (not personal projects). How many warnings does the code produce when it’s compiled?

    I’ve written a little bit of C++ decades ago, and since then I’ve worked alongside devs who worked on C++ projects. I’ve never seen a codebase that didn’t produce hundreds if not thousands of lines of warnings when compiling.




  • I’ve always thought movies like Terminator where the AI becomes sentient and takes over are complete BS. We still don’t understand sentience, and we have no hope of making a sentient computer, at least in my lifetime.

    But, what’s scary is that you don’t even need a sentient AI. You just need people to hand the keys over to “spicy autocomplete”. Spicy autocomplete has been trained on the scripts of Terminator, 2001: A Space Odyssey, Wargames, etc. It has no desire to take over the world, it has no desire at all because it’s not conscious. But, it knows how to generate text as though it were playing a part in one of those movies. If you actually do hook it up to a computer that can launch missiles, it’s perfectly able to pattern match its inputs to the expected outputs based on those movies, and as a result to play the role of the evil AI taking over and wiping out humanity.

    I can’t even believe that people are willing to let LLMs hit the “commit” button for them. But, that someone is willing to give admin access to a database to a bullshit generator, that’s just perfect.


  • I get the hesitation that things can turn into lies, but that’s a sign that you’re doing things wrong. That also tends to happen to comments that are far away from the relevant code, like the documentation of a 100 line function. The function can change while the comment is no longer visible on the screen, so it’s easy to forget to also fix the comment.

    But test strings like that are designed to avoid that problem. They’re right there next to your tests for a reason. You should always be right next to them when you’re changing the test.

    Fundamentally, this is something that has to be addressed with code reviews. If someone can commit their changes to a group repository without anybody else seeing them, you’re going to get stuff like this. As soon as you get decent code reviews, you can just reject a change where there are tests without documentation, the same way you can reject a change to a test where the documentation is now out of date.