ennemi [he/him]

  • 0 Posts
  • 17 Comments
Joined 2 years ago
cake
Cake day: December 29th, 2022

help-circle


  • I’d still recommend getting an AMD graphics card and generally prioritizing hardware with upstream drivers. As in, drivers that are included with the kernel itself. The experience is always better. Overall it’s still a good habit to look up how any hardware runs on Linux before buying it.

    Gaming in a Windows VM is possible but it was a big ordeal when I did it. You have to make sure your CPU and motherboard support IOMMU for PCI passthrough. It’s less of a problem nowadays but there are still some pitfalls with PCIe lanes and whatnot. You need two video adapters, one for the host and one for the guest (because the host has no access to the passed-through GPU) and if you want to game on both Windows and Linux that can be a pain in the ass. It goes on. I personally don’t recommend it. If you have to play trashy eSports that ship with built-in anti-cheat malware then just Windows for that.











  • ennemi [he/him]@hexbear.nettoProgrammer Humor@programming.devGolang be like
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    1
    ·
    edit-2
    11 months ago

    You can, if you want, opt into warnings causing your build to fail. This is commonly done in larger projects. If your merge request builds with warnings, it does not get merged.

    In other words, it’s not a bad idea to want to flag unused variables and prevent them from ending up in source control. It’s a bad idea for the compiler to also pretend it’s a linter, and for this behaviour to be forced on, which ironically breaks the Unix philosophy principle of doing one thing and doing it well.

    Mind you, this is an extremely minor pain point, but frankly this is like most Go design choices wherein the idea isn’t bad, but there exists a much better way to solve the problem.





  • ennemi [he/him]@hexbear.nettoProgrammer Humor@programming.devGolang be like
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    1
    ·
    edit-2
    11 months ago

    The language was designed to be as simple as possible, as to not confuse the developers at Google. I know this sounds like something I made up in bad faith, but it’s really not.

    The key point here is our programmers are Googlers, they’re not researchers. They’re typically, fairly young, fresh out of school, probably learned Java, maybe learned C or C++, probably learned Python. They’re not capable of understanding a brilliant language but we want to use them to build good software. So, the language that we give them has to be easy for them to understand and easy to adopt. – Rob Pike

    "It must be familiar, roughly C-like. Programmers working at Google are early in their careers and are most familiar with procedural languages, particularly from the C family. The need to get programmers productive quickly in a new language means that the language cannot be too radical. – Rob Pike

    The infamous if err != nil blocks are a consequence of building the language around tuples (as opposed to, say, sum types like in Rust) and treating errors as values like in C. Rob Pike attempts to explain why it’s not a big deal here.