• helopigs@lemmy.world
    link
    fedilink
    arrow-up
    8
    arrow-down
    1
    ·
    12 hours ago

    I think 10x is a reasonable long term goal, given continued improvements in models, agentic systems, tooling, and proper use of them.

    It’s close already for some use cases, for example understanding a new code base with the help of cursor agent is kind of insane.

    We’ve only had these tools for a few years, and I expect software development will be unrecognizable in ten more.

    • ZILtoid1991@lemmy.world
      link
      fedilink
      arrow-up
      6
      ·
      10 hours ago

      It also depends on the usecase. It likely can help you better at throwing webpages together from zero, but will fall apart once it has to be used to generate code for lesser-discussed things. Someone once tried to solve an OpenGL issue I had with ChatGPT, and first it tried to suggest me using SDL2 or GLFW instead, then it spat out a barely working code that was the same as mine, and still wrong.

      A lot of it instead (from what I’ve heard from industry connections) being that the employees are being forced to use AI so hard they’re threatened with firings, so they use most of their tokens to amuse themselves with stuff like rewriting the documentation in a pirate style or Old English. And at the very worst, they’re actually working in constant overtime now, because people were fired, contracts were not extended, etc.

    • dependencyinjection@discuss.tchncs.de
      link
      fedilink
      arrow-up
      6
      arrow-down
      9
      ·
      edit-2
      11 hours ago

      It’s made me a 10x developer.

      As someone who transitioned form Junior to Dev as we embraced LLMs. Our company saved that much time that we all got a pay rise with a reduction in hours to boot.

      Sick of all this anti LLM rhetoric when it’s a tool to aid you. People out here thinking we just ask ChatGPT and copy and paste. Which isn’t the case at all.

      It helps you understand topics much quicker, can review code, read documentation, etc.

      My boss is the smartest person I’ve ever met in my life and has an insane cv in the dev and open source world. If he is happy to integrate it in our work then I’m fine with it. After all we run a highly successful business with many high profile clients.

      Edit: love the downvotes that don’t explain themselves. Like I’m not earning more money for doing less hours and productivity has increased. Feel like many of the haters of LLMs don’t even work in the bloody industry. 😂

      • burlemarx@lemmygrad.ml
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        3 hours ago

        I am not anti-AI or something like it and I use AI on a daily basis. If you work on a domain where there’s plenty code written for it or documentation, AI acts like a very efficient search tool. It does not replace traditional documentation or stack overflow, but it significantly reduces the time I take searching for specific syntax, or an example of how to use a library, or how to use a specific feature or parameter of a library. Occasionally it gives me bad advice as well, such as doing something that results in low performance, low security, but then I can check the actual documentation and code to see the details. For code reviews, I think it’s only partially useful, while sometimes it spits something useful, most of the time it spits out bad or irrelevant advice that ends up polluting the code review screen for actual human devs trying to review the code. However, even with all the gains, which is kind of a mixed bag, I think it’s very unlikely AI will increase speed 10 fold. At best, it will be like a 25% improvement at best, and only specific to some times in the project lifecycle, and most of the gains only happen when you are dealing with generating boilerplate code and adding non business-specific functionality. Most of the time I had to maintain existing code, debug existing functionality and fix some security flaws, AI didn’t help me at all.