It’s incredible that people can feed up to one million tokens to LLMs and yet they still most of the time fail to take advantage of that enormous context window. No wonder people say that the output generated by LLMs is always crap… I mean, they’re not great but at least they can manage to do a pretty good job - that is, only IF you teach them well… Beyond that, everyone has their own effort + time / results ratio.
"Engineers are finding out that writing, that long shunned soft skill, is now key to their efforts. In Claude Code: Best Practices for Agentic Coding, one of the key steps is creating a CLAUDE.md file that contains instructions and guidelines on how to develop the project, like which commands to run. But that’s only the beginning. Folks now suggest maintaining elaborate context folders.
A context curator, in this sense, is a technical writer who is able to orchestrate and execute a content strategy around both human and AI needs, or even focused on AI alone. Context is so much better than content (a much abused word that means little) because it’s tied to meaning. Context is situational, relevant, necessarily limited. AI needs context to shape its thoughts.
(…)
Tech writers become context writers when they put on the art gallery curator hat, eager to show visitors the way and help them understand what they’re seeing. It’s yet another hat, but that’s both the curse and the blessing of our craft: like bards in DnD, we’re the jacks of all trades that save the day (and the campaign)."
https://passo.uno/from-tech-writers-to-ai-context-curators/
#AI #GenerativeAI #LLMs #Chatbots #PromptEngineering #ContextWindows #TechnicalWriting #Programming #SoftwareDevelopment #DocsAsDevelopment

Why would you spend your time developing a new skill (technical writing for AI) which has a unknown return on investment, when you could use your time to develop your existing coding skills or learn a new language which has a known positive return?
@makeshiftreaper@lemmy.world: That’s a good approach when you’re still a student or you’re on a sabbatical. But when it comes to for-profit companies, that is totally unrealistic because projects have strict deadlines, managers impose certain productivity goals, etc., etc. That’s not a valid approach when you’re earning your life as a software developer.
I’m speaking as a professional in the software development industry. Why would I hire AI technical writers to output code instead of more developers? Why would I have my devs upskill on AI writing rather than refine/expand their existing knowledge? I understand writing code to a deadline
I’m asking you to prove to me that the effort I put into this skillset will have a tangible impact, especially because I know similar effort in other similar skills will have tangible improvements
@makeshiftreaper@lemmy.world: Why are large companies forcing software developers to use AI tools in their jobs?
To justify sunk costs. I work for those companies, they’re making us go into offices despite overwhelming proof it’s not improving output. I don’t really care what mandates they push down, we’ve all been lying to meet those for decades and we’ll continue to do so. I care about what I can do to tangibly improve the output of my teams, and there continues to be little evidence that AI will do so
@makeshiftreaper@lemmy.world: Like in all situations and with all new technologies, using AI for intellectual tasks should always imply a calculation of the effort expended in relation to potential gains. To say that AI tools are always unhelpful is for me a blatant lie. People should research how to best leverage new technologies such as LLMs. To simply deny that there are advantages is for me too simplistic.
I’d argue you’re making the fallacy here. You’re asserting a change to an established practice. You’ve yet to give me evidence that your change is an improvement
It’s not the defense’s job in an argument to prove their assertion. Anything asserted without evidence can be equally dismissed without it
@makeshiftreaper@lemmy.world: I suspecting that by now you’re trolling me. Millions of software developers use these tools everyday. Some of them are forced to use them; some of them not and in fact quite enjoy using and LEARNING new things with them. But you can’t ignore that more and more people are using AI tools. That’s just a fact. You may prefer horse carriers instead of motor vehicles because they’re safer. But you can’t ignore the fact that people love to use cars to move themselves.
I was trying to politely get you to use logic to understand that AI is not some inherently better tool because it’s new and money is being spent on it so I’ll be blunt:
I have seen zero evidence that code output by AI justifies the multitude of costs that come at its implementation. You lose the opportunity to train junior devs, it fucks up testing, it hurts the quality of developers, it’s unnecessarily verbose, and makes frequent type mistakes. I want you to provide solid evidence AI outputs code at the same quality or better than a traditional developer before I will agree that learning AI skills is a benefit in the corporate environment
This hilariously assumes writing all those LLM instructions and validating it’s code takes no time.
@SGforce@lemmy.ca That’s why companies need Technical Writers AKA Content Curators! :)
I can spend 4 months learning Perl, or I can spend 3 months attempting to get the bullshit machine to learn Perl and then have to learn it myself anyway in order to fix its inevitably fucked-up output. Pick your poison.