What do you think about Artificial Intelligence?

CRAP
Total votes: 55 (87%)
NOT CRAP
Total votes: 8 (13%)
Total votes: 63

Re: Thing: Artificial Intelligence

251
enframed wrote: Sat Jan 10, 2026 11:06 am https://www.theatlantic.com/technology/ ... ch/685552/

"On tuesday, researchers at Stanford and Yale revealed something that AI companies would prefer to keep hidden. Four popular large language models—OpenAI’s GPT, Anthropic’s Claude, Google’s Gemini, and xAI’s Grok—have stored large portions of some of the books they’ve been trained on, and can reproduce long excerpts from those books.

In fact, when prompted strategically by researchers, Claude delivered the near-complete text of Harry Potter and the Sorcerer’s Stone, The Great Gatsby, 1984, and Frankenstein, in addition to thousands of words from books including The Hunger Games and The Catcher in the Rye. Varying amounts of these books were also reproduced by the other three models. Thirteen books were tested.

This phenomenon has been called “memorization,” and AI companies have long denied that it happens on a large scale."
I really wonder how long the traditional media mega corporations are going to tolerate this shit. Music companies are already trying shut down AI music generators.

It’s astounding they’ve been allowing this considering how nuts they when over Pirate Bay and Napster.
clocker bob may 30, 2006 wrote:I think the possibility of interbreeding between an earthly species and an extraterrestrial species is as believable as any other explanation for the existence of George W. Bush.

Re: Thing: Artificial Intelligence

255
https://www.youtube.com/watch?v=U8dcFhF0Dlk
Technological determinism absolves us from building systems that might be unideal for society.

Generative AI gives people the emotional permission to make music that they didn’t have before.

Music is no longer the domain of the reality privileged.

Futurism provided aesthetic cover for the fascists.

Music, Impatience, Aesthetics, Fun

Fundamental separation of the crafts between live music and recorded music.

Re: Thing: Artificial Intelligence

258
At work, we've been trialing AI in several different capacities since early 2025. This year, the company decided to push it hard, where now it's going to be tied to performance. I don't know what that means for other positions in the company, but for engineering it means we have to stop writing code ourselves and start learning how to do agentic engineering. For me, I've been coding less these days and doing more system design. But, I started to work primarily on prompting an agent to write code, vs the way I've been doing it, which is using AI to review code and do some research and write tests.

From my short experience with it, all I can say is, if business people think they're going to get rid of engineers and replace them with AI, they are setting their businesses up for failure. AI can write code... sure. So can a junior engineer. It's just a means to an end. It still takes education and experience to build software. It takes people coming up with ideas, fleshing those ideas out and then bringing those ideas to life, and AI is just not good at coming up with ideas, for obvious reasons.

I am only slightly comfortable with this. I don't mind changing with the times, but this is the most significant change to software engineering that I've witnessed in my 22 year career. In the early days, there wasn't as many tools or frameworks, or reliance even on googling for answers. Slowly, these things started building the infrastructure out in the industry to help engineers get through common task and issues, and get to the heart of building software based on business needs and customer desires. This has not changed.

What has changed is the tool for writing code is now writing it for you. In some ways, this is very freeing, as you don't necessarily need to know a language to be proficient in it. Is this relevant? Often times, you are working somewhere that has one, two, maybe three major languages, and maybe you know one or all of them. So, this is irrelevant in most use-cases, I think. I also think using an agent for everything is a waste of resource, because some things are just dead simple obvious what needs to be done, whereas building a new feature requires a lot of lines of code and interconnecting systems. While we have a blanket demand to have AI do all the coding, I am still pushing back to say that agents are not hammers and not everything is going to be a nail. Especially since we are dealing with cutting down our costs.

I hate the talk of "no one is going to be writing code in 6-12 months". It often gets misconstrued as engineers being insignificant.If you want a job as a software engineer, maybe writing the code isn't going to be a requirement, but you better understand the code that is being written. There is no blindly accepting lines of code and pushing them up to a repo and allowing it to be deployed. I don't care if your unit tests pass. Doing this is amateur hour and anyone who thinks otherwise is clueless or suicidal.

Agentic engineering or vibe coding, whatever the fuck it's going to be called, is analogous to automated manufacturing, where maybe one person manages one or more robots to build something. It would be naiive to think that AI won't cost jobs for engineers, but it would be equally naiive to think that it will remove the engineer completely. I guess we'll see. There are massive layoffs happening in the tech industry. It already felt like teams were bloated, to be honest. There's nothing wrong with being lean, but being smart about it. At my work, we have a team of about 30 engineers, ranging from devops, data, qa and general software. I can't imagine any one of us being replaced by AI without it having a huge impact on the company's ability to create the software the company's business is built on. I'm writing less code, but I'm also engaged more in planning, specing and reviewing, since writing code is just one aspect of engineering.

Maybe I'm all wrong, but my gut tells me that AI is good at some things, but not everything. And if we allow AI to do every part of our job for us, instead of enhancing our abilities, then what is the point of the job? Humans still need fulfillment, and humans still need to drive problem solving, as AI literally cannot think outside the box. I also believe that as AI evolves, humans are evolving around it, learning to recognize it, and being turned off by it. I can't imagine AI replacing humans to make music, but I can imagine there being AI music generated for use in grocery stores.

I have so much more that I'm thinking about on the matter, with AI in general. The truth is, either I embrace it and continue working in software, or I go find another job. But, even then there's no guarantee on the requirements or not for AI use. What I'm looking forward to is the inevitable bubble burst, and what happens after. Hopefully all the hype dies down and we take a more sane and humane approach to using AI. It's unfortunately here to stay.

Who is online

Users browsing this forum: No registered users and 2 guests