IMO it's analogous to adaptive cruise control on cars. It's very useful in certain specific circumstances, and genuinely helpful. But not yet necessary or transformative unless you live in LA/sit in traffic for a living.
People are claiming to use it to write code, I'm curious how sophisticated said code is. Sure it might take out of some of the grunt work, like an ultra-sophisticated find-and-replace, but you still have to review all the changes it makes and correct any mistakes so the only thing it really saves is the typing. There's no way you could ask it to write code for a sophisticated architecture without extensive training on said architecture, and I'm not sure how you would even train it for that (can it parse design documents? Diagrams?)
The "Look, AI is replacing creative work first, the thing we thought was most immune to AI!" narrative annoys me. IMO it's just yet another factor revealing how little people value "creative" work. There's a reason relatively simplistic Marvel movies make the big bucks and the erudite starving author/artist is a meme. The market does not appreciate creativity for its own sake, and never has. No one cares about the reincarnation of William Shakespeare if he' s using all his talent to write blogspam. No one cares about Monet's ghost's DeviantArt anime titty drawings. People care about creativity because it's a requirement to produce something new and useful, that use can be pragmatic or symbolic, but if it's neither no one cares and the AI-generated equivalent is good enough.
You want to see AI-proof creativity (at least currently available AI)? Look at any luxury automobile interior. Look at any sophisticated software/hardware architecture. Look at an aircraft carrier or any other item where there aren't millions of samples to train on. That's not to say AI couldn't contribute to the tools that make these things, but no one working on the above is losing their job to AI any time soon. It's just taking out some of the low-hanging creative fruit. Maybe it's the start of an all-consuming revolution, or maybe this is as far as it goes. Only time will tell, but I've seen enough false revolutions (self-driving cars, AI advertising, crypto) to not buy in until I see hard data of it doing something more than writing convincing youtube intros and being used to cheat on high school essays. If the tools turn out to produce genuine value, then I'll learn how to use them to maximum effect at my job. Nothing to get wound up about either way.
Copilot is perfectly capable of understanding my extremely complex spaghetti code and give accurate suggestions based on the context spanning many files. It truly feels like you're doing pair programming with another human being.
tl;dr: in some instances I had to coach it pretty directly to get what I wanted. In other instances it was flawless. And to your point, some of the code it generated was both clearer and faster than the code I would have written for the same task.
Sure but this kinda proves my point. It can potentially generate good code for simple, atomic problems. It can't write me a REST service that hooks into an existing web backed spread over multiple repos of proprietary code.
Any relationship not visible in the code seems to be outside its capabilities to understand for the moment. I'll be impressed when I can point it at a server cluster and the associated dozen repos, give it some clues, and it can understand how the code for a server cluster interacts with said cluster's configuration and database hookups by simply scanning the files/repos and the info I textually provide.
The problem for us devs is not that it will replace you completely (although that day is coming, but it's at least a decade away). The problem is that as developer productivity increases dramatically this may put pressure on developer job positions, salaries, etc. One hope is that as the price of developing software plummets it will increase demand accordingly so that we won't feel it that much. God knows there is still a ton of areas where software or better software would help out a lot but it is cost prohibitive at the moment. Once we get into robotics basically anything that humans do can be improved with software.
More like I'll invest time learning a tool relative to the potential payoff. Right now this tool would be of minimal utility. The vast majority of code I write isn't "make this isolated algorithm more efficient", it's "implement/integrate this new feature into the server cluster". Without deep understanding of the software/server architecture and the ability to derive potential tradeoffs of different approaches, my job cannot be done.
This "if you wait to see results the opportunity will be gone!" mentality is for VCs and other people who's business models require them to be way out on the risk curve, who make a lot of bad calls, but lose relatively little when they fail. It was also partially a product of low interest rates. It is not applicable to most individuals/organizations.
I use it as an instant StackOverflow for the most part to get around new libraries or libraries/languages I don't use that often. Also generating custom bash one liners or small scripts. It is priceless for this use case. Yeah, sometimes it is wrong, but in my exp. less than 5%. Also we are lucky that for our purposes we can almost always validate the answer almost instantly and without incurring any cost.
People are claiming to use it to write code, I'm curious how sophisticated said code is. Sure it might take out of some of the grunt work, like an ultra-sophisticated find-and-replace, but you still have to review all the changes it makes and correct any mistakes so the only thing it really saves is the typing. There's no way you could ask it to write code for a sophisticated architecture without extensive training on said architecture, and I'm not sure how you would even train it for that (can it parse design documents? Diagrams?)
The "Look, AI is replacing creative work first, the thing we thought was most immune to AI!" narrative annoys me. IMO it's just yet another factor revealing how little people value "creative" work. There's a reason relatively simplistic Marvel movies make the big bucks and the erudite starving author/artist is a meme. The market does not appreciate creativity for its own sake, and never has. No one cares about the reincarnation of William Shakespeare if he' s using all his talent to write blogspam. No one cares about Monet's ghost's DeviantArt anime titty drawings. People care about creativity because it's a requirement to produce something new and useful, that use can be pragmatic or symbolic, but if it's neither no one cares and the AI-generated equivalent is good enough.
You want to see AI-proof creativity (at least currently available AI)? Look at any luxury automobile interior. Look at any sophisticated software/hardware architecture. Look at an aircraft carrier or any other item where there aren't millions of samples to train on. That's not to say AI couldn't contribute to the tools that make these things, but no one working on the above is losing their job to AI any time soon. It's just taking out some of the low-hanging creative fruit. Maybe it's the start of an all-consuming revolution, or maybe this is as far as it goes. Only time will tell, but I've seen enough false revolutions (self-driving cars, AI advertising, crypto) to not buy in until I see hard data of it doing something more than writing convincing youtube intros and being used to cheat on high school essays. If the tools turn out to produce genuine value, then I'll learn how to use them to maximum effect at my job. Nothing to get wound up about either way.