security

IT pros wary as Microsoft Copilot juggernaut gains steam – TechTarget


Microsoft Copilot is officially the major focus of the cloud vendor’s strategy going forward. But some of the end users who will be most affected by the generative AI boom remain cautious about its potential negative effects.

A week after Microsoft code collaboration subsidiary GitHub declared that it was “re-founded on Copilot,” the generative AI-based code creator and chat virtual assistant, Microsoft made more than 100 updates injecting Copilot across its product lines at its Ignite conference.

These updates included a new Azure AI Studio for building generative AI applications, Microsoft Fabric for data analytics, Microsoft 365 Copilot for business productivity, and Copilot for Azure for cloud infrastructure management. Microsoft also added dedicated GPU workload profiles, vector database add-ins, and other features meant to make it easier to host generative AI apps on its Azure Container Apps serverless infrastructure. It also added a new Kubernetes AI toolchain operator that automates large language model (LLM) deployment on Azure Kubernetes Service.

“Our vision is pretty straightforward: We are the Copilot company,” Microsoft CEO Satya Nadella said during his opening keynote at Microsoft Ignite. “We believe in a future where there will be a Copilot for everyone and everything you do.”

While resistance to the Microsoft Copilot steamroller might seem futile, some IT pros expressed trepidation this week about what it might mean for job security among developers, especially open source contributors.

Few can deny that Copilot brings productivity gains to enterprise IT pros, especially developers, who have had access to GitHub Copilot code generation since 2022.

“I have heard directly from a lot of developers in my communities about how useful they find Copilot,” said Rob Zazueta, a freelance technical consultant in Concord, Calif. “It does seem to help with a lot of boilerplate-type stuff — writing test cases, stubbing out simple code for API calls and such. From the standpoint of it being a partner for developers in their work, I’m hearing a lot of positive things from the folks I know.”

Enterprises will also be keen to reduce development time and deliver new features more efficiently, Zazueta said.

“That’s all cool,” he said. “But I think the long-term impact will be to further diminish the role of the developer within an organization. Rather than hiring dozens of engineers, some may be able to get by with smaller teams, relying on AI to fill in the gaps. That’s great for investor profit, but it’s detrimental to everyone else.”

Satya Nadella presents on Microsoft Copilot at Ignite 2023.
Microsoft is ‘The Copilot company,’ according to CEO Satya Nadella in his opening keynote presentation during this week’s Microsoft Ignite conference.

‘Training their replacements’

Job security doomsday isn’t in the near-term forecast given that the underlying LLMs can produce results that are inaccurate and biased or introduce security risks. But LLMs have also developed rapidly in the year since OpenAI’s ChatGPT was made generally available. Companies, including OpenAI and Microsoft, have started offering broad legal indemnification to customers using generative AI against future copyright lawsuits.

Still, while it will take time for generative AI to fully mature and legal concerns to be settled, that time is not too far off, said Andy Thurai, an analyst at Constellation Research.

“I would be very scared if I were a programmer,” he said. “This elevates the junior programmers to the senior level and senior programmers to the expert level. Your experience may not matter much if you don’t know how to prompt engineer properly.”

OpenAI has also launched a feature letting enterprises create their own GPTs, quelling some concerns about code being used to train models that don’t belong to individual companies. But even if proprietary code isn’t taken to train public LLMs and companies are protected from copyright concerns, individual Copilots and GPTs will be trained within organizations by developers using them, Zazueta said.

“Developers who use Copilot are training their replacements,” he said. “I’m not at all in favor of that.”

The consensus among some industry experts is that developers won’t be phased out, ultimately, but evolve into prompt engineers. But Zazueta said he’s skeptical of that concept as well.

“There’s an argument to be made that natural language may eventually replace standard programming languages, [but] that’s a pretty far abstraction from the actual code that is run by the machine,” he said. “Human language doesn’t map 1:1 with binary code. In my experience, highly abstracted languages like PHP or JavaScript can lead to poor application development and poor programming experiences. … I can’t imagine code written by an AI without some human intervention could possibly be competitively performant.”

One GitHub Enterprise customer who might consider the new GitHub Copilot Enterprise launched in beta this month is less skeptical of the concept than of its current quality of execution.

“An abstraction layer can be really helpful,” said Kyler Middleton, senior principal software engineer at healthcare tech company Veradigm. “I sure love having an operating system and not needing to program registers to run my computer. However, we’re at the very early stages of this abstraction, where there are lots and lots of rough edges under the silk tablecloth.”

Productivity gains vs. qualms about open source ethics

Copilot’s productivity gains are difficult to resist, Middleton said, despite its $39 per person per month price tag.

“It is incredibly, amazingly useful in knowing the syntax of programming language loops and maps and case statements and obscure logic stuff that I’d normally have to go Google. That part is already pretty magical,” Middleton said. “It’s hard to exactly quantify it, but I think it’s improved my programming speed by maybe 30% to 40%. … Given how expensive developers are, that’s pretty amazing for $39 per month.”

However, Middleton is also committed to the open source community. She isn’t entirely comfortable with the way generative AI models were developed using open source code in ways its contributors might not have intended.

“Closed source libraries aren’t being propagated back to the AI’s intelligence — at least as far as Microsoft says. It’s hard to confirm that. But broadly, every open source project is absolutely scraped for convention and content,” Middleton said. “I occasionally run into this in a hilarious way. I’ll start writing a comment, and it’ll prompt me to finish the line with references to particular people or projects I’ve never heard of. They’re certainly ‘borrowing’ the hard work of folks who made their smarts available for free and charging money for it, which isn’t a very popular business strategy.”

The next year will yield AI winners and losers

Generative AI services such as Microsoft Copilot for Azure will also disrupt the commercial IT market, directly competing against AIOps and observability vendors. Its cost analysis features will compete against FinOps players, Thurai said.

“Particularly interesting is the option that the customers can analyze their observability data using Copilot for Azure, which can [be very helpful in not only] optimizing cloud applications but also diagnosing and potentially avoiding incidents,” he said.

I would be very scared if I were a programmer. … Your experience may not matter much if you don’t know how to prompt engineer properly.

Still, this is also not without its risks, Thurai added.

“This being the first version to be released. It still needs to prove its accuracy and worthiness,” he said of Microsoft Copilot for Azure. “If a recommendation is wrong, it could cost a whole lot more as this becomes the code base for effectively managing the app and infrastructure code, which can be dangerous.”

Despite users’ doubts, it’s clear generative AI will remain a hot trend in IT.

“I think there is going to be a first-movers advantage here,” Middleton said. “The first group to get it right and available is going to cash in massively. Maybe GitHub already has done that, but I don’t think they have. I think they’re going to crack it in the next year, or someone else will first.”

Many vendors, including cloud providers Google and AWS, have been busily developing their own Copilot counterparts. But it’s Microsoft and GitHub’s market to lose at this point, according to Thurai.

“Overall, while other vendors are fighting for LLM creation and model traction, Microsoft has decided to move into operationalizing LLMs and AI, leaving others to scramble once again to catch up with them,” he said.

Beth Pariseau, senior news writer at TechTarget, is an award-winning veteran of IT journalism. She can be reached at [email protected] or on Twitter @PariseauTT.



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.