August 4, 2023

Why Generative AI Won't Replace Developers: 5 Solid Reasons

Generative AI is steadily becoming a reliable assistant for professionals across various fields, but it certainly won't replace them in the near future

Generative AI has been the main topic of discussion in the tech industry and many other sectors for more than a year. Some are amazed at how interesting the results can be thanks to such technologies. Others worry about the jobs that might disappear due to artificial intelligence.

Generative AI was the main topic of discussion at the Collision 2023 conference, where we managed to chat with representatives from companies and startups from various industries. They all marveled at the explosive development of artificial intelligence or shared their concerns about such technologies.

ChatGPT, in particular, is attracting special attention today. It's an AI-based chatbot. Various data from the internet was used for its training. This includes the GitHub codebase, which even allows it to write applications.

If software code can be partially written with ChatGPT, why do we need developers at all? This is a logical question asked only by those who are not directly involved with the tech industry. This article is intended for them and should be an eye-opener.

1. Experience and Innovative Solutions are Important in Development

ChatGPT can write a snippet of simple program code, but not a large full-fledged application

Developers can indeed delegate simple routine tasks to artificial intelligence. With tools like ChatGPT from OpenAI, Copilot from GitHub, and others, it is possible to generate code snippets or remind oneself of necessary methods for strings, arrays, objects, and so on.

However, coding itself is not the most important task of a developer. In real projects, experience and creative thinking are often more important, and artificial intelligence will certainly not replace them. When creating software, it's essential to understand the architecture and be knowledgeable about the technologies being used.

A perfect example to understand the situation is Tesla's autopilot. It handles highway driving very well. The electric car confidently switches between lanes and safely overtakes other vehicles. Its performance in such scenarios rarely raises any questions or doubts.

But on mountain roads with many turns, the autopilot can often make mistakes and behave uncertainly. When it encounters an unusual situation, its behavioral patterns no longer work. The same happens in software development, which often requires the use of innovative solutions.

2. Generative AI Can't Handle Unusual Situations

Artificial intelligence has long been used in many industries, but so far it hasn't replaced any professionals. On the contrary, it has become a tool that helps to perform a variety of tasks faster and better. It's not something to be feared, but rather actively used in professional activities.

As early as the end of the last century, artificial intelligence formed the basis for autopilot in passenger airplanes. Even then, an airliner could independently make a flight between cities, performing all stages, including taxiing to the runway, take-off, navigation along the route, as well as landing.

Pilots indeed actively use the autopilot system, but only in 1% of cases do they engage it during landing. This happens because professionals do not want to lose the skill of manual control, which will be very needed at a critical moment if a failure suddenly occurs in the automation.

By the way, failures in airplane automation do periodically happen. Sensors can transmit incorrect data, and the autopilot can make incorrect actions based on them. In this case, the pilot must necessarily take control. The same situation applies to software development.

3. The Capabilities of Artificial Intelligence are Limited

ChatGPT agrees that it can't do everything

Artificial intelligence is trained on real data that already exists. To put it simply, for a neural network to learn to recognize cats, it needs to be shown a million pictures of them, and on the next one, it should understand that it's not a dog.

Modern neural networks, like ChatGPT, are trained on huge amounts of data from open sources, including GitHub, Stack Overflow, and more. However, even after this, the most modern generative AI can essentially only use solutions that have already been used by someone else.

Considering that even ChatGPT, which is talked about the most, uses an information base only up to September 2021, it becomes clear that this specific neural network doesn't know everything. Often, the use of absolutely different, more effective solutions may be required.

Moreover, to create even relatively simple software, generative AI needs to be provided with a very complex description of all its nuances. As a result, it often turns out that it's much easier for a developer to write the code independently than to ask a neural network to do it.

4. Responsibility Can't Be Assigned to Generative AI

The question of the safety of using artificial intelligence was raised several times at the Collision 2023 conference. Adam Selipsky, CEO at Amazon Web Services (AWS), rightly emphasized this point during his speech on the Centre Stage.

There's nothing surprising about this, as generative AI isn't always safe. For example, in March 2023, there was a major leak of confidential information of paid subscribers to the ChatGPT service. The credit card data of 1.2% of users practically ended up in the public domain.

This happened due to an error in an open-source Python library called Redis. OpenAI tried to deal with the problem as quickly as possible, but some users of ChatGPT were still affected. This is not the biggest problem, but it is one that occurred.

Artificial intelligence can confidently write texts that are indistinguishable in style from real ones, but in terms of content, they can be absolute nonsense. Plus, you can't blame it for this. A neural network can't take responsibility like a developer or any other professional.

5. It's More Profitable to Train Developers Than a Neural Network

ChatGPT is trying to come up with an interview task that not every developer can solve

Generative AI won't replace even a junior developer with a couple of months of experience during real project work. Of course, most often, a neural network solves tasks faster and better than new employees at interviews, but this is no reason to claim that it can take their jobs.

Today, neural networks receive millions of requests per minute, and processing each of them costs a couple of cents. These are fabulous amounts, so it's not surprising that the current model of ChatGPT has a limit on the number of responses in a given period of time, even for paid subscribers.

The next stage in the development of generative AI will definitely be the reduction of their use cost, not the increase in the quality and quantity of capabilities. Therefore, companies like Focus21 will continue to hire employees and increase their level to create even more interesting and useful software.

As for generative AI, in the next few years, such technologies will likely be used even more actively in various fields. With their help, developers and other professionals will be able to work faster, better, and more profitable for the business. But neural networks will definitely not be able to replace employees.

Explore AI-Powered Solutions with Focus21 Today

Author

Mykola (Nick) Hrytsaienko

Tags

Artificial Intelligence

Our Latest Thoughts on Technology

trends
Message Us

Let's Get Started with Focus21

Our company is a space where ideas flourish and transform into reality.

Thank you! Your submission has been received!
Please input your email to submit the form.