+

Cookies on the Business Insider India website

Business Insider India has updated its Privacy and Cookie policy. We use cookies to ensure that we give you the better experience on our website. If you continue without changing your settings, we\'ll assume that you are happy to receive all cookies on the Business Insider India website. However, you can change your cookie setting at any time by clicking on our Cookie Policy at any time. You can also see our Privacy Policy.

Close
HomeQuizzoneWhatsappShare Flash Reads
 

OpenAI's ChatGPT can help novice users write code. Here are the best prompts to use, according to experts.

Dec 22, 2023, 18:24 IST
Business Insider
OpenAI's ChatGPT has caused a stir in the tech community.Mairo Cinquetti/Getty Images
  • OpenAI's ChatGPT can help users write lines of code.
  • The bot is making it easier to develop software and putting programmers on edge.
Advertisement

OpenAI's ChatGPT has caused quite a stir in the tech community.

The AI-powered chatbot's ability to write impressive lines of code has freaked out programmers and caught the attention of some tech CEOs. Some companies have already begun incorporating the technology into everyday workflows.

Although generative AI is lowering the barriers for coding and making it easier to develop software across the board, some users are still struggling to get the desired results.

Business Insider spoke to several computer science experts and tech workers who've created software with ChatGPT to get their advice for generating better code.

1. Break down your prompts and keep them simple

When it comes to generative AI, it's all about prompting.

Advertisement

AI-powered chatbots respond to commands written in plain English. Much like instructing a person, it's better to be clear and concise when explaining what you want to ChatGPT.

Ammaar Reshi, who described himself as a novice coder who's used the bot to create apps and video games, said going overboard with information in prompts could skew the results.

"I like to think of GPT as someone who is half-listening to you," he told BI. If you ask the bot to do everything at once, for example, to create an entire video game, it's likely going to make quite a few mistakes, Reshi said.

He said: "I would recommend explaining what your project is to GPT first. Approach it step by step and build those blocks with GPT, asking it how it would do things."

"I found it makes far fewer mistakes when you separated it all out because it's no longer trying to keep everything in its head," he added.

Advertisement

Neil Ernst, an associate professor of computer science at the University of Victoria, said the tech was a great tool to help those who lacked training get started.

Ernst, whose students sometimes use GPT to help them with assignments, said adding technical detail to prompts or feeding the bot example code can help improve results.

"It will recognize key terms and then students refine it by talking a bit more about what particular changes they would like it to make," he said.

2. Assign ChatGPT a role

Giving ChatGPT a specific persona can help it understand what a user wants.

Giving the bot a role, such as a specific job title, will yield better results, Jason Gulya, an AI council chair at Berkeley College, who teaches clients how to use ChatGPT, previously told BI.

Advertisement

When it comes to coding, Reshi suggested users start by assigning ChatGPT the role of a "world-class programmer."

"I think telling it to assume that personality works really well," he said. "Because otherwise, it's just using generic programming knowledge. By adding that specificity, you're more likely to get the best results that it's trained on."

3. Brush up on the basics

No matter how specific users keep their prompts, language can still be ambiguous.

This means users still have to have some understanding of how to steer the chatbot and what results they expect to see, according to Emery Berger, a professor at the University of Massachusetts Amherst.

Berger, who's used GPT to create several AI programs, said the chatbot would also likely struggle with higher-level or specific industry tasks, meaning users must start refining the results themselves at some point.

Advertisement

GPT-produced solutions also tend to lack nuance, Ernst said.

Ernst added that the results were most successful when a person knew what they wanted it to do and had a good idea of how they thought the software should be designed.

"What we seem to be seeing is that it spits out a good solution but a lot of the nuance about why you want a solution in a particular way is lost," he said. "You can get a little bit more by asking it to improve on certain aspects of the code, but I think it pretty quickly runs into roadblocks where it just can't improve any further."

You are subscribed to notifications!
Looks like you've blocked notifications!
Next Article