Why do I think prompt engineering is overrated?

Transform business strategies with advanced india database management solutions.
Post Reply
Reddi2
Posts: 174
Joined: Sat Dec 28, 2024 9:04 am

Why do I think prompt engineering is overrated?

Post by Reddi2 »

The example of DALL-E 3: fewer words, more quality
With one of the newest features in ChatGPT, DALL-E 3 impressively demonstrates that sophisticated models do not necessarily need complex queries or prompts to deliver high-quality results. With just a few words as a prompt, DALL-E 3 can create pretty good AI images. You do not need a prompt with parameters etc., such as in Midjourney. The AI ​​(in this case GPT-4) adapts to your command accordingly.

Will prompting soon become obsolete? My opinion on Hype 3
Will prompting soon become obsolete? My opinion on Hype 4
Will prompting soon become obsolete? My opinion on Hype 5
The evolution of AI: fewer “engineered” prompts
As AI models develop, they become more intuitive in their interactions . The need to use overly detailed or engineered prompts will decrease over time. This means that the ability to formulate complex requests may not be as critical as it once was. Examples include Microsoft's CoPilot and Google Duet . Firmly integrated into Office applications, you "just" have to say kenya phone number data what you want. In the future, such systems will also have enough data about you and your intention to automatically insert this into the prompt.

The Smartest Intern in the World
Instead of trying to create the perfect technical prompt, it might be more helpful to think of the AI ​​as a smart intern . What would you tell it to get the information or results you want? The intern doesn't know you or your goals, so how would you brief them?

Prompts are often just data sources
As the token context window changes, more and more content can be processed in AI systems. In my opinion, this will also change the topic of prompts. Here is an example

You can of course add your previous successes and campaign evaluations to a prompt. However, the amount is limited. Now imagine that instead of 32,000 tokens (currently the largest GPT-4 model), you have perhaps 1,000,000 or 5,000,000 tokens available. Here you could simply add the evaluation of the last 5 years or simply solve this via a connection (via API).
Post Reply