[ad_1]

When the Wright brothers first took flight in 1903, not many could have pictured that just six decades later, there’d be man on the moon. It took human beings millions of years to get a plane off the ground for just 59 seconds, but in 1969, those same human beings were able to land on a surface 238,900 miles away in outer space.

Technology is strange like that. The more it advances, the more it accelerates. It’s hard to think ChatGPT came into the public discourse just two months ago. It feels a lot longer, since it’s been able to send shockwaves everywhere. School teachers needed to act quickly, as students have used the program to write essays. Even a company as powerful as Google sees its rapid potential, issuing a code red for what the system could potentially do to the tech industry.

When new products come to the market, it’s natural for there to be an outcry from the incumbents about how dangerous the new kid on the block could be. Chatbots are nothing new. However, in the past couple of years, AI has made quite an impact that has forced people to consider that maybe this particular acceleration of technology may be happening a bit too fast. Look at the craze of the DALL·E 2, and its ability to create realistic images and art from a text description.

Science fiction books and movies have long foretold not only a disruption, but a destruction of our future, due to artificial intelligence. This is roughly the plot of things like The Terminator, The Matrix and countless others. And even if AI doesn’t bring something that bleak, assuming that robots will take our jobs is another trope that tends to resurface every time the public embraces a brand new ChatGPT. But is this time different? Does the fact that OpenAI — the company that owns the model — is privately-held mean bad things for all of the tech industry? Do UX designers need to worry?

an image generated by the DALL·E 2

The way the DALL·E 2 and ChatGPT work is by understanding human patterns. If you feed ChatGPT “twinkle twinkle,” it will scour patterns of language to understand what comes next in the sequence best. These are tools of mimicry, not of creativity. Will that eventually come? Probably, but that’s something AI has not yet figured out.

So why shouldn’t the UX designer worry about this mimicker, being an expert in understanding patterns of human behavior? The answer lies in what user experience has been doing all along.

Prompt Engineering, a concept in Natural Language Processing (NLP), is the art of writing text prompts to get an AI system to generate the output you want. Basically, it’s understanding how to ask AI the right questions to receive the right responses. This is essentially how UX designers have been understanding how to improve products all along. Look at UX research, and how one of the best ways to improve a product is feedback from an actual user. Sure, there are fears that robots will take everyone’s job one day, but the best way to prepare for this is to remember that someone will need to understand how to prompt these robots. User experience has had a head start on this.

Design isn’t the only place where this is beneficial. The prospect of AI works wonders for Search Engine Optimization (SEO) as well. While Google sees AI-generated content as against their guidelines (hey, we already spoke about why Google has been a bit sour about the new tool), ChatGPT may be a pretty great way to generate relevant keywords or generate a rough outline. Artificial Intelligence is not something to replace human eloquence, but rather, enhance it like a well-done bibliography. It can be the next step in research, similar to how Google search results replaced understanding how to use the Dewey Decimal System at the library.

Will the UX designer’s job be safe forever? Who knows. With the speed at which artificial intelligence and machine learning has been going, it’s hard to comprehend what tech will bring in the next five years alone. The idea is not to lay down and accept defeat, but rather, grow with the times, and understand that the designer’s best asset is his or her ability to understand a product and who is using it, then take feedback to make it better. Just like ChatGPT, we as designers take input to produce an output. So what’s the difference? I absolutely hate how overdone the term “empathy” has become in the UX world. But this is actually one of those scenarios that it actually makes sense to mention. The DALL·E 2 and ChatGPT can mimic human patterns, but here in 2023, those programs can not understand the “why” behind a customer’s motives or a stakeholder’s goals. And for this brief period in time, that’s enough of a headstart for me.

[ad_2]
Source link