ChatGPT, AlphaCode, and AI art: what do they really mean for humanity?

Together in Our Diversity
3 min readDec 16, 2022

--

An illustration generated by DALL·E 2

Artificial intelligence has long been a favorite of science fiction. Imaginary worlds ruled by AI in film, literature, and art create fun hypotheticals — today’s age, however, marks a transition to reality.

Art generated through machine learning has been all the buzz this year, and ChatGPT by OpenAI took the world by storm in November. Easy access to these new innovations has augmented their hype, but it’s also raised questions on the impact of AI on real people and industries.

OpenAI, while not nearly the only player in the field, is essentially the leader of current advances in AI. It was established in 2015 as a non-profit, with Elon Musk among its founders.

OpenAI’s stated purpose is to “ensure that artificial general intelligence benefits all of humanity.” A statement from its 2015 founding emphasizes OpenAI’s dedication to non-profit research because of “how much it (AI) could damage society if built or used incorrectly.”

However, the organization transitioned to a “capped” for-profit in 2019, with a cap of 100X any investments. This transition coincided with a $1 billion USD investment from Microsoft, shedding some doubt on the altruism propagated by members like CEO Sam Altman.

Beyond these conflicts of interest, it’s important to look at the actual impact of AI technologies. OpenAI is also the creator of DALL·E 2, one of the most popular AI art generation systems, among others like Stable Diffusion and Midjourney.

These AI systems can create incredibly detailed and accurate illustrations from natural language descriptions. They can even create designs in particular styles, such as digital art or photorealism. All three are trained with machine learning on huge datasets, which often contain copyrighted art from real artists without permission.

ImageNet, the most popular art training set, compiles images from the web without regard for their Public Domain status. AI systems learn from this dataset, and can emulate artist style. If AI can produce unlimited art that’s impossible to distinguish from human-made works, the value of artists will likely decrease.

These datasets exist in a legal gray area. Copyright law is still far behind the Internet, so it likely won’t catch up to AI until any damage has already been done.

Artificial Intelligence is also on the rise within the engineering industry. Tools like GitHub Copilot automate portions of code production, but they can’t manage the full infrastructure of i.e. a web browser.

AlphaCode, another AI code automation tool, is trained from coding competition datasets. It’s great for highly specific tasks, but it can’t figure out what those tasks should be. At least for the foreseeable future, AI won’t beat out engineers—engineers who know how to effectively use AI will beat out engineers who don’t.

There is, however, also a fallacy in the idea that jobs eventually automated by AI will be replaced by new jobs in new markets. Since the Industrial Revolution automated manual labor, “knowledge workers” have been in huge demand. If AI automates knowledge work, there’s not much to come after that.

Will AI take over the world? Recent advances don’t contain the answer.

Sources:

--

--

Together in Our Diversity
Together in Our Diversity

Written by Together in Our Diversity

2023 Global Scholars Cohort Member. Global connections, computer science, and ping-pong enthusiast.

Responses (1)