In the heart of Silicon Valley, thousands attended the recent developers’ conference in San Francisco. While many featured speakers provided the attractive, perhaps none were more popular that Sam Altmann, OpenAI’s CEO. In fact, 40,000 viewers tuned in online to listen to what he had to say about OpenAI’s new releases. Indeed, Altmann didn’t disappoint, describing several up-and-coming offerings at the company. But perhaps one of the most intriguing, and maybe even disturbing, involved the launch of the fist AI personal assistants. Called GPTs for the ones developed at OpenAI, these autonomous AI assistant bots could soon change everything we do. In fact, there’s soon likely to be dozens of apps for GPTs readily available to everyone.
Thus far, the world has experienced platforms like ChatGPT and Dall-E, both of which were barely launched a year prior. Today, ChatGPT users alone total about 100 million per week and rising. The speed with which AI has been adopted is nothing short of tremendous. With a few simple commands, AI platform users receive vast amounts of information within seconds. Or they are able to develop a unique image or song by leveraging AI’s capacities to their creative advantage. While these uses are amazing, these will hardly compare to the capabilities of AI personal assistants. Not only will information be more readily accessible, but various apps for GPTs will handle many of our own personal tasks. This indeed may make our lives simpler and more efficient. But it may also come with some potential risks if not pursued with caution.
Introducing AI Personal Assistants
It might be hard to fathom how AI personal assistants might work. After all, the current versions of GPT-4 platforms typically take inputs and produce “thought-like” outputs. But according to OpenAI, coming apps for GPTs will be able to do much more. The initial versions of these AI personal assistants will do things like respond to emails, schedule meetings, or even post on various media platforms. As silicon-based extensions of ourselves, these GPTs will perform a variety of tasks for us. The potential uses of these AI chatbots in this capacity are enormous. From travel bookings and gift purchases to negotiating a sale or pay raise, AI personal assistants have many potential upsides.
According to Sam Altmann, initial apps for GPTs to be developed by OpenAI will provide these types of services. Additional uses might involve AI-led instructions in teaching math or in designing stickers. But in order to encourage more advanced development of AI personal assistants, OpenAI is providing developers with new supports. This includes an “assistants API” that developers can use to create apps for GPTs. And OpenAI will also be launching a GPT store where creators can sell their apps for GPTs. Based user volumes, these creators will be able to share in the revenues along with OpenAI. Altmann envisions each person having multiple apps for GPTs in the near future. And each of these not only handle specific tasks for us but also communicate among themselves.
The Potential Dark Side of GPTs
Having AI personal assistants to help us with a variety of activities each day sounds quite appealing. The opportunities are endless in this regard. Searching for a job including resume and cover letter writing could be delegated as could grocery shopping and more. New apps for GPTs will soon be able to pull data from private files to better know their hosts. Notably, user permissions would be required, but the more personal info accessible, the better the decision-making. This is particularly attractive to large enterprises that dream of utilizing AI assistants to perform a number of routine activities. With new apps for GPTs, customer service and other company tasks will no longer exist as we currently know it.
Understanding this, there are real risks with new apps for GPTs related to job displacement. One could argue that AI personal assistants will free up more time for more advanced pursuits, which is true. But for some, such opportunities may not exist. In addition to these concerns, there are additional threats related to malicious intentions and use of these new apps for GPTs. AI personal assistants could expand the reach of such activities just as they extend the activities of other users. And of course, with uploading of personal information into these AI bots, risks involving privacy and security have to be considered. Open AI’s solution to this thus far is to approach progress through a gradual iterative employment strategy. By releasing small incremental improvements in AI at a fast pace, a chance to adapt and address issues exists. Whether this is true or not remains to be seen.
Other Big Changes at OpenAI
Without question, the big news for OpenAI involved the supports they will be providing for AI personal assistants. In addition to developing apps for GPTs themselves, they are supporting all developers in this regard. Their new GPT store will provide a marketplace for these developers’ creations. But at the same time, OpenAI is releasing an assistant API that will have image capabilities to help promote complex apps for GPTs. OpenAI is also providing a copyright shield for developers and for enterprise clients choosing to work with OpenAI. And most importantly, OpenAI announced GPT-4 Turbo, which has larger text processing capacities and is three times cheaper for input tokens. This offers many developers of AI personal assistants great opportunities to excel.
The main takeaway from OpenAI’s announcements was that AI personal assistants are here. Though they offer rudimentary task assistance at present, this novice period isn’t likely to last very long. Paying ChatGPT clients and enterprise customers already have access to these new apps for GPTs. And based on the supports being provided, dozens more will soon be hitting the GPT store. If you though AI advances in the last year were something, you’d better hold on. For AI personal assistants, things are just getting started. By this time next year, risks or not, it’s likely the AI landscape will look incredibly different.