In the few weeks since this year's AIM conference in early May, I have been reflecting on AI, which was the dominant theme of this year's show. Like many people, I have been tinkering with generative AI since late 2022. Some ideas from the sessions at AIM have changed how I think about this technology, as I will describe in this article.
I have previously expressed the view that many in our industry may be approaching AI the wrong way. The crucial question is: what distinguishes the wrong approach from the right one? I believe I may have stumbled on an answer.
Think About What's Under the Hood.
During one of the roundtable sessions at the event, Ellen Thompson, the co-founder of Respage, shared her insights on the development of AI within our industry and beyond. Ellen is one of our industry's true thought leaders on AI, and during this session, she reminded us that AI is fundamentally a data analysis tool.
This perspective is undoubtedly true: the immense power of AI lies in its ability to process an enormous amount of data rapidly. In the case of generative AI tools like ChatGPT, the input and output data happen to be text. The trouble is, most of us don't seem to be thinking about it that way.
In this year's 20 for 20 report, I highlighted how multifamily operators tend to think of AI applications in terms of automating tasks they are already familiar with. As I argued, using AI to take tasks off people's desks misses the point of such a powerful technology. It is better to reimagine entire activities or even roles on a foundation of AI and refocus people on tasks at which humans excel. That means we need to be good at identifying things that are a natural fit for AI. To answer those questions, we must look to the underlying data.
Hit and Miss Use Cases.
When we reframe potential applications of AI as "data analysis" rather than "task automation," it becomes clearer which tasks are a fit for AI. I previously touched on some examples from the AIM keynote session about ChatGPT, which presented many examples, but a mixed bag in terms of usefulness.
A couple of contrasting examples illustrate the point. The first used ChatGPT to generate a business plan for a specific department in a company. The second example used ChatGPT to analyze apartment reviews and identify positive and negative features of communities in a given market. Using that information, it created web content for an individual community that differentiated its most impactful attributes.
The results from the second example were vastly more impressive than the first. The business plan looked fine, but it is hard to see how useful it could be. Anyone who has created business plans understands that the value lies in decision-making and prioritization. As Eisenhower famously said, "Plans are useless, but planning is indispensable." Treating a business plan as an exercise in organizing text on a page renders the task pointless.
The website example, on the other hand, demonstrates a high-value activity that humans could not have accomplished without AI. It represents a smart application of data analysis: harnessing a previously unmanageable source of data to create higher-quality outputs (that happen to take the form of text).
The business plan example looks wrongheaded when viewed from a data analysis perspective. There's little point in analyzing the words other companies used in their business plans as their businesses all have different priorities and opportunities. Garbage in, garbage out.
The Penny Drops on Generative AI.
It seems that people who are getting good at using generative AI are doing so because they are figuring out what kinds of questions to ask and how to ask them. Writing—a topic near and dear to my heart—is a process, some of which lends itself to AI. The trick is figuring out which parts. I am finding that a good question to ask yourself is, "Which parts of this process look like data analysis?"
For example, when we request ChatGPT to write an article, we are essentially asking it to analyze existing data (other people's work) and replicate it in the form of an article. The best you can hope for is a re-arrangement of other people's ideas, which rules out thought leadership and places a low ceiling on how good your content can be.
For ChatGPT to enhance productivity and quality, we must think harder about what data we need it to analyze. Crafting a rough draft, like the one I used to generate this article, is a task that humans excel at but is challenging for AI to replicate. Taking that draft as source data, analyzing it and turning it into copy that aligns with a specific writing style turns out to be a great fit for AI. (If you want more details, come and find me at NAA next week!)
In recent weeks, the penny has dropped for me on how to use generative AI. Thanks again to Ellen Thompson, Mike Whaling, and other industry thought leaders who are helping us make sense of this transformative technology.
Photo by Lukas on Pexels