Wednesday , December 11, 2024

Payments 3.0: How to Manage Generative AI

Juliet Capulet sent me an e-mail recently to offer me a marketing list for a conference that has nothing to do with payments.

I’m no gentleman of Verona, but I definitely recognize the Capulet name. However, I don’t think Juliet is real. The e-mail’s stilted wording, the mismatch of the product to my work, and the fact that I can’t find any results on LinkedIn or the Web when searching for her e-mail make me think “Juliet” is a product of generative artificial intelligence.

Generative AI is a subset of AI that learns how to create new content based on data that the system has been trained on.  Generative AI is supposed to get smarter as new data is provided and as human handlers refine the output.

Chat GPT brought AI into the mainstream and showed the possibilities and risks of the new tool. Juliet’s e-mail is an early example of how businesses are getting excited about its opportunities.

For example, having AI create and execute e-mail campaigns might save time and money. But if the messages don’t look like they come from a real person and read like a compilation of buzz phrases, they won’t generate leads. While e-mail is cheap, a spammer reputation is expensive.

AI has been used for customer-service chatbots, fraud detection, portfolio optimization, and a variety of other applications. Generative AI adds new layers because it can take in queries and return original answers from raw data rather than a predefined set of possibilities.

Take customer service. Generative AI is different from non-generative AI because it can create new, human-sounding answers to questions, where older AI models use existing content and rules to describe, predict, or recommend something.

Even so, that doesn’t mean it is the most effective tool. Indeed, Meg Porter, the executive vice president for Enterprise Transformation at Ubiquity, a customer-service company, says the technology’s real advantage could lie behind the scenes.

“Generative AI has some exciting potential to make chatbots even smarter, but here’s the thing. No matter how good technology gets, having a way for customers to talk to a real person can make their lives easier, and that’s a big deal for keeping them happy and a returning customer,” Porter said.

Instead, she said Generative AI might bring better insights after customer-service calls are over by analyzing good and bad customer interactions and identifying patterns. “This data-driven approach is way more efficient than slogging through reviews by hand, and it can uncover new trends in a massive dataset,” she said.

Finally, regulations will affect the way payment providers can use AI. At the IPA’s Compliance Boot Camp in Chicago in September, Eli Rosenberg, a partner at Baird Holm LLP, noted that companies should not expect that using AI will change the regulatory regimes that govern their products and services. “Regulations are technology-agnostic; they apply regardless of what you are using,” he said.

While properly trained AI may be able to do tasks like analyzing disclosures or helping to draft first versions of documents, it cannot be left to do these things on its own. Generative AI creates false references and facts so often, the phenomenon has its own term: hallucinations. Expert human oversight is necessary to avoid costly problems.

Also, companies need rules to ensure customers and employees do not enter sensitive data into AI tools in ways that others can access. They may inadvertently compromise personal and corporate data.

Generative AI offers opportunities to increase the capacity of organizations. But financial-service providers need to make sure they don’t end up in a star-crossed love for new tech.

—Ben Jackson bjackson@ipa.org

Check Also

Fiserv’s Deal with COCC and other Digital Transactions News briefs from 12/11/24

Fiserv Inc. is expanding a relationship with fintech COCC to include cloud-based financial tools and fintech …

Digital Transactions