How AI can save one industry in the USA about 135 billion+ dollars per year.

One industry that has slowly being replaced with AI is customer service. Most current chatbots don’t use a llm or at least a very primitive one, but a locally integrated GPT or llama engine that gets regular updates with a local business knowledge base would save a ton of money. There are about 3 million customer service jobs in the united states alone, which presents an opportunity of currently about 135 billion dollars a year in potential savings. This is just employee salary alone, but there are a whole breadth of expenses estimated around 200 billion total if you include taxes, health premiums etc that will be saved.

One company alone, Charter communications, has about 100000 employees, and at about 45k a year salary on average, we’re looking at about a savings of 1 billion alone estimating 20000 customer service employees. Fine tuning a LLM to work specifically for the company shouldn’t cost more than 10 million a year plus the cost of GPUs if they are hosting it locally or the rental cost from a cloud service keeping everything up to date and functioning properly. Eventually you will have the LLM connect to a voice model that is indistinguishable from a real live person, even able to have small talk and conversations with the caller if the caller chooses to engage. This saves charter about a billion dollars a year in savings, and they will have a much more capable employee than a lot of current ones that work there. I ran some numbers through chatgpt and at about 3 cents per interaction (estimated by the software) would be about 17 million a year in costs although I found that current cost per 1000 tokens on chatgpt is one fifth of one penny, so reduce those costs by 1/15th and you’re at about 2 million a year to essentially replace a 1 billion dollar cost to your company.

The great thing is even though it eliminates jobs, customer service can be very taxing with difficult customers (did it for 5 years), and AI can handle that with ease. Some company very soon is going to have a multiple dialect text to speech model that will simulate voice very well, until then this can replace all chat agents globally as well. This would all likely be powered by Nvidia chips.

I could see most companies go with a cloud solution at first and slowly integrate it into their service as they see the cost savings, and eventually if they see the cost savings model go with a locally hosted version and pay for as many racks of B100 or B200 chips as it would take to meet demand. Most large corporations will probably shift fully over from human agents to AI based over the next 5 years. They could even use their best customer service agents as training data (if they even need to train, there will probably be models for customer service agents available that get licensed out).

Programs like CHAT with RTX already show how simple it is for a single person to get input based on data of pdf and text files. inputting a location on your computer you can already run your own local version of customer service employee (at least for knowledge questions). The biggest hurdle is getting it to interface with the current software and get it to perform actions on accounts.

submitted by /u/RainbowUnicorns
[link] [comments]

Leave a Reply

The Future Is A.I. !
To top
en_USEnglish