Every week I get asked some version of the same question. What should an LPTV station be doing with artificial intelligence? The honest answer is that it depends on the station, but there is a general framework that I think most operators will find useful. I want to spend this first article in the series laying out that framework, because a lot of the noise around AI in broadcasting is either breathless overpromise or reflexive dismissal, and neither is particularly helpful.
Let me start with what I actually believe. AI tools are already useful in an LPTV operation today, but their usefulness is uneven, their risks are real, and the stations that will get the most out of them are the ones who approach the technology with clear eyes about both.
Start with what AI is actually good at right now
Current AI tools, the large language models and related systems most of us interact with every day, have matured rapidly in the last few years. They are genuinely useful for certain categories of work and genuinely unreliable for others. Getting the categorization right is the whole game.
AI is good at first drafts. A first draft of a press release, a first draft of a local news script based on verified facts you provide, a first draft of a sales email, a first draft of a weekly newsletter, a first draft of social media copy. In every one of these cases, a human still needs to review, edit, and approve the final product. But the human is starting from a partial result rather than a blank page, and that can be a significant time saver.
AI is good at summarization. If you have a long FCC filing, a board meeting transcript, a stack of audience research, or a lengthy email thread, AI can produce a reasonable summary that gets you to the key points faster than reading the whole thing yourself. Again, you have to verify the summary, because AI sometimes misses important nuances or gets details wrong. But as a starting point for understanding, it is useful.
AI is good at translation and reformatting. Converting a document from one format to another. Reformatting bullet points into flowing prose, or vice versa. Translating content from English to Spanish or another language, though you still need a human speaker of the target language to verify the result before publication.
AI is good at answering general questions and helping you think through problems. Not by giving you the right answer, because it doesn’t always know the right answer. But by being a conversational partner who can help you articulate what you are thinking, stress-test your logic, and offer alternative framings. Used this way, AI is a thinking tool, not an answer tool.
Be clear-eyed about what AI is not good at
AI is not reliable for original reporting. If you ask it to tell you what is happening in your local market, it will confidently make things up. Dates, names, events, quotes. This is not a hypothetical risk. It is a well-documented failure mode, and it is why responsible newsrooms using AI keep human editorial judgment at the center of the process.
AI is not reliable for anything requiring verification of current facts. Regulatory filings. Legal questions. Contractual language. Technical specifications. It can be useful as a starting point, but the output needs to be verified by someone who actually knows the subject.
AI is not a substitute for human relationships. The sales call, the advertiser conversation, the community meeting, the job interview. These are relational interactions where the point is the human connection, not the content of the exchange. AI can help you prepare for these, but it cannot conduct them on your behalf without losing what makes them valuable.
AI is not a replacement for judgment. It can help you think through a decision. It cannot make the decision for you in any situation where the context, stakes, and human factors matter. Pretending otherwise is how stations end up publishing embarrassing errors and making bad calls.
The right way to think about AI in an LPTV operation
AI is a productivity layer, not a replacement for expertise. It helps the people who work at your station do more in less time, if they use it well. It does not replace the need for those people.
That framing has a few implications. First, AI adoption at your station should be driven by your specific operational needs, not by what everyone else is doing. If you have a bottleneck where a lot of time goes into first drafts, summarization, or reformatting, there is likely a useful AI application. If your bottlenecks are somewhere else, the right investment may be elsewhere too.
Second, AI adoption requires training. Your team needs to know what these tools are good at, what they are not good at, and how to verify their outputs. Without that training, you will either get limited value from the tools or, worse, you will get embarrassing failures. Investing a modest amount of time in actually teaching your staff how to use AI well pays back quickly.
Third, AI adoption requires guardrails. You need to decide what AI is and is not allowed to be used for at your station. Can it be used to draft news copy? Under what conditions? Can it be used to generate images? Under what disclosure? Can it be used to produce scripts read by on-air talent? These are policy decisions that every station needs to make, and the time to make them is before something goes wrong.
A practical starting approach
For most LPTV stations, I would recommend starting with two or three specific use cases where AI can save real time with low risk. Newsletter drafts. Social media copy. Meeting summaries. Sales email drafts. Initial research on a topic. Transcription of interview audio for internal use. Pick the uses that match your actual workflow, try them for thirty days, and see what actually saves time versus what just feels modern.
Resist the temptation to adopt AI everywhere at once. That is how stations get burned. Start small, measure actual productivity gains, address problems as they come up, and expand from there. The stations that are getting real value from AI today almost all started this way. The ones that tried to transform everything at once usually had to roll things back.
The rest of this series
Over the next nine articles, I am going to get specific. We will cover traffic and scheduling, news writing, ad sales, playout automation, closed captioning, audience analysis, social and web, the risks, and how to build a roadmap for your station. Each article will include practical guidance you can actually use, not just general enthusiasm about the technology.
AI is not a silver bullet. It is a tool, and a powerful one, and like every tool it rewards the operators who learn how to use it well. LPTV has always been an industry of small operators figuring out how to do more with less. AI, used thoughtfully, is another way to do exactly that. Let’s dig into the specifics.


