The Role of AI in Journalism: Insights from HackerNoon’s CEO

Journalism is facing a tough time with public trust at an all-time low. Adding to the challenge, artificial intelligence (AI) is making waves in newsrooms, forcing publishers to decide how much they want to rely on it. Should they let AI create content and headlines? How much should they tell readers about AI’s involvement? And will AI-driven news end up overshadowing human-written journalism?

The Role of AI in Journalism: Insights from HackerNoon’s CEO

founder and CEO of HackerNoon

To explore these questions, Growthy.web reached out to David Smooke, the founder and CEO of HackerNoon, a technology-focused publication with over 45,000 contributors and 4 million monthly readers. Smooke shared how HackerNoon is using AI and offered his thoughts on how journalists and news outlets should approach AI.

Key Takeaways

  • Trust in journalism is decreasing, and AI could make things worse.
  • News publishers are figuring out how to integrate AI in their operations.
  • David Smooke, CEO of HackerNoon, believes in being transparent about AI’s role in journalism.
  • AI content presents both risks and benefits for traditional journalism.
  • The New York Times’ lawsuit against OpenAI highlights the ongoing tension between AI and content creators.
  • Q&A with David Smooke

    Q: What role do you think AI should play in the newsroom?

    A: I see myself more as a writer and product manager than a journalist. HackerNoon publishes all kinds of content, including op-eds, tutorials, interviews, and some journalism. We’re developing a community-driven content management system where AI can help in various ways—like generating new ideas, fixing grammar, or finding relevant stories.

    In our text editor, we’ve integrated a custom ChatGPT layer for rewrites, various image generation models, and AI tools that generate summaries tailored for different distribution channels. We also use AI to translate stories into different languages and create audio versions of blog posts, making content more accessible.

    As a news consumer, I believe journalists should use the most advanced tools to research their stories, but they should never fully trust AI. Verification is crucial.

     

    Q: What level of AI use is acceptable, and how transparent should publishers be with readers?

    A: It’s not okay to pass off AI-generated content as human-made. Platforms should clearly indicate where and how AI was involved. For instance, we use emoji credibility indicators to show readers if AI contributed to a story. Readers need to trust that the author is who the site claims they are.


    Q: Do you see AI-generated news as a threat to human-written journalism?

    A: AI-generated content can have side effects. Deepfakes, for example, get billions of views on social media, and while platforms are getting better at detecting them, they’re easier than ever to create. When a site like TUAW relaunched with AI-generated content attributed to real humans, it didn’t go over well. Readers trust content more when they believe a real person is behind it.

    Some financial websites use AI to quickly generate headlines, which is essential for investors who need timely information. This trend started long before the recent AI boom, but it raises questions about speed versus human input.


    Q: Is there a risk that AI-generated content will draw attention away from human-made journalism?

    A: Yes, there’s a risk that AI could divert attention from publishers to search engines. If an AI-generated search result answers a question that would have brought a reader to a specific page yesterday, that’s a lost visitor. However, publishers can also benefit from powerful AI tools that help them retain quality traffic.


    Q: What do you think of The New York Times’ lawsuit against OpenAI and the broader issue of AI models being trained on human-generated content?

    A: I expect that in the future, both government and private sectors will regulate how online content is used for AI training. While this lawsuit may not set a precedent, it highlights the need for proper compensation when AI companies use content created by others. Plagiarism is a clear issue, and both OpenAI and The New York Times know this. I expect OpenAI will end up paying more, but this case alone won’t define the future of content licensing for AI.


    Q: What are your thoughts on AI news tools like Perplexity AI, which offer news summaries with citations?

    A: Curating is valuable, and AI can sometimes do it as well as humans, especially with clear guidelines. AI is definitely changing how we search for and consume information online. Google’s use of AI in search results shows that generative AI is becoming a key part of the search market’s future.


    Q: How can news publishers use AI to enhance their operations?

    A: At HackerNoon, we use AI in several ways. For instance, AI suggests headlines based on the draft of a story and the performance of past stories. While humans still write better headlines most of the time, it’s helpful to have AI-generated options. We also use AI to categorize large amounts of data, like when we organized 50,000 technology tags into 22 categories—it made more sense to have AI do that than human editors.


    Q: Do you think it will become harder for human-written content to rank well against AI-generated content in the future?

    A: AI content isn’t a source in itself. Sources will always need to be cited and linked. As AI becomes more integrated into search results, it’s expected that search experiences will be enhanced by AI assistants.


    Q: What advice would you give to writers and journalists entering the industry who are worried about the future of news publishing?

    A: Don’t be afraid to compete with AI. Even as the internet floods with AI-generated content—whether it’s bad, mediocre, or even remarkable—great human storytellers will continue to stand out. The demand for authentic human stories is as high as ever.

    If you have stories to tell, there are more ways than ever to reach readers around the world.


    Q: Any final thoughts?

    A: As humans, we crave human stories. Originality wins!