MediaMorph Edition 94 - by HANA News
Liquid news - when the article is no longer the product
Was this newsletter forwarded to you? Sign up here
The written-by-a-human bit
Robert Thomson, the global chief executive of News Corp, may have caused his newsrooms to choke on their bagels when he told a Morgan Stanley tech conference in San Francisco that “We’re essentially an input company.”
“The great threat in the age of AI is going to be to what you might call output companies. We’re an input in the way that semiconductors are an input, in the way that datacentres are an input, in the way that energy is an input. You look at breaking news, you look at unique real estate information.”
These thoughts preceded an announcement that News Corp has signed an AI content licensing deal with Meta worth up to US$50m a year.
Thomson also reiterated his “woo or sue” approach – in which he welcomed deals with AI companies, but he would sue them if they took the publisher’s content illegally.
Along with News Corp’s existing OpenAI deal, reportedly worth $US250m over five years, this represents a healthy new line item for their P&L.
But what exactly is an “AI input company”?
To me, it suggests that, with the right compensation, news organisations will willingly fuel the AI answer engines for real-time ingest and, hopefully, citations.
The tacit admission is that the article is no longer the inevitable destination, that news has become liquid and can be repurposed as a chatbot answer, a live briefing, a personalised audio summary, a smart alert, a timeline, or even an infographic.
The article still exists, but it is no longer the only, or even the main, event. It becomes a source of truth that bots can visit to find verified information, context, interviews, chronology, evidence and human judgment.
As answer engines replace search engines, readers may never “arrive” at an article in the traditional sense. They will encounter a publisher’s reporting as a sentence in an answer, or a cited paragraph in a summary somewhere inside an assistant.
The act of news consumption becomes less about navigating to a known author and more about pulling information from a responsive layer of metadata.
For purists like me, a well-written article is a brief visit into the author's mind, with the arc of discovery, a destination, and the small thrill of a completed journey.
Relying on answer engines feels like going to the Louvre and not getting past the gift shop. It’s all there in the gift shop, but you will never see the real thing.
Of course, once the article has been liquidised, it can live on in many other formats - and that should be seen as an opportunity.
But time will tell whether the traditional article remains a container of truth, a stand-alone artefact, or a bucket of text that sits in a dark basement, waiting for hungry bots.
Tool of the week: Gamma - still our go-to for presentations with excellent AI features for image creation and layouts. Their UI is a thing of beauty.
Mark Riley, CEO Mathison AI
AI and Journalism
This week’s best articles, as chosen by our editors
News Corp CEO Robert Thomson warns AI companies scraping without paying: ‘We’re coming for you’ Press Gazette - March 4, 2026 News Corp CEO Robert Thomson has warned AI companies against using the publisher's content without compensation, stating that the publisher will pursue legal action and form licensing agreements with major firms like Meta and OpenAI. He emphasises News Corp's role as a crucial content provider for AI tools, arguing that their unique, real-time information is invaluable compared to the existing data patterns relied upon by AI systems. |

Claude rewards niche journalism in AI answers, study finds The Media Copilot - March 12, 2026 Muck Rack's research shows that Claude, a generative AI model, prefers citing niche and mid-tier media outlets, academic journals, and industry publications over major news sources, while ChatGPT leans more towards recent journalism. Additionally, there's a minimal overlap between journalists pitched by PR teams and those cited by AI models, emphasising the distinct citation preferences of different AI systems. |
AI systems use Canadian journalism but seldom cite media sources: report A recent study reveals that AI systems heavily depend on Canadian journalism for content while failing to adequately credit or compensate original sources. This raises ethical concerns about the use of journalistic work, highlighting the need for better recognition and remuneration for media organisations in the digital landscape. |

Perplexity claims News Corp tried to ‘entrap’ chatbot to make copyright case Press Gazette - March 3, 2026 Perplexity has accused Dow Jones and the New York Post of attempting to provoke its AI chatbot into reproducing their articles verbatim for copyright claims, while the publishers claim Perplexity is unfairly profiting from their content. In a legal dispute, Perplexity argues that the publishers waived confidentiality by submitting queries on its platform, and it seeks disclosure of the publishers' extensive inquiries to demonstrate entrapment. |

The intersection of artificial intelligence and journalism Miami - March 16, 2026 As AI increasingly permeates journalism, major news outlets like The Associated Press and Bloomberg are leveraging the technology to enhance reporting, while experts emphasize the importance of training for future journalists to navigate this evolving landscape. An upcoming event titled "AI in Communication" at the University of Miami will explore these developments and their implications for accountability and transparency in news reporting. |
AI and the Future of News 2026 Join us for a thought-provoking one-day conference on how AI is transforming journalism, featuring experts from the University of Oxford and beyond. Attend engaging panel discussions and lightning talks covering everything from AI's role in investigations to its societal implications—available online for those who can't attend in person! |
How Journalists Can Make AI Work for Them Cjr - Journalists are voicing concerns about AI's impact on their industry, citing worries over job security, misinformation, and ethical use in reporting. As AI tools become more prevalent in newsrooms, the conversation highlights both the opportunities for innovation and the challenges of maintaining journalistic integrity. |

Using AI in journalism Media Helping Media has embraced generative AI since February 2025, expanding its resources from 150 to 405 offerings for journalists and educators while emphasizing guidelines that prioritize journalistic integrity and audience trust. By utilizing AI tools like Google Gemini and OpenAI’s ChatGPT, MHM enhances its content creation processes while adhering to principles of human oversight, rigorous verification, and transparency with audiences. |

New AI tools that are genuinely useful to business journalists The Reynolds Center - March 11, 2026 The recent SABEW panel highlighted the transformative impact of AI tools like Google's NotebookLM in journalism, showcasing how journalists leverage these technologies for efficient research and creative brainstorming, while also addressing crucial concerns about accuracy and source protection. As generative AI continues to evolve, experts emphasize the importance of thoughtful usage and thorough verification to maintain journalistic integrity. |
Webinar: Detecting AI-Generated Content – Updated Tools and Techniques March 26, 2026 Gijn - As AI-generated content becomes increasingly sophisticated, journalists face new challenges in verifying information and combating misinformation. To maintain integrity and trust, they must adapt by leveraging advanced verification techniques, collaborating with tech experts, and exploring blockchain technology for enhanced credibility. |

Europe House hosts discussion on the use of artificial intelligence in newsrooms and media ethics EEAS - Europe House recently hosted an insightful event on "Journalism and Technology," featuring Croatian journalist Ivan Fischer, who emphasized the role of AI in enhancing newsroom efficiency while underscoring the need for accuracy and ethical standards. The discussions highlighted both the opportunities and challenges AI presents in journalism, including concerns about accountability, bias, and disinformation, reinforcing the importance of upholding democratic values in the evolving media landscape. |

From scripts to sermons: is AI going to be writing everything soon? | Margaret Sullivan The Guardian - March 10, 2026 Pope Leo XIV warns against using artificial intelligence to deliver homilies, emphasising the irreplaceable nature of genuine faith, while AI's integration across industries sparks heated debates about its implications for job security and accountability. From military negotiations to Hollywood writers' concerns and journalism's evolving landscape, the dual nature of AI as both a valuable tool and a potential risk underscores the need for careful oversight and ethical considerations in its application. |

How AI Became a Popular Tool for News Agencies Onrec - AI is revolutionising journalism by streamlining research, data analysis, and real-time news monitoring across the U.S., Europe, and Ukraine, while enhancing efficiency without replacing the critical roles of human reporters. This collaboration fosters deeper storytelling and maintains public trust amidst the challenges of modern news coverage. |
AI and Academic Publishing
This week’s best articles, as chosen by our editors
How AI use in scholarly publishing threatens research integrity, lessens trust, and invites misinformation Since 2023, the use of AI tools in scholarly writing has surged, aiding researchers in enhancing quality, clarity, and analysis, but this trend raises critical concerns about authorship, originality, and the integrity of academic work. As the academic community navigates these challenges, discussions on guidelines for AI's role in publishing are becoming increasingly vital. |

AI is inventing academic articles – and scholars are citi... The Observer - March 10, 2026 The rise of AI-generated "scholarly slop" is compromising the integrity of academic publishing, as seen in Ben Williamson's encounter with a fake paper that was widely cited despite containing fabricated references. With increasing submissions and a "publish or perish" culture exacerbating the issue, the academic community faces significant challenges in maintaining rigorous standards amidst a surge of misinformation. |

"If AI is writing the work and AI is reading the work, do we even need to be there at all?" Education workers reveal a growing crisis on campus and off Bloodinthemachine - March 13, 2026 The rapid integration of generative AI in education has sparked significant challenges, including academic integrity crises, job losses for educators, and concerns over diminished critical thinking skills among students. As institutions increasingly adopt AI tools, many educators are voicing their fears about the erosion of genuine learning experiences and the impact on their roles in fostering meaningful connections with students. |

Academic forum explores future of publishing industry at London Book Fair Chinadaily - March 13, 2026 At the "Beyond Borders" forum during the London Book Fair, industry leaders discussed the transformative impact of AI on publishing, emphasizing the shift from traditional content providers to knowledge service providers. Key insights included the importance of balancing innovation with copyright protection, enhancing accessibility through digital platforms, and fostering a love for reading through community partnerships and engaging programs. |
From pandemic radars to "analog" living: JMIR publications explores the frontiers of AI safety and digital wellness JMIR Publications has released three pivotal articles exploring the intersection of technology and society, including a machine learning model predicting avian influenza threats, the perils of using flawed datasets in research, and the potential mental health benefits of analog experiences for youth. The News and Perspectives section invites contributions from patient advocates and experts on digital health topics, emphasizing the blend of academic rigor with accessible journalism. |
Why trust in publishing matters more than ever Oup - Sophie Goldsworthy highlights the essential role of publishers in combating misinformation by providing rigorous and authoritative content, which fosters informed discourse and critical thinking. She advocates for transparency, peer review, and ethical standards to build trust in academic literature, ultimately enriching both the academic community and public understanding. |
This newsletter was partly curated and summarised by AI agents, who can make mistakes. Check all important information. For any issues or inaccuracies, please notify us here
View our AI Ethics Policy


