
Redefining Responsible AI: Creativity, Culture and Data Sovereignty

The following blog post is a summary of our discussion on the podcast created with the assistance of AI.
Prefer to watch?
In this special episode of Deliver Next, we sit down with Steven Renata, CEO of Kiwa, winners of the HSO People’s Choice Award at the Aotearoa AI Summit.
This episode explores:
• How Kiwa built a global dubbing platform
• Their approach to AI that enhances creativity
• How CultureQ redefines data sovereignty
• Lessons from culturally aligned, community-led AI
• The future of language and digital preservation
• How leaders adopt AI responsibly and build trust
A People’s Choice Winner with Global Impact
Kiwa’s recognition in the HSO People’s Choice category reflects something bigger than a single product or project. It reflects a business that has earned deep trust from creatives, educators, technologists and Indigenous communities.
For more than 20 years Kiwa has quietly played a major role in how the world localises film, TV and gaming content. Their VoiceQ platform has become the go-to tool for dubbing studios worldwide, supporting productions that reach millions of viewers.
Their win at the Summit highlights how innovation grounded in purpose can scale globally without compromising the communities behind it.
AI That Enhances Creativity, Not Replaces It
As award winners, one of the qualities that stood out was Kiwa’s deliberate approach to AI adoption.
Rather than letting AI override artistic performance or cultural nuance, Kiwa uses it exactly where it adds value:
• Scene and mouth detection
• Timing analysis
• On-screen/off-screen speech analysis
• Automated transcription
These are repetitive tasks that slow production.
Automating them gives creative experts more space to do the work only they can do.
Steven highlights that the goal is simple:
Help artists deliver better performances, unlock more opportunities and reduce friction across the workflow.
This balanced model is one we at HSO see having parallels for enterprises deploying AI responsibly.
CultureQ: Protecting Knowledge with Sovereignty and Respect
Another reason Kiwa earned the People’s Choice Award is their groundbreaking work on CultureQ.
Around the world, Indigenous languages and cultural knowledge are disappearing faster than they can be preserved. Many stories exist only in handwritten notes, old recordings or the memories of elders.
CultureQ offers a solution that resonates deeply:
• A secure platform designed around data sovereignty
• Support for audio, video, documents and oral histories
• Automated transcription and metadata
• A private AI that uses only community-approved content
• Guardrails shaped by tikanga and cultural protocols
Steven describes CultureQ’s AI as a kaitiaki — a guardian — trained to behave with the respect and contextual awareness that elders expect.
This philosophy is a beacon in the AI landscape and was a major reason Kiwa’s work captured the hearts of voters.
What Enterprise Leaders Can Learn from Kiwa
While CultureQ was built with Indigenous communities, its principles apply directly to CIOs, CTOs and transformation leaders:
• Trust must be earned through education, transparency and patience
• Governance matters as much as the model itself
• Data sovereignty is not optional for sensitive content
• AI needs clear boundaries to maintain integrity
• The value of AI is determined by its impact on people, not its novelty
Kiwa’s approach offers a blueprint for responsible AI adoption in any organisation dealing with identity, safety, risk or community expectations.
A Vision Focused on the Next Generation
Steven closes the episode with a perspective that embodies why Kiwa won the People’s Choice Award.
The true success of AI is measured not by efficiency but by its effect on the next generation.
Will it give them confidence, opportunity and a stronger connection to their culture and identity?
Kiwa is building technology that preserves the past, empowers the present and strengthens the future. Their work ensures young people see themselves reflected in the digital world and have the tools to express their creativity and heritage.
That is leadership worth celebrating.
Conclusion
Kiwa’s win of the HSO People’s Choice Award is more than recognition. It is a signal that New Zealand innovation, rooted in culture and community, can influence global technology and set new ethical standards.
Their work with VoiceQ and CultureQ shows how AI can be both powerful and principled when designed with purpose, respect and collaboration.
For enterprise, and public sector leaders navigating the complexities of AI, Kiwa offers a clear lesson:
What is good for Indigenous communities is good for the world.
Ready to explore responsible AI for your organisation?
Kiwa’s story shows how the right foundations, governance practices and cultural intelligence can turn AI into a catalyst for trust, creativity and long term value.
If you are looking to design or accelerate your own AI strategy, our team at HSO can help you move from ambition to measurable outcomes.
Connect with our experts by completing the form below and we will be in touch to discuss your goals and challenges.