By: Adi Agrawal and Dmitri Mirakyan, co-founders of Creed
In our modern, digital age – the need for a strong relationship with God has only become more important, and technology can be a tool to help guide people, especially youths, in their pursuit of the Lord.
We created Creed with the best intentions in mind: helping people who need a reminder of their values, beliefs, and purpose to keep going. Creed leverages AI chatbots to blend technology with faith-inspired reflection, giving users, including teens, a safe space to explore their inner lives and receive encouragement grounded in spiritual wisdom. The chatbots don’t just stop at the screen — they nudge people toward deeper connection in the real world, pointing them back to their churches, small groups, and Bible studies.
But today, technology that dares to honor faith is at risk. Assembly Bill 1064, authored by Assembly member Rebecca Bauer-Kahan, could unintentionally sweep away platforms like Creed under the guise of protecting young users. The bill says, in part:
“An operator shall not make a companion chatbot available to a child unless the companion chatbot is not foreseeably capable of any of the following…Prioritizing validation of the user’s beliefs, preferences, or desires over factual accuracy or the child’s safety.” It also prohibits chatbots that can provide “mental health therapy,” a term the statute neglects to define.
On its face, the statute seems well-intentioned. Of course we want technology to safeguard children, to prevent misinformation, and to ensure that apps are not manipulating vulnerable users. But the phrasing is so broad and vague that it risks eliminating an entire category of supportive, faith-based tools.
Consider what this means in practice. If a teenager uses Creed to process the grief of losing a loved one, the app might encourage them to lean on prayer, or to remind them that scripture teaches that they are not alone. This may not be interpreted as “fact” according to lawmakers – but as belief. And yet, for billions of people around the world, “belief” is precisely what makes life bearable in hard times. Under AB 1064, simply validating that belief could be construed as “prioritizing” it over factual accuracy, and therefore banned. Similarly, the “mental health” language of AB 1064 might prevent teens from using tools like Creed to seek faith-based emotional support, which is precisely the kind of support that people sought through religion.
This is a dangerous precedent. We risk reducing “safety” to a sterile exchange of data, cutting off young people from the kinds of spiritual encouragement that families, communities, and entire cultures have relied on for generations. Technology should not be a substitute for human connection, but it can be a bridge. Faith-based apps like Creed do not seek to replace parents, pastors, or mentors; they are about putting faith-based encouragement in someone’s pocket when they need it most.
The real question isn’t whether children should be shielded from harmful technology—they should. The real question is whether lawmakers will recognize the difference between exploitation and support. If AB 1064 passes as written, it will lump together predatory tools with those designed to nurture, encourage, and strengthen.
Faith has always been a resource for resilience. Our digital tools should be allowed to reflect that reality. Lawmakers must find a balance that protects children without stripping away the very resources that have sustained humanity for thousands of years. Because sometimes, what a child—or an adult—needs most isn’t just a fact. It’s faith.
About the Author:
Adi Agrawal and Dmitri Mirakyan are the co-founders of Creed, which is building the first AI rooted in Christian values. Creed’s first product is an embodied AI companion for young believers to grow closer to God and to their faith community