arrow Products
Glide CMS image Glide CMS image
Glide CMS arrow
The powerful intuitive headless CMS for busy content and editorial teams, bursting with features and sector insight. MACH architecture gives you business freedom.
Glide Go image Glide Go image
Glide Go arrow
Enterprise power at start-up speed. Glide Go is a pre-configured deployment of Glide CMS with hosting and front-end problems solved.
Glide Nexa image Glide Nexa image
Glide Nexa arrow
Audience authentication, entitlements, and preference management in one system designed for publishers and content businesses.
For your sector arrow arrow
Media & Entertainment
arrow arrow
Built for any content to thrive, whomever it's for. Get content out faster and do more with it.
Sports & Gaming
arrow arrow
Bring fans closer to their passions and deliver unrivalled audience experiences wherever they are.
Publishing
arrow arrow
Tailored to the unique needs of publishing so you can fully focus on audiences and content success.
For your role arrow arrow
Technology
arrow arrow
Unlock resources and budget with low-code & no-code solutions to do so much more.
Editorial & Content
arrow arrow
Make content of higher quality quicker, and target it with pinpoint accuracy at the right audiences.
Developers
arrow arrow
MACH architecture lets you kickstart development, leveraging vast native functionality and top-tier support.
Commercial & Marketing
arrow arrow
Speedrun ideas into products, accelerate ROI, convert interest, and own the conversation.
Technology Partners arrow arrow
Explore Glide's world-class technology partners and integrations.
Solution Partners arrow arrow
For workflow guidance, SEO, digital transformation, data & analytics, and design, tap into Glide's solution partners and sector experts.
Industry Insights arrow arrow
News
arrow arrow
News from inside our world, about Glide Publishing Platform, our customers, and other cool things.
Comment
arrow arrow
Insight and comment about the things which make content and publishing better - or sometimes worse.
Expert Guides
arrow arrow
Essential insights and helpful resources from industry veterans, and your gateway to CMS and Glide mastery.
Newsletter
arrow arrow
The Content Aware weekly newsletter, with news and comment every Thursday.
Knowledge arrow arrow
Customer Support
arrow arrow
Learn more about the unrivalled customer support from the team at Glide.
Documentation
arrow arrow
User Guides and Technical Documentation for Glide Publishing Platform headless CMS, Glide Go, and Glide Nexa.
Developer Experience
arrow arrow
Learn more about using Glide headless CMS, Glide Go, and Glide Nexa identity management.

Why we chose Amazon Bedrock to power GAIA, our AI for publishers

GAIA, Glide Artificial Intelligence Assistant, makes creating content easier and quicker for writers and publishers. Behind it sits multiple Large Language Models, all enabled by Amazon's Bedrock service. Why did we choose Bedrock and what advantages does it give?

by Rich Fairbairn
Published: 16:03, 24 October 2023

Last updated: 16:25, 24 October 2023
GPP creates GAIA by building on Amazon Bedrock

We rarely give much of a glimpse behind the curtains at Glide because the ins and outs behind a technical decision is irrelevant compared to the "What does it do for me?" that publishers actually want to know.

We typically make two to three dozen platform updates a year, and if we dug into the technical thinking behind every single one and did a big article on it, well - we'd rightly be accused of being dull company. Or dull-er, some might say.

However, in this case, we think it's useful to give some insight into why we chose Amazon Bedrock to underpin the new GAIA Glide Artificial Intelligence Assistant generative AI tools available to users within GPP, and what the effect of that choice will be to the publishing and media companies that rely on content to drive their businesses.

I'll avoid using over-technological terms and abbreviations and, since it does get asked, this article has not been made or assisted by AI - not that doing so is inherently a bad thing, as we will outline below.

Generative AI and publishers: friend or foe?

No one in publishing and media will be unaware of the rapid impact of generative AI tools and services since OpenAI's ChatGPT (backed by Microsoft) sprung onto the scene last year, followed quickly by Google's Bard. More have appeared since, but those two grab the headlines in part because it hurled Google and Microsoft on a collision course one more time.

We quickly built an in-platform POC to see how they could work for publishers. That's one of the strengths of GPP: the ability to integrate external tools easily, and to its credit ChatGPT made testing it fairly straightforward. What was apparent was how useful this AI could be for our industry - and how careful publishers would need to be in using it.

If you are in publishing, you will be aware ChatGPT and Bard are already somewhat controversial figures. Many publishers think them to be an enemy and are considering legal options should their copyrighted content have been used to enrichen the databases which give the AIs their power.

Meanwhile many news and content sites are also blocking AI crawlers from looking at their content in the here and now fearing they will simply plunder the insight or data which gives value to audiences. It's inconceivable Google would let ChatGPT analyse and reproduce its precious search algorithms to perhaps feed back into Microsoft's Bing, so why should publishers feel any differently about their content being fed into big AIs already worth billions?

One egregious example of seeming copyright theft by Bard experienced by tech publisher Tom's Hardware encapsulated much of the industry's viewpoint that these services were little more than pickpockets with a degree.

Elsewhere were countless examples of GenAIs fictionalising and hallucinating about the facts of just about anything, and as good as they seem you got the feeling they couldn't be trusted, something our Head of Content Intelligence Rob Corbidge has written about here.

And beyond our industry, larger entities were also undecided. Italy temporarily banned ChatGPT within weeks of its launch, and lawmakers around the world are scrambling to get a handle on AI more generally. Lots of change is likely to come to the AI landscape and how it is used.

From our perspective, you can see it's a losing game to try to convince the industry that if they want to use this incredible new technology they will have to do so by climbing into bed with companies it might be in legal dispute with. Especially if they are also worried about their data disappearing into unknown AI training models to boot.

As a backdrop to this there were other concerns, of service reliability and support, pricing, their in-use strengths and weaknesses, of data privacy and contractual transparency, small-print surprises and so on - things easy to forget amid all the handwaving to grab market awareness.

And while ChatGPT and Bard may have bet the house on being the only AIs in town, more are springing up by the day and professional users particularly will expect and need choice. Flexibility is key, because some will survive, and some will not - after all, dozens of social networks withered away before the social network took hold.

AI done safely

What we at GPP wanted was the ability to allow users to easily choose from the increasing proliferation of Large Language and Foundation Models, so they were neither trapped into a deal with only one, or forced to sign off multiple contracts and multiple strands of development to use more than one.

That's what Amazon Bedrock gives us: lots of AI in one place, reliably and safely.

It introduces the understanding of AI as a Service, and gives users the ability to choose and test different LLMs - and testing will let them see that there are enormous variations in results between models. And, it opens up the possibility of building their own walled-garden custom models.

We saw this flexibility first-hand while on the Amazon Bedrock beta programme: new LLMs became accessible in the options pool available to test, while others noticeably improved and were reduced in cost to boot.

It’s a very fast-moving world, and building GAIA on top of Bedrock allows us to do the legwork for users so they get the benefits without getting embroiled in tech or being forced to reset their development cycle every time a new LLM appears.

Elsewhere, more practical matters are also addressed in an efficient way typical of AWS and gives us, as customers, massive reassurance both for ourselves and on behalf of end users. Is customer data shielded from training 3rd party models, how it is supported, how it is charged, the contracts, GDPR and HIPAA compliance and so on. The non-glamorous things that are vital to businesses.

And before you say “Well, you are probably contracted to use Bedrock…”, that’s not the case. Although we use AWS extensively, are part of the AWS Partner Network, and make GPP available via the AWS Marketplace, we are not prevented from using any other technologies or platforms as we see fit.

Our overriding concern is finding a solution that our customers can rely on and be confident in using, that's future-proofed and accommodates changes in the marketplace of AIs, and thinks about the boring things like metadata. For example we are working with the news metadata authority IPTC about what should be baked into images as a minimum and what's beyond what is currently required.

We are really excited that this is where we already are with generative AI in GPP. What’s more exciting though is seeing what impact and benefits it has for our customers, and what they will do with it.