Go back

How switching to a composable DXP will affect security

CS_-_Cybersecurity_-16.9.png

The top priority for any business is protecting sensitive information from cyberattacks, and the effectiveness of your cybersecurity measures largely depends on your tech stack. There are a number of  benefits of going composable, and a key one is that composable DXPs can offer better security than monolithic solutions. Read on to learn:

  • How going composable can improve your organization’s cybersecurity 
  • What you need to know to make your composable tech stack as secure as possible

What is composable architecture?

Composable architecture breaks down the large and complex functions found in monolithic solutions into smaller, more manageable pieces. An API acts as the go-between for these smaller pieces, allowing them to communicate and transfer information more efficiently. In a composable CMS, the front-end and back-end layers are decoupled, so changes can be made to the front end independent of back-end functions. 

There are a variety of benefits of moving to a composable DXP, including reduced IT costs, more streamlined processes and functions, easier updates and, when properly implemented, better security.

What are the biggest threats businesses face?

Cyberattacks have always posed a risk to businesses, but that threat has grown in the past decade. Businesses have beefed up their cybersecurity measures when it comes to some of the more common threats like phishing and malware; unfortunately, hackers have responded by developing more sophisticated cyberattacks that are harder to spot — and more difficult to guard against. 

Today, businesses face a slew of cybersecurity threats. Ransomware attacks hold entire networks hostage. Endpoint attacks are on the rise, thanks to the shift toward remote work and, in turn, the number of off-site Internet of Things (IoT) devices connected to business systems. Supply chain attacks exploit security weaknesses in third-party vendors or providers to gain access to their partners’ systems. And even though we are better trained to spot phishing attempts and avoid malware, these strategies still work often enough that hackers continue to use them. 

Composable DXPs provide the flexibility to employ cutting-edge cybersecurity measures to protect against cyber attacks and data breaches.

The security benefits of going composable

A strong cybersecurity strategy is especially important with composable DXPs. As noted above, a composable approach offers the ability to break the large, single-suite functions of monolithic platforms into smaller components. This allows for more customization options, as organizations can pick and choose the specific programs and functions they need to deliver a top-tier digital experience. But each individual piece has its own security requirements and vulnerabilities, and your cybersecurity strategy needs to account for all these differences so there are no holes to exploit. 

When moving to a composable DXP, a key first step is to define your security needs and identify the security tech stack that best meets those needs. This will serve as the foundation of your cybersecurity framework, and all the functionality that follows needs to fit within it. The benefit is that it makes it much easier to identify and isolate any vulnerabilities in your security. With monolithic systems, spotting security risks or finding the source of a breach means combing through the entire system. With a composable DXP, it’s much faster and easier to go through each individual function and make the necessary adjustments to secure your system. 

How to properly secure your composable tech stack

Breaking out functions into individual components with a composable DXP solution creates more endpoints that can be vulnerable to cyberattacks. But even though there are more potential points of access, there are also more ways to secure your systems. 

API management platforms make it easy to track API usage and integrate up-to-date security protocols like OAuth and OpenID. That allows you to control who can access and use critical applications and data stored in cloud services, and with authentication processes to verify user IDs, you can catch any security threats before a breach occurs.

To secure your composable DXP, these functions are essential:

  • End-to-end encryption
  • Access controls 
  • Authentication: Encryption keys; 2FA; securing IoT devices
  • Data protection
  • Detailed monitoring

Implementing these functions and tailoring them to the unique needs of your composable DXP helps ensure that the sensitive data in your platform is protected from cyberattacks.

Data security in your composable DXP

When it comes to brand interactions, today’s consumer expects a personalized experience, but in order to create a robust customer journey, you need to gather data about your customers. Consumers are willing to provide that data if it means a better digital experience — but they also expect that their sensitive information will be safe in your hands. 

The financial cost of a data breach can be massive, but it’s nothing compared to the damage your organization’s reputation will suffer if your customer data is exposed due to a security breach. Fortunately, your composable DXP strategies can help provide better data security. 

With a monolithic system, if your critical infrastructure is breached, all your customer data is exposed. A composable DXP allows you to create modular data pipelines that connect to each individual component and the relevant data, rather than a single large block that contains all your data, as is the case with legacy systems. With composable, you can scale up or down and implement or remove components based on your security needs. And if a data breach does occur in one component, the scope of the data exposure is usually limited.

Securely meeting consumer demands

The customer experience is delivered across different parts of your composable DXP, from your headless CMS to your marketing stack — and it all needs to be supported by a robust cybersecurity strategy that meets or exceeds industry standards.

Cybersecurity threats come in all shapes and sizes, and cyberattacks can come from anywhere. To combat those threats and protect your system, your cybersecurity strategy needs to address all the potential risks. Your technology also needs to be flexible and adaptable in order to guard against new threats as they arise. Going composable allows you to build your tech stack to match your security strategy, and vice versa.

It’s important to remember that ensuring a safe and secure experience goes beyond adding security protocols to your tech stack. Rather, it’s about deploying the right technologies and data protection programs and practices for the unique needs of your organization. 

Learn more

Learn more about composable architecture in our blog post, “Why composable architecture is the future of digital experience.”

Schedule a free demo to learn how Contentstack can help you create a secure composable DXP solution that best suits your organization’s needs.

 

You may find interesting

Learn how to drive business forward and build better customer experiences.

3 ways tech and business teams can help each other through a transformation

Not all great tech transformations are brought to the table by a developer, engineer, CTO or other technical person. We see great projects kick off because someone in marketing, sales or customer success raises the flag for change. Sometimes business people can see opportunities that aren’t as plain to the tech teams.That’s what happened when Booking.com decided to transition off its old systems to a headless CMS. Juliette Olah, senior manager of Editorial, realized that her teams had produced thousands of pieces of content over the years — but the capabilities of their current technology significantly limited the value that content produced in their local markets and possibilities for the future.Listening to her “People Changing Enterprises” episodes, I admired the way Juliette united Booking.com’s product and editorial teams from the beginning to pull off their transformation. Business and tech can either be each other’s biggest advocates or frustrating roadblocks.To avoid the latter, these are three examples of how tech and business teams can support each other throughout a transformation.Thoroughly debrief at the onsetAt the beginning of every project, we encourage organizations to sit down with their cross-functional teams and level set. Business and tech have their own KPIs and goals to achieve. In a project that bridges the two, there should be a frank discussion, ending in clear, written requirements of process and goals for both sides.Once Juliette realized a tech transformation was the answer to the editorial team’s needs, she became the living bridge between the editorial and product teams. Sitting down with tech stakeholders, they talked through what Juliette called a “comprehensive 360 view of the benefits to the technical side of the platform”:The editorial team’s strategy and the justification behind the new technologyReal-life examples of what execution would look likeThe business value of a central headless CMS would bring to each local marketOpportunities they were currently missing out on because of their current toolBecause she had clearly done her homework and demonstrated the need on both sides, the product team was eager to get started.Partner up to find and test new toolsFinding and testing new tools is an easy way for business and tech teams to partner effectively. When the new CMS was in place, the teams at Booking.com partnered to try out the new tool to make sure it worked for both sides.At Contentstack, once a new solution is initially developed, we pull in our business partners for User Acceptance Testing. They can test, catch bugs, or point out which workflows function trickier than anticipated — versus the tech teams doing it all themselves.Additionally, when you’re on the hunt for new tools, tag a business partner in for their opinion. From a different mindset, they might be able to raise questions or point out benefits you didn’t consider. Work together to phase out what isn’t neededA transition to composable is the perfect time to evaluate what tools you’re bringing into the new environment — and which ones you should retire. This is another area that tech and business teams should work together.A few years ago, we bought an analytics tool for the organization to use on their reports. It was low-cost and met some of our needs upfront, so we decided to take a chance on it. Six months down the road, we were spending a huge amount of time trying to force the tool to work. When a business person came to me and admitted the tool wasn’t helping their team meet their objectives, we decided to look into something else. On a composable project, it’s not always clear on the back end if a tool is working for teams as they need. That’s where our business partners come in. It’s a partnership.Babe Ruth once said, "The way a team plays as a whole determines its success. You may have the greatest bunch of individual stars in the world, but if they don’t play together, the club won’t be worth a dime."It’s easy for business and tech teams to work in silos, but working together produces more value. Especially in a tech transformation, the two teams are different sides of one coin. Find ways to bridge the gap and you’ll see much more value in your resulting tech stack.

From legacy systems to microservices: Transforming auth architecture

Contentstack receives billions of API requests daily, and every request must be validated to be a valid Contentstack identity. It is a common industry practice to achieve this using some sort of “identity token" for every request. Imagine having multiple types of identity tokens, such as session tokens, OAuth tokens, delivery tokens, management tokens, etc. The problem of securing billions of API requests daily can be challenging. We decided to address this by spinning up a new team that handles the complex problems of user authentication and authorization in a token-agnostic platform.Our transition journey Contentstack started as an API-first headless CMS platform that allowed content managers to create and manage content while simultaneously and independently enabling developers to use Contentstack's delivery API to pull that content and render it to create websites and applications. This means that Contentstack’s traffic increases proportionately to the traffic received by our customers' websites and applications.With increased traffic and usage, we catered to various new use cases by developing new features. These features were powered by a set of microservices, each catering to a particular feature domain and needing support for processing multiple identity tokens that had roles and permissions associated with them. The whole system had turned out to be quite complex, and performing auth had become a great challenge. This prompted us to redesign our auth architecture, which addressed the issues of being a token-agnostic and low-latency platform.Read on to learn more about this journey and how we have been able to:Transition from a monolith to a low latency microservices-based auth (authentication plus authorization) and rate-limiting architecture.Set up centralized authentication for multiple (any domain) microservices that are part of the same Kubernetes cluster.Set up decentralized and self-serviced, policy-based authorization for internal services and teams.Increasing feature sets increased domain microservices, which increased the complexity of performing auth.Monolithic auth architectureMonolithic architectures can be difficult to maintain, scale and deploy. In a monolithic architecture, user authentication and authorization are typically tightly coupled with the application code, making it difficult to implement and maintain robust security measures. Monolithic architectures often rely on a single authentication and authorization mechanism for the entire application, which can limit the flexibility of the system to accommodate different types of users or access levels.Performing auth in a typical monolithic architecture.In monolithic architectures, the steps involved in auth are the following:Users use their credentials at the client to generate a session token or use an existing identity token to generate other identity tokens.Users then use the generated identity token to perform a business operation by making a request to the application server.Once a request is received at the application server, the authentication middleware authenticates the token and forwards the request to the business module.The business module performs the business operation based on the authorization rules applied to the user identity.Problems with monolithic auth architecture:Authentication and authorization logic is mixed with the business logic.Changing the way an identity performs an operation on a resource involves a change in the associated auth-related logic.Each domain individually implements the authorization logic, causing a difference in implementation.Since authorization logic is deeply nested in business logic, we lack visibility into authorization rules applied to a resource.Shipping of new authorization logic requires a fresh deployment of the application image.New microservices require knowledge of various identity tokens and resource authorization rules to be applied.Microservices auth architectureMicroservices offer a more flexible, modular approach that allows for easier maintenance, scalability and deployment. With microservices, each service can be developed, deployed and scaled independently, allowing for faster time-to-market, improved fault tolerance, and better alignment with modern development practices. Additionally, microservices offer more efficient use of resources and better support for diverse technology stacks.AuthenticationWhy centralized authentication?Centralized authentication is a security model in which a central authority manages authentication, such as a server or service, rather than it being distributed across multiple systems or applications. There are several reasons why centralized authentication is commonly used and considered advantageous, including increased security, simplified management, improved user experience and lower costs. While there are some drawbacks to centralized authentication, such as the increased risk of a single point of failure and increased complexity in managing the central authority, the benefits often outweigh the risks. Centralized authentication and rate-limiting at the edge of the service mesh.The steps involved in the centralized authentication process are the following:Any incoming request to the Kubernetes cluster first lands at the Istio ingress gateway.The request containing the identity token is proxied to a central authentication gRPC service with the help of envoyproxy's external authorization filter.The central authentication service queries Redis with the identity token and metadata associated with the request.Redis responds with the identity associated with the token and the current rate-limit count based on the request metadata.The central authentication service responds to Istio with either of the following:Authenticated response with user context attached to the request in the form of request headersUnauthenticated responseRatelimit exceeded responseAn authenticated request containing the user context is then forwarded to the upstream service.Advantages over the monolithic architecture:Easier to onboard newer microservices to central authentication service by using label based istio-injection.All requests are authenticated and rate-limited at the edge of the service mesh, ensuring that each request entering the cluster is always rate-limited and authenticated.The request forwarded to the upstream microservice has user identity context attached to it in the request headers, which can be further used for applying authorization rules.Keeping centralized authentication eliminates the problem of multiple mutations performed by the upstream microservices on the identity of the token.Authorization Centralized authorizationWe tried a model where along with authentication and rate limiting, we also added authorization as a responsibility of the central authentication and rate limiting service. The service would first identify the incoming request’s identity from the token and apply rate limiting based on the request metadata. Once the user identity is known, authorization rules could be applied to the user’s identity, thereby performing the entire Auth at the edge of the service mesh. Problems with this model are the following:This model could only perform basic authorization at the edge based on the request metadata provided, such as validating organizations, stacks, etc. However, it could not perform fine-grained authorization, such as finding out which content types the logged-in user had access to.For RBAC, each domain has its roles and permissions associated with it; performing authorization for such requests requires knowledge of the upstream domain and leads to the addition of domain-specific logic in the centrally managed domain-agnostic platform.With newer domain microservice additions, this again would lead to the problem of lacking visibility into authorization rules applied to a resource.Distributed authorization with central authorization serviceWe then tried implementing a model where we distributed authorization to the upstream microservices where each upstream microservice makes a call to a central authorization service. The authorization service has access to all the roles and permissions of different domains and was able to give appropriate authorization results. Authorization could now be performed from the upstream service’s business module by making a network request using Kubernetes cluster networking to avoid making a call over the internet.Problems with this model are the following:The central authorization service becomes a single point of failure.Any change in the API contract defined by the central authorization service requires all the upstream services to abide by it and makes shipping these changes independently a complex task.Performing authorization adds a network hop, thereby increasing the latency.Distributed authorization with the sidecar patternLearning from the previously discussed disadvantages, we wanted to build a model that had authorization distributed, low latency and made shipping authorization logic an independent activity. ArchitectureThe architecture involves the following components:Auth sidecarCentral policy serviceAuth SDKArchitecture for authorizing an authenticated request with the sidecar pattern.Auth sidecarThe auth sidecar is a gRPC service that gets injected along with the microservice’s application container in the same Kubernetes pod. Let’s understand how this architecture helped us tackle the previously mentioned problems.Single point of failure: The auth sidecar service runs with the application container in the same pod, and any case of failure is only limited to the current pod. Restarting the pod gives us a fresh set of application and auth sidecar containers.Independent delivery: Since the auth sidecar service container is shipped along with the application container, the application service can decide which version of the sidecar image to use, thereby making the delivery of newer versions of the authorization sidecar independent.Low latency: There is no network hop involved in making a gRPC call to the auth sidecar running in the same pod. This helps the application to get the authorization result with very low latency (in a few milliseconds).Updating authorization logic: The auth sidecar periodically downloads fresh policy bundles; any time there is a change in policy bundle coming from the central policy service, the auth sidecar updates its local policy cache with the new bundle.This way, updating authorization logic does not involve a fresh deployment/restart of the application container.Components involved in auth sidecar Responsibilities of the components involved in the authorization sidecar.Aggregator: The responsibility of the aggregator is to fetch authorization-related data for the current identity based on the metadata provided by the application service in the gRPC call. It then aggregates it to be evaluated against the authorization policy.OPA Engine: We use OPA (Open Policy Agent) to periodically download fresh policies and evaluate the policy path mentioned in the gRPC call against the aggregated data.Central policy serviceThe central policy service is a repository of policy bundles (*.rego files) which are independently managed by the domain microservices. The maintainers of the domain microservices create these policies for various resources that need authorization. Since these policies only involve rules, it greatly increases the visibility of authorization rules being applied to a particular resource.Auth SDKThe auth-sdk is an internal library that we developed that helps the developers of upstream microservices to easily communicate with different auth components. It can do the following:Extract user identity and other useful information attached in the request headers by the central authentication serviceDiscover various auth components and streamline communicating with themExpose different helper methods to perform any auth-related activity on behalf of the application serviceRedesigned (new) architecture:Tracing the request lifecycle in our redesigned auth architecture.ConclusionMicroservices-based architectures can help address some of these challenges of monolithic architecture by separating user authentication and authorization into individual services, which can be developed, deployed and maintained independently. This approach can provide greater flexibility, scalability and security for user authentication and authorization.However, it's important to note that transitioning to a microservices-based architecture can also come with some challenges, such as increased complexity and a need for more advanced DevOps practices. Proper planning, implementation and ongoing maintenance are crucial to ensuring a successful transition.

How to make the most of your composable DXP with ChatGPT

ChatGPT has gotten plenty of attention in recent months. The chatbot, developed by OpenAI, recently made headlines for (among other things) passing law exams, a Wharton business management exam, and parts of the US Medical Licensing Exam, all without any external input. ChatGPT reached 1 million users within five days of release; by comparison, it took Spotify five months to reach that threshold. For digital brands, the arrival of ChatGPT is a game-changer — especially when it comes to creating content and content management. Read on to learn how to use ChatGPT with a composable DXP.What is ChatGPT?ChatGPT is a natural language processing tool powered by AI technology. Using reams of text from the open internet, the program is asked to predict the next word in a text string, over and over. Over the course of trillions of predictions, GPT has learned not just how to predict the next word, but also use its training data to answer questions and complete tasks assigned to it. As you can imagine, this presents a lot of opportunities for brands to enhance their digital experience platforms.How can ChatGPT integrate with a composable CMS?Although ChatGPT was not designed specifically to work with content management systems, there are ways to integrate it into your CMS and leverage its capabilities to improve the digital experience.Chatbot engineWith most chatbots, user communication is on rails. Standard chatbots only recognize specific prompts, and depending on how the chatbot is designed, those prompts might not cover everything users may need. ChatGPT is able to have real, natural conversations with users for a more robust and personalized experience. Using ChatGPT as the conversational engine for the chatbot, the CMS can use relevant content — such as articles, FAQs, product information, and more — as training data, which ChatGPT can then use to answer user queries. ChatGPT can also enhance the chatbot's ability to understand and respond to natural language queries, so users can ask questions however they want and still get the answer they’re looking for. The benefit for your team is less time spent designing the chatbot to respond to every possible user query (and every possible variation of those queries). ChatGPT can also connect to your CMS via custom API integration. When users interact with your CMS, the web service can run the user’s query through ChatGPT, and ChatGPT’s response can then be returned to the web service to be displayed to users. Whether you use ChatGPT to power a chatbot or integrate it with your CMS, users benefit from a more engaging and personalized experience — and your customer support team benefits from the automation of some of their more time-intensive tasks.Marketing automation and personalizationGood marketing is informative. Great marketing is informative and has a personal touch. Generic or broad automated marketing is unlikely to win new customers; instead, your marketing needs to address each user’s unique preferences in order to win their business. Unfortunately, creating highly specific content that speaks to each individual user is incredibly time-consuming. ChatGPT can take information about your customers’ interests, preferences and behaviors and use it as training data to create more personalized and targeted marketing content in a fraction of the time. In other words, brands don’t have to choose between reach and personalization — ChatGPT can provide both.Coding suggestions for front-end and back-end developmentReviewing code is a necessary part of updating and enhancing your composable digital experience platform, but it can be a major time and resource drain for your DevOps team. Since ChatGPT is a language model, it can read and understand code written in a variety of programming languages — which can save you time and stress as you prepare to roll out new features or functions on your platform.Strictly speaking, ChatGPT is not able to “review” code in the way a developer would, because ChatGPT doesn’t understand the context of the code the way a human developer does. Still, ChatGPT can answer specific programming questions for developers, review code syntax and point out any potential logical flaws in the code that can save time on debugging. Content creation and testingEffective content helps attract new users to your site and delivers a quality experience that will encourage repeat visits. Producing high-quality content requires a lot of time, effort and resources. ChatGPT can handle many of the tasks that might otherwise require you to hire a team of writers or content specialists.ChatGPT’s role in content creation is entirely up to you and your needs. You can use ChatGPT to generate content ideas by providing prompts and suggestions related to your business and letting ChatGPT suggest topics that will resonate with your target audience. ChatGPT can also be used to test existing content: Simply provide the AI model with existing pieces of content and ask for suggestions, and ChatGPT will offer suggestions to make the content more engaging, more informative or more SEO-friendly. On the other end of the spectrum, ChatGPT can also be used to create outlines or even entire pieces of content from scratch. With a bit of guidance in the form of prompts or content topics, ChatGPT can write full articles and blog posts, which can then be edited and customized to match your brand’s voice and messaging. It’s important to note, however, that while ChatGPT can drastically reduce the amount of time spent on content creation, there is still value in a human element. Users may eventually learn to spot AI-generated content, and your SEO can suffer if your content bears too many AI fingerprints. When used effectively, ChatGPT can help reduce the time your team spends on content creation and the tedious tasks that go with it.How Contentstack AI Assistant leverages the power of ChatGPTContent creation is a crucial part of building a connection with your audience, and ChatGPT is poised to transform the content creation and publishing experience. The new Contentstack AI Assistant harnesses the power of ChatGPT to help teams create brand-specific content in seconds. By adding ChatGPT to your entry editor through in-line UI extensions, the Contentstack AI Assistant makes it easy for content teams to leverage ChatGPT’s capabilities however they choose. Want to experiment with content creation? Quickly and easily build outlines for content writers? Create summaries, outlines, metadata tags, and headlines so editors and publishers can move on to other tasks? Translate content for global users? All this (and much more) is possible with just a single prompt to the Contentstack AI Assistant. The possibilities for your content are endless. With Contentstack AI Assistant, it’s never been easier to tap into those possibilities to take your content to a whole new level and get the most from your composable DXP. Learn moreWatch this episode of "Contentstack LIVE” with Contentstack VP of Product Conor Egan to learn more about the power of generative AI.Schedule a free demo to see how Contentstack can help you leverage the power of automation to launch digital experiences at the speed of your imagination.

Join the conversation

Community

Join the Contentstack Community to find more answers and help from experts and users.

Join

For executives

Leverage composable tech to drive business forward

Learn

For digital leaders

Learn how to deliver better digital experiences, faster

Learn

For developers

Learn how to build better digital experiences

Learn