Leveraging Edge Computing to Simplify and Scale

Published: 14/September/2022

Reading time: 8 mins

Meet the Experts

Key Takeaways

⇨ Moving workloads to the edge can help deliver higher performance at the point of transaction

⇨ Using containers in conjunction with moving workloads to the edge can limit exposure while allowing for easier management and updating of deployments

⇨ Containerization in conjunction with edge processing can distribute critical processes and improve performance

Over the past decade, there has been a significant shift for many organizations in the infrastructure supporting their business activities. From an environment that was on-premise, many organizations are moving most of their operations to the cloud. There are several reasons for this move, including cost, ease of management, scalability, and flexibility of infrastructure, and reduced administrative overhead. But while there are benefits to moving to cloud-based infrastructure, there are also challenges that stem from a more complex environment as well as increased transaction volume. These challenges need to be resolved if systems are to maintain performance and usability. One means of addressing these challenges is by leveraging edge computing.

Transaction Volume and Complexity are Increasing

To gain insight into how organizations are addressing these challenges, SAPinsider sat down with Eric Christian and Matt Thoman from Vertex. Christian is the Principal Architect of Edge Solutions at Vertex, and Thoman is a Product Manager and Retail Solution Owner. Being a tax solutions company, Vertex works with some of the largest organizations in the world to incorporate a tax compliance strategy into their systems, including tax calculation at the point of sale. This could be for an online purchase, or in any of hundreds or thousands of retail locations in different cities, states, or countries. Being able to perform and process these transactions in real time without disruption is critical for both customer satisfaction and system efficiency.

For many companies, having a single location where these calculations can be performed works fairly well. But what Christian says is that he is seeing an increasing trend toward organizations having a much larger volume of transactions to process, which can stress a centralized model. Thoman says that he sees this beyond just the areas in which Vertex operates. “Companies need to do things from a technology and user experience perspective in real-time,” Thoman stated. “Historically that was done locally with on-premise software. But the shift to the cloud has created a gap with key components that, from a performance perspective, were sometimes better suited to on-premise solutions.”

An example of this might be a retailer running an e-commerce website. They may host the site in multiple locations in the cloud to reduce latency, some in the US and others spread across Europe and Asia. They’re doing this to distribute performance and provide a better end-user experience that’s resistant to latency in internet traffic. But, while the internet is very resilient and traffic will always be re-routed to avoid issues, that may increase latency between those locations and the central server. A high volume of transactions coming from many locations can also make effective scaling difficult. Both these concerns can have a significant impact on performance when performance is critical.

Moving Processing to Edge Computing

How many organizations are addressing this change, according to Christian, is by employing containerization and moving workloads to the edge. This allows organizations to distribute the key components of their software architecture and do it in a way that uses standards that haven’t really existed before.

“Containerization is a beautiful thing because, when you build the container, it includes everything that you need to perform a given function,” said Christian. “That includes the operating system, an application server, and even a database. Everything that is needed for the specified process. And that function is going to run identically on every computer that it runs. So not only is it easy to distribute, it is also guaranteed to run the same.”

Thoman sees a similar thing happening with the customers he works with. They are shifting workloads to the cloud or have already done so. But all these organizations are talking about the problems that have been created by that move and are looking to enable edge computing to manage some of their key processes. But Thoman also emphasizes that edge computing is about taking real-time processes and running them locally while keeping them connected to the central cloud environment.

Well-run edge-based systems push data back to the cloud so that it can be leveraged centrally using powerful data analytics tools that can process massive quantities of data. So, for an organization that is running the same process in 3,000 different locations, aggregating the data from those 3,000 locations to a central data repository provides the opportunity for significant insight. That can then be used to adjust a configuration which is distributed back to the edge, making that overall process more efficient.

Vertex has significant experience with this sort of the change in looking at its own evolution. They started with a data extract that could be placed into point-of-sale (POS) systems because real-time offline tax calculations have always been a part of retail even when technology solutions didn’t exist that could provide this. The process started with having a data file connected to the POS system, but it developed to support a locally hosted tax engine, which was much more accurate than the data file. But that engine was disconnected from the cloud. Newer solutions, like Vertex O Series Edge, run the engine locally but support critical connections to the cloud. This automates getting new tax content to the edge while centralizing transaction data for analytics and compliance. Aggregating the data was a key feature needed to support the end-to-end tax needs of clients. This was created by leveraging containerization as an extension of a centralized cloud model.

Edge-Based Solutions Help Secure Data

While the moving of real-time data processing to the edge using containers or another methodology can ensure that organizations can perform critical processes without concern about latency, the other factor in the equation is that of data. To truly leverage the data that they are collecting, organizations need to move it back to a centralized location where analytics tools can be used to the best advantage. But using an edge-based solution can also help an organization resolve some of its data security issues.

Thoman provided an example of a situation where an organization isn’t using an edge-based solution. In that example, calls made from one data center to another cross firewalls twice—for the request, and for the response. But if an organization is leveraging an edge-based solution that no longer needs to make a request back to a central server, traffic through the firewall is reduced. In addition, edge systems can provide a high degree of control over where transaction data is being centralized. Organizations may need to host data in a private cloud instance for security reasons or host it regionally to maintain regulatory compliance. Edge computing provides the flexibility to meet these, and many other, needs.

Christian agrees. “The best way to secure your data and ensure that it doesn’t get in the wrong hands is to ensure that it never crosses the firewall. That’s where containerization and edge computing really help.”

Thoman also emphasized how a containerized solution can provide even more security. “Rather than distributing an entire on-premise solution in multiple locations, edge allows you to only distribute the components that are needed. There might be no user interface or no login endpoint for the containers that are distributed. It’s only the calculation request and the response, which leaves fewer vulnerabilities in the components that are moved to the edge.”

Containers Provide Simplification and Reliability

Before the advent of cloud-based solutions, when an organization needed to configure something in a remote location, they would have to ship media on which the appropriate files were needed and hope that there was someone at the other end who could do the deployment. If not, a valuable IT resource would spend hours on the phone talking someone through the process. Today that can all be done from a central location.

In addition, the same tools that manage updates in a containerized deployment also manage scalability. Containers are designed to be copied and started up extremely quickly, so a container orchestration tool can be set to respond to increased volume by automatically making a copy of the container and running two in parallel to handle the additional volume.

What Does This Mean for SAPinsiders?

Many organizations across multiple market sectors are facing challenges that relate to ensuring the performance of applications and processes while limiting the impact of a growing volume and complexity of transactions. Edge computing can help achieve this, but what should organizations do if they are to implement successfully?

  • Evaluate whether your current infrastructure is configured to meet changing customer and market expectations. One of the biggest reasons for change in organizations today is the need to update systems to support increased and changing customer expectations. As new sales channels are adopted and integrated into existing business channels, there is an increasing need to combine the performance and stability of local calculations with a centralized configuration. Leveraging containerized and edge solutions is the easiest way to achieve these performance and stability goals.
  • Focus on limiting the exposure of data and systems by reducing the number of times it crosses integration points. Another benefit of edge-based systems is that it can limit the number of times data requests need to pass back and forth between remote locations and central instances. While many organizations focus on protecting data that is already in systems, it is becoming vital that focus be expanded to include data travelling across networks and where it integrates with other systems. This means limiting both user interface gateways and traffic entering and leaving private networks. If this sort of movement can be limited it will improve overall security of your data.
  • Leverage containers to distribute critical processes and workloads to improve performance while limiting exposure. Containers running at the edge allow organizations to have critical processes available to remote systems to reduce latency while providing real-time computing. In addition, software in a container can be limited to only the components essential to a given task, eliminating other components which may reduce performance or open an access port which could potentially be compromised. Using containers also allows for updates to be performed across the enterprise quickly and without the need for manual configuration steps.

 

About Vertex

Vertex, Inc. is a leading global provider of indirect tax software and solutions. The company’s mission is to deliver the most trusted tax technology enabling businesses to transact, comply, and grow with confidence. Vertex provides cloud-based and on-premise solutions that can be tailored to specific industries for major lines of indirect tax, including sales and consumer use, value added, and payroll. Headquartered in North America, and with offices in South America and Europe, Vertex employs over 1,200 professionals and serves companies across the globe.

More Resources

See All Related Content