5 serverless security concerns to watch out in 2018
The serverless approach to application architectures is fundamentally changing how we build and manage software. It should come as no surprise that with serverless, security matters are also different from what we’re used to see in traditional server-based applications.
Moving to a cloud-hosted application backend eliminates a number of security concerns. One significant advantage serverless offers security-wise is turning compromised servers and OS-level vulnerabilities into something you no longer have to worry about. Additionally, working with AWS Lambda or Google Cloud Platform reduces the negative consequences of DoS attacks.
On the flip side, serverless security has its own risk factors to deal with. The serveress approach doesn’t introduce new security concerns, but it makes some matters more troublesome. It’s these security matters that this post will focus on. Besides, we’ll briefly explore what issues are equally relevant for server-based and serverless applications.
5 security concerns inherent to serverless applications
There are five global concerns that make you redefine your approach to security with serverless:
- FaaS architectures lack a clearly-shaped security perimeter.
- Data communication between functions and services is more extensive with FaaS.
- FaaS architectures are prone to function bloat and staling dependencies.
- Serverless vendors are more exposed to attacks.
- Multitenancy introduces concerns of its own.
1. Each serverless function needs a security perimeter of its own
By its very nature, serverless computing pushes you towards granular application architectures. FaaS breaks the application backend down into short-lived stateless functions, allowing for unparalleled flexibility.
This flexibility is easy to appreciate if you are a developer. For a security specialist, though, it might have a tricky side effect. Shedding the monolithic structure means that your serverless application will also lack clearly-shaped security perimeter.
With Serverless: each and every FaaS function in a serverless architecture needs a security perimeter of its own. This has several important implications:
- Serverless enforces more responsibility on developers and testing automation engineers to precisely control what each function can and cannot do.
- The serverless approach expands the scope of unit testing to include security. Also, TDD looks like the strategy of choice for a serverless architecture.
- If you’re working with AWS, it’s preferable to use IAM roles to manage what data is accessible to a particular function, user, or service. In case with the Serverless framework, you’ll find that IAM roles are the same for all functions by default. Changing this and tinkering with permissions might prove a wiser approach.
2. More data in transit — more risks
Working with diversified stateless functions has yet another side effect that impacts data security. In serverless applications, functions exchange data a lot more often compared to what you get with a traditional architecture. There’s also more data shared with third-party services, which further increases the amount of data in transit. As a result, going serverless dramatically increases your reliance on strong data encryption.
The use HTTPS and key management systems becomes a must-follow practice. Besides, the rule of thumb with serverless security is to mistrust all input by default, even if we’re talking about input from another FaaS function. Also, the serverless approach requires stricter constraints on the input and output of calls made via the API gateway.
3. Serverless may encourage function bloat, impeding security monitoring
FaaS simplifies the deployment of new code, which is a big reason why companies go serverless in the first place. Deploying a single serverless function barely costs you anything significant in terms of money and time. The same goes for execution — especially if we’re talking about low-volume functions.
While all of the above is great for productivity, keeping the deployment barrier low may complicate security monitoring. Simplified deployment creates an incentive to deploy more and remove less. If you let things pile up, you’ll quickly lose track of who uses which function, which makes it difficult to safely remove unused functions.
Knowing that the serverless security ecosystem still lacks good monitoring solutions, staling dependencies in unused functions can become a huge issue. Being able to dodge this issue is all about diligence and frequent refactoring. Basically, we’re talking about adherence to programming best practices, so everything depends on how good your developers are.
4. Larger surface area for cyber attacks
Any serverless application has to rely on third-party security implementations. Most of these implementations are far more reliable than what a single security team can achieve. This said, serverless platforms are targeted by far more attackers than ordinary applications.
Knowing this, it’s important to keep in mind that adding new third-party implementations to your serverless architecture inevitably increases its surface area for cyber attacks. No system is 100% impenetrable, and attackers can succeed at times, even though this seems improbable with giants like AWS Lambda or Google Cloud Functions.
5. Multitenancy concerns
The term “multitenany” describes how serverless vendors optimize their infrastructure to service hundreds of clients. From a vendor’s standpoint, a single customer is, effectively, a tenant sharing server space with other tenants. This refers to running functions from different tenants on both on the same hardware and within the same hosting application.
Naturally, ensuring that sharing a common space doesn’t create security problems for tenants is a top priority for any serverless vendor. Still, low-probability issues are easy to imagine. In his definitive article about serverless architectures, Mike Roberts theoretises that due to possible hosting application errors, one tenant may potentially access other tenants data. Again, this is a low-probability scenario, but low-probability scenarios are a matter of concern for software security.
What security concerns do serverless and server-based application share?
In addition to raising the concerns mentioned above, serverless shares some problems of traditional architectures. Namely, the issues outlined further are equally relevant for serverless and non-serverless architectures:
- Unauthorised access to databases. From an attacker’s standpoint, there’s no difference between accessing the database server-based or serverless application. Consequently, the same security precautions work for both product development paradigms. Strong encryption and a solid policy for DB access should provide enough protection. Another must-follow practice is to never expose crusical data to the open Internet.
- Dependencies. Application dependencies are difficult to monitor no matter what type of architecture you’re using. Luckily, there are automated tools that help you streamline this process. Using platforms like Snyk should work equally well for server-based and serverless applications.
- Bad code is still bad code, and failing to follow best practices will leave your application vulnerable whether it’s FaaS or your own dedicated server. The only way to guard your product against things like SQL Injection and Cross-Site Scripting is to work with developers able to stay on top of the security-related best practices. This brings us to one additional point.
Where to find skillful serverless developers?
Being able to protect your serverless application against attacks largely comes down to how good your developer team is. If finding great developers has been a challenge, you’re in the right place!
AgileEngine is a team of software development professionals who have been building awesome software for over a decade. Our serverless expertise includes products developed for both startups and well-established companies, and our portfolio includes quite a few world-known brands. Whether your product uses serverless computing or revolves around your own server infrastructure, we can take it to new heights.