Serverless computing, more accurately termed Functions as a Service (FaaS), is a very compelling format. Developers no longer need to be concerned with compute instances and can instead stay focused on the code logic.
Most enterprise customers are only dipping their toes into the technology, just as they are with microservices and containers. The question we are often asked at CTP is whether they can go beyond a toe-dipping exercise and apply it to a “real” application.
It is worth noting that Microsoft moved Azure Functions to general availability in December 2016, and Google Cloud Functions moved from private alpha to public beta, as announced at Google Cloud Next this year. However, AWS Lambda has been in general availability since May 2015, so this article may be more relevant to AWS users.
Certainly, in the startup world, there is clear evidence of its use. The idea of running solutions on a shoestring budget is attractive for companies that are just at the stage of proving out their concepts, and the serverless model can be a fifth of the cost compared to more traditional cloud architecture. In addition, increased speed is always critical to startups and the simplified serverless model allows teams to develop solutions quickly without the overhead of infrastructure architecture and coding.
Five Issues To Consider
For the enterprise, the serverless model is just starting to get some attention. Below are some of the issues that are currently under scrutiny around the serverless computing model. These are not showstoppers, but rather things to consider as you architect serverless solutions:
The same best practices apply to serverless functions as you would be expected to apply in your server-based deployments. Some ideas to consider:
- Identity management is still a first line of defense.
- One interesting pattern that has emerged is a function wrapper that passes the trigger event input to a security analyzer (i.e., Alert Logic), and only proceeds to the main function once a content OK result is returned.
- Use of API Gateways as a protective front end to your endpoints.
This has been cited as a tricky part of the serverless equation, but, for example, AWS uses Amazon CloudWatch as the de facto mechanism to monitor your AWS Lambda functions, which makes it a seamless addition to the rest of your standard AWS monitoring.
Depending on the language you choose for your serverless functions, you may find issues with the version supported by the service. For instance, earlier this year AWS Lambda supported Node.js v4.3.2 whereas the recommended version was v6.10.0. At the same time, AWS Beanstalk supported v6.9.1. Now they both support the recommended version v6.10. So even maintaining consistency across services within a given cloud provider can be a challenge and you should make sure you are clear on current versions and any potential issues that may arise as a result.
The various cloud providers have different limitations built into their serverless offerings, including constraints such as maximum execution duration, maximum request and response payload sizes, maximum temp disk space, concurrent executions, etc. Note that some of these limitations can be increased upon request from the provider.
Real World Examples
Bustle.com is a news, entertainment, lifestyle and fashion website with over 50M monthly readers. Bustle was struggling with scaling issues and infrastructure management overhead. They moved to a serverless architecture, halved the operations team and experienced over 80% overall cost savings.
Moonmail provides an email marketing platform for the eCommerce community. They replaced their existing EC2 and Ruby on Rails environment with a complete serverless solution and achieved virtually infinite scalability. They leveraged the Serverless Framework and were able to implement in only 2 weeks. Results like this really speak to the ultimate goal of innovation at speed.
Abstract.AI (Abstract), based in Los Angeles, develops software and services. They partnered with Brainitch, another Los Angeles-based firm that offers personalized, 1-to-1 marketing to artists on Facebook Messenger and related platforms, to create a bot for Laidback Luke’s (an electronic music artist based in the Netherlands) birthday bash. Their initial legacy architecture was not handling the spikes in traffic. They too leveraged the Serverless Framework, redeployed the new architecture in 2 weeks and achieved a 95% reduction in costs while eliminating scalability issues.
Other Use Cases For Serverless
There are many use cases for utilizing serverless today. Let’s take a look at a few.
One of the first we implemented at CTP was a serverless build/deploy pipeline that bypassed the traditional Jenkins server for CI/CD. We utilized Atlassian’s new Bitbucket Pipeline service for the build, writing the build artifacts to AWS S3. This triggered an AWS Lambda function that handled the deploy (see Figure 1.). A second iteration leveraged Terraform scripts called right from Bitbucket Pipelines, streamlining the process further. Given the importance of the build/deploy process on productivity and speed, we will continue to experiment and optimize approaches that eliminate any unnecessary infrastructure management and cost.
Data processing pipelines are another case where serverless shines. Figure 2 shows an example from Werner Vogels’ blog. Data pipelines can be assembled with various cloud-native components, including ingestion (i.e., AWS Kinesis), processing (i.e., AWS Lambda), analytics (i.e., AWS EMR) and storage (i.e., AWS S3, DynamoDB or Redshift). Serverless processing components are great for ETL functions and can easily be chained together to avoid potential execution duration limitations.
The microservices architecture pattern is another area where serverless functions are gaining momentum. Figure 3 shows a simplified cloud-native architecture for a microservice function. A more traditional architecture might include EC2 instances in an auto-scale group behind an ELB in the application layer and AWS RDS for persistence. But leveraging AWS Lambda and DynamoDB is an option that can provide scalability at lower cost
Summing It Up
Serverless is relatively new, but it is gaining an enthusiastic following at a rapid pace. As more people experiment they are finding it easier to provide scalable solutions while keeping costs more directly tied to usage. This is extremely attractive.