Building and Scaling IoT Projects Quickly with Serverless Architecture

2022-11-16 02:58:53
关注

Building and Scaling IoT Projects Quickly with Serverless Architecture
Illustration: © IoT For All

So you’ve launched a new IoT product, perhaps using the IoT framework provided by AWS, Azure, or another major cloud provider, and your devices can now send and receive data from the cloud. Now, how do you process that data to get valuable insights, such as device health telemetry or user behavior tracking? There are a number of different ways to set up data processing infrastructure in the cloud that trade off control and complexity. Serverless architecture is ultimately a software design principle that allows you to build, scale, and run services without managing the infrastructure, and MistyWest is excited about how this “serverless” pattern can enable teams to rapidly build and scale cloud solutions.

To help you understand how applicable this is for IoT product solutions, we’re providing the following overview of the different architecture patterns and when you should consider going serverless for your project.

'Serverless architecture allows you to completely offload managing servers to the cloud providers while you can focus directly on your application code.' -MistyWest Click To Tweet

Using a Virtual Machine

Now, the old-school way of setting up a cloud pipeline, and the recommended way if you want more control, is to spin up a virtual machine (VM) in the cloud to run your processing code. Azure Virtual Machines, AWS EC2, or GCP Compute Engines are some common options. You get a virtual computer that can run code similar to running on your personal computer. However, the limitation of this route is that you will rapidly run out of processing power in a single VM, especially if you’re handling data from thousands of IoT devices.

To get more processing power, you can add more VMs and divide the processing work between many computers. Tools like Kubernetes and Docker Swarm let you orchestrate processing workloads across many machines, and cloud providers offer services like AWS Elastic Kubernetes Service and Google Kubernetes Engine to support orchestrating workloads across multiple machines in the cloud. Services like AWS Elastic Beanstalk or Azure App Services automate the setup and scaling of common web development frameworks like Django, Rails, and Node, and are great starting point services to help you manage them.

Setting up and configuring orchestration tools, however, can be complex, requiring a lot of time and expertise that doesn’t directly provide value for your customers. If you want to prototype quickly to deliver value to your customers and you know that your solution will scale, serverless may be the way to go.

What Is Serverless Architecture?

Serverless architecture allows you to completely offload managing servers to the cloud providers while you can focus directly on your application code. One of the more common architecture subsets is Functions-as-a-Service (FaaS). But serverless architecture provides much more – from databases and queue systems to event processing services, each cloud service provider offers a wide variety to meet your needs.

Figure 1 – Multiple Manually-Configured Servers

Benefits of Going Serverless

  1. Charging: One benefit is that serverless platforms tend to charge based on how often the serverless functions run and for how long, so you only pay for the compute time that you use. This can keep costs low during development while building in a way that automatically scales up during launch.

  2. Fast Response: Serverless functions also tend to respond quickly to spikes in demand as the platform automatically scales up the amount of compute power available to run the functions, then downscale when the load is reduced. This produces an efficient usage of resources, deploying compute power only when needed.

  3. Language Options: There is good support for a variety of programming languages, so you can very likely build your serverless functions in your language of choice. For example, AWS Lambda natively supports Java, Go, PowerShell, Node.js, C#, Python, and Ruby, and provides a Runtime API to allow the use of other programming languages. Azure Functions support C#, Javascript, F#, Java, Powershell, Python, and Typescript.

  4. Bug Prevention: Building with serverless functions necessarily creates a stateless and hostless system, which can simplify reasoning about the system and prevent some complex bugs around state management.

  5. Data Pipeline: With your IoT framework, you can set up automated, event-driven data pipeline triggers and database storage. By additionally hooking in visualization frameworks or developing your internal dashboard, you can monitor the progress immediately.

  6. Pay for Less: If you have a VM spun up, you’re paying regardless of whether you’re using the full extent of those resources or if it’s just sitting idle. This isn’t ideal if your IoT device is infrequently sending small packets of data, and even if you have thousands of devices constantly transmitting data, there’s going to be a lot of idle time between packets, which will be quite inefficient. Utilizing a serverless framework allows you to pay only for what you use, and will handle the scaling for you without any configuration change needed – whether you have one device or a million devices. Additionally, if you’re using Azure, as originally proposed in the paper Serverless in the Wild, you’ll be able to dramatically reduce your cold starts by having it predict invocations.

Figure 2 – Functions Running on a Serverless Platform

Downfalls of Going Serverless

  1. Less Control: One downside of using a serverless platform is that you get less fine-grained control of the instance running the code as you don’t have control over the configuration of the underlying hardware. Limits on processing power, memory, and processing time in serverless environments can be a limitation, whereas in a VM you can vertically scale quite heavily.

  2. Long Latency: One other disadvantage is that because the platform handles scaling up the available compute power (i.e. spinning up more VMs to run functions) during demand spikes, you have less control of the latency that arises when users have to wait for a VM to spin up. This can sometimes cause long latency on the first request, known as “cold start latency” which could be a concern in highly latency-sensitive applications, though isn’t generally an issue.

Elastic Architecture

There are a number of cost comparisons around serverless architecture available on the web; we found Serverless Transformation on Medium and The Burning Monk’s analyses to be very helpful. Serverless architecture is highly applicable to IoT solutions and growing in popularity. With the billions of IoT devices used in the world today, having an elastic architecture is critical for getting to production quickly. Building with a serverless architecture will let you prototype quickly, fail fast, and beat your competition in the long run – just watch out for all of the under-the-hood properties in order to get the most for your buck.

Tweet

Share

Share

Email

  • Big Data
  • Connectivity
  • Consumer IoT
  • Cybersecurity
  • Data Analytics

  • Big Data
  • Connectivity
  • Consumer IoT
  • Cybersecurity
  • Data Analytics

参考译文
使用无服务器架构快速构建和扩展物联网项目
插图:© IoT For All 所以你已经推出了一款新的物联网产品,可能使用了AWS、Azure或其他主要云服务商提供的物联网框架,现在你的设备可以发送和接收来自云端的数据了。那么,你如何处理这些数据以获取有价值的洞察,比如设备健康遥测信息或用户行为追踪呢?目前有多种方法可以在云中搭建数据处理基础设施,这些方法在控制能力和复杂性之间各有取舍。无服务器架构(Serverless architecture)本质上是一种软件设计原则,它允许你在无需管理基础设施的情况下构建、扩展和运行服务,MistyWest 非常看好这种“无服务器”模式,它能帮助团队迅速构建并扩展云解决方案。为了帮助你更好地理解这种模式对物联网产品解决方案的适用性,我们提供了以下不同架构模式的概述,并告诉你在什么情况下应考虑将你的项目转向无服务器模式。 “无服务器架构可以让你完全将服务器的管理任务外包给云服务提供商,而你可以专注于应用程序代码的开发。” — MistyWest 点击推文 使用虚拟机 现在,传统的设置云处理管道的方法是启动一个虚拟机(VM)来运行你的处理代码,如果你希望获得更多的控制权,这也是推荐的方法。Azure Virtual Machines、AWS EC2 或 GCP Compute Engines 是一些常见的选择。你可以获得一个虚拟计算机,其运行代码的方式与在你个人电脑上运行类似。然而,这种方法的限制在于,单个虚拟机的处理能力很快就会跟不上需求,特别是当你处理来自成千上万个物联网设备的数据时。为了获得更多的处理能力,你可以添加更多的虚拟机,并在多个计算机之间分配处理任务。像 Kubernetes 和 Docker Swarm 这样的工具可以让你在多台机器上协调处理任务,而云服务商则提供如 AWS Elastic Kubernetes Service 和 Google Kubernetes Engine 这样的服务,以支持在云中跨多台机器管理任务。例如,AWS Elastic Beanstalk 或 Azure App Services 可以自动化设置和扩展常见网络开发框架(如 Django、Rails 和 Node),是帮助你管理这些任务的良好起点。然而,配置和设置协调工具可能非常复杂,需要大量时间与专业知识,而这些并不直接为客户带来价值。如果你希望快速构建原型以实现客户价值,并且你清楚自己的解决方案将扩展性很强,那么无服务器架构或许是更合适的选择。什么是无服务器架构? 无服务器架构让你可以完全将服务器的管理任务交给云服务提供商,而你可以专注于应用程序代码的开发。一个常见的架构子集是函数即服务(FaaS,Functions-as-a-Service)。但无服务器架构提供的内容远不止于此,从数据库和队列系统,到事件处理服务,每个云服务提供商都提供了丰富的选项以满足你的需求。 图1 - 多个手动配置的服务器 转向无服务器架构的好处 **成本节约**:一个优势在于,无服务器平台通常按函数运行的次数和持续时间来计费,因此你只需为你使用的计算时间付费。这可以在开发阶段降低运营成本,同时也能够自动扩展以应对上线需求。 **快速响应**:无服务器函数通常能够迅速响应需求的激增,因为平台会自动增加可用的计算资源来运行这些函数,负载减少后则自动减少资源。这可以高效利用资源,只在需要时分配计算能力。 **语言选项**:对多种编程语言有良好的支持,因此你很可能会用自己偏好的语言来构建无服务器函数。例如,AWS Lambda 本机支持 Java、Go、PowerShell、Node.js、C#、Python 和 Ruby,并提供运行时 API,允许使用其他编程语言。Azure Functions 支持 C#、JavaScript、F#、Java、PowerShell、Python 和 TypeScript。 **减少错误**:采用无服务器函数开发,会构建一个状态无依赖、无宿主的系统,这可以简化对系统的推理并减少与状态管理相关的一些复杂错误。 **数据管道**:借助你的物联网框架,可以设置自动、事件驱动的数据管道触发器和数据库存储。如果再接入可视化框架或开发内部仪表板,就可以立即监控进展。 **支付更少**:如果你启动了一个虚拟机,无论你是否充分利用其资源,或者只是闲置,你都需要付费。如果你的物联网设备很少发送小数据包,这种方式并不理想。即便你有成千上万个设备持续传输数据,数据包之间的空闲时间也会造成大量资源浪费,效率极低。使用无服务器架构,你可以只为实际使用的资源付费,并且平台会自动处理扩展,无需任何配置更改——无论你有一个设备还是一百万个设备。此外,如果你使用 Azure,并参考论文《Serverless in the Wild》的建议,你可以通过预测调用来显著减少冷启动延迟。 图2 - 在无服务器平台上运行的函数 转向无服务器架构的缺点 **控制较少**:使用无服务器平台的一个缺点是你对运行代码的实例缺乏更精细的控制,因为你无法控制底层硬件的配置。在无服务器环境中,处理能力、内存和处理时间的限制可能会成为瓶颈,而在虚拟机中你可以进行垂直扩展。 **长延迟**:另一个劣势是,由于平台在需求激增时会自动增加可用计算资源(例如启动更多虚拟机来运行函数),因此你对用户等待虚拟机启动时产生的延迟无法完全控制。这有时会导致第一个请求出现较长时间的延迟,称为“冷启动延迟”,这在对延迟高度敏感的应用中可能是个问题,但在大多数情况下并不会造成严重影响。弹性架构 网上有许多关于无服务器架构成本的比较;我们发现 Medium 上的《Serverless Transformation》和《The Burning Monk’s》分析非常有帮助。无服务器架构在物联网解决方案中高度适用,并且正在变得越来越流行。在全球数十亿物联网设备的背景下,拥有一个弹性架构对于快速推出产品是至关重要的。采用无服务器架构可以让你快速建立原型、快速失败,并在长期内战胜竞争者——只需注意所有底层特性,以充分发挥效益。推文分享邮件 大数据连接消费者物联网网络安全数据分析 → 大数据连接消费者物联网网络安全数据分析
您觉得本篇内容如何
评分

评论

您需要登录才可以回复|注册

提交评论

广告

iotforall

这家伙很懒,什么描述也没留下

关注

点击进入下一篇

2022行业信息化竞争力成果重磅发布

提取码
复制提取码
点击跳转至百度网盘