Amazon Bedrock vs SageMaker JumpStart

Artificial Intelligence & MLOps
Data Modernization & Analytics
Video

Learn about the differences in how Amazon Bedrock and SageMaker JumpStart help you approach foundation model training for GenerativeAI Use cases on AWS.


AWS offers a few different approaches to running foundation models that can help you create Generative AI solutions.

Beginning at the lowest level but highest DevOps tax is EC2. With EC2, you have access to a wide variety of instance types and capabilities, including things like Inf2 Instances, but you’re also stuck managing the minutiae of the instances. Most companies prefer to have their developers leverage an orchestration layer like Amazon SageMaker, which removes much of the undifferentiated heavy lifting in ML workloads. SageMaker still requires a developer to select the underlying instances, but they no longer have to maintain them at a basic level. The DevOps tax is lower. Finally, there’s a new entry to the AI inference service ecosystem - Amazon Bedrock. Bedrock provides a simple InvokeModel API that allows developers to access foundation models using familiar SDKs and HTTP APIs. With bedrock there’s zero DevOps tax and it’s trivial to use.

Amazon SageMaker JumpStart

You can trade DevOps tax for managed compute with Amazon SageMaker, leveraging SageMaker JumpStart, a model hub that provides access to foundation models. SageMaker JumpStart allows you to provision any model, including foundation models like Falcon-40b, Llama 2 or really any of the models available on Hugging Face, and it facilitates their deployment onto SageMaker compute instances. SageMaker will manage the underlying compute and provide an HTTP endpoint that you can invoke from your code. It’s also possible to fine-tune the models within SageMaker and add examples relevant for specific industries. 

Amazon Bedrock

If you’d rather not worry about the underlying compute then look no further than Amazon Bedrock. Bedrock provides a simple API to invoke a foundation model and it is metered on the number of input and output tokens instead of underlying compute. Each model in Bedrock has a different cost, enabling you to optimize the throughput and utilization of these services in diverse ways. We like to think of it as serverless foundation model inference.

So what does all this mean for you?

When ranking each solution on ease of use, Bedrock stands atop, SageMaker positions itself in the middle, and EC2 remains reliable but cumbersome. The way customers take advantage of each solution will depend on the entities, the access patterns, and the use cases involved. Different customers may use varying mixes of some or all of these services. That AWS provides this flexibility is one of the tremendous advantages of working with a larger cloud provider.


Next Steps

We hope this provides you with a quick overview of Amazon SageMaker and Bedrock and how they can fit into your AI initiatives on AWS.

Are you exploring ways to take advantage of Analytical or Generative AI in your organization? Partnered with AWS, Caylent's data engineers have been implementing AI solutions extensively and are also helping businesses develop AI strategies that will generate real ROI. For some examples, take a look at our Generative AI offerings.



Accelerate your GenAI initiatives

Leveraging our deep experience and patterns

Browse GenAI Offerings
Artificial Intelligence & MLOps
Data Modernization & Analytics
Video
Randall Hunt

Randall Hunt

Randall Hunt, VP of Cloud Strategy and Innovation at Caylent, is a technology leader, investor, and hands-on-keyboard coder based in Los Angeles, CA. Previously, Randall led software and developer relations teams at Facebook, SpaceX, AWS, MongoDB, and NASA. Randall spends most of his time listening to customers, building demos, writing blog posts, and mentoring junior engineers. Python and C++ are his favorite programming languages, but he begrudgingly admits that Javascript rules the world. Outside of work, Randall loves to read science fiction, advise startups, travel, and ski.

View Randall's articles

Related Blog Posts

AI-Augmented OCR with Amazon Textract

Learn how organizations can eliminate manual data extraction with Amazon Textract, a cutting-edge tool that uses machine learning to extract and organize text and data from scanned documents.

Artificial Intelligence & MLOps

Securing Sensitive Data: A Deep Dive into PII Protection with OpenSearch

Learn how organizations can protect sensitive data using Amazon OpenSearch's security features like fine-grained access controls, encryption, authentication, and audit logging.

Data Modernization & Analytics

Building Recommendation Systems Using Generative AI and Amazon Personalize

In this blog, learn how Generative AI augmented recommendation systems can improve the quality of customer interactions and produce higher quality data to train analytical ML models, taking personalized customer experiences to the next level.

Artificial Intelligence & MLOps