For a cloud service there are two ways users interact with your product. Through the user interface, and through the API. So it's important to get both of those right, such that using your product is both intuitive and easy. There are many books and blogs written about API design, and what makes a good API. I don't claim to be an expert on this topic so I'll leave that exercise up to the reader. Instead, what I want to talk about is how we build our APIs and Polar Signals and why that works for us. Like any technology choice, this isn't a one size fits all solution and isn't meant to be prescriptive.
Define the API
To define our APIs we start with Protocol Buffers (protobufs). Protobufs allow us to define sets of structured data and couple those with API definitions. Protobufs have become widely used in our industry, and as such there is a wide array of plugins and support that make them incredibly powerful for developing APIs quickly.
One of the plugins that we use for our protobufs is grpc-gateway. grpc-gateway allows us to define RESTful endpoints alongside our gRPC endpoints, such that we can support both protocols while only having to implement it one time.
You can view these examples on our open source continuous profiling project Parca. Below is a snippet of the QueryService in Parca. It defines a set of APIs for being able to query and retrieve profiles from the database. It defines a set of grpc-gateway definitions alongside each RPC definition.
// QueryService is the service that provides APIs to retrieve and inspect profilesservice QueryService {// QueryRange performs a profile query over a time rangerpc QueryRange(QueryRangeRequest) returns (QueryRangeResponse) {option (google.api.http) = {get: "/profiles/query_range"};}// Query performs a profile queryrpc Query(QueryRequest) returns (QueryResponse) {option (google.api.http) = {get: "/profiles/query"};}// Series is unimplementedrpc Series(SeriesRequest) returns (SeriesResponse) {option (google.api.http) = {get: "/profiles/series"};}// Labels returns the set of label names against a given matching string and time framerpc Labels(LabelsRequest) returns (LabelsResponse) {option (google.api.http) = {get: "/profiles/labels"};}// Values returns the set of values that match a given label and time framerpc Values(ValuesRequest) returns (ValuesResponse) {option (google.api.http) = {get: "/profiles/labels/{label_name}/values"};}}
Generation
At Polar Signals once we have a proto definition, we use Buf to generate code for us. We love using Buf because it's extremely fast at compiling protobufs and generating code, but it's also able to handle all of our external protobuf dependencies. It even allows us to publish our proto definitions to it's own registry, so that others in the community can also consume these definitions and build other tools that interact with these APIs. The published definitions for Parca can be found here.
Below is our buf.yaml definition in Parca that defines our external dependencies, so we no longer have to write bash scripts that download and vendor these files, Buf does that for us. Buf also handles our linting of our protobuf files so that we can have consistent definitions across our codebase.
version: v1name: buf.build/parca-dev/parcadeps:- buf.build/googleapis/googleapis- buf.build/grpc-ecosystem/grpc-gatewaylint:use:- DEFAULT- COMMENTS
Buf generates a lot of code for us from a simple buf.gen.yaml file. Below is that file definition where you can see we define generation of Go code, gRPC code, Javascript clients, Typescript clients, the grpc-gateway Go code, and even an OpenAPI definition.
version: v1managed:enabled: truego_package_prefix:default: github.com/parca-dev/parca/gen/proto/goexcept:- buf.build/googleapis/googleapisplugins:- name: goout: gen/proto/goopt: paths=source_relative- name: go-grpcout: gen/proto/goopt:- paths=source_relative- require_unimplemented_servers=false- name: jsout: ui/packages/shared/client/srcopt:- import_style=commonjs,binary- name: tspath: ./node_modules/.bin/protoc-gen-tsout: ui/packages/shared/client/srcopt:- service=grpc-web- name: grpc-gatewayout: gen/proto/goopt:- paths=source_relative- generate_unbound_methods=true- name: openapiv2out: gen/proto/swagger
Implementation
Once we've generated all our code, we only need to implement the API once, and once that is implemented we simply start a gRPC server and register our implementation with the server. Our server has a handler function that serves both our gRPC as well as our grpc-gateway endpoints on the same port, to do so we validate the ProtoMajor version in the request, and the Content-Type header.
And since we've generated Javascript and Typescript clients, when we go to implement a web user interface on top of our API, we only have to import our clients to interact with the server. No longer defining any of that handling by hand. Since our APIs can be very data heavy, we generate our Typescript clients using grpc-web so that they payloads can be transferred using the proto format.
Protobufs allow us to have a single API definition that we're then able to generate a large amount of code from and utilize in many aspects of our code base which allows us to focus on the important functionality of our code and leverage all the work the community has already done for us to make APIs easy.