Hi, I'm Andrew
A developer exploring Canada

Code
Using Dataloader.KV to call APIs lazily
2019/11/04

Dataloader is a library for loading data in batches. It integrates very well with Absinthe, a GraphQL library for Elixir.

Continue...
Travel
Dressing for a Calgary winter
2018/02/09

With temperatures ranging from -30ºC to +10ºC during winter, how you dress will make all the difference to how comfortable you survive ou...

Continue...
test

Using Dataloader.KV to call APIs lazily

2019/11/04 - Code

Dataloader is a library for loading data in batches. It integrates very well with Absinthe, a GraphQL library for Elixir.

My background with Absinthe & Dataloader

I’ve been working a bit with GraphQL (Absinthe/Elixir) microservices in the past few months. For the most part, those services have been able to exist in their realm and not need to query outside of it. However, from time to time, there’s a need to incorporate a field - an association - within a schema which calls APIs outside of that realm. Sometimes those APIs are friendly to batch queries, and sometimes they’re not. Either way, I’ve found that Dataloader.KV can be used to effectively manage batching requests to those services.

When I first came across the problem of calling other APIs efficiently in Absinthe, I only knew of Dataloader.KV. I read a bit about the documentation and asked around the community for more information. Oddly, there seemed to be few resources on getting it going in a simple case. I hope this blog can help other people jump-start into using it for the future.

Getting started

Let’s skip ahead to the result first. I find myself making use of Dataloader.KV in two ways.

Pretend we have a User, with an association, posts, which are hosted on an external service we’ll call postal;

object :user do
  field :posts, list(:post), resolve: dataloader(:postal)
end
field :posts, list(:post) do
  resolve fn parent, args, %{context: %{loader: loader}} ->
    loader
    |> Dataloader.load(:postal, :posts, parent)
    |> on_load(fn loader ->
      loader
      |> Dataloader.get(:postal, :posts, parent)
      |> do_something_with_the_result()
    end)
  end
end

For those using Dataloader already, both of those usages should look familiar.

The question now is how to make it work… First, we’re going to create a module that will house the code for calling the external API.

We will need to define a function to give to Dataloader.KV that will take the association and parent records and resolve the data; dataloader_postal_loader (arbitrarily named). I, however, factor the actual resolving of data out of this function, and just let it manage the incoming options.

Note: One thing that’s important to getting this right, is that you will receive a list of parent records in the format of a MapSet. Dataloader is expecting you to return a map of these records to a result. The keys of those maps must be exactly as they came; if these records are Ecto structs, and are missing some associations that you need to load before you resolve this external data, then you will need to hold onto the original key alongside your record with loaded associations so you can return it in the result map.

defmodule UserAppWeb.Sources.Postal do
  @spec dataloader_postal_loader() :: (({:posts, keyword} | :posts, MapSet.new(User.t)) -> map)
  def dataloader_postal_loader do
    fn
      # Signature for use with `dataloader` helper function
      {:posts, _opts}, users ->
        load_posts_for_users(users)
        
      # Signature for use with manual dataloading (without opts)
      :posts, users ->
        load_posts_for_users(users)
    end
  end
  
  # Example - just call an API which accepts bulk arguments
  @spec load_posts_for_users(list(User.t)) :: map
  defp load_posts_for_users(users) do
    user_ids = 
      users
      |> Stream.map(users, & &1.id)
      |> Enum.join(",")
    
    # Perhaps returns JSON map of user_id -> [string]
    # Don't hard pattern match {:ok, result} in the real world.
    # You could also use Flow to concurrently make external calls for non-bulk APIs
    {:ok, result} = HTTPoison.get("http://example.com/posts?user_ids=#{user_ids}")
    
    # Wrapping result in ok tuple will result in `nil` being returned for the field
    # if no result was found for the user.
    users
    |> Stream.map(fn original_user -> {original_user, {:ok, Map.get(result, original_user.id)}} end)
    |> Map.new()
  end
end

You will need to add a dataloader source, finally, to make use of this functionality. Probably somewhere like a main schema.ex

def context(ctx) do
  postal_loader =
    UserAppWeb.Sources.Postal.dataloader_postal_loader()
    |> Dataloader.KV.new()

  loader =
    Dataloader.new()
    |> Dataloader.add_source(..., ...) # Your other sources
    |> Dataloader.add_source(:postal, postal_loader)

  Map.put(ctx, :loader, loader)
end

You should now be able to make GraphQL queries which lazily call the external API.

Further considerations

The above example touches on the very simple case. There are some issues with managing errors from your external APIs. Instead of returning {:ok, value} or {:ok, nil} for every result, you can in fact return an error tuple, however you will need to specially manage this in your field resolver. I found that I couldn’t get the ok or error without first changing the dataloader setting of get_policy to :tuples (see dataloader options). This looks a bit like;

field :posts, list(:post) do
  resolve fn parent, args, %{context: %{loader: loader}} ->
    loader
    |> Dataloader.load(:postal, :posts, parent)
    |> on_load(fn loader ->
      loader
      |> Map.put(:options, Keyword.put(loader.options, :get_policy, :tuples)) # So that we can get errors out
      |> Dataloader.get(:postal, :posts, parent)
      |> case do
        {:ok, {:ok, _value} = success} ->
          success
        {:ok, nil} ->
          {:ok, nil}
        {:error, _error} = error ->
          error
      end
    end)
  end
end

If you have any better ways of managing that issue, I’d love to hear them - please get in touch!

- Andrew

Better Phoenix APIs - featuring Ecto, Absinthe & GraphQL - Part 1

2018/10/12 - Code

At Abletech, I’ve been using Elixir full time for almost 9 months. During that time I have authored or been involved in more than 10 separate codebases, mostly using Phoenix to serve JSON APIs. I have come away from those codebases with a few opinions around how best to manage data. Full disclosure; I’m not an expert, and much of what I’ve done has been driven through conversations with talented developers in the Elixir community.

Part 1 doesn’t get into the code specifics of using Absinthe or GraphQL. Stay tuned for that

Associations, scoping, consistency and boilerplate

One of the best of ways of getting an application going quickly with Phoenix and Ecto is to use the built in phx generators. These are useful for basic CRUD, and will usually work well while you’ve got only a few schemas involved in your applications.

However, complexity increases very quickly when you add authentication, multiple assocations and scoping. In particular, when different scenarios require different associations to be preloaded and scoped.

My first low-tech solution for this is based off of the recommended usage for dataloader, a package I’ll get into more later.

Let’s take a look at some code and how it might progress naturally in a simple application. Consider listing users and their posts for a simple blog.

  1. The generated boilerplate
@spec list_users() :: list(User)
def list_users do
  Repo.all(User)
end

This works perfectly for a simple concept.

  1. The associated schema

If we add in posts, we can simply load them for each user by preloading like so.

@spec list_users() :: list(User)
def list_users do
  User
  |> Repo.all()
  |> Repo.preload([:posts])
end

However, chances are we don’t always want to preload the posts each time we load users, so we add a second function or clause;

@spec list_users() :: list(User)
def list_users do
  Repo.all(User)
end

@spec list_users_with_posts() :: list(User)
def list_users_with_posts do
  User
  |> Repo.all()
  |> Repo.preload([:posts])
end

This works, but is not sustainable if we start adding other associations

  1. The preload list

That brings us to adding a list with the associations we want to preload.

def list_users(preloads \\ []) when is_list(preloads) do
  User
  |> Repo.all()
  |> Repo.preload(preloads)
end

Which works well until you need to do a search for users’ names. Perhaps in this scenario, you don’t want to preload associations - or only preload a subset of them.

  1. Keyword opts

First a totally naive implementation

def list_users(opts \\ []) do
  users = User
  users = if Keyword.has_key?(opts, :name) do
    name = opts[:name]

    if is_nil(name) do
      from u in users, where: is_nil(u.name)
    else
      from u in users, where: u.name == ^name
    end
  else
    users
  end
  
  users = if Keyword.has_key?(opts, :preloads) do
    preloads = opts[:preloads]
    from u in users, preload: ^preloads
  else
    users
  end
  
  Repo.all(users)
end

This is obviously not a good functional approach to the problem. Since we’re always building on the queryable depending on the next opt, we can simplify this using Enum.reduce:

def list_users(opts) do
  users = Enum.reduce(opts, User, fn
    {:name, nil}, users ->
      from u in users, where: is_nil(u.name)
    {:name, name}, users when is_binary(name) ->
      from u in users, where: u.name == name
    {:preloads, preloads}, users ->
      from u in users, preload: ^preloads
  end)
  
  Repo.all(users)
end

Bam! This is much more elegant and takes a functional approach. This isn’t something that came to me intuitively - this is a tip I received from @dpehrson on the Elixir Slack. This tip has helped pave the way to how I currently handle ecto queries. More on that later.

At some point, it’s likely we want to apply these filters and options to a query where we want only one result. Something like a get_user function. Let’s extract out all that option handling and let list_users and get_user handle their real responsibilities.

  1. query/2

Introducing the query/2 function. Totally derived from how dataloader is used - it did not make sense to me at first. However, its value is clear after going through this process once or twice.

@spec query(Ecto.Queryable.t, keyword) :: Ecto.Queryable.t
def query(queryable, opts \\ [])
def query(User, opts) do
  Enum.reduce(opts, User, fn
    {:name, nil}, users ->
      from u in users, where: is_nil(u.name)
    {:name, name}, users when is_binary(name) ->
      from u in users, where: u.name == name
    {:preloads, preloads}, users ->
      from u in users, preload: ^preloads
  end)
end

@spec list_users(keyword) :: {:ok, list(User)}
def list_users(opts \\ []) do
  users =
    User
    |> query(opts)
    |> Repo.all()
    
  {:ok, users}
end

@spec get_user(keyword) :: {:ok, User} | {:error, {:not_found, {:user, keyword}}}
def get_user(opts \\ []) do
  user =
    User
    |> query(opts)
    |> Repo.one()
    
  case user do
    %User{} = user ->
      {:ok, user}
    nil ->
      {:error, {:not_found, {:user, opts}}}
  end
end

Now we’ve factored the option handling out so that we can easily apply arbitrary filtering, pagination, scoping or otherwise in a single location. I have found this to be considerably more flexible and easy to handle, and is worth the small extra work to me as soon as one association is introduced. Not only that, but it works just fine with multiple schemas handled in a same context by pattern matching on the queryable argument in query/2

In part 2, we’ll introduce authorisation to the equation, and start discussing how Absinthe can help us solve some issues.

- Andrew

A day in the life of an Abletech developer.

2018/02/22 - Code

This article was originally published on Medium.

Working remotely as a developer is a situation many people would love to find themselves in. For the last 6 months, I’ve been lucky enough to do so with Abletech.

My base location of choice has been Calgary, Canada. While Wellington has been enjoying a unique, spectacularly warm and windless summer, I’ve been enjoying freezing rain, ice, snow and temperatures plummeting to -32ºC. Sounds shocking — but it’s not so bad.

A day of winter

The day usually starts in one of two ways.

  1. My alarm goes off and I wake up like any normal person
  2. An extremely loud whirring noise anytime between 4am and 6.30am, caused by two people clearing the paths around our apartment building with snow blowers (imagine leaf blower, but louder).

While the former is a much nicer way to wake up, the latter means only one thing; snow. I get out of bed and peak out the blinds from our living room. It’s easy to tell just how much snow there is by looking at the balcony railing. Some days it looks like someone sprinkled ground ice delicately on top — and some days it looks like an extra white partition!

My next stop is a bowl of cereal in front of the weather forecast for the day. The weather here is very changeable, and if you don’t dress right for the day, you can find yourself in a serious spot. Temperatures can rise from -20ºC to positive degrees within a few hours if the wind blows the right way — or vice versa if we get some cool arctic air blow through. You can read more about how I dress for winter here

Once I’m ready to go, I take a short walk to a bus stop downtown and wait in a semi-enclosed shelter, usually arriving around 8am. Depending on the time of year, it’s either starting to get light… or totally pitch black. There’s something a little odd about walking to work at 8am when it’s pitch black — but those days don’t stay around too long thankfully.

The bus takes 20 minutes or so to get into Kensington, a nice little suburb just outside of downtown. I jump out of the bus — sometimes onto a clear footpath, sometimes into snow deeper than my boots. The paths outside of downtown are often not cleared early in the morning after a good dump of snow, and if they are, there’s a high chance you’re walking on ice, not on pavement. I head inside to Assembly CS where I rent a space, and take my boots off. I grab a coffee (to start the day and to defrost!), chat with a couple of developers around me, sit down, and start catching up on what happened the night before.

Remote working

When I agreed with Abletech to work remotely, I was a little nervous about how it might go — there are endless numbers of articles online about the challenges teams face, and the solutions they come up with to tackle them. I was expecting it to be hard. However, what I found in reality wasn’t as difficult as others had made it out to be. I put this down to 3 ideas which have so far helped me stay productive and sane:

1. Communicate often, and communicate early.

When you’re not able to talk to someone face to face, it’s important to just be in touch. I like to communicate any problems I anticipate happening as soon as possible so they can be resolved before they happen, or even while I’m asleep 💤

2. Try and keep the pipeline full

Truly running out of things to do should be a rarity. Working in another timezone, in another part of the world can make getting extra tasks or detail a challenge. The best way to combat this is to take from #1 and try to stock up on work before running out. Working in a team that operates scrum or kanban boards with stories which anyone can pick up is a huge help.

3. Iterate

In true agile fashion, I believe the most important point is to evaluate what is and is not working, and then improve. Weekly catch ups provide an opportunity to address anything necessary and make changes quickly.

Lunch

My lunch préféré used to be a bottle of Soylent. Soylent is a meal of pretty much everything you need — however — Canada has decided it disagrees about what a meal constitutes, and therefore I can no longer sit inside the warm, cosy office at lunch time… I must venture out into the wildlands. Boots on, jacket on, gloves, beanie (toque?!), sunglasses — check! Often how far I go for lunch is proportional to the temperature. The lower the temperature, the less I’m willing to walk 😎. Do you want to be out grabbing lunch in -25ºC? No. For the warmer days, Kensington is relatively good on choice. There are fast foods, restaurants, cafes, a supermarket. Anything is within reach depending on the weather.

End of the working day

The sun is setting around 4.30pm. By the time I leave — around there, or 5pm, it’s dark. Start dark, end dark. Thankfully the in-between is almost always sunny: Calgary is the sunniest city in Canada!

Getting home is another bus and train (train is faster on the way home). Peak hour C-Train in downtown core is like walking into a container of sardines most days. I walk a short distance, and I’m home again.

After hours

Coming from cities in New Zealand which don’t see snow, I absolutely love the amount of winter activities around Calgary. Skiing is a 20 minute drive away, ice skating a 5 minute walk to the plaza, a toboggan-worthy hill only 10 minutes away. There’s plenty to do in the snow — even taking a walk alongside the river is totally different to the summer. While the river is running, it’s hidden by a thick layer of snow capped ice. The snow around the parks shows the prints of the wildlife; snowshoe hares, squirrels. They’re even cuter in real life than you’d think.

Calgary has been a great city in our time so far. It’s clean, easy to get around and has some incredible natural beauty around it. In saying that, it just does not and will not beat Wellington on food and coffee. I look forward to the day I get back to catch up with the team over a coffee from Flight Hangar, Nasi Goreng from Little Penang, or Char Kuey Teow from Cinta.

- Andrew