Currency Conversions

I am trying to understand the best practices for handling currency conversion rates

  • [ ] How do you handle currency conversions?
  • [ ] For reporting, do you lock the currency conversion rates by day/month/quarter?
  • [ ] If the rates are not locked and timestamped, do you always use the current market rate and apply it to all the metrics I.e. even the historical records?
  • [ ] Do you lock the currency rates after the Finance team books the numbers for the month?

I intend to create 4 columns -

metric in local currency, 
currency conversion rate (at that point in time),
metric converted to corporate currency```

I want to know how people in this community are handling it.

At a Fintech I worked for we used the ECB rates to set daily Exchange Rates (not just in BI but actually in the production environment as well):

Corporate currency was Euro, so that made sense. I suppose you can use them for any currency that is offered, but there is probably a similar source for other central banks/currencies. The ECB exposes its currency rates in an XML format* as well if you want to write the ETL yourself, but there is a python library somewhere that you can invoke to get them without hussle.

Thus, you have a daily rate. Of course you also need to have a logic that decides which date to use for any particular row of your data.

*For the XML, go to a particular currency and you find a link in the bottom right corner “Alternative data format, XML (SDMX-ML) data file”

@Bijan I can start storing currency rates per day, however, I am not sure which rates should be used for month-end.quarter-end or year-end reporting.

When you report facts as of end of periods, I would use the rate at the last date of the period.

In my opinion, rate of the last date of the period must be used for month-end reporting.

For quarter/year end reporting, all the metrics must be derived/aggregated from month-end metrics.

With this approach, the currency fluctuations won’t drastically impact quarter/year end reporting.

Which exchange rate do you use? Day/month/quarter?

  • It depends. If these reports are being used by your finance team, ultimately it is up to the finance team to decide which exchange rate to use, and for the data team to implement it. That’s because currency conversions have a real impact on their reports! For example, in Australia, it’s regulation that accountants use the month-end currency exchange rate for financial reporting.
  • If your finance team truly doesn’t have strong opinions on this, my preference is to use a daily exchange rate.

How do you model this?

Source data

We generate a table of exchange rates in our data warehouse. Currently, it’s a (sub-optimal) process of building a Google Sheet that uses the googlefinance function to grab exchange rates, and loading the Google Sheet into our warehouse.

Ideally, I’d like to use a Singer tap on Stitch for this, but there currently isn’t a reliable tap to do this.

Build a model of exchange rates

I would build a model with one record per currency, per day, with the fields:

  • date
  • source_currency: ISO currency code, e.g. AUD, GBP, USD
  • target_currency: ISO currency code, e.g. AUD, GBP, USD
  • rate: exchange rate to convert 1 unit of the source currency to the target currency

Any business logic on which exchange rate to use should be included in this model.

Perform the conversion

I would convert prices at the lowest granularity possible (i.e. on the orders table, rather than a rollup of daily_revenue).
For each model with multiple currencies, I tend to store:

currency # the ISO code for the currency, e.g. AUD, GBP, USD
price_usd # or whatever the target exchange rate is

This is what the SQL would look like to convert to USD. Note that I’ve done it in a way that can be extended if you wish to convert to other currencies too.

with orders as (
    select * from {{ ref('orders') }}

exchange_rates as (
    select * from {{ ref('exchange_rates') }}

    orders.price as price_local,
    orders.price * usd_exchange_rates.rate as price_usd
from orders

left join exchange_rates as usd_exchange_rates
    on usd_exchange_rates.target_currency = 'USD'
    and date_trunc('day', orders.ordered_at) =
    and orders.currency = usd_exchange_rates.source_currency

Things to be mindful of:

  • The timezone for conversion - it’s up to your business to decide the logic here
  • Accidentally using the inverse rate (I’ve done this more times than I care to admit)

How do you handle currencies in your BI tool?

One thing to be really conscious of is that you want to prevent your users from trying to aggregate the price_local field. It makes no sense to average an order of $20USD and another order of $15GBP!

To do this, I recommend ensuring that currency is added to any BI queries that reference the price_local field – on Looker it might look like adding currency as a required field to any of your measures that reference price_local.

On other BI tools where users can query the underlying data model without the guard rails of LookML, the only way to truly prevent this from happening is to remove the price_local field and instead create a column for each currency, filling the columns with NULL when the record is not in that currency, e.g.:

  case when currency = 'USD' then price_local as price_local_usd,
  case when currency = 'GBP' then price_local as price_local_gbp,
  case when currency = 'AUD' then price_local as price_local_aud,

^ This approach isn’t very sustainable as your company grows though.

Advanced things you can do in Looker:

  • If converting to multiple target currencies, you can do some trickery with using parameters that are added as an always_filter, so users can switch their reports between USD, GBP etc, without having to choose different measures. Kind of related is this article I wrote a while back on a similar pattern for timezone conversion.
  • You can use custom formatting to get the right currency symbol for a price field, as described in this article

Thank you so much @claire. This is very helpful, I will review it with the finance team and ask for their opinion.

1 Like

@claire I am attempting to join similar data to your example, but there are cases where the system (Netsuite / Xignite) we are pulling rates from do not have a rate for every date. Is there a way to make that join grab the latest rate if the exchange rate doesn’t exist for that specific date?

Hey @ryanonymous :wave:

There’s two options here:

:one: Create a valid_to column in your exchange_rates model, and then use an inequality join in orders.
To create the valid_to column, you need to check the next record, which you can do by using the lead function.
Depending on your SQL dialect, your exchange rate model might look like:

  date as valid_from, 

  -- use a window function to find the next record
  lead(date) over (
    partition by source_currency, target_currency
    order by date
  ) as next_record,
  -- coalesce this with today's date so your most recent record has a valid_to date
from exchange_rates

Then in your orders model, use something like:

from orders

left join exchange_rates as usd_exchange_rates
    on usd_exchange_rates.target_currency = 'USD'
    and orders.currency = usd_exchange_rates.source_currency
    -- order happens on or after valid_from
    and orders.ordered_at >= usd_exchange_rates.valid_from
    -- order happens before valid_to
    and orders.ordered_at < usd_exchange_rates.valid_to

:two: Use an “all days” table to get your exchange_rates model to have one record per day.
We call this “date spining”, and you can see an article about it over here :point_right: Finding active days for a subscription/user/account (date spining)