Uptime monitor in Elixir & Phoenix: Data models
Data models, contexts and modular design are all part of an Elixir/Phoenix application's architecture. Learn how to build and handle them with the third part of our tutorial!
Table of Content
- The requirements
- Contexts & modular design
- The Schema module
- Data models in the Elixir uptime monitor
- Ecto migrations
- A word of conclusion
Welcome to the third part of the article series about creating an uptime monitor in Elixir. Today we will build the data models in our application. It is going to be a bit of a challenge because of the task touching the application architecture field.
New to the Elixir/Phoenix uptime monitor tutorial? Start from the project setup
The requirements
Taking it back to the first article, we want our application to be a tool where you put your website URL and it will gather its uptime and response time. The uptime will be shown with a given accuracy, depending on the chosen timespan. The same goes for the response time.
We can predict the models our application will need:
- user & session related models, which we created in the previous article,
- website model, which will store some basic information about the website,
- measurement, which will hold the single measurement of a website.
The above list leaves us with just two basic models. Seems easy, huh? :-)
Contexts and modular design
What are contexts in Elixir?
Contexts in Elixir are the modules exposing and grouping related functionalities. Any time you call any standard library function, like Enum.map/2, you refer to some context.
What is modular design?
Modular design is a design pattern, where you split the whole system into smaller, independent parts grouping some functionality. They are called modules. Those, in the perfect scenario, can be independently replaced and exchanged between systems. Modular design is a wide concept, related not only to programming but to the whole engineering in general.
Given that, contexts can be a projection of modular design in our code. They allow us to split our application into smaller parts. Our task is to group related functionalities of our application into proper contexts, so they follow the modular design rules. That’s why we are going to put our models in two contexts: Sites and Analytics.
The Schema module
One of the most important programming principles says: don’t repeat yourself. That is why we are going to create an EMeter.Schema module in the apps/e_meter/lib/e_meter folder with a using macro inside, which will cover our imports and module attributes used in all schemas.
defmodule EMeter.Schema do
defmacro __using__(_) do
quote do
use Ecto.Schema
import Ecto.Changeset
@primary_key {:id, :binary_id, autogenerate: true}
@foreign_key_type :binary_id
end
end
end
We used two module attributes:
- @primary_key which configures the schema primary key. It expects a field name, type, and options tuple. We are going to use binary id as default - that is why it is configured this way.
- @foreign_key_type which configures the default key type used by belongs_to association.
Data models
As we said before contexts are extremely important. This is why before going further into code and technical stuff we need to build a proper structure for our models. For now, our e_meter app has one context: Accounts, which was automatically generated and covered in the previous article.
Our mission is to create Sites and Analytics contexts in such a way that they are consistent with the Accounts context. That is why we should add sites.ex and analytics.ex to our e_meter/lib/e_meter folder. Those will hold general context functions inside. Also, in the same place, let’s create sites and analytics folders, which will hold our models inside.
With contexts defined this way, we can start coding. As we determined before, we are going to create a Website model in the Sites context first. To do so, add the website.ex file in our sites folder.
defmodule EMeter.Sites.Website do
use EMeter.Schema
schema "websites" do
timestamps()
end
end
First of all, we added use EMeter.Schema at the top of our module. It makes use of our previously defined schema module with all the utilities defined there. Then we used the Ecto schema macro to define the mapping of database fields.
In the next step, we need to think of what exact data we will store in our model.
defmodule EMeter.Sites.Website do
use EMeter.Schema
@required_fields ~w(user_id url)a
@optional_fields ~w()a
schema "websites" do
field :url, :string
belongs_to :user, EMeter.Accounts.User
timestamps()
end
end
For our Website we will need its URL to check the response times. We also should define that it belongs to the :user. belongs_to defines one-to-one or many-to-one association with another schema. We will use default belongs_to settings, which means it will take :user_id as a foreign key.
There are also @required_fields and @optional_fields defined above the schema. Those module attributes will be used in the changeset/2 function. They are defined with sigils, which are the Elixir tool to work with textual representations. They always start with the tilde followed by a character, which identifies the sigil. ~w means a word list, which is ended with an a after the expression, which determines that it is going to be an atom list.
You can also notice a timestamp() at the end of the schema. It is a macro used to generate :inserted_at and :updated_at fields automatically, which is a little facilitation shipped with Ecto.Schema.
Last but not least, let’s create a changeset/2 function, which will be casting and checking input validity.
defmodule EMeter.Sites.Website do
use EMeter.Schema
@required_fields ~w(user_id url)a
@optional_fields ~w()a
schema "websites" do
field :url, :string
belongs_to :user, EMeter.Accounts.User
timestamps()
end
def changeset(website, attrs \\ %{}) do
website
|> cast(attrs, @required_fields ++ @optional_fields)
|> validate_required(@required_fields)
|> unique_constraint(:user, name: :unique_website)
end
end
That is the place for our module attributes, defining required and optional fields, to be used. First, it takes the existing Website, casts given attributes onto it, and validates if required fields were given to the model. At the end, it checks if the given URL already exists for this user via unique_constraint. It prevents the user-url from being duplicated.
Given that, let’s create our second model: Measurement. It belongs to the Analytics context.
defmodule EMeter.Analytics.Measurement do
use EMeter.Schema
@required_fields ~w(response_time status_code webstie_id)a
@optional_fields ~w()a
schema "measurements" do
field :response_time, :integer
field :status_code, :string
belongs_to :website, EMeter.Sites.Website
timestamps()
end
def changeset(measurement, attrs \\ %{}) do
measurement
|> cast(attrs, @required_fields ++ @optional_fields)
|> validate_required(@required_fields)
end
end
We are using the same macros and attributes to define it. It has two fields - :response_time and :status_code and belongs to the :website. All of its fields are required.
Ecto migrations
With the data models implemented in our code, we also need our database to follow up on the changes. That is why we need to create migrations, which will set up our database tables. The Ecto.Migration defines macros to help you operate on the database structure. To make use of it we have to navigate to our postgres application.
cd apps/postgres
To create a migration we are going to use the mix ecto.gen.migration command.
mix ecto.gen.migration add_basic_models
It should generate a new migration module with change/0 function.
defmodule Postgres.Repo.Migrations.AddBasicModels do
use Ecto.Migration
def change do
end
end
To add a new table we will use create/2 macro, which takes an object and options list as its arguments.
defmodule Postgres.Repo.Migrations.AddBasicModels do
use Ecto.Migration
def change do
create table(:websites, primary_key: false) do
end
end
end
We name our table websites and set the primary key to not generate automatically on the table generation. That is because we want to add manually the primary key on the binary id field to match our EMeter.Schema module. The last step is to add fields to be created in the table.
defmodule Postgres.Repo.Migrations.AddBasicModels do
use Ecto.Migration
def change do
create table(:websites, primary_key: false) do
add :id, :uuid, primary_key: true
add :url, :string, required: true
add :user_id, references(:users, type: :binary_id, on_delete: :delete_all)
timestamps()
end
end
end
We need to create the primary key field - :id. The type of this field is :uuid. The second field is :url, as in the schema. Then we add :user_id fields, which refer to the users table. In the reference, we also need to define its type, which is :binary_id, because it is other than default. We also specify what should happen after the user deletion with the on_delete: :delete_all tuple. This will create a cascade reaction once the user is deleted. They will bring down all websites belonging to them from our application. The last thing in the schema is timestamps(), which create :updated_at and :inserted_at fields.
In the end we want to create a unique index on user_id and url, so users won’t have the possibility to have doubled URLs in our application.
defmodule Postgres.Repo.Migrations.AddBasicModels do
use Ecto.Migration
def change do
create table(:websites, primary_key: false) do
add :id, :uuid, primary_key: true
add :url, :string, required: true
add :user_id, references(:users, type: :binary_id, on_delete: :delete_all)
timestamps()
end
create unique_index(:websites, [:url, :user_id])
end
Now we do the same thing for the measurements table and we end up with the ready migration.
defmodule Postgres.Repo.Migrations.AddBasicModels do
use Ecto.Migration
def change do
create table(:websites, primary_key: false) do
add :id, :uuid, primary_key: true
add :url, :string, required: true
add :user_id, references(:users, type: :binary_id, on_delete: :delete_all)
timestamps()
end
create table(:measurements, primary_key: false) do
add :id, :uuid, primary_key: true
add :status_code, :string, required: true
add :response_time, :integer, required: true
add :user_id, references(:users, type: :binary_id, on_delete: :delete_all)
timestamps()
end
end
end
The last step is to run the migration. Go back to the root folder and run the mix ecto.migrate command. You should end up with the message confirming that migration has been done with the list of all changes in your console.
A word of conclusion
In this article, we learned the basic concepts of model creation in Elixir. This part might seem trivial to some, but the concept of modular design and splitting the functionalities between the right modules is one of the core bases of the clean code. Therefore, it's worth repeating as often as it can be.
In the next article, there was a plan to work with a data gathering. However, I think we will work a bit with controllers and routing instead, which will allow us to build some basic functionalities on top of our models. Until then!