Yes, TDD slows you down

Reading Time: 3 minutes

I recently had the opportunity to reflect about testing practices. In particular, the different layers of testing: unit testing, integration testing, acceptance testing.

The test pyramid suggests that each level has an associated cost. The cost of your tests translates then into the pace at which you can make changes to your system.

Quite frequently, early-stage organizations tend to focus less on unit tests and more on acceptance testing. This might be the case at the early days of a new product. There’s a high fluctuation in requirements and you might feel that unit tests will slow you down when it comes to making radical changes to the implementation.

By not investing in unit testing, though, you are giving up on the opportunity to do TDD. Yes, you will still be able to write acceptance tests before writing any code. But having so coarse-grained tests will not constrain you in the way you structure your code, your components and how they communicate with each other.

Your acceptance tests will guarantee that your system meets the requirements, now.

TDD is not just about testing

TDD is a software development methodology that strictly dictates the order you do things in. And it’s not dictating that order for the sake of it. Writing tests first and letting that drive the functional code you write after is going to impact the way you design and architect the system.

This is going to have numerous benefits: your code will be modular, loosely coupled and with high cohesion, as well as being clearly documented and easy to extend.

Unit testing is one aspect of TDD. It helps you at a fundamental level of your system. The implementation level, that is ultimately going to decide how agile you will be in the future at making changes.

If you develop a system without a TDD approach you won’t necessarily have a system without tests. But you will most likely have a system that is hard to extend and hard to understand.

This is the reason why I’m more and more convinced that you can’t really be that liberal about your testing layers. If you’re building new functionality and decide it is going to solely be tested through acceptance testing, you’re most likely going to miss out on the added benefits of TDD at your service layer.

When you mix TDD with non-TDD you’re not just compromising on the tests at certain layers of your test stack. You’re compromising on the architecture of your system.

Do you really need to go faster?

Of course you do! Who thinks that going slower is better?

Too bad the question should usually be rephrased to:

Do you really need to go faster now, and go slower later?

I think this way of phrasing it helps making the right decision. As I said at the beginning of this post, going faster now might actually be the right decision. TDD might be what slows you down now, and you might not be able to afford slowing down now. But my advice is not to stop asking that question as you keep making compromises.

When you defer the adoption of TDD you’re not just skipping on unit tests. You’re developing code unconstrained, increasing the opportunities of high coupling and low cohesion. This inevitably makes you slower as you go. And it will keep making you slower until you will inevitably incur in such a high cost of change that implementing even the most trivial feature will seem like an insurmountable challenge. Not to mention this will impact your team morale, as well as the confidence that the rest of the organization will have for your team.

Be conscious

As I said at the beginning of this post, giving up on unit tests might actually be the right thing to do for you, now. What’s important to realize, though, is that giving up on TDD at a certain layer doesn’t just mean giving up on tests. It means giving up on a framework that guides you through specific principles. If you are not going to do TDD, do you have an alternative set of principles to stick to? Do you have an alternative way that helps you structure your system effectively, to help you scale? Are you going to have the confidence to make changes? Are new team members going to be able to easily on-board themselves and make changes without requiring additional contextual knowledge?


If you enjoyed this post, chances are you’ll enjoy what I post on Twitter. Thank you for getting to the end of this article.

Testing LiveData on Android

Reading Time: 3 minutes

Testing LiveData represents an interesting challenge due to the peculiarities of its technology and the way it eases development for your Android app.

I’ve recently started to build an Android app to keep motivated on my journey to learn Kotlin. My most recent experience has been with Architecture Components and this brief blog post, in particular, will focus on unit testing your DAO when using LiveData.

What is LiveData?

LiveData is a lifecycle-aware, observable data holder that will help you react to changes in you data source. In my case, I’m using it in combination with Room to make sure my app reacts to the new data becoming available in my database.

Our simple DAO

For this blog post let’s just pretend we have a very simple DAO that looks like the following:

This is just a DAO that helps you fetch posts.

As you can see, the return type for the function is not just a plain List<Post> but it wraps it in a LiveData instance. This is great because we can get an instance of our list of posts once and then observe for changes and react to them.

Let’s test it

The Android Developer documentation has a neat example on how to unit test your DAO:

This pretty simple test has the aim of testing that the data is being exposed correctly and that, once the post is added to the database, it is reflected by the getAll() invocation.

Unfortunately, by the time we are asserting on it, the value of the LiveData instance will not be populated and will make our test fail. This is because LiveData uses a lifecycle-oriented asynchronous mechanism to populate the underlying data and expects an observer to be registered in order to inform about data changes.

Observe your data

LiveData offers a convenient observe method that allows for observing the data as it changes. We can use it to register an observer that will assert on the expected value.

The observe method has the following signature:

void observe(LifecycleOwner owner, Observer<T> observer)

It expects an observer instance and the owner for its life-cycle. In our case, we’re only interested in keeping the observer around just enough to be able to assert on the changed data. We don’t want the same assertion to be evaluated every time the data changes.

Own your lifecycle

What we can do, then, is build an observer instance that owns its own life-cycle. After handling the onChange event we will mark the observer life-cycle as destroyed and let the framework do the rest.

Let’s see what the observer code looks like:

This observer implementation accepts a lambda that will be executed as part of the onChange event. As soon as the handler is complete, its own lifecycle will proceed to mark itself as ON_DESTROY which will trigger the removal process from the LiveData instance.

We can then create an extension on LiveData to leverage this kind of observer:

fun <T> LiveData<T>.observeOnce(onChangeHandler: (T) -> Unit) { 
    val observer = OneTimeObserver(handler = onChangeHandler) 
    observe(observer, observer)
}

Let’s test it again

A couple of things to notice this time.

First of all, we’re taking advantage of an InstantTaskExecutorRule. This is a helpful utility rule that takes care of swapping the background asynchronous task executor with a synchronous one. This is vital to be able to deterministically test our code. (Check this out if you wanna know more about JUnit rules).

In addition to that, we’re now leveraging the LiveData extension that we have written to write our assertions:

postDao.getAll().observeOnce {
    assertEquals(0, it.size)
}

We have just asserted in a much more compact and expressive way by leaving all the details inside our observer implementation. We are now asserting on the LiveData instance in a deterministic way making this kind of tests easier to read and write.

Conclusion

I hope this post will help you write tests for your DAO more effectively. This is one of my earliest experiences with Kotlin and Android: please, feel free to comment with better approaches to solve this.

Follow me on Twitter for more!

Cover photo by Irvan Smith on Unsplash

Email development with React and Webpack

Reading Time: 5 minutesEmail development is often an overlooked practice due to the peculiarities and constraints that most email clients impose.

If you want to deliver a delightful user experience to your product you have to maintain its design consistent across all the media it is consumed from.

An email is one way your product might be consumed from. It is therefore important that you craft your emails sticking to the same design principles you would follow when developing your product in other contexts.

The email world, though, has its own peculiarities, and the constraints the emails have to be built within often lead to design compromises.

My personal experience with email development

A project I was recently involved in had delivered a new look and feel to the product but it required an email to be sent out about it.

We wanted our email to take advantage of the new designs too and, ideally, also of the tools and components we had already built for delivering the new UI.

For this reason, we decided to build an email development pipeline that would help us achieve this goal.

In this blog post I’m going to focus on the use of React and webpack to build HTML templates to be sent as emails.

Even though I mentioned React, the key aspect of this project is really how webpack has been configured to come up with a small email development environment. I’m pretty sure the same configuration can be easily adapted to work with other frameworks.

The constraints

This project had to allow us to re-use React components in our email development. We also wanted to structure it so that it would allow for more than one email template to reuse our existing components library.

Finally, we wanted to be able to produce, as output, a single, standalone HTML file.

Let’s get started

I created and pushed an example project on GitHub. Check it out if you’re impatient 🙂

The project structure

We’re gonna structure the project in a way that allows multiple email templates to be crafted and hosted in the same repository.

.
├── output/
├── package.json
├── README.md
├── src/
│   ├── components/
│   │   └── SectionOutline/
│   │       ├── index.js
│   │       └── index.scss
│   ├── index.js
│   └── templates/
│       └── HelloWorld/
│           ├── index.js
│           ├── index.scss
│           └── index.test.js
├── webpack.config.js
└── yarn.lock
  • The templates folder will contain all the email templates that will be built, pre-rendered and published into the output folder
  • The components folder will contain all the reusable ReactJS components you want to reuse across templates
  • The output folder contains the resulting HTML output of the template you chose to build

webpack.config.js

The webpack configuration file is probably the most important bit in this project. The build configuration will take care of picking the right template to build by injecting all the information it needs and pre-rendering it to HTML.

Let’s start with the header of our webpack.config.js.

const path = require('path');
const webpack = require('webpack');
const HtmlWebpackPlugin = require('html-webpack-plugin');
const PrerenderSPAPlugin = require('prerender-spa-plugin');
const HTMLInlineCSSWebpackPlugin = require('html-inline-css-webpack-plugin').default;

As you can see, the configuration is importing a few useful plugins.

In particular, PrerenderSPAPlugin, will take care of pre-rendering the whole template and generate a static HTML out of it. This is achieved using puppeteer behind the scenes.

Another important bit, here, is HTMLInlineCSSWebpackPlugin that will help us convert our css into an internal <style> node within our generated HTML. This is particularly useful for emails.

A dynamic entry

We want to be able to compile a single template out of all the ones available in the templates folder. In order to do this, we will create a function that will return the configuration hash to be exported for the webpack configuration.

const config = (env) => {
  return ({
    mode: 'production',
    entry: {
      // this will be filled in dynamically
    },
    output: {
      filename: '[name].js',
      path: path.join(__dirname, 'output'),
    },
    module: {
      rules: [
        {
          test: /\.(scss|css)$/,
          use: [
            'css-loader',
            'sass-loader',
          ],
        },
        {
          test: /\.js$/,
          exclude: /node_modules/,
          use: {
            loader: 'babel-loader',
          },
        },
      ],
    },
    plugins: [
      new PrerenderSPAPlugin({
        staticDir: path.join(__dirname, 'output'),
        indexPath: path.join(__dirname, 'output', `${env.entry}.html`),
        routes: ['/'],
        postProcess: context => Object.assign(context, { outputPath: path.join(__dirname, 'output', `${env.entry}.html`) }),
      }),
      new HtmlWebpackPlugin({
        filename: `${env.entry}.html`,
        chunks: [env.entry],
      }),
      new HTMLInlineCSSWebpackPlugin(),

      // the DefinePlugin helps us defining a const
      // variable that will be 'visible' by the JS code
      // we are rendering at runtime
      new webpack.DefinePlugin({
        "EMAIL_TEMPLATE": JSON.stringify(env.entry),
      }),
    ],
  });
};

As you can see, the mandatory entry field has not been set yet. We will handle it later, and we will require it to be passed in by command line.

The entry will be used to load the right template by passing its value to the PrerenderSPAPlugin. It will also be used to tell HtmlWebpackPlugin how to name the result file.

Finally, we export the configuration the way webpack expects it to be exported:

module.exports = (env) => {
  const entry = {};
  entry[env.entry] = './src/index.js';
  const cfg = config(env);
  cfg.entry = entry;
  return (cfg);
};

Whatever entry we specify, we will always associate it with our entrypoint: index.js.

The index.js file is what is reponsible of loading the template and embedding it into our email layout.

Check out the full webpack.config.js for more information.

A dynamic template

This is the content of the index.js file:

import React, { PureComponent } from 'react';
import ReactDOM from 'react-dom';
import { Box, Item } from 'react-html-email';

/*
 * The EmailContainer will be the body of your Email HTML body.
 * It will receive the right template to load and inject from Webpack
 * and will attempt to load it here and include it in the DOM.
 * 
 * This class expects the EMAIL_TEMPLATE const to be defined.
 */
class EmailContainer extends PureComponent {
  render() {
    // EMAIL_TEMPLATE is defined by the webpack configuration and enables us
    // to include the right template at compile time
    const Template = require(`./templates/${EMAIL_TEMPLATE}`).default;
    return (
      <Box width="600px" height="100%" bgcolor='#f3f3f3' align='center'>
        <Item align='center' valign='top'>
          <Template />
        </Item>
      </Box>
    );
  }
}

ReactDOM.render(<EmailContainer />, document.body);

This is where the magic happens. Every template will be embedded into this code that will produce the final HTML output for the email. As you can see, I am importing a couple of components from the convenient react-html-email project that takes care of providing a few useful components for the email world.

The Template object is dynamically loaded from the EMAIL_TEMPLATE const string that’s expected to be defined when this code is executed. We’re able to do this because we’re using the webpack DefinePlugin:

// the DefinePlugin helps us defining a const
// variable that will be 'visible' by the JS code
// we are rendering at runtime
new webpack.DefinePlugin({
  "EMAIL_TEMPLATE": JSON.stringify(env.entry),
}),

The plugin will take care of setting the const to the full path of the email template I am interested in rendering. In our case the HelloWorld template.

Run it

yarn webpack --env.entry=HelloWorld

The resulting HTML will be stored in the output folder.

browser window html hello world
hello world template rendered result

I know. Not the most beautiful email, but I’ll leave the design to you. 🙂

I hope you enjoyed the post. Let me know if you have any feedback. Don’t forget to check out the full project on GitHub.