Pasha Craydon

Recent Posts

  • Jenkinsfile with Slack notifications

    How do you setup a simple Jenkins pipeline that can run node commands for your javaScript project and send notifications of their results to your Slack channels.

    Creating a Jenkinsfile file is a great way to do this. You gain a single source of truth for your pipeline since it can be checked into source control. Great for code review and iterating on the pipeline.

    Here is an example for running node commands. It uses the Timestamper plugin which adds timestamps to the console output. Inside this wrapper we use a nodejs environment to checkout the project then run npm commands during the Build stage. You can add additional stages such as for testing, deployment, cleanup etc. but this is intentionally left simple and clean.

    These stages are wrapped in a try statement so that build failures can be passed to a notifyBuild handler that will deal with sending Slack notifications.

    #!groovy
    
    timestamps {
      node('nodejs') {
        currentBuild.result = 'SUCCESS'
    
        try {
          stage('Checkout') {
            checkout scm
          }
    
          stage('Build') {
            env.NODE_ENV = 'test'
    
            print 'Environment will be : ${env.NODE_ENV}'
    
            sh 'node -v'
            sh 'npm prune'
            sh 'npm cache verify'
            sh 'npm install'
            sh 'npm run build'
          }
        }
        catch(err) {
          currentBuild.result = 'FAILURE'
          throw err
        }
        finally {
          // success or failure, always send notifications
          notifyBuild(currentBuild)
        }
    
      } 
    }

    The notifyBuild handler looks at the previous builds to figure out if they failed so that we can tailor our messages to be helpful. We don’t want to notify every single time a build passes the first time so we return early for those.

    If a build failed but then passes, we send a notification that the build “returned to passing”. If a build previously failed and it fails again, we send a notification that the build is “still failing” with a color notification that ratchets up from danger to warning.

    PROJECT = 'my-project-on-github-repo'
    
    // Send messages these Slack channels
    SLACK_CHANNELS = ['#my-favorite-slack-channel']
    
    def notifyBuild(currentBuild) {
      println 'Evaluating build notifications...'
      def channels = SLACK_CHANNELS
    
      notificationPrefix = "<${BUILD_URL}/console/|${PROJECT} ${BRANCH_NAME} ${currentBuild.displayName}>"
      lastBuildResult = currentBuild.rawBuild.getPreviousBuild()?.getResult()
    
      def color = ''
      def message = ''
    
      println "Build is going from '${lastBuildResult}' to '${currentBuild.result}'"
    
      if (currentBuild.result == 'SUCCESS') {
        color = 'good'
        if (lastBuildResult &&
            lastBuildResult != hudson.model.Result.SUCCESS &&
            lastBuildResult != hudson.model.Result.ABORTED) {
          message = "${notificationPrefix} returned to passing"
        } else {
          message = "${notificationPrefix} passed"
          // Unnecessary to notify of every build that passes
          return false
        }
      } else if (currentBuild.result == 'FAILURE') {
        if (!lastBuildResult ||
            (lastBuildResult == hudson.model.Result.SUCCESS ||
             lastBuildResult == hudson.model.Result.ABORTED)) {
          color = 'danger'
          message = "${notificationPrefix} failed"
        } else {
          color = 'warning'
          message = "${notificationPrefix} still failing"
        }
      }
    
      println "Sending '${color}': '${message}' to ${channels}"
    
      if (color != '' && message != '') {
        // Send notifications
        for (channel in channels) {
          slackSend (channel: channel, color: color, message: message)
        }
      }
    }

    You can set this up to work through Github by adding a Slack Payload URL to the Webhooks sidebar menu setting under the repositories setting in the Github admin. It may also be necessary to add Non-humans to the Collaborators & teams setting and give them administration permissions so Jenkins can do it’s thing.

    In Jenkins, I set up a new job. Point it at the Github project repository and add a Build Configuration Mode selected to by Jenkinsfile.

  • Mount Shasta in a day

    This article was also published on Medium.

    Tree branches carved long scratches into the paint of my Honda Civic. Driving close to the shoulder was my best bet to avoid the trenches carved out by more capable vehicles that mine would surely bottom out on. I was 6 miles down a rough dirt road that was getting worse and causing me some anxiety due to the low clearance of my vehicle. In the distance rose Mount Shasta, all 14,179 feet of it.

    The remaining members of our group, two girls, had just dropped out of the trip due to adverse conditions on the mountain. A few winter storms had blown through the past couple of weeks and left snow on the mountain which could make climbing it more dangerous. Rangers had warned the climbing season was probably over.

    It would just be me and the trip leader, Marco. I was confident enough in his previous experience with Shasta and his enthusiasm for continuing the trip that I decided not to back out. Besides, winter storms were expected for the following week so this was definitely the last opportunity of the year I would have to climb the mountain.

    I arrived at Clear Creek trailhead and settled in for the night in the back of my car. My alarm woke me up at 4 am the following morning. Marco had arrived in the night and was already gearing up. We left our vehicles two hours later. An hour too late Marco grumbled.

    We marched for three miles through fir trees with our headlamps on in the dark. The sun began to rise and light up the landscape. It was beautiful.

    Beyond the tree line we came upon a spring. The water came bubbling straight out of the ground. This would be the last stop for water, Marco said and I topped off my two water bottles, 3 liters in total. The water was cold and clean.

    The landscape was beginning to get desolate, inhabited only by rock, ice and small trees. The trail was about to get much more difficult, Marco warned and he covered his face in a balaclava. Soon enough the trail was straight up through scree with fast elevation gain.

    We ran into two other hikers, a young couple. They thought that they would summit by 2 pm or so and be back down the mountain before sunset. Marco corrected them. At best they would summit by sunset and be descending in the dark. This was our goal and we were prepared for the descent with headlamps but I think it scared them and they eventually turned around before they would have to face hiking in the dark.

    Marco would point out landmarks in the distance, small peaks or big boulders and cautioned me that while they appeared close, they were still many hours away.

    The trail gained more of a steep angle and the scree became tougher. Trudging up it was a little like doing stair master on difficult and it would last for many hours. For every three steps you moved up you would slide down one. It was fairly exhausting and would really test us, particularly as the altitude began to hit.

    Our first destination was Red Rock. A croppy boulder field of large rocks that would signal the end of this strenuous traversal in scree and begin our scramble to the top. The altitude started to hit me closer to Red Rock. I felt a little bit light headed like I had been given some drug at the dentist.

    I was familiar with altitude mountain sickness and I knew how dangerous it can be. I had turned around after camping at 12,000 feet on Mount Whitney the previous weekend after experiencing nausea and what I determined were the beginning symptoms of altitude mountain sickness. Combined with the strenuous huffing and puffing and my light headedness, I was beginning to feel some anxiety about pushing for the summit. We determined that we would make it to Red Rock and reassess how we felt and if we wanted to continue.

    It must have been about 3pm when we made it to Red Rock. The landscape was much more rocky and strewn with large boulders. We stopped to eat. I ate a couple tortillas with salami and mayonnaise and mustard. I also had a 16 oz Rockstar energy drink which immediately boosted my energy and lifted my enthusiasm for pushing for the summit. We were both feeling good so we decided to push on.

    We entered a landscape with big boulders that required some class 2 scrambling. This was a welcome change as I found it much easier than the strenuous trudge through scree. The landscape started to level out when we could finally see the highest point on Shasta. Marco cautioned that while it looked close, we were still a few hours away from it.

    It was about 5pm when we reached the final plateau closest to the summit. I was immediately hit by a wave of sensations. First, there was a very pungent sulfur smell in the air from volcanic springs that came out of the mountain. This reminded me that this was still an active volcanic area. Second, the air must have been much thinner here because the altitude had a much stronger affect on me. I felt more light headed and I had a brief moment of anxiety as I stopped and considered turning back. I decided I felt well enough to push to the summit but I continued very slowly.

    I felt exhilarated when I reached the summit. The view was knock out. Shasta cast a huge shadow to the south that looked like a giant pyramid. There is a book on the summit that you can sign. Marco and I both added our signatures.

    The altitude was really taking a number on me and I felt the urge to descend. We had left distinct footsteps in patches of snow on the way up and it paid off helping us find our way down quickly. The last rays of the sun disappeared by the time we made it back to Red Rock and we switched on our headlamps. The affects of the altitude started to diminish here and I felt much better.

    The night sky was clear as we hiked down the mountain and the moon was out but it was not bright enough for us to turn off our headlamps. It took us another 7 hours to make it back to our vehicles. We followed the main trail that is a series of switchbacks down the scree. We would lose the trail then spend some time looking for it. Coming down through the scree was much easier then going up and while my knees ached I was riding on a high of exhilaration from having reached the summit.

    We considered glissading but the area where there was snow was quite steep and frozen. We worried that we would not be able to self-arrest and might injure ourselves or worse. Glissading is the act of sliding down a steep slope of snow while using your ice axe for support. Self-arrest is the technique of using your ice axe to brake while sliding or falling down ice or snow.

    We had many good conversations on the trip. We both work with software so we talked about software engineering, the future of software development and Marco told me stories about his many trips up Mount Shasta and Mount Rainier.

    We returned to our vehicles at 1 am the following morning. The trip total was about 7,800 ft of elevation gain, 16 miles round trip over 19 or so hours. I slept in my vehicle and left Shasta when I woke up after stopping off at Black Bear diner for a greasy breakfast. Marco stayed and rested then did the whole mountain again the following day by himself. What a beast.

  • Decorators and React Components

    Decorators are a syntax for calling higher order functions. Typically, a decorator is just a function which takes another function and extends the latters behavior. We can use decorators to reduce boilerplate by easily adding reusable behaviors to React components.

    Decorators are available using the ES7 syntax so you need an npm plugin such as the babel-plugin-transform-decorators-legacy to transpile it.

    The following is a very simple example of simply outputting the decoratored components props to the console so we get an idea of how this can work.

      import React, { Component } from 'react';
    
      function debugDecorator() {
        return DecoratedClass => class extends Component {
          render() {
            console.log(this.props);
            return <DecoratedClass {...this.props} />;
          }
        };
      }
    
      @debugDecorator()
      export default class Car extends Component {
        render() {
          return (
            <div>
              <h2>This is a car. Zoom.</h2>
            </div>
          );
        }
      }

    Every instance of <Car /> will now output its props to the console.

    Arguments

    Taking this example further, we can pass arguments to the decorator. In this example, the decorator accepts a callback function which we use to pass in the function warn which simply outputs the components props via console.warn.

      import React, { Component } from 'react';
    
      function debugDecorator(callback) {
        return DecoratedClass => class extends Component {
          render() {
            callback(this.props);
            return <DecoratedClass {...this.props} />;
          }
        };
      }
    
      const warn = (props) => console.warn(props);
    
      @debugDecorator(warn)
      export default class Car extends Component {
        render() {
          return (
            <div>
              <h2>This is a car. Zoom.</h2>
            </div>
          );
        }
      }
  • Let's talk about webpack

    Confused by webpack? You’re not alone. I often field a lot of webpack questions at work so I put together something of a short guide here.

    Webpack Basics

    Webpack is a module bundler for javaScript. It can handle things like code splitting, hot reloading, ES6 to ES5 transpilation and much more. Webpack has a bit of a learning curve and can be difficult to setup at first so let’s start with some basics.

    Entry

    Entry points are the javaScript files that webpack will bundle for you. The syntax is;

      const config = {
        entry: {
          main: './path/to/my/entry/file.js'
        }
      };

    This configuration will turn out a javaScript file named main.js that is bundled by webpack from the file that is at ./path/to/my/entry/file.js.

    Output

    You need to tell webpack where to bundle your files. The output setting lets you set the directory that webpack should output the bundled files and also how they should be named. A simple example is;

      const path = require('path');
    
      module.exports = {
        entry: {
          main: './path/to/my/entry/file.js'
        },
        output: {
          path: path.resolve(__dirname, 'dist'),
          filename: '[name].bundle.js'
        }
      };

    In this example, the bundled file will show up in the folder named dist in your project with the filename main.bundle.js.

    See additional configuration for the output setting here.

    Loaders

    Webpack only understands javaScript. Loaders help webpack transform files that it does not understand so they can be added to your bundles, this includes CSS, html, SASS files, JSON etc.

      const path = require('path');
    
      module.exports = {
        entry: {
          main: './path/to/my/entry/file.js'
        },
        output: {
          path: path.resolve(__dirname, 'dist'),
          filename: '[name].bundle.js'
        },
        module: {
          rules: [
            {
              test: /\.jsx?$/,
              use: [
                {
                  loader: 'babel-loader'
                }
              ]
            }
          ]
        }
      };

    In this example we have added the module setting. A loader is added to the rules array. This loader called babel-loader looks for files with the extension .jsx. Babel loader transforms javaScript written in the ES6 syntax to ES5.

    See additional configuration options for loaders here.

    Plugins

    Webpack can be customized with plugins. There are many webpack plugins available on npm.

      const path = require('path');
    
      module.exports = {
        entry: {
          main: './path/to/my/entry/file.js'
        },
        output: {
          path: path.resolve(__dirname, 'dist'),
          filename: '[name].bundle.js'
        },
        module: {
          rules: [
            {
              test: /\.jsx?$/,
              use: [
                {
                  loader: 'babel-loader'
                }
              ]
            }
          ]
        },
        plugins: [
          new BundleAnalyzerPlugin({
            analyzerMode: 'static',
            reportFilename: 'bundle_analysis.html',
            openAnalyzer: false,
            generateStatsFile: true,
            statsFilename: 'stats.json'
          })
        ]
      };

    In the above example there is a new setting plugins added. The array includes a plugin called BundleAnalyzerPlugin available on npm. Plugins require instantiation through the usage of new and often allow an options object to be passed into them.

    Handling CSS with PostCSS

    PostCSS is a webpack loader that has many plugins to allow you to do a wide array of things with CSS that include inline images, linting, transpile future CSS syntax, support variables and mixins, CSS modules and more.

    Setup

    A great way to setup PostCSS is to use the ExtractTextPlugin to pull out the CSS from the javaScript bundles into a separate file. This will allow a few advantages such as you can cache your CSS separately from your javaScript, improve the compilation time of javaScript and CSS separately, improve runtime since CSS is requested in parallel with the javaScript bundles.

    You can add ExtractTextPlugin as a loader.

      const postCssPlugins = [
        require('postcss-import'),
        require('postcss-url'),
        require('postcss-filter-gradient'),
        require('postcss-cssnext'),
        require('postcss-extend')
      ];
    
      ...
    
      module: {
        rules: [
          {
            test: /\.css$/,
            use: ExtractTextPlugin.extract({
              fallback: 'style-loader',
              use: [
                {
                  loader: 'css-loader',
                  options: {
                    modules: true
                  }
                },
                {
                  loader: 'postcss-loader',
                  options: {
                    plugins: () => postCssPlugins
                  }
                }
              ]
            })
          }
        ]
      }

    A few things are going on here. First, this looks for files that include the extension .css and uses the ExtractTextPlugin loader on them. Secondly, the plugin takes an options object. options.use specify the loaders required for converting the resource to a CSS exporting module. This example uses css-loader which allows for CSS modules and importing CSS files into your javaScript modules. post-css can be used standalone but is recommended to be used in conjunction with css-loader. With postcss-loader we can include a plethora of PostCSS plugins, as seen in the above example with the postCssPlugins array.

    CSS Modules

    Look for the modules options flag in css-loader to enable CSS modules. CSS modules allow you to import CSS files from your JS modules. CSS files imported into javaScript modules will show up as javaScript objects. Example.

      .card {
        position: relative;
        display: inline-flex;
        vertical-align: top;
        width: 100%;
        background: #FFF;
      }
    
      .border {
        border: 1px solid red;
        display: inline-flex;
        flex-direction: column;
        justify-content: flex-start;
        width: 100%;
      }

    Pretend the above is in a file called card.css. You can import this file into your javaScript module.

      import styles from './card.css';

    styles will contain a key value object of the CSS file. You can add these styles to your React components like so;

      import styles from './card.css';
    
      export default class Card extends Component {
         render() {
            return (
              <div className={style.border}>
                <form className={style.card}>
                </form>
              </div>
            );
          }
        }

    These CSS styles will be compiled with hash values so instead of seeing something like <div class="border"> in the html. You will see something like <div class="_23_aKvs-b8bW2Vg3fwHozO-border">. Though the way these hash values appear can be customized, the purpose is that these styles are local to the component so there are never any CSS styles that override them. This allows you to keep your CSS very clean and avoid the many pitfalls of deeply nested CSS.

    You can still write global CSS in these files, the syntax is;

      :global(.card) {
        position: relative;
        display: inline-flex;
        vertical-align: top;
        width: 100%;
        background: #FFF;
      }

    How to handle assets like images and font files

    At some point you may want to include images, svg, JSON and maybe even font files in your project. There are a few loaders that can handle this, file-loader for images and fonts and raw-loader can handle svg and text files, json-loader can handle JSON.

      module: {
        rules: [
          {
            test: /\.(jpg|png|ttf|otf)$/,
            loader: 'file-loader'
          },
          {
            test: /\.json$/,
            use: 'json-loader'
          },
          {
            test: /\.svg$/,
            loader: 'raw-loader'
          }
        ]
      },

    In this example, we add the three new loaders to look for our images, font files, svg and JSON files.

    Dealing with external libraries

    Sometimes you may want to exclude certain packages from the webpack bundle but still include them as dependencies in your modules. This may be because the package doesn’t play well with webpack, throws errors you can’t resolve when bundling or maybe you just want to retrieve it from a CDN.

    You can use the externals setting for this. These packages will be retrieved at runtime.

      externals: {
        lodash: 'lodash'
      }

    In this example, require('lodash') is external and available on the global variable lodash. You will want to include a script tag in your HTML to these libraries. Example that pulls lodash from a CDN.

      <script src="https://cdnjs.cloudflare.com/ajax/libs/lodash.js/4.17.4/lodash.min.js"></script>

    Improve bundling performance

    Webpack is pretty opaque so it can be difficult to figure out why your bundles are way too big or it takes forever to bundle them. webpack-bundle-analyzer steps in to fill this void. This webpack plugin can make it very clear what modules are taking up the most size, what is really inside each module etc. The following is an example configuration of this plugin.

      const BundleAnalyzerPlugin = require('webpack-bundle-analyzer').BundleAnalyzerPlugin;
    
      ...
    
      plugins: [
        new BundleAnalyzerPlugin({
          analyzerMode: 'static',
          reportFilename: 'bundle_analysis.html',
          openAnalyzer: false,
          generateStatsFile: true,
          statsFilename: 'stats.json'
        })
      ]

    Using this configuration you can run the command webpack --profile --json > stats.json. This will generate a stats.json file in your projects root folder. Opening this file at the bottom will be printed a direct link to the file bundle_analysis.html. Opening this link in a web browser will show you graphs of the analyzed modules.

    Bundle Analyser

    There are more options which include running a server, check out the package here.

    Bundle common dependencies

    A nice way to improve performance is by bundling all dependencies (or code which doesn’t change often) which are common to your whole project (such as React, Redux etc.) into a specific file that you can then cache. The webpack-vendor-chunk-plugin is great for this.

      plugins: [
        new webpack.optimize.CommonsChunkPlugin('vendor', 'vendor.js'),
      ]

    The above example setting will bundle common libraries into a file called vendor.js. If you find that after analyzing your bundles this file is very heavy, you can take this example further and exclude specific packages.

      const EXCLUDED_FROM_VENDOR = [
        'language-tags',
        'language-subtag-registry',
        'd3',
        'moment',
        'moment-timezone'
      ]
    
      ...
    
      new webpack.optimize.CommonsChunkPlugin({
        name: 'vendor',
        minChunks(module) {
            return module.context
              && module.context.indexOf('node_modules') !== -1
              && !(new RegExp(EXCLUDED_FROM_VENDOR.join("|")).test(module.context));
        }
      });

    The above example includes an option in the CommonsChunkPlugin that tells it to bundle all files from node_modules except the packages that are listed in the EXCLUDED_FROM_VENDOR array.

  • How to keep React tests maintainable

    Keeping front-end tests maintainable is important. When it comes to adding features or refactoring code, you can inadvertently break more tests then you intend and suck much more time into your project then may be necessary. Here are some steps I try to follow to keep React tests helpful and easy to maintain.

    1. Keep tests isolated

    Tests should not care about or rely on what is outside of the component or logic being tested. Mock interactions with external services or packages to abstract away the real implementation. Code that is refactored in packages should (almost) never break any tests in apps which consume them.

    Think about where tests belong. For example, let’s say you are writing a test for a function that goes into the callback of a component that you imported from a separate package. You should test the function you are passing into the callback. You can test that you pass the function as a prop into the component. You should not test your function as it performs in the component since that is an example of testing logic which should be tested in the package.

    How to test code that is in a separate package

    It is common to use functions that are invoked as callbacks in components which are imported from separate packages. You can test the function and you can test that you pass the function as a prop to the component. The following is an example:

      import { MyComponent } from separate-package;
    
      function myAnalyticsHandler () => {
        ga('send', 'event', 'category', 'click', 'label');
      }
    
      test('should include `myAnalyticsHandler` as a prop for on click events.', () => {
        const wrapper = shallow(
          <div>
            <MyComponent onClickHandler={myAnalyticsHandler} />
          </div>
        );
    
        expect(wrapper.find(MyComponent).props().onClickHandler).toEqual(myAnalyticsHandler);
      });
    
      test('myAnalyticsHandler() should call `ga` with expected events.', () => {
        window.ga = jest.fn();
        myAnalyticsHandler();
        expect(window.ga.toBeCalledWith('send', 'event', 'category', 'click', 'label'));
      });

    The point here is that these tests never dig any further than the root level into a component that was imported from a separate package. Refactoring any code inside MyComponent will not indirectly break these tests unless MyComponent loses its onClickHandler callback prop, in which case one of these tests should rightly fail.

    In this case, you would not write a test to ensure that MyComponent.onClickHandler correctly invokes your callback. That test belongs within the separate-package package close to the MyComponent code because that is where its business logic resides.

    Only test code that you wrote

    Consider whether you are testing code that you wrote versus code that is in the library you are using. For example, if you write a connected component that is hooked up to Redux, you do not need to test the connected part. You can export a version of the component that is not connected just for testing, along with the connected version for your app, e.g.:

      export default connect(
        (state, ownProps) => {
          return {
            ...state
          };
        }
      )(MyComponent);
    
      export { MyComponent as PureMyComponent };

    In your test file, import the pure component and write tests for it.

      test('should render', () => {
        const wrapper = shallow(<PureMyComponent />);
        expect(wrapper.find('div').exists()).toBe(true);
      });

    2. Keep tests specific

    When a test breaks, there should be a very logical and specific failure message. Describe tests from the perspective of a developer rather than a user. For example, instead of describing a test as revealing something based on a click event, describe the specific thing the click handler should do, such as changing a state attribute.

    3. Prefer shallow over mounted tests

    Enzyme includes two methods, each with its own API for asserting and traversing React components, called shallow and mount.

    shallow rendering will not allow you to traverse child components. They will show up only as their root component name. Example of a component mounted using shallow in enzyme:

      <div>
        <h1>Test title</h1>
        <MyComponent uri= onClick={Function}>
          <span className="myClass" />
        </MyComponent>
      </div>

    You can assert that <MyComponent /> exists, that it is passed certain props, that it has some children – but you can’t test the insides of MyComponent since it is not rendered. Shallow rendering is useful for ensuring your test does not indirectly assert on behavior of child components. This is important for keeping our test code isolated and should be the preferred method of testing.

    Enzyme’s mount() method will fully render all the nodes of a component and its child components (like html displayed on a page). It is a sort of code smell because you are much more likely to be testing logic that is not native to your component and may be breaking the isolation rule. Keep an eye on this in your own tests and in code review.

    I have no hard rule against using mount(). Writing tests with mount() is still appropriate in many situations such as where child components play an integral role in there parents.

    Summary

    In short, keep tests maintainable by making sure you keep your tests isolated and free from any dependencies. Don’t overcomplicate your tests by testing logic you did not write. Keep your tests specific by describing them from the perspective of a developer. Avoid testing the logic of child components by using shallow methods to traverse your components.