2024

The Braytech project pays my rent and evolves daily. No matter how often I try to update the project item in my portfolio, it always falls out of currency. It's difficult to even justifty spending time updating it when you could instead be using that time to grow the project that pays the bills! It's very cyclical.

Functions for 2023

In reply to Built-in functions

Immutability

  • Array.prototype.toSpliced()
  • Array.prototype.toSorted()
  • Array.prototype.toReversed()
  • Array.prototype.with()

The above methods look like existing methods we've had for decades but with one key difference: they return new copies of arrays. This has various implications, especially if you're using a framework like React.

Previously, to avoid mutating an original, you'd first make a copy the array by using from(), slice(), or the spread operator. Now you can do it on the fly.

with() is a cool new trick that allows you to alter an item in an array at specific index while returning the entire array without mutating the original.

const fruits = ['🍐', 'πŸ‘', 'πŸ’', '🍍', 'πŸ‡', '🍌']; 

console.log(
  fruits.with(2, 'πŸ‰')
);
// ['🍐', 'πŸ‘', 'πŸ‰', '🍍', 'πŸ‡', '🍌']

Coming Soon: Object.groupBy()

The Object.groupBy() static method groups the elements of a given iterable according to the string values returned by a provided callback function. The returned object has separate properties for each group, containing arrays with the elements in the group.

Object.groupBy() - Javascript | MDN

Whatever you say, MDN technical writers! Basically, you can group an array without using a library like Lodash and without writing a complex reduce() function.

const sports = [
  { name: 'Rugby Union', contact: 'full' },
  { name: 'Tennis', contact: 'noncontact' },
  { name: 'American Football', contact: 'full' },
  { name: 'Wrestling', contact: 'full' },
  { name: 'AFL', contact: 'semi' },
  { name: 'Sailing', contact: 'noncontact' },
];

// using a reduce() function
console.log(
  sports.reduce((groups, sport) => {
    if (groups[sport.contact] !== undefined) {
      groups[sport.contact].push(sport);
    } else {
      groups[sport.contact] = [sport];
    }

    return groups;
  }, {})
);

// using the new Object.groupBy() method
console.log(
  Object.groupBy(sports, (sport) => sport.contact)
);

// both functions return...
const output = {
  "full": [
    { "name": "Rugby Union", "contact": "full" },
    { "name": "American Football", "contact": "full" },
    { "name": "Wrestling", "contact": "full" }
  ],
  "noncontact": [
    { "name": "Tennis", "contact": "noncontact" },
    { "name": "Sailing", "contact": "noncontact" }
  ],
  "semi": [
    { "name": "AFL", "contact": "semi" }
  ]
}

Multi-Process Life

Today, I learnt how I could have multiple scripts messaging eachother in a Node.js environment. It's very similar in principle to recent modern additions to the web standard such as web workers.

// File: parent.js

import { fork } from 'child_process';

const child = fork('child.js');

child
  .on('message', function (message) {
    console.log('Message from child:', message);
  })
  .on('exit', function (code, signal) {
    console.log(`Child process exited with code ${code} and signal ${signal}`);
  })
  .on('error', function (error) {
    console.log(error);
  });

child.send('Message from parent');
// File: child.js

process.on('message', function (message) {
  process.send('Message from parent:', message);
});

All of my projects live in DigitalOcean Droplets, somewhere in some kind of internets cloud thing. Deploying to these Droplets has meant a lot of SFTP action... Until GitHub Actions showed me a better way.

I had a less-than-positive (bad) experience with Netlify which led them to erroneously charging my money card and their support was bad. It really burnt me.

In protest, I built my own private Netlify clone, but running bash scripts from Node is just not that fun.

Enter GitHub actions: you can create highly customisable workflows with various triggers that perform various actions which you may create yourself or import from other GitHub committers.

  1. The following action is fired on git push to the master branch.
  2. It spins up Ubuntu
  3. Check outs the latest commit
  4. Installs Node.js
  5. Runs npm install
  6. Runs npm run-script build
  7. Transfers the built React app to the specified Droplet (or server of your nomination) to be served by NGINX
name: Deploy create-react-app to DigitalOcean Droplet with SCP

on:
  push:
    branches: [ master ]

jobs:
  deploy:
    name: Deploy

    runs-on: ubuntu-latest

    steps:
    - uses: actions/checkout@v3
      with:
        submodules: 'recursive'

    - name: Setup Node.js
      uses: actions/setup-node@v3
      with:
        node-version: '16.x'

    - name: Install dependencies
      run: npm install

    - name: Build
      run: npm run-script build
      env: 
        CI: false

    - name: Deploy
      uses: appleboy/scp-action@master
      with:
        host: ${{ secrets.DROPLET_HOST }}
        username: ${{ secrets.DROPLET_USERNAME }}
        password: ${{ secrets.DROPLET_PASSWORD }}
        source: 'build/'
        target: '${{ secrets.DEPLOY_TARGET }}'
        strip_components: 1

I recently added Suspense, a feature of React, to one of my projects. I learnt the following;

  1. It's awesome and works effortlessly
  2. You need error boundaries
  3. It works great with react-router but you do need to mind how you configure it
    • Error boundaries will make the sub-routes inaccessible when an error is thrown. Instead of wrapping multiple routes, it's better to wrap components on a route-by-route basis instead
    • If you do use an error boundary component as a child of a router <Switch /> component, you may find that the switch's behaviour no longer works as expected

So how do I use it all exactly? Well, that's easy, I created my own little higher-order component. Florals? For spring? Groundbreaking.

In place of a regular <Route /> component, I use this this <SuspenseRoute />. Pass all the usual props along and they get handed down with some extra magic thrown into the mix!

const SuspenseRoute = ({ component: Component, ...rest }) => (
  <Route {...rest} render={route => (
    <ErrorBoundary>
      <Suspense fallback={<SuspenseLoading />}>
        <Component {...route} />
      </Suspense>
    </ErrorBoundary>
  )} />
);

With 2020 vision

Look, it's been a time. A lot of the things have done happened. Today, I'd like to focus on a single event.

So, I use a headless content management system called Directus which I host from a DigitalOcean Droplet.

I decided that I wanted to update it to the latest version, but I didn't realise there'd be so many breaking changes (they weren't communicated well). Needless to say, I ended up in a position where I needed to remove some directories and their contents.

Now, rm -rf is not the bad guy, but I forgot to backup all of the uploaded files associated with the hosted web sites. I deleted everything.

This isn't the first time it's happened. Last time, I had no backups to lean on as I hadn't enabled them.

To my delight, I opted to enable backups for my primary Droplet, after the last disaster, for a small fee. Backups are made weekly and retained for a few weeks before being deleted. These backups can be instantly converted to snapshots. With a snapshot, one can spin up an additional Droplet, based upon it.

Okay, cool, I have the files, but they're on the wrong server. As a programmer, I have a professional web search certification, which enabled me to divine the solution (Googled it): scp -r /path/to/my/files [email protected]:/path/on/remote/droplet.

scp is linux utility, whose name apparently stands for secure copy. It's based on SSH and was a delight to use.

So in summary, I really appreciate the Recycle Bin ― it's an important aspect of my workflow.

"Post your setups" is an activity where people post an image of where they get whatever it is they're doing done. It's popular amongst gaming communities. Seems to be becoming a thing for programmers, too.

It was allegedly popular on Twitter.

This is my desk.This is my desk.

This is my desk.

Built-in functions

In 2019, JavaScript has some amazing and readily accessible built-in functions. Learning React has been one thing, but adding various built-in JavaScript functions to my repertoire has been―arguably―even more rewarding.

Functions such as Array.map, Array.filter, and Array.reduce, can make code so much more straight forward. Of all the tools I've equipped, I've found Array.reduce to be the most confounding. I'd heard of accumulation before, but I'd never heard of an "accumulator" (thanks, MDN Documentation). Of the lot, it's probably the most powerful.

Below is a mostly real example of a situation where I needed to build a quick set of item filter links based on an array of objects returned from an API.

const products = props.productsFromAnAPI;             // array of objects with a 'type' property

const productTypes = [{ type: 'all' }]                // add an artificial product item to the start of the
  .concat(products)                                   // products array, as real data won't have an 'all' type
  .reduce((accumulator, currentValue) => {
    if (!accumulator.includes(currentValue.type)) {   // check the previously accumlated items and only add
      return [                                        // to the array if it's not yet included
        ...accumulator,
        currentValue.type                             // ⚠️ performance tip: this example returns a new array
    ];                                                // with every itteration - for large arrays, consider accumulator.push()
    } else {
      return accumulator;                             // item included already, return the array as is
    }
  }, [])                                              // initialise with an empty array
  .map((type, t) => {
    return (
      <li key={t}>
        <NavLink to={`/store/filter/${type}`}>
          {type}
        </NavLink>
      </li>
    );
  });

One Thousand Voices is a piece of highly sought-after equipment found through successful completion of one of Destiny's co-operative and most challenging activities, the Last Wish raid.

I've been having some fun learning about and experimenting with 3DS Max, materials (sans textures), and rendering.

The final incarnation of my effort sporting a deliciously soft depth of field with all of the bokeh.The final incarnation of my effort sporting a deliciously soft depth of field with all of the bokeh.

The final incarnation of my effort sporting a deliciously soft depth of field with all of the bokeh.

Calculating depth of field for this kind of render requires an exahusative number of CPU cycles but it's totally worth it, especailly if you aren't a surface modeller, don't have a high-poly model, don't know what you're doing, or if you aren't adding textures. ?

It's super forgiving for when you have a lot of missing details or some crazy edges.

Depth of field effects can help take a render from "eh" to "yeah!"Depth of field effects can help take a render from "eh" to "yeah!"

Depth of field effects can help take a render from "eh" to "yeah!"

Bungie's Destiny API includes an endpoint for 3D model data as it's used by the official Destiny 2 Companion app. Unfortunately, it's not disseminated in an easy to consume format, such as GLTF. It's filled with proprietary shenanigans and left to the third party developer to discover how they can wield it.

A starting point, courtesy of the beautiful minds at Bungie via the third party developer API.A starting point, courtesy of the beautiful minds at Bungie via the third party developer API.

A starting point, courtesy of the beautiful minds at Bungie via the third party developer API.

I haven't built many React-based projects yet, especially projects where size and performance become significant concerns... Until Braytech. It's in the Archiveβ€”check it out!

Lots of components, lots of HOCs, lots of asynchronous activity.

Braytech is one of my test beds, one of the projects I lean on for learning React and new principles. So there's occasions where I haven't looked at code I wrote when I first started the project and as well all know, past self is dumb. πŸ€ͺ

I've refined Braytech a lot over the past few months. I've re-designed and re-written a lot of code, usually for the better. As I've done so, I've found myself increasingly considering performance of these components, and what's the better way to approach solving a problem. The usual self-improvement stuff, you could say.

Passing down unnecessary props

Despite the fact I've understood the effects of spreading props for a long time, I've only just recently considered how spreading unnecessary props and state could adversely affect performance by triggering needless reconciliations.

Accordingly, the past few weeks I've been spending a chunk of time searching the whole project for instances where I have unnecessarily spread all of the Redux state.

Object literals defined in render

Usually, I stick to writing CSS the old fashioned way with a stylesheet, but there have been instances where I have passed an object literal as a prop to a component or in JSX.

The case to look out for is where by the object is defined in a component's render function. The problem being is that each time the render function is executed (often), React will create a new unique reference to that object.

When it comes time for React to perform reconciliation between the DOM and virtual DOM, it will perform a shallow comparison and interpret them as being unique (different).

// Don't do this
<Notification priority='high' style={{ backgroundColor: 'red', fontWeight: '900' }} />

// Do do this
<Notification priority='high' style={notificationStyle} />

Anonymous functions

Similarly to object literals defined in render, anonymous functions defined in render will also cause unnecessary reconciliations.

// Don't do this
<Button text='Reload' onClick={() => {
    setTimeout(() => {
      window.location.reload();
    }, 50);
  }}
/>

// Do do this
<Button text='Reload' onClick={this.reloadPagePlease} />

npm-check-updates

I found an awesome little package for dealing with updating old package.json.

It checks every dependency for its latest version and tells you what it will do if you tell it to go ahead and upgrade everything.

Why is this so awesome? The alternative is to manually check each dependancy and update the package.json yourself.

PS G:\thomchap.com.au> npx npm-check-updates
Checking G:\thomchap.com.au\package.json
[====================] 10/10 100%

 lodash            ^4.17.11  β†’  ^4.17.15 
 react              ^16.8.6  β†’   ^16.9.0 
 react-dom          ^16.8.6  β†’   ^16.9.0 
 react-markdown      ^4.0.8  β†’    ^4.1.0 
 react-moment        ^0.8.4  β†’    ^0.9.2 
 react-router-dom    ^4.3.1  β†’    ^5.0.1 
 react-scripts        2.0.5  β†’     3.1.1 
 three             ^0.102.1  β†’  ^0.107.0 

Run npx npm-check-updates -u to upgrade package.json