Blog

Blogging on programming and life in general.

  • Published on
    -
    4 min read

    My Work from Home Setup

    It'll soon be coming up to a year working from home full-time due to the pandemic and I thought I'd write a post about my current setup as it has evolved over the months. Starting from a bare empty room with just a desk and chair has now become a fitting place to ensure maximum productivity and comfort.

    I believe investing in a good home office setup is what can make working from home that little bit easier. Not everyone will be fortunate enough to have a single room dedicated to an office space, or afford all the niceties you've see other bloggers write about or showcased on Instagram.

    The most important part of any office is investing in a good desk and chair. Everything else is secondary. I can't stress how important this is. Working on something like a dining table can get uncomfortable very easily and this can be a big distraction in itself. Start small with the basic's and overtime work your way up and make improvements when you can. This is the approach I’ve taken.

    In general, working from home over long periods can be a real chore and a good setup will help you stay healthier and focussed whilst working. Interesting enough, The Atlantic wrote an article detailing why so many people are now experiencing medical problems after making the switch to working from home. A combination of long working hours, fewer breaks, stress and isolation is creating a negative impact on all of us.

    Desk

    I’m quite particular about desks and prefer ones that are a little industrial looking and made from real material. None of that MDF or veneered manufactured stuff. I went for a desk made from Indian reclaimed mango wood, constructed on a sturdy metal steel frame. It certainly adds a bit of character to the office.

    I’ve been told I should have opted for a standup desk for further health benefits, but I’m doing just fine as both my desk and chair are at the right height suitable for my posture.

    Chair

    I went for an Ikea Alefjall office chair that provides great support in a relatively small form factor. The seat and backrest are height adjustable. You also get support for your thighs and back through its depth adjustment along with tilt capability.

    Monitor

    Samsung Ultrwide 34 inch monitor

    I managed to snap a real bargain on an ultra-wide curved monitor from last years Amazon Black Friday deal and now a proud owner of a Samsung 34 inch ultra-wide beauty! This is a major upgrade over my Dell Ultrasharp, which by no means is a bad monitor, but just felt I needed more screen real-estate.

    Being Thunderbolt-compatible is a bonus as my MacBook Pro can charge and transmit data simultaneously over a single cable. Makes cable management that little bit easier.

    Mouse

    I have a Logitech MX Master and it’s the most comfortable mouse I’ve ever used. Fits very comfortably in the palm of your hand and is very customisable. I don’t generally like wireless mice as they can be fiddly to connect and I always question the usage time in between charges.

    This mouse works for weeks and that's with me leaving it switched on all the time. When it comes to charging, just connect the cable and carry on using it.

    Keyboard

    I've been a big fan of mechanical keyboards and prefer them over Apple’s over-priced ones. You just can’t beat the nice responsive “clickity-clack from every keypress. I’m still using the Ducky DK9008 Shine 2 my Dad got me in 2013. It’s still going strong unlike the many Apple keyboards that have failed previously.

    Just be careful whilst using it when on a Zoom call. You will notice how noisy it can come across. The amount of noise emitted by a mechanical keyboard depends on the type of switches used. You can get some really good mechanical keyboards across a variety of price points. If I didn’t already have one, I’d choose one from the range offered by Keychron.

    Speaker

    I have a Google Home Max smart speaker that packs a real punch sat in the corner of the room. Even though the speaker itself isn’t in close proximity to where my desk is, I can summon commands without having to raise my voice.

    Google Home Max speaker

    Plants

    An office space can quite quickly look very sterile and I like a little bit of greenery, which is thought to improve productivity and relieve stress. I’m not sure if that’s true. All I know it makes my working space that little bit nicer to be in. The plants I went for are very low maintenance and consist of:

    • Sansevieria: Known as “The Mother in Law's tongue” due it’s sharp upright leaves. It emits oxygen and filters toxins from the air.
    • Succulents: Really cheap and small enough to fit into any space.
    • Orchid: Not so low maintenance. Looks very cool when alive though! Mine is currently making its way back from the dead.

    What's Next?

    I think I'm done for the moment. It'll be nice to get some LED strips to fix to my desk and behind my monitor for subtle accent lighting.

  • You probably haven't noticed (and you'd be forgiven if this is the case!) that my site now has the ability to search through posts. This is a strange turn of events for me as I decided to remove search capability from my site many years ago as I didn't feel it added any benefits for the user. This became evident from Google Analytics stats where searches never hit high enough numbers to warrant having it. The numbers don't lie!

    So what caused this turnaround?

    I've noticed that I'm regularly referring back through posts to refresh myself on things I've done in the past and to find solutions to issues I know I've previously written about. Having a search would make trawling through my few hundred posts a lot easier. So this is more of a personal requirement than commercial. But there is an exciting aspect to this as well - experimenting with Algolia. Using Algolia search is something I've been meaning to look into for a long time and integrating with GatbsyJS.

    The thought of having the good ol' magnifying glass back in the navigation makes me nostalgic!

    Note: In this post, I won't be covering the basic Algolia setup or the plugins needed to install as there is already a great wealth of information online. Check out my "Useful Links" section at the end of the post.

    Basic Setup

    Integrating Algolia into GatbsyJS was relatively straight-forward due to the wealth of information that others have already written and also the plugins themselves. The plugins make light work of rendering search results quickly allowing enough customisations to the HTML markup for easy implementation within any site. By default, the plugins contain the following components:

    • InstantSearch
    • SearchBox
    • Hits
    import algoliasearch from 'algoliasearch/lite';
    import PropTypes from 'prop-types';
    import { Link } from 'gatsby';
    import { InstantSearch, Hits, Highlight, SearchBox } from 'react-instantsearch-dom';
    import React from 'react';
    
    // Get API keys from the environment file.
    const appId = process.env.GATSBY_ALGOLIA_APP_ID;
    const searchKey = process.env.GATSBY_ALGOLIA_SEARCH_KEY;
    const searchClient = algoliasearch(appId, searchKey);
    
    const SearchPage = () => (
      <InstantSearch
        searchClient={searchClient}
        indexName={process.env.GATSBY_ALGOLIA_INDEX_NAME}
      >
        <SearchBox />
        <Hits hitComponent={Hit} />
      </InstantSearch>
    );
    
    function Hit(props) {
      return (
        <article className="hentry post">
          <h3 className="entry-title">
            <Link to={props.hit.fields.slug}>
              <Highlight attribute="title" hit={props.hit} tagName="mark" />
            </Link>
          </h3>
          <div className="entry-meta">
            <span className="read-time">{props.hit.fields.readingTime.text}</span>
          </div>
          <p className="entry-content">
            <Highlight hit={props.hit} attribute="summary" tagName="mark" />
          </p>
        </article>
      );
    }
    
    Hit.propTypes = {
      hit: PropTypes.object.isRequired,
    };
    
    export default SearchPage;
    

    The InstantSearch is the core component that directly interacts with Algolia's API and takes in two properties, "searchClient" and "indexName" containing the Application ID and Search Key that is acquired from the Algolia account setup. This component contains two child components, SearchBox is the search textbox and Hits that displays results from the search query.

    It is the Hits component where we can customise the HTML with our own markup by using it's "hitComponent" attribute. In my case, I created a function to generate HTML where I access the properties from the search index. What's really cool is here is we have the ability to also highlight our search term where they may occur in the results by using the Highlight component (also provided by the Algolia plugin) and adding a "tagName" attribute.

    Removing The SearchBox Component

    The standard implementation may not suit all scenarios as you may want a search term to be sent to the InstantSearch component differently. For example, it could be from a custom search textbox or (as in my case) read from a query-string parameter. It wasn't until I started delving further into the standard setup I realised you cannot just remove the SearchBox component and pass a value directly, but there is a workaround.

    I have expanded upon the code-snippet, above, to demonstrate how my search page works...

    import algoliasearch from 'algoliasearch/lite';
    import PropTypes from 'prop-types';
    import { Link } from 'gatsby';
    import { InstantSearch, Hits, Highlight, connectSearchBox } from 'react-instantsearch-dom';
    import Layout from "../components/global/layout";
    import React, { Component } from "react";
    
    // Get API keys from the environment file.
    const appId = process.env.GATSBY_ALGOLIA_APP_ID;
    const searchKey = process.env.GATSBY_ALGOLIA_SEARCH_KEY;
    const searchClient = algoliasearch(appId, searchKey);
    const VirtualSearchBox = connectSearchBox(() => <span />);
    
    class SearchPage extends Component { 
      state = {
        searchState: {
          query: '',
        },
      };
    
      componentDidMount() {   
        // Get "term" query string parameter value.
        let search = window.location.search;
        let params = new URLSearchParams(search);
        let searchTerm = params.get("term");
    
        // Send the query string value to a "searchState" object used by Algolia.
        this.setState(state => ({
          searchState: {
            ...state.searchState,
            query: searchTerm,
          },
        }));
     }
    
      render() {
          // Default "instantSearch" HTML to prompt user to enter a search term.
          var instantSearch = null;
          
          // If there is a search term, utilise Algolia's instant search.
          if (this.state.searchState.query) {
            instantSearch = <div className="entry-content">
                              <h2>You've searched for "{this.state.searchState.query}".</h2>
                              <div className="post-list archives-list">
                              <InstantSearch
                                  searchClient={searchClient}
                                  indexName={process.env.GATSBY_ALGOLIA_INDEX_NAME}
                                  searchState={this.state.searchState}
                                >
                                  <VirtualSearchBox />
                                  <Hits hitComponent={Hit} />
                                </InstantSearch>  
                              </div>
                            </div>
          }
          else {
            instantSearch = <div className="entry-content">
                              <h2>You haven't entered a search term.</h2>
                              <p>Carry out a search by clicking the <em>magnifying glass</em> in the navigation.</p>
                            </div>
          }
    
          return (
            <Layout>
              <header className="page-header">
                <h1>Search</h1>
                <p>Search the knowledge-base...</p>
              </header>
              <div id="primary" className="content-area">
                <div id="content" className="site-content" role="main">
                    <div className="layout-fixed">
                        <article className="page hentry">
                          {instantSearch}
                        </article>
                    </div>
                </div>
              </div>
          </Layout>
        )
      }
    }
    
    function Hit(props) {
      return (
        <article className="hentry post">
          <h3 className="entry-title">
            <Link to={props.hit.fields.slug}>
              <Highlight attribute="title" hit={props.hit} tagName="mark" />
            </Link>
          </h3>
          <div className="entry-meta">
            <span className="read-time">{props.hit.fields.readingTime.text}</span>
          </div>
          <p className="entry-content">
            <Highlight hit={props.hit} attribute="summary" tagName="mark" />
          </p>
        </article>
      );
    }
    
    Hit.propTypes = {
      hit: PropTypes.object.isRequired,
    };
    
    export default SearchPage
    

    My code is reading from a query-string value and passing that to a "searchState". The searchState object is created by React InstantSearch internally. Every widget inside the library has its own way of updating it. It contains parameters on the type of search that should be performed, such as query, sorting and pagination, to name a few. All we're interested in doing is updating the query parameter of this object with our search term.

    If the query parameter from the "searchState" object is empty, render search results, otherwise, display a message stating a search term is required.

    One thing to notice is the SearchBox has been replaced with a VirtualSearchBox, which uses the connector of the search box to create a virtual widget - in our case an empty span tag. This will link the InstantSearch component with the query. Having some form of search box component is compulsory.

    Conclusion

    I prefer not to use the out-of-the-box search box component as I can potentially save requests to Algolia's API, as searches aren't being made on the fly as a user enters a search term. This is the plugins default behaviour.

    Passing a search term through a query-string may come across as a little backwards, especially when it's rather nice to see search results change before your eyes as you type letter-by-letter. However, this approach misses one key element: Tracking in Google Analytics. Even though I will be primary the person making the most use of my site search, it'll be interesting to see who else uses it and what search keywords are used.

    Useful Links

  • ASP.NET Core contains a variety of useful Tag Helpers to enable server-side code to participate in creating and rendering HTML elements in our Views. One Tag Helper, in particular, has the ability to cache bust links to static resources such as Image, CSS and JavaScript by appending an asp-append-version="true" attribute.

    The asp-append-version attribute automatically adds a version number to the file name using a SHA256 hashing algorithm, so whenever the file is updated, the server generates a new unique version. For a deeper understanding on how ASP.NET Core performs this piece of functionality, give the following StackOverflow post a read: How does javascript version (asp-append-version) work in ASP.NET Core MVC?.

    This approach works perfectly if you're linking to your static resources using the relevant HTML tag, for example img, script or link. In my scenario, I'm using a JavaScript library called LabJS - a dynamic script loader that gives the ability to control the loading and execution of different plugins. For example:

    <script>
      $LAB
      .script("http://remote.tld/jquery.js").wait()
      .script("/local/plugin1.jquery.js")
      .script("/local/plugin2.jquery.js").wait()
      .script("/local/init.js").wait(function(){
          initMyPage();
      });
    </script>
    

    I need to be able to append a query string parameter to one of the JavaScript file references. One thing that came to mind was to use the applications last build-time as the cache busting value. Whenever the application is updated, this value will automatically be updated so no manual intervention is required.

    I found code examples from meziantou.net that demonstrated various approaches to acquiring an applications build date. I modified the "Linker timestamp" example to return a Unix timestamp in a newly created class called AssemblyUtils.

    public class AssemblyUtils
    {
        #region Properties
    
        public int UnixTimestamp { get; set; }
    
        #endregion
    
        /// <summary>
        /// Get timestamp in Unix seconds for the last build.
        /// </summary>
        /// <returns></returns>
        public static int GetBuildTimestamp()
        {
            const int peHeaderOffset = 60;
            const int timestampOffset = 8;
    
            byte[] bytes = new byte[2048];
    
            using (FileStream file = new FileStream(Assembly.GetExecutingAssembly().Location, FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
                file.Read(bytes, 0, bytes.Length);
    
            int headerPos = BitConverter.ToInt32(bytes, peHeaderOffset);
            int unixTime = BitConverter.ToInt32(bytes, headerPos + timestampOffset);
    
            return unixTime;
        }
    }
    

    The code will only return the Assembly information if your Visual Studio .csproj file (from version 15.4 onwards) includes the following setting within the <PropertyGroup> settings:

    <Deterministic>False</Deterministic>
    

    It would be a waste to constantly call the GetBuildTimestamp() method to acquire assembly information directly within the page View, when the most ideal approach would be to make this call once on application startup.

    public void ConfigureServices(IServiceCollection services)
    {
        #region Assembly Utils - Build Time
    
        Action<AssemblyUtils> assemblyBuildOptions = (opt =>
        {
            opt.UnixTimestamp = AssemblyUtils.GetBuildTimestamp();
        });
    
        services.Configure(assemblyBuildOptions);
        services.AddSingleton(resolver => resolver.GetRequiredService<IOptions<AssemblyUtils>>().Value);
    
        #endregion
    }
    

    We can access the build timestamp value using Dependency Injection within a base controller that gets inherited by all controllers.

    public class BaseController : Controller
    {
        private int _buildTimetamp { get; set; }
    
        public BaseController(AssemblyUtils assemblyUtls)
        {
            _buildTimetamp = assemblyUtls.UnixTimestamp;
        }
    
        public override void OnActionExecuting(ActionExecutingContext context)
        {
            base.OnActionExecuting(context);
    
            // Assign build timestamp to a View Bag.
            ViewBag.CacheBustingValue = _buildTimetamp;
        }
    }
    

    The timestamp is assigned to a ViewBag that can then be accessed at View level.

    <script>
      $LAB
      .script("http://remote.tld/jquery.js").wait()
      .script("/local/plugin1.jquery.js")
      .script("/local/plugin2.jquery.js").wait()
      .script("/local/init.js?v=@ViewBag.CacheBustingValue").wait(function(){
          initMyPage();
      });
    </script>
    

    This will result in the following output:

    <script>
      $LAB
      .script("http://remote.tld/jquery.js").wait()
      .script("/local/plugin1.jquery.js")
      .script("/local/plugin2.jquery.js").wait()
      .script("/local/init.js?v=1609610821").wait(function(){
          initMyPage();
      });
    </script>
    
  • I normally like my last blog post of the year to end with a year in review. In light of being in Tier 4 local restrictions, there isn't much to do during the festive period unlike previous years. So I have decided to use this time to tinker around with various tech-stacks and work my own site to keep me busy.

    Whilst making some efficiency improvements under-the-hood to optimise my sites build and loading times, I randomly decided to check the security headers on securityheaders.com and to my surprise received a grade 'D'. When my site previously ran on the .NET Framework, I managed to secure things down to get graded an 'A'. I guess one of my misconceptions on moving to a statically-generated site is there isn't a need. How wrong I was.

    A dev.to post by Matt Nield explains why static sites need basic security headers in place:

    As you add external services for customer reviews, contact forms, and eCommerce integration etc., we increase the number of possible vulnerabilities of the application. It may be true that your core data is on accessed when you rebuild your application, but all of those other features added can leave you, your customers, and your organisation exposed. Being frank, even if you don't add external services there is a risk. This risk is easily reduced using some basic security headers.

    Setting security headers on a Netlify hosted site couldn't be simpler. If like me, your site is built using GatsbyJS, you simply need to add a _headers file in the /static directory containing the following header rules:

    /*
    X-Frame-Options: DENY
    X-XSS-Protection: 1; mode=block
    Referrer-Policy: no-referrer
    X-Content-Type-Options: nosniff
    Content-Security-Policy: base-uri 'self'; default-src 'self' https: ; script-src 'self' 'unsafe-inline' https: ; style-src 'self' 'unsafe-inline' https: blob: ; object-src 'none'; form-action 'self' https://*.twitter.com; font-src 'self' data: https: ; connect-src 'self' https: ; img-src 'self' data: https: ;
    Feature-Policy: geolocation 'self'; midi 'self'; sync-xhr 'self'; microphone 'self'; camera 'self'; magnetometer 'self'; gyroscope 'self'; fullscreen 'self'; payment 'self'
    

    When adding a "Content-Security-Policy" header be sure to thoroughly re-check your site as you may need to whitelist resources that are loaded from a different origin. For example, I had to make some tweaks specifically to the "Content-Security-Policy" to allow embedded Tweets to render correctly.

    My site is now back to its 'A' grade glory!

    Useful Links

  • Published on
    -
    7 min read

    Year In Review - 2020

    Well, hasn’t this been an interesting year? I couldn't have described it as eloquently as Helen Rosner, who managed to sum up the thoughts that ran through my mind at the start of the pandemic:

    2020 In Words/Phrases

    Coronavirus, Covid-19, holiday less, lockdown, DIY, UniFi router, armchair, daily HIIT sessions, home office setup, GatsbyJS, 10th work anniversary, Netlify hosting, WordPress (begrudgingly!), Batman Begins, Failing MacBook Pro, Social Share Image Improvements, Hubspot, work from home, social hermit, ultra-wide curved monitor, smart-home automation, family, new forest, outside Christmas lights, Pixel 5, Azure Web Apps, Azure DevOps, Google Photos disappointment

    Holiday - A Paradise Lost

    Over the last couple of years, I started a tradition on writing about my one big holiday I like to take each year. This was supposed to be the year where I expanded my Maldives horizons (after visiting Vilamendhoo last year) holidaying on another island - Cocoon Island!

    I wanted to go to Cocoon Island to celebrate my 35th year of being on this earth with family. But alas, it was not to be. Covid-19 cast doubt of travel uncertainty throughout the year and I'm hoping (like many others), I'll have the opportunity to travel once again in the coming year.

    My Site

    This has been the year where I have fully transitioned my site into the Gatsby framework and had an absolute ball in doing so! There is something liberating about having a website that doesn’t rely on the conventional CMS platform and as a bonus, I’m saving around £100 in yearly hosting costs after moving to Netlify for hosting.

    I need to pluck up the courage to update the front-end build as it not only looks dated but doesn’t perform very well on the Google Lighthouse score, which is something that should easily be achievable using Gatsby JS. Redeveloping this aspect of my website has always taken a back-seat as writing content will take precedence. Strangely enough, looking back over the year I should have had time to write more especially during the lockdown period, but I found this year to be mentally exhausting.

    Statistics

    When I look at the stats for this year, it seems like my older posts still seem to get a lot of traction. Maybe the numbers are trying to tell me that my more recent posts aren’t that interesting. In all seriousness, I have had another positive bump but not on the same level as in previous years. I am ok with that.

    As I stated in my last year in review post, I accepted that the figures will plateau. I’m surprised I managed to get any increase in stats as I lacked focus when it came to blogging and most importantly talking more about unique technical subjects with depth.

    2019/2020 Comparison:

    • Users: +11.45%
    • Page Views: +10.54%
    • New Users: +10.72%
    • Bounce Rate: -0.01%
    • Search Console Total Clicks:  +99%
    • Search Console Impressions: +91%
    • Search Console Page Position: +1.7%

    Experiencing The Missed Cinematic Experience of 2005

    On the 15th June 2005, a film was released that would forever redefine super-hero cinema - Batman Begins! There are certain films that must be seen on the big screen and for me, Batman Begins was one of them. It was unfortunate I gave it a miss on release as I fell out of love with the film interpretations of Batman after “Batman and Robin” scarred me for life.

    I instantly regretted this miss-step when finally watching the film on DVD over a year later. I yearned the day when I’d get an opportunity to see Christopher Nolan’s Batman on the big screen. Fast-forward 15 years from its original release, Covid-19 presented a small silver-lining where a handful of films were re-released to fill the gaping hole in the cinemas' schedule, caused by film studios withholding their new releases.

    The screening itself couldn’t have been more perfect. Sitting in the VIP seating area and having the whole auditorium to myself, gave a somewhat immersive and intimate viewing experience.

    The MacBook Pro Engineer

    My 2015 MacBook Pro's battery has been failing for some time now. So much so it's become a glorified desktop rather than a laptop, as any attempt to disconnect would result in the full loss of power. Being that my laptop is out of warranty and even considered buying a replacement, I plucked up the courage to replace the battery myself. Some may call this madness, but I thought this would be the quickest way to get a new battery in when compared to the estimated time Apple quoted me - 2 weeks. Two weeks is a very long time to be without a laptop.

    There is such a wealth of online resources demonstrating how the battery can be replaced via DIY videos on YouTube and iFixit tutorials. I'll admit, it takes guts to rip out an existing battery mainly due to the heavy-duty adhesive. It's a slow and arduous process. After this is done, the rest is plain sailing.

    I wish I could say my laptop is fully operational but it’s still a glorified desktop as I am still getting battery health warnings, leading me to think some other component is playing up.

    Syndicut

    1st July marked my 10th anniversary at Syndicut. I always knew I wanted my 10th anniversary to be marked with something memorable... Covid made it memorable indeed for all the wrong reasons. I would have preferred to celebrate with my workmates on a social outing of some sorts, instead, it was a more low-key affair involving a raised glass of the finest Rioja to another successful 10 years!

    At this point, I have to really thank Steve and Nick (the directors of Syndicut) who managed to steer us through the choppy waters of the Covid-19 ripple effect. It’s thanks to them our jobs remained secure and I’m sure my fellow work colleagues would express the same gratitude that we came through the other side! For the first time in my life, I felt the possibility of facing financial insecurity.

    If this year has taught me anything, it's not to take one's job and career for granted especially when words such as “furlough” and “unemployment” is so prevalent.

    Journey for Self Improvement

    Depending on how one looks at it, when living on your lonesome and placed in a lockdown can be a recipe for borderline insanity! You could while away the time watching excessive amounts of TV or playing Scrabble GO (my lockdown game of choice!) with friends and randoms over the world, or utilise this time improving one's self. As they say - Idle hands make for the devil's work.

    With so much time on my hands, I became very conscious of ensuring I was being as productive as I could, whether that was doing DIY, learning new a new programming framework/language or forcing myself to exercise more often using resistence-bands with gyms being closed. Seriously, those resistance-bands are worth every penny. I don’t think I’ll ever be going back to the gym.

    Home Improvement

    When in lockdown, I no longer had an excuse to put off all the DIY and general house jobs I previously been telling myself I'm too busy to complete. The outcome has been very satisfying and in can finally say things are more homely.

    My most precious purchase is the new leather armchair which I've placed in the corner of the room along with some plants. It's since become a place where I can read, write and think... I've called it my "thinking space"! :-)

    Working from home gave me the extra push to properly kit out a small office space. Thankfully, this is something that was already in motion before the lockdown and had a nice industrial desk (made out of re-purposed Indian mango wood and steel) and a leather chair. Over the months, I kept adding more items to make my work life more comfortable. Currently, I am awaiting some Displates to cover up the bare walls.

    I’ve also been delving into some smart-home automation starting with the purchase of some smart plugs leading me wanting more! At some point in the future, I could hook up my smart devices to a Raspberry Pi for additional control through a mini touchscreen. Now that would be very cool!

    Google Pixel 5

    I didn’t end up getting an iPhone to compliment my iPad purchase from last year. Couldn’t bring myself to do it. Even though I’ve been looking for a replacement for my Pixel 2 for some time, there weren’t any Android phones I deemed a worthy purchase. Last years Pixel 4 didn’t tick the boxes that I’d hope it would and so opted for this years Pixel 5.

    The Pixel 5 isn’t what I’d class as the typical flagship. Google has redefined what they class as a “flagship” by not using the most up-to-date components when it comes to the processor and the camera. Strangely enough, the camera hardware hasn’t been updated since the Pixel 2, which is very odd. Nevertheless, I have found the Pixel 5 to be a fine phone. The battery lasts me two days on a single charge and (most importantly!) the camera picture quality cannot be faulted.

    Home Network Upgrade

    In light of having to work from home, I thought now might be a good time to give the network and little more stability, speed and security. My trusty old Billion 7800DXL router started to wane and found myself having to manually restart it on a daily basis. After failing to find up-to-date firmware to help remedy the issue, I thought it’s best to opt for an upgrade to a prosumer grade router - UniFi Dream Machine.

    At some point, I would like to beef up my network setup by getting a network switch cabinet filled with hardware from the UniFi range of products. Even though this would be overkill for my needs, it would be very interesting to setup.

    Final Thoughts

    I leave 2020 with an immense sense of gratitude where all those I consider close to me are safe and healthy. It’s strange to think over the last year has been something we have all bear witness and experienced together. Covid-19 has changed things - the very fabric of our existence. It squashes a persons ego.

  • The older I get, the more obsessed I have become with preserving life’s memories through photos and video. With so many companies offering their storage solutions, we’re living in an age where storage is no longer something that comes at a premium. There are a wide variety of pricing and feature tiers for all, benefiting us as consumers. If you have full trust in the service provider, they are suited particularly well for the majority of consumer needs. But as a consumer, you need to be prepared to shift with potential service changes that may or may not work in your favour.

    For many years, I have always been conscious that I’m a photo hoarder and believe that there isn’t a bad photo one can take with the help of advancements in phone camera technology. If you ask any of my work colleagues, they’d probably tell you I have a problem. When we go on any of our socials, I’m the first person to whip out my phone and take pictures as they make nice mementoes to look back on and share.

    On a more personal note, during last years Diwali I came to the sudden realisation as we all sat down to my mum belting out her prayers that this will not last forever and it dawned on me that these are truly special moments. Being an Indian who is culturally inept in all the senses and cannot speak his native tongue, I would be comforted knowing that I'll have photos and video to look back on many years to come. From that moment, I decided to make an active effort to capture smaller moments like these. Maybe the pandemic has shown me not take things for granted and appreciate time with family more.

    I got a little serious in my crusade and took things a step further by acquiring as many family photos as possible by purchasing a photo scanner to digitise all prints for safekeeping. Prints fade in time, not in the digital world.

    Photo Backup Strategy

    Whether I take photos on my phone or my FujiFilm X100F camera, the end destination will always be my Synology NAS where I have the following redundancies in place:

    • RAID backup on all hard disks.
    • Nightly backups to the cloud using BackBlaze.
    • Regular backup to an external disk drive and stored off-site.

    As expected, my phone gets the most use and everything by default is stored within my Google Photos account using free unlimited storage. I then use Synology Moments that acts as my local Google Photos where my photos are automatically stored to my Synology in original quality.

    My camera gets mostly used for when I go on holiday and events. I store the RAW and processed photos on my Synology. I still upload the processed photos to Google Photos as I love its AI search capability and makes sharing easy.

    At the end of the day, the layers of redundancy you put in place depend on how important specific photos are to you. I like the idea of controlling my own backups. I take comfort knowing my data is stored in different places:

    • Synology
    • Backblaze
    • Google Photos
    • Offsite Hard Drive

    Cloud Storage and Shifting Goalposts

    The fear I had pushed to the back of my head finally came to the forefront when Google changed its storage policy.

    The recent news regarding the changes in Google Photos service gives me a sense of resolve knowing I already have my local storage solution that is already working in parallel with Google Photos. But I can’t help but feel disappointed by the turn of events though. Even though I can to some extent understand Google's change in their service, I can't help but feel slightly cheated. After all, they offered us all free unlimited storage in exchange to allow them to apply data mining and analysis algorithms to improve their services. That's the price you pay for using a free service. You are the product (this I have no grievances with)!

    Now they have enough of our data, they can feel free to cut the cord. We all know Google has a history of just killing products. Google Photos may not be killed, but life has certainly been sucked out of it.

    It may come across as if I’m solely bashing Google Photos, when in fact this is a clear example of how companies can change their service conditions for their benefit and face no repercussions. We as users have no say on the matter and just have to roll with the punches. It just seems wrong that a company would entice so many users with a free service to then strip it away. This is a classic monopolistic strategy to grab market share by pricing out its competitors to now demand money from its users.

    For me, Google Photos provided a fundamental part of the photo storage experience by making things easily accessible to family and friends. No longer will I be able to invite friends/family to contribute to shared albums unless they opt for the paid plan. Now when you’re surrounded by iPhone users, this creates another barrier of entry.

    This has cemented my stance more so to ensure have control of my assets and service, which is something I have been doing.

    Final Thoughts

    If I have carried out my photo archival process correctly, they should be accessible to future generations for many years to come and continue to live on even after I’ve expired. This should be achievable as I’ll continue to maintain this time-capsule as technology continues to evolve.

    The most important take-away: If you strip down my approach to the barebones, I’m not giving in to the monopolistic behaviour of the tech giants - Google, Apple or Microsoft. Just using them as a secondary thought to compliment my process. It’s just my NAS doing the heavy-lifting where I set the rules.

    These priceless heirlooms are a legacy and my gift for future generations to come.

  • I love my Google Nest and it truly is a revolutionary piece of kit. Not only does it display my photos but it also forms a key part of some basic smart-home automation. I really have no gripes. But there is one small area I feel it's lacking. The Radio Alarm. I'm the type of person who detests alarm sounds and prefer my sleep cycle to be shattered by something a little softer, like a radio station.

    How difficult is it for Google to add a feature that will allow one to wake up to their favourite radio station? I would have thought this key feature would be very easy to put in place, after all, the Nest Hub can carry out much more complex operations. There are varying reports that this feature is available only within the US, which I find very odd to why this is the case. Does Google not know here in the UK we also would find this feature useful?

    In the meantime, whilst I await an official release (that may not come anytime soon!) I managed to concoct a somewhat preposterous way to get some form of radio alarm automation. You will require the following:

    • An Android phone with Google Assistant capability
    • Google Nest Hub (standard or max)
    • Phone stand to sit next to your Nest Hub (optional)

    The premise of the approach I detail is to get an Android phone to fire off the alarm at the desired time and when the alarm is dismissed manually, the phone will utter a phrase that will be picked up by your Google Nest Hub and play your radio station.

    If you’re still here and intrigued by this approach, let's get to it!

    The first thing we need to do is set up a “Good Morning” routine on the Google Nest Hub, which can be done through the Google Home app on your phone. It is here where we will carry out the following:

    1. Assistant will section: Adjust media volume to 40%.
    2. And then play section: Select Play radio and enter the name of a radio station.
    3. Save the routine.

    Now when you utter the magic phrase “Good morning”, the Google Nest Hub will do exactly what we set up in our routine. Now we need to add some automation to do this for us and this is where the alarm feature on your Android phone comes into play.

    I cannot be sure if the alarm feature on all newish Android phones gives the ability to define a Google Assistant routine. If it does, you should see this as an option when setting the alarm. We need to carry out a similar process as we carried out (above) when setting a “Good Morning” routine on the Google Nest Hub:

    1. When I dismiss my alarm: Adjust media volume to 50%.
    2. Select the “Add action” button and under the “Enter command” tab, enter the following text: Hey Google. Good Morning.
    3. Leave the “And then play” section to do nothing.
    4. Save the routine.

    Your phone will ideally be placed in close proximity of your Google Nest Hub for the “Hey Google. Good Morning” utterance to be heard. In my case, I have my phone right next to the Nest Hub on my bedside cabinet to make it easy to dismiss the alarm.

    I have to concede the approach I have to take comes across quite lame. It just seems ridiculous that you have to rely on a phone to fire off a process to allow one to have the radio to play automatically. Why can’t routines be more flexible at Nest Hub level?

    I’m unable to determine whether my approach comes across naive or clever. Maybe it's somewhere in between.

  • Another day, another ASP.NET Core error... This time relating to JSON not being parsable. Like the error I posted yesterday, this was another strange one as it only occurred within an Azure environment.

    Let me start by showing the file compilation error:

    Application '/LM/W3SVC/144182150/ROOT' with physical root 'D:\home\site\wwwroot\' hit unexpected managed exception, exception code = '0xe0434352'. First 30KB characters of captured stdout and stderr logs:
    Unhandled exception. System.FormatException: Could not parse the JSON file.
     ---> System.Text.Json.JsonReaderException: '0x00' is an invalid start of a value. LineNumber: 0 | BytePositionInLine: 0.
       at System.Text.Json.ThrowHelper.ThrowJsonReaderException(Utf8JsonReader& json, ExceptionResource resource, Byte nextByte, ReadOnlySpan`1 bytes)
       at System.Text.Json.Utf8JsonReader.ConsumeValue(Byte marker)
       at System.Text.Json.Utf8JsonReader.ReadFirstToken(Byte first)
       at System.Text.Json.Utf8JsonReader.ReadSingleSegment()
       at System.Text.Json.Utf8JsonReader.Read()
       at System.Text.Json.JsonDocument.Parse(ReadOnlySpan`1 utf8JsonSpan, Utf8JsonReader reader, MetadataDb& database, StackRowStack& stack)
       at System.Text.Json.JsonDocument.Parse(ReadOnlyMemory`1 utf8Json, JsonReaderOptions readerOptions, Byte[] extraRentedBytes)
       at System.Text.Json.JsonDocument.Parse(ReadOnlyMemory`1 json, JsonDocumentOptions options)
       at System.Text.Json.JsonDocument.Parse(String json, JsonDocumentOptions options)
       at Microsoft.Extensions.Configuration.Json.JsonConfigurationFileParser.ParseStream(Stream input)
       at Microsoft.Extensions.Configuration.Json.JsonConfigurationFileParser.Parse(Stream input)
       at Microsoft.Extensions.Configuration.Json.JsonConfigurationProvider.Load(Stream stream)
       --- End of inner exception stack trace ---
       at Microsoft.Extensions.Configuration.Json.JsonConfigurationProvider.Load(Stream stream)
       at Microsoft.Extensions.Configuration.FileConfigurationProvider.Load(Boolean reload)
    --- End of stack trace from previous location where exception was thrown ---
       at Microsoft.Extensions.Configuration.FileConfigurationProvider.HandleException(ExceptionDispatchInfo info)
       at Microsoft.Extensions.Configuration.FileConfigurationProvider.Load(Boolean reload)
       at Microsoft.Extensions.Configuration.FileConfigurationProvider.Load()
       at Microsoft.Extensions.Configuration.ConfigurationRoot..ctor(IList`1 providers)
       at Microsoft.Extensions.Configuration.ConfigurationBuilder.Build()
       at Microsoft.Extensions.Logging.AzureAppServices.SiteConfigurationProvider.GetAzureLoggingConfiguration(IWebAppContext context)
       at Microsoft.Extensions.Logging.AzureAppServicesLoggerFactoryExtensions.AddAzureWebAppDiagnostics(ILoggingBuilder builder, IWebAppContext context)
       at Microsoft.Extensions.Logging.AzureAppServicesLoggerFactoryExtensions.AddAzureWebAppDiagnostics(ILoggingBuilder builder)
       at Microsoft.AspNetCore.Hosting.AppServicesWebHostBuilderExtensions.<>c.<UseAzureAppServices>b__0_0(ILoggingBuilder builder)
       at Microsoft.Extensions.DependencyInjection.LoggingServiceCollectionExtensions.AddLogging(IServiceCollection services, Action`1 configure)
       at Microsoft.AspNetCore.Hosting.WebHostBuilderExtensions.<>c__DisplayClass8_0.<ConfigureLogging>b__0(IServiceCollection collection)
       at Microsoft.AspNetCore.Hosting.HostingStartupWebHostBuilder.<>c__DisplayClass6_0.<ConfigureServices>b__0(WebHostBuilderContext context, IServiceCollection services)
       at Microsoft.AspNetCore.Hosting.HostingStartupWebHostBuilder.ConfigureServices(WebHostBuilderContext context, IServiceCollection services)
       at Microsoft.AspNetCore.Hosting.GenericWebHostBuilder.<.ctor>b__5_2(HostBuilderContext context, IServiceCollection services)
       at Microsoft.Extensions.Hosting.HostBuilder.CreateServiceProvider()
       at Microsoft.Extensions.Hosting.HostBuilder.Build()
       at Site.Web.Program.Main(String[] args) in C:\Development\surinder-main-website\Site.Web\Program.cs:line 11
    
    Process Id: 2588.
    File Version: 13.1.20169.6. Description: IIS ASP.NET Core Module V2 Request Handler. Commit: 62c098bc170f50feca15916e81cb7f321ffc52ff
    

    The application was not consuming any form of JSON as part of its main functionality. The only JSON being used were three variations of appsettings.json - each one for development, staging and production. So this had to be the source of the issue. The error message also confirmed this as Program.cs was referenced and it’s at this point where the application startup code is run.

    My first thought was I must have forgotten a comma or missing a closing quote for one of my values. After running the JSON through a validator, it passed with flying colours.

    Solution

    After some investigation, the issue was caused by incorrect encoding of the file. All the appsettings.json files were set to "UTF-8" and as a result, possibly causing some metadata to be added stopping the application from reading the files. Once this was changed to "UTF-8-BOM" through Notepad++ everything worked fine.

  • You gotta love .NET core compilation errors! They provide the most ambiguous error messages known to man. I have noticed the error message and accompanying error code could be caused by a multitude of factors. This error is no different so I’ll make my contribution, hoping this may help someone else.

    The error in question occurred really randomly whilst deploying a minor HTML update to a .NET Core site I was hosting within an Azure Web App. It couldn’t have been a simpler release - change to some markup in a View. When the site loaded, I was greeted with the following error:

    Failed to start application '/LM/W3SVC/####/ROOT', ErrorCode '0x8007023e’.
    

    I was able to get some further information about the error from the Event Log:

    Application 'D:\home\site\wwwroot\' failed to start. Exception message:
    Executable was not found at 'D:\home\site\wwwroot\%LAUNCHER_PATH%.exe'
    Process Id: 10848.
    File Version: 13.1.19331.0. Description: IIS ASP.NET Core Module V2.
    

    The error could only be reproduced on Azure and not within my local development and staging environments. I created a new deployment slot to check if somehow my existing slot got corrupted. Unfortunately, this made no difference. The strange this is, the application was working completely fine up until this release. It's still unknown to me what could have happened for this error to occur all of a sudden.

    Solution

    It would seem that no one else on the planet experienced this issue when Googling the error message and error code. After a lot of fumbling around, the fix ended up being relatively straight-forward. The detail provided by the Event Log pointed me in the right direction and the clue was in the %LAUNCHER_PATH% placeholder. The %LAUNCHER_PATH% placeholder is set in the web.config and this is normally replaced when the application is run in Visual Studio or IIS.

    In Azure, both %LAUNCHER_PATH% and %LAUNCHER_ARGS% variables need to be explicitly set. The following line in the web.config needs to be changed from:

    <aspNetCore processPath="%LAUNCHER_PATH%" arguments="%LAUNCHER_ARGS%" stdoutLogEnabled="false" stdoutLogFile=".\logs\stdout" forwardWindowsAuthToken="false" startupTimeLimit="3600" requestTimeout="23:00:00" hostingModel="InProcess">
    

    To:

    <aspNetCore processPath=".\Site.Web.exe" arguments="" stdoutLogEnabled="false" stdoutLogFile=".\logs\stdout" forwardWindowsAuthToken="false" startupTimeLimit="3600" requestTimeout="23:00:00" hostingModel="InProcess">
    

    The processPath is now pointing to the executable generated by the project. In this case, "Site.Web.exe". Also, since no arguments are being parsed in my build, the arguments attribute is left empty. When you push up your next release, the error should be rectified.

    As a side note, there was one thing recommended to me by Azure support regarding my publish settings in Visual Studio. It was recommended that I should set the deployment mode from "Framework-Dependent" to "Self-Contained". This will ensure the application will always run in its current framework version on the off-chance framework changes happen at an Azure level.

  • The Kentico Kontent ASP.NET Core boilerplate contains a CustomContentLinkUrlResolver class that allows all links within your content to be transformed into a custom URL path based on the content-type a link is referencing. The out-of-the-box boilerplate solution works for most scenarios. But there will be times when links cannot be resolved in such a simplistic fashion, especially if your project is using dynamic page routing.

    What we need to do is make a small tweak to the CustomContentLinkUrlResolver class so we can use Kontent’s DeliveryClient object, which in turn allows us to query the API and carry out a complex ruleset for resolving URL’s.

    To give a frame of reference, the out-of-the-box CustomContentLinkUrlResolver class contains the following code:

    public class CustomContentLinkUrlResolver : IContentLinkUrlResolver
    {
        /// <summary>
        /// Resolves the link URL.
        /// </summary>
        /// <param name="link">The link.</param>
        /// <returns>A relative URL to the page where the content is displayed</returns>
        public string ResolveLinkUrl(ContentLink link)
        {
            return $"/{link.UrlSlug}";
        }
    
        /// <summary>
        /// Resolves the broken link URL.
        /// </summary>
        /// <returns>A relative URL to the site's 404 page</returns>
        public string ResolveBrokenLinkUrl()
        {
            // Resolves URLs to unavailable content items
            return "/404";
        }
    }
    

    This will be changed to:

    public class CustomContentLinkUrlResolver : IContentLinkUrlResolver
    {
        IDeliveryClient deliveryClient;
        public CustomContentLinkUrlResolver(DeliveryOptions deliveryOptions)
        {
            deliveryClient = DeliveryClientBuilder.WithProjectId(deliveryOptions.ProjectId).Build();
        }
    
        /// <summary>
        /// Resolves the link URL.
        /// </summary>
        /// <param name="link">The link.</param>
        /// <returns>A relative URL to the page where the content is displayed</returns>
        public string ResolveLinkUrl(ContentLink link)
        {                
            switch (link.ContentTypeCodename)
            {
                case Home.Codename:
                    return "/";
                case BlogListing.Codename:
                    return "/Blog";
                case BlogPost.Codename:
                    return $"/Blog/{link.UrlSlug}";
                case NewsArticle.Codename:
                    // A simplistic example of the Delivery Client in use to resolve a link...
                    NewsArticle newsArticle = Task.Run(async () => await deliveryClient.GetItemsAsync<NewsArticle>(
                                                                                new EqualsFilter("system.id", link.Id),
                                                                                new ElementsParameter("url"),
                                                                                new LimitParameter(1)
                                                                            )).Result?.Items.FirstOrDefault();
    
                    if (!string.IsNullOrEmpty(newsArticle?.Url))
                        return newsArticle.Url;
                    else
                        return ResolveBrokenLinkUrl();
                default:
                    return $"/{link.UrlSlug}"; 
            }
        }
    
        /// <summary>
        /// Resolves the broken link URL.
        /// </summary>
        /// <returns>A relative URL to the site's 404 page</returns>
        public string ResolveBrokenLinkUrl()
        {
            // Resolves URLs to unavailable content items
            return "/404";
        }
    }
    

    In the updated code, we are using DeliveryClientBuilder.WithProjectId() method to create a new instance of the DeliveryClient object, which can then be used if a link needs to resolve a News Article content type. You have may have also noticed the class is now accepting a DeliveryOptions object as its parameter. This object is populated on startup with Kontent’s core settings from the appsettings.json file. All we’re interested in is retrieving the Project ID.

    A small update to the Startup.cs file will also need to be carried out where the CustomContentLinkUrlResolver class is referenced.

    public void ConfigureServices(IServiceCollection services)
    {
        ...
    
        var deliveryOptions = new DeliveryOptions();
        Configuration.GetSection(nameof(DeliveryOptions)).Bind(deliveryOptions);
    
        IDeliveryClient BuildBaseClient(IServiceProvider sp) => DeliveryClientBuilder
            .WithOptions(_ => deliveryOptions)
            .WithTypeProvider(new CustomTypeProvider())
            .WithContentLinkUrlResolver(new CustomContentLinkUrlResolver(deliveryOptions)) // Line to update.
            .Build();
    
        ...
    }
    

    I should highlight at this point the changes that have been illustrated above have been made on an older version of the Kentico Kontent boilerplate. But the same approach applies. The only thing I’ve noticed that normally changes between boilerplate revisions is the Startup.cs file. The DeliveryOptions class is still in use, but you may have to make a small tweak to ascertain its values.