Blog

Blogging on programming and life in general.

  • Cookiebot was added to a Kentico 13 site a few weeks ago resulting in unexpected issues with pages that contained Kentico forms, which led me to believe there is a potential conflict with Kentico Page Builders client-side files.

    As all Kentico Developers are aware, the Page Builder CSS and JavaScript files are required for managing the layout of pages built with widgets as well as the creation and use of Kentico forms consisting of:

    • PageBuilderStyles - consisting CSS files declared in the </head> section of the page code.
    • PageBuilderScripts - consisting of JavaScript files declared before the closing </body> tag.

    In this case, the issue resided with Cookiebot blocking scripts that are generated in code as an extension method or as a Razor Tag Helper.

    <html>
    <body>
        ...
        <!-- Extension Method -->
        @Html.Kentico().PageBuilderScripts()    
        ...
        <!-- Razor Tag Helper -->
        <page-builder-scripts />
        ...
    </body>
    </html>
    

    Depending on the cookie consent given, Kentico Forms either failed on user submission or did not fulfil a specific action, such as, conditional form element visibility or validation.

    The first thing that came to mind was that I needed to configure the Page Builder scripts by allowing it to be ignored by Cookiebot. Cookiebot shouldn't hinder any key site functionality as long as you have configured the consent options correctly to disable cookie blocking for specific client-side scripts via the data-cookieconsent attribute:

    <script data-cookieconsent="ignore">
        // This JavaScript code will run regardless of cookie consent given.
    </script>
    
    <script data-cookieconsent="preferences, statistics, marketing">
        // This JavaScript code will run if consent is given to one or all of options set in "cookieconsent" data attribute.
    </script>
    

    Of course, it's without saying that the data-cookieconsent should be used sparingly - only in situations where you may need the script to execute regardless of consent and have employed alternative ways of ensuring that the cookies are only set after consent has been obtained.

    But how can the Page Builder scripts generated by Kentico be modified to include the cookie consent attribute?

    If I am being honest, the approach I have taken to resolve this issue does not sit quite right with me, as I feel there is a better solution out there I just haven't been able to find...

    Inside the _Layout.cshtml file, I added a conditional statement that checked if the page is in edit mode. If true, the page builder scripts will render normally using the generated output from the Tag Helper. Otherwise, manually output all the scripts from the Tag Helper and assign the data-cookieconsent attribute.

    <html>
    <body>
        ... 
        ...
        @if (Context.Kentico().PageBuilder().EditMode)
        {
            <page-builder-scripts />
        }
        else
        {
            <script src="/_content/Kentico.Content.Web.Rcl/Scripts/jquery-3.5.1.js" data-cookieconsent="ignore"></script>
            <script src="/_content/Kentico.Content.Web.Rcl/Scripts/jquery.unobtrusive-ajax.js" data-cookieconsent="ignore"></script>
            <script type="text/javascript" data-cookieconsent="ignore">
                window.kentico = window.kentico || {};
                window.kentico.builder = {};
                window.kentico.builder.useJQuery = true;
            </script>
            <script src="/Content/Bundles/Public/pageComponents.min.js" data-cookieconsent="ignore"></script>
            <script src="/_content/Kentico.Content.Web.Rcl/Content/Bundles/Public/systemFormComponents.min.js" data-cookieconsent="ignore"></script>
        }
    </body>
    </html>
    

    After the modifications were made, all Kentico Forms were once again fully functional. However, the main disadvantage of this approach is that issues may arise when new hotfixes or major versions are released as the hard-coded script references will require checking.

    If anyone can suggest a better approach to integrating a cookie compliance solution or making modifications to the page builder script output, please leave a comment.

    Useful Information

  • Banner Image by: pch.vector on Freepik

    I've been looking out for a side hustle to supplement my monthly stock and shares investment contribution - trying to make up for lost time in the years I did not invest. As it was my first foray into the world of side hustling, I wanted to ease myself into things. So it was important for it to be flexible enough to work around office/personal hours and not require too much time.

    During the COVID-era, I kept note of some side hustles I was planning to try out but never got around to doing so. Forgetfulness also has a part to play in matters and was only reminded when coming across one of my notes from July 2021 stored in Evernote.

    Now was a good time as any to try out one of them: Usertesting.com.

    What Is UserTesting?

    Usertesting.com provides a platform for businesses to get feedback on their products and services. Anyone can apply to be a contributor and provide feedback that consists of:

    • Accessibility
    • Usability
    • Live conversations with businesses
    • Pre-release platform feature review
    • Competitor benchmarking tests
    • A/B testing to compare different versions of a product or feature

    Before becoming an active contributor, a UserTesting will require some basic information as part of the registration process and a practice test to be completed.

    Acing The Practice Test

    UserTesting will provide a test scenario to prove you're a legitimate person and have the capability to demonstrate good communication and analytical thinking. It provides a good standard that is expected when carrying out real tests.

    The test itself is not complicated but you should be prepared to clearly think out loud so there is an understanding of your thought process as you're undertaking various tasks. It's always a good idea before performing a task to read the question out loud so your interpretation of what is being asked is clear. Most importantly, be honest in what you're reviewing.

    At the end of the test, provide a conclusion and thank them for their time in this opportunity.

    The fact that UserTesting.com forces users to take an assessment beforehand demonstrates the credibility of the service and sets the standard for the type of businesses they work with.

    UserTesting will respond to your practice test within 2-3 days, provide feedback and let you know if you will be accepted as a contributor.

    What To Expect From The Real Test?

    After completing the practice test, I didn't get real tests immediately. It took a good couple of weeks for them to start trickling in. Even then, I didn't qualify to take part in some tests as I didn't have experience in the area of expertise.

    Tests are performed on Windows, Mac, Android or iOS devices. There might be a requirement to provide feedback using a specific device. Access to a microphone and sharing your screen is a strict prerequisite. Some do ask for a face recording as well, but I decided to refuse tests that requested this.

    Test vary in length and payout:

    1. Short tests - $4
    2. 10-20 minute tests - $10
    3. 30-minute test - $30
    4. 60-minute test - $60

    The 60-minute tests will always be live a conversation directly with the business and scheduled in advance.

    The Type of Tests I've Contributed To

    I have been quite lucky as to the tests offered to me as they seem to relate to the tech industry. Providing feedback for businesses such as Microsoft, SalesForce, Github, GitLab and Amazon has been insightful.

    Other tests have evolved around the sectors of AI, website accessibility, pre-release platform updates and cloud-hosting.

    Payout

    This is the part you have all been waiting for. How much money have I made since starting at the beginning of June?

    Jerry Maguire - Show Me The Money

    I completed twenty tests consisting majority of $10 tests, one $60 test and a handful of $4 tests. Totalling to $232. Each test is paid out within two weeks to your linked PayPal account. Not so bad for an ad-hoc side hustle.

    UserTesting.com Payout - August 2024

    Twenty tests over the span of three months is not a lot when my contribution could have been higher. But when taking into consideration that this side hustle is only pursued outside of working hours and some tests do not apply to my expertise, it's not so bad.

    The majority of tests offered will be worth $10. Some may question whether they're even worth doing, to which I say: Yes! A $10 test can take anywhere between 5-15 minutes to complete on average. When you take the hourly UK National Minimum wage of £11.44, it's not bad. $10 converted to GBP equates to around £7.60. Easy money!

    The more you contribute the higher chance there is in getting more tests offered to you, providing your feedback rating is good. There are some damn interesting ones as well.

    Conclusion

    Don't apply to UserTesting with the expectation of mass riches as you will sorely be disappointed. Think of it as petty cash to count towards a little "fun money".

    Apart from the monetisation aspect of using UserTesting, I feel I am getting an early insight into where certain industry sectors are going, including my own, which is almost as valuable as the payout itself.

    There will be some days or even weeks when there will be no applicable tests. Just stick with it as all it takes is a handful of 30 or 60-minute tests (which can be hard to come by) to get a nice chunk of change for the month.

  • Published on
    -
    4 min read

    Addressing The Lack of Kentico Content

    I spoke to one of my developer friends a while back and as conversations go with someone tech-minded, it's a mixture of talking about code, frameworks, and platforms entwined with the more life-centric catch-up.

    Both having been in the tech industry for over 15 years, we discussed the "old ways" and what we did back then that we don't do now, which led to Kentico - a platform that we used to talk about all the time, where we'd try and push the boundaries to create awesome websites in the hopes of winning the coveted site of the month or year award. It occurred to us that it's not something we talk much about anymore. Almost as if overnight it vanished from our consciousness.

    Looking through the archive of postings, it's evident I haven't published anything Kentico-related in a long time, with my most recent being in September 2020. Despite the lack of Kentico content on my site, it remains a key player in the list of CMS platforms that I work with. The only difference is the share of Kentico projects are smaller when compared to the pre-2020 era.

    In this post, I discuss my thoughts as to the reason behind my lack of Kentico-related output.

    NOTE: This post consists of my view points alone.

    Licensing Cause and Effect

    A contributing factor was the substantial shift in their licensing model sometime in 2020. Moving to an annual subscription at an increased cost and ditching the base license created somewhat of a barrier to entry for small to mid-sized clients who just needed a reliable CMS platform with customisability. So for someone like myself who could provide Kentico solutions in a freelance capacity was instantly priced out.

    I understand why Kentico needed to reassess its price structure. They offer one of the best .NET CMSs and to stay at the top, an increase in revenue is required to drive the business forward. In all honesty, I believe we had a good run on the old licensing model for over ten years, and it was only a matter of time until a pricing review was required.

    It's just a hard sell when trying to sell a CMS with a £10,000 price tag before any development has even started.

    In light of this, it's only natural to look for alternatives that align with your own business strategy and development needs. The time originally spent developing Kentico has now been reallocated to alternative CMS platforms.

    A Stable Well-Rounded Platform

    Kentico is a mature product with many out-of-the-box capabilities (that get better with every release), which indirectly contributed to my lack of blogging on the subject. I usually only blog about a platform when I find useful workarounds or discover an issue that I was able to resolve.

    This is truly a compliment and testament to Kentico's build quality. There is no need to write about something that is already well-documented and written by active users of the community.

    Reassessing The Kentico Offering

    Kentico is still offered whenever possible. Both clients and developers alike have confidence in the platform. Clients enjoy the interface and security. Developers appreciate the customisability, clear architecture, quick hot fixing, and consistency between editions.

    The only question we now have to ask ourselves is whether Kentico is the right platform for the client's requirements. Prior to the change in licensing, you would be scoffed at for asking such a question. Kentico would be the front-runner before considering anything else.

    Nowadays, Kentico would only be put forward to a client if they had large-scale requirements where cheaper CMS offerings fall short for the licensing costs to be justified.

    I was recently involved in an e-commerce project that ticked all the boxes in line with the client's priorities, which made for an ideal use-case to carry out the build in Kentico, such as:

    • Enterprise-level security
    • Industry-standard compliance
    • All in one solution consisting of content management, e-commerce, and marketing automation
    • Scalability
    • Ability to handle large sets of data
    • Advanced customisability

    In my view, if a client is not too concerned about the above, then alternatives will be used and additional development will be carried out to fill in any gaps.

    The Alternatives

    The CMS sphere is ripe with offerings where we are spoilt for choice. I have whittled these down to:

    1. Umbraco
    2. Kentico
    3. Prismic
    4. Dato
    5. HubSpot

    In my view, those variety of CMSs covers all pricing points, technologies and customisability.

    Conclusion

    I would always jump at the chance in developing in Kentico as I know a large complex website can be developed with almost infinite customisation. But we can't help but notice there is a lot of competition out there, each providing a range of features across different architectures and price ranges.

    Based on my own experience, the demand for fully featured CMS platforms that have a large hosting footprint are reducing in popularity in the advent of more API driven (also known as headless) content delivery that works alongside other microservices.

    Investing in the Kentico eco-system (including its headless variant, Kontent) is always worth considering. It may just not be something I will be writing about consistently here as it requires a more corporate-level type of clientele.

  • Published on
    -
    2 min read

    CSS Zen Garden - The Road To Enlightenment

    Who remembers CSS Zen Garden? I do. As if it was yesterday... I remember first gazing my sights on a simplistic but visually stunning webpage demonstrating what all websites in the future could look like.

    CSS Zen Garden broke the norm of the websites we were used to seeing at the time - crowded blocky tabular-based layouts that lacked personality. It was a revelation and a turning point in web design standards!

    As described within the content of every CSS Zen design, its ethos was clear:

    The Road To Enlightenment

    Littering a dark and dreary road lay the past relics of browser-specific tags, incompatible DOMs, broken CSS support, and abandoned browsers.

    We must clear the mind of the past. Web enlightenment has been achieved thanks to the tireless efforts of folk like the W3C, WaSP, and the major browser creators.

    The CSS Zen Garden invites you to relax and meditate on the important lessons of the masters. Begin to see with clarity. Learn to use the time-honored techniques in new and invigorating fashion. Become one with the web.

    CSS Zen Garden pathed The Road To Enlightenment for me in an entirely different manner - deciding my career.

    Having completed my degree in Information Systems at university in 2006, I was at a crossroads as to which IT field I should specialise in. University seems to prepare you for anything apart from how to apply yourself when you exit the final doors of education into the real world.

    CSS Zen Garden changed the trajectory of my career. Originally, I considered entering the field of Consulting to then changing my mindset into garnering interest as a Web Developer instead. This has had a lasting effect. Even after 18 years, I am still involved in Web Development. I tend to focus more on backend functionality rather than look and feel.

    The CSS Zen Garden community spawned a variety of other designs from talented Web Developers that encompassed a design flair. But the design that started it all, 001, will always hold a special place in my heart. The unforgettable Japanese elements - the Itsukushima Shrine, water lilies, and a blossom tree.

    All the designs have stood the test of time and even through the age of modern web browsers and high-resolution screens, they still present a timeless look that fills me with nostalgia.

  • Published on
    -
    6 min read

    Year In Review - 2023

    Where Have I Been?

    Out of all the years I have been blogging, 2023 seems to be the year that has just flown by like a speeding train leaving us at the platform of memories before we have had a chance to truly comprehend what we have left behind.

    During the year, I am normally able to stop and take note of what I have accomplished (some end up as blog posts) but then also look towards the horizon for what is next. This year has been different - I was unable to transform my learnings and experiences into words. A combination of limited time and (if I am being honest) the lack of passion to write... I got tired.

    As a result, my blogging output has dwindled - totalling a mere seven posts for the year. This blog is important to me and is not something I will ever plan on abandoning as it is very important to own your words. I just have to be realistic in the sense that I won't be blogging as regularly as I have done so in the past.

    The lack of blog posts is not a reflection of having a lack of things to do.

    2023 In Words/Phrases

    Stocks and Shares, Investments, Upskilling, Hardwired Dashcam, Cornwall, Spirituality, Home Networking, Synology 1821+, Writer's block, Serverless functions, Azure CI, Shopify, Google Analytics 4, First Indian Wedding Anniversary, YouTube Content Curation, Green fingers, Lawn Enthusiast, Garmin Venu 2 Plus, The Flash IMAX, Patio/Garage Door renovation, Sky Garden London

    Stocks and Investments

    Since I began investing in the stock market at the beginning of 2022, it has quickly become a source of enormous interest for me, eventually becoming a passion of mine and a skill I wanted to invest (no pun intended) more time. - This could be the main reason for the lack of blogging output.

    I decided to take things a step further by going back to basics by taking a six week course provided by a very clever and patient coach, Vittorio, at StoicMoney. The course consisted of online learning material as well as weekly one-to-one private sessions.

    The course filled in the gaps where I was lacking in understanding crucial areas of the stock market as well as pointing me in the right direction to where my existing investment portfolio could be improved.

    Synology NAS Upgrade

    I am finally the owner of a 8-bay NAS powerhouse that is the Synology DS1821+, which is by far the next best network-related purchase I have made since the UniFi Dream Machine router.

    Investing in my home network is part of a transition I am currently undertaking to be less reliant on third-party services. By having a NAS, I have been able to reduce my cloud-based subscriptions to just Google Photos.

    My NAS is not just being used by myself, it is also regularly being used by family members - all the more reason to invest in making my home network more reliable and efficient. A NAS will only run as efficiently as the network is on.

    If the Synology DS1821+ is anything like the DS415Play (still fully functional), I expect it to last a very long time especially since it encompasses more than enough upgradeable options, such as:

    • Capability for up to 8 Hard Disk Drives
    • Upgradeable RAM
    • SSD Cache
    • Upgradeable Ethernet Card

    A Lawn Enthusiast Is Born

    Who could have ever seen coming that 2023 would be the year I would have taken an interest in something gardening-related, such as lawn care?

    I can't exactly explain where this obsession for obtaining thick, green, luscious blades of grass came from. Throughout the summer months, I was on my hands and knees creating the perfect foundation for grass to grow by creating a mixture of topsoil and grass seed to sprinkle on bare patches and areas that required thickening.

    I can confirm this will not be a one-off obsession based on how disheartened I felt when I was unable to tend to the lawn as I would have liked during the colder months.

    Where my lawn is now compared to how it was previously is night and day. I was both proud and somewhat surprised such a change could be made in what was a barron wasteland of dirt to something pleasing to the eye.

    I look forward to springtime next year to make my lawn even better!

    YouTube Video Curation

    My wife is quite the cook. She has the natural art of turning any group of ingredients into something damn tasty! So much so, her vegetarian cooking converted me from a regular meat eater to someone who enjoys eating meat-free alternatives.

    Meat-eater to vegetarian is quite the accomplishment!

    We had been toying with the idea of creating a YouTube channel to showcase her culinary skills for a while and back in June decided to train myself in video recording and editing.

    Together we made a very short test video on a simple subject matter (making Indian Chai) just to test the waters as to whether it's something we could do. The video had to accomplish the following to determine the feasibility:

    1. Recording content in good lighting conditions.
    2. Edit the video to a certain length (3 minutes).
    3. Apply narration throughout the video for cooking instructions.
    4. Add background music.
    5. Export the video as a YouTube Short and in its long-form variant.
    6. Overlay a logo.

    I am in no way a savvy video editor as this isn't something I've ever done before. Luckily, I managed to tick off all requirements and output a relatively slick (totally biased here!) video.

    Statistics

    I was hesitating as to whether it would be worth mentioning this section for a number of reasons:

    1. I know for a fact my visitors would have dipped due to lack of content output.
    2. The lack of ability to sufficiently compare 2022 and 2023 due to upgrading to Google Analytics 4 in May.
    3. I haven't been keeping track of any analytics for the majority of the year and will not have a clue how I will fair.

    However, the resulting stats didn't look too bad...

    2022/2023 Comparison:

    • Users: -7.3%
    • Page Views: -2.7%
    • Search Console Total Clicks: +87.1%
    • Search Console Impressions: +84%
    • Search Console Page Position: +3.3%

    There is an evident drop in users and page views - this is to be expected as I haven't been cranking out relatable content for some time. I wasn't expecting the positive outcome reported by Search Console.

    As stated earlier, the transition to Google Analytics 4 could have skewed the results to some extent as further filter refinement is required. The true comparison will come in next years year's end review.

    Goals for 2024

    I made a choice in 2022 to make next year's goals somewhat more achievable and less career/programming orientated. This has worked for me and those goals are still relevant to me - even a year on.

    I managed to get back into reading, made more of an effort to pick up the phone to loved ones and to be more present. One area I definitely need to improve on is fitness and exercise!

    I would also like to create a YouTube Channel for my wife's cooking and will set a goal to publish at least one video. I already have a name for the channel and my wife has a few dishes she would like to showcase. This is out the norm for me and quite looking forward to curating some video content.

    Final Thoughts

    My final thought is simply: Be happy. Be healthy. Be productive. Be kind.

  • I've owned my first Synology device - the DS415Play for over eight years. It has been my true day-to-day workhorse, which never faltered whilst churning through multiple downloads and uploads 24 hours a day, 7 days a week, 12 months a year since 2015. It's been one of the most reliable pieces of computer-related hardware I've ever owned.

    Unfortunately, I started to outgrow the device from a CPU, RAM and (most importantly!) hard-drive capacity standpoint and the need for regular restarting became the norm to resolve performance-related issues.

    The performance issues are also caused indirectly by the fact that my NAS isn't solely used by me, but also by my parents and wife where photos are viewed/backed up, documents are stored and videos are streamed.

    The most natural route was to move to stick with Synology and move to one of their larger and expandable NAS devices. The DS1821+ ticked all my requirements:

    • Quad-core 2.2Ghz processor
    • 8 bay Hard Drive capacity
    • Upgradeable RAM
    • NVMe Cache slots
    • Improved scalability

    Focusing on the potential for expansion should mean I won't hit the "hardware glass-ceiling" for many years, which was, unfortunately, the case with my DS415Play. I truly believe that if the DS415Play had the potential for some form of expansion, such as increasing the amount of RAM, it would solve the majority of my usage issues.

    Migrating From One Synology to Another

    DS415Play To DS1821+

    I was under the misconception that migrating from one Synology device to another would be as simple as moving the existing drives over. However, this was not the case due to a variety of factors:

    1. Age: Lack of straightforward compatible migration approaches from older NAS models.
    2. "Value" model discrimination: DS415Play is considered a "value" model and no straightforward migration route is available when upgrading to a "plus" model.
    3. Difference in Package Architecture: If the source and destination NAS use different package architectures may result in DSM configurations and package settings being lost after migration. You can only migrate drives to models with the same package architecture as your source NAS.
    4. Direct Ethernet Connection: Data cannot be copied over via a direct connection between both devices.

    The How To Migration tutorial provided by Synology raised more questions about how I should move my data and configuration settings. Out of the three methods proposed (Migration Assistant/HDD Migration/Hyper Backup), there was only one approach that applied to me - Hyper Backup.

    Manual Copy and Paste To USB Drive

    Before settling with Hyper Backup, I decided to carry out a direct copy-and-paste approach of each user's home directory from Synology to an external USB Drive. I thought this might be a less process-intensive and quickest way to move the data. - No Synology app-related overhead that could cause my DS415Play to grind to a halt.

    However, I quickly realised this could come to the detriment of the integrity and overall completeness of the backup. My NAS was still getting used daily and there was a high chance of missing new files and updates.

    With Hyper Backup, I could carry out a full backup initially and then schedule incremental backups nightly until I was ready to make the switch to DS1821+.

    Hyper Backup

    At the time, unbeknownst to me, this would prove to be a right pain. I knew from the start that moving around 5TB of data would be time-consuming but I didn't factor in the additional trial and error investigation time just to complete this task.

    To ensure smooth uninterrupted running, I disabled all photo and file indexing.

    Avoiding Slow Backup Speeds

    The backup procedure wasn't as straightforward as I'd hoped. Early on I experienced very slow backup speeds. This is down to the type of "Local Folder & USB" backup option selected in Hyper Backup. There is a vast difference in transfer speeds:

    • Local Folder & USB (single-version): 10MB - 60MB/s
    • Local Folder & USB: 0 - 1.2MB/s with longer gaps of no transfer speed

    To reduce any further overhead, compression and encryption were also disabled.

    Additional steps were also taken, such as reformatting the external hard drive to ext4 format and enabling the "Enable delayed allocation for EXT4" setting from the Control Panel.

    What is delayed allocation?

    All byte writes are cached into RAM and it's only when all the byte writes have finished and the file is closed then the data is copied out of the cache and then written to the drive.

    The potential disadvantage of enabling this setting is the drive is more vulnerable to data loss in the event of a power outage.

    Make Use of The High-speed USB Port

    Older Synology models have front and rear USB ports. To further aid in faster data transfer, be sure to connect the external hard drive to the rear USB port as this will be USB 3.0 - a better option over the slower USB 2.0 port provided at the front.

    Backup Strategy

    Once I had Hyper Backup running in the most efficient way, I created three backup tasks so the restore process could be staggered:

    1. User Home Directories: Everything within the /homes path.
    2. Photos: DS Photo-related files that have yet to properly be migrated over to Synology Photos
    3. Application Settings*: Settings and configuration for the key apps that I use. This doesn't include any physical files the app manages.

    * Only the "Local Folder & USB" backup type has the option to allow application settings to be solely backed up. Transfer speeds were not a concern as the settings have a very minimal file size.

    Once a full backup was completed, a nightly schedule was set to ensure all backups were up-to-date whilst I waited for some new hard drives for the DS1821+.

    Restore

    Restoring the backed-up data was a lot more straightforward than the backup process itself. The only delay was waiting for the new hard drives to arrive.

    New Hard Drives

    Due to the limitations posed by the only migration approach applicable to me, new drives had to be purchased. This was an unexpected additional cost as I hoped to re-use the 8TB worth of drives I already had in my DS415Play.

    I decided to invest in larger capacity drives to make the most of the 8-bays now at my disposal. Two 8TB Western Digital Reds are just what was required.

    Setup and Restore Process

    Utilising new hard drives was actually a refreshing way to start getting things going with the DS1821+, as any missteps I made as a new Synology owner when originally setting up the DS415Play could be corrected.

    Once the drives were installed, the following restore process was carried out:

    1. Install DSM 7.1.
    2. Create Drive Storage Pools.
    3. Install applications.
    4. Re-create all user profiles using the same details and usernames.
    5. Using Hyper Backup, copy all files into each home directory.
    6. Ensure each user's home folder and child directories are assigned with the correct permissions and are only accessible by the user account.
    7. Restore the /photo directory.
    8. Login to Synology Account in Control Panel and restore all DSM configuration settings from online backup - minus user profiles.
    9. Restore application settings (backup task number 3) using Hyper Backup.

    It was only after restoring the DSM configuration settings (point 8), that I realised user profiles including permissions could be restored.

    DSM Configuration Backup Items

    • File Sharing: Shared Folder, File Services, User & Group, Domain/LDAP
    • Connectivity: External Access, Network, Security, Terminal & SNMP
    • System: Login Portal, Regional Options, Notification, Update & Restore
    • Services: Application Privileges, Index Service, Task Scheduler

    Over Network File Restoration

    I decided to limit the use of over-network file copying to just the final leg of the restoration journey to handle some of the less important/replaceable files.

    I would only recommend over-network file copying if you have a fast and stable home network. My UniFi Dream Machine was more than able to handle the amount of data to the DS1821+.

    What Will Become of The DS415Play?

    There is still life in my old trusty DS415Play as it can still handle low-intensive tasks where background processes are kept to a minimum. Any form of file indexing on a large scale would not be suitable.

    I see the DS415Play being used purely as a network storage device avoiding the use of Synology apps. For example, a suitable use case could be an off-site backup at my parent's house.

    Final Thoughts

    Even though the migration process wasn't as smooth as I hoped it would be, there was a silver lining:

    • A Considered Setup Approach: As a long-term Synology user, I consider myself more experienced and understood more about the configuration aspects, allowing me to set up my new NAS in a better way.
    • Data Cleanse: When faced with limited migration routes, it makes you question what data is worth moving. I am somewhat of a data hoarder and being able to let go of files I rarely use was refreshing.
    • Storage Pools: I was able to set up Storage Pools and Volumes in a way that would benefit the type of data I was storing. For example, Surveillance Station recordings will write to a single hard disk, rather than constantly writing to multiple disks based on a RAID setup.

    After completing the full migration, the following thoughts crossed my mind: How long will this Synology serve me? When will I have to perform another migration?

    It has taken me eight years to outgrow the DS415Play. The DS1821+ is double in capacity and more so from a specification perspective (thanks to its upgradeability). Maybe 10 to 14 years?

    As someone who has just turned 38, I can't help but feel a sense of melancholy thinking about where I will be after that duration of time and whether the investment to preserve memories to my Synology will truly be the success I hope it will be.

  • While manually importing data into a Google Sheet to complete the boring chore of data restructuring, I wondered if there was any way that the initial import might be automated. After all, it would be much more efficient to link directly to an external platform to populate a spreadsheet.

    Google App Scripts provides a UrlFetchApp service giving us the ability to make HTTP POST and GET requests against an API endpoint. The following code demonstrates a simple API request to a HubSpot endpoint that will return values from a Country field by performing a GET request with an authorization header.

    function run() {
      apiFetch();
    }
    
    function apiFetch() {
      // API Endpoint options, including header options.
      var apiOptions = {
         "async": true,
         "crossDomain": true,
         "method" : "GET",
         "headers" : {
           "Authorization" : "Bearer xxx-xxx-xxxxxxx-xxxxxx-xxxxx",
           "cache-control": "no-cache"
         }
       };
    
      // Fetch contents from API endpoint.
      const apiResponse = UrlFetchApp.fetch("https://api.hubapi.com/crm/v3/properties/contact/country?archived=false", apiOptions);
      
      // Parse response as as JSON object.
      const data = JSON.parse(apiResponse.getContentText());
    
      // Populate "Sheet1" with data from API.
      if (data !== null && data.options.length > 0) {
          // Select the sheet.
          const activeSheet = SpreadsheetApp.getActiveSpreadsheet();
          const sheet = activeSheet.getSheetByName("Sheet1");
    
          for (let i = 0; i < data.options.length; i++) {
            const row = data.options[i];
    
            // Add value cells in Google sheet.
            sheet.getRange(i+1, 1).setValue(row.label);
          }
      }
    }
    

    When this script is run, a request is made to the API endpoint to return a JSON response containing a list of countries that will populate the active spreadsheet.

    The Google App Script official documentation provides even more advanced options for configuring the UrlFetchAppservice to ensure you are not limited in how you make your API requests.

    In such little code, we have managed to populate a Google Sheet from an external platform. I can see this being useful in a wide variety of use cases to make a Google Sheet more intelligent and reduce manual data entry.

    In the future, I'd be very interested in trying out some AI-related integrations using the ChatGPT API. If I manage to think of an interesting use case, I'd definitely write a follow-up blog post.

  • Published on
    -
    3 min read

    Working From Home - A Three-Year Update

    It's 23rd March 2020 8:30pm and Boris has announced a stay-at-home order effective immediately. Absolutely befuddled at what is about to happen, I receive a call soon after the broadcast from my manager saying we should go to the office and collect any equipment that would allow us to start working from home.

    It is only after returning back from the office later that evening where it dawned on me that things have changed. Thoughts about health, family and concerns about my livelihood all of a sudden came into question. Uncertainty of life as we knew it.

    As strange as it may sound, my basic home office setup gave me a sense of focus, purpose and security. I consider myself as one of the lucky ones during the pandemic where I had a room that could act as a dedicated home office.

    The Very First Work From Office Setup

    Three years on I'm still happily working remotely. The only thing that has changed with each passing year since that fateful night is my office is better equipped and as of last year started making more of an effort in hybrid working by making appearances a couple times a week to my place of work. Being able to choose when to work from the office on my own terms gives me the freedom to break up the week and do find it refreshing. Best of both worlds!

    The pandemic was an uncontrollable force for change in many peoples lives and in my case, both personally and professionally. When I reflect back to the time before the pandemic, I often wonder how I was able to work around 47 hour week and yet still find time to cook, clean and relax. I can only surmise that this is all we have ever known and what was expected of us. A way of life ingrained in our DNA from the very moment we begin our careers.

    Productivity

    The pace of life has slowed to the point where, for the most part, I can schedule my working hours around my day. I now start my day earlier and undisturbed, allowing me to rip through emails and complete some tasks from the day before, all before my first meeting.

    I still work in a similar fashion to how I normally would do within an office environment where I am sat at my desk over many hours, but the main difference being - I have more time!

    I’m able to get personal mundane tasks done, such as putting the washing on, prepping healthy food (now completely handled by my wife), or even cleaning the bathroom during my lunch break. These types of tasks help get me away from the desk for small moments of time and be productive in doing so.

    Even though I've gone to a hybrid working pattern, there are stark differences between being at home and the office. Suffice to say I am more productive within a home environment as I'm able to focus on the job in hand without distractions (apart from a few scheduled meetings), which is a blessing as a programmer.

    There just doesn’t seem to be time to work in an office anymore on a full-time basis. Working from home has proved to be a positive change from both a work and personal perspective.

    Minor Downsides

    Working from home gives me a lot of flexibility in how I work. Sometimes too much flexibility can be a detriment to when you feel you can have a break. Some are able to walk away from their desk during lunchtime and stick to the 9 to 5. Unfortunately, I'm not the type of person who can do that.

    I find it quite difficult to set boundaries at home even though I have a dedicated working space. At least when working from the office, the day ends from the moment you leave the building.

    The Future and Sustainability

    Regardless what employers think of the work from home phenomenon, it isn’t going anywhere soon. It’s what future employees expect. If your job can be done at a desk, does it matter where your desk is?

    Since we've returned from post pandemic normality, I’ve become more conscious how professional myself and my surroundings may come across to new clients I talk to on Zoom calls and make an active effort in ensuring everything is up-to-par. - Some clients may interpret disturbances from family members and informal looking office surroundings as unprofessional.

    Conclusion

    Working from home is just a small part in a bigger picture on how the pandemic has changed my life. It was a catalyst of positive change that forced me to reassess my priorities.

    Home really is where the heart is and there is no longer any doubt whether work and family life are able to mix under one roof. I wouldn't have it any other way.

  • There are times when you need to call multiple API endpoints to return different data based on the same data structure. Normally, I'd go down the approach of manually creating multiple Axios GET requests and then inserting the endpoint responses into a single object array. But there is a more concise and readable way to handle such tasks.

    With the help of DummyJson.com, we will be using the product search endpoint to search for different products to consolidate into a single array of product objects: /products/search?q=.

    As you can see from the code below, we start off by populating an array with a list of API endpoints where multiple GET request can be carried out for each endpoint from our array. The requests variable contains an array of promises based on each of these GET requests.

    Finally, axios.all() allows us to to make multiple HTTP requests to our endpoints altogether. This function can only iterate through a collection of promises. For more information regarding this Axios function, I found the following article very insightful for a better understanding: Using axios.all to make concurrent requests.

    // List all endpoints.
    let endpoints = [
      'https://dummyjson.com/products/search?q=Laptop',
      'https://dummyjson.com/products/search?q=phone',
    ];
    
    // Perform a GET request on all endpoints.
    const requests = endpoints.map((url) => axios.get(url));
    
    // Loop through the requests and output the data.
    axios.all(requests).then((responses) => {
    	let data = [];
    
      responses.forEach((resp) => {
    	  data.push(...resp.data.products)
      });
      
      // Output consolidated array to the page.
      const template = $.templates("#js-product-template");
      const htmlOutput = template.render(data);
    
      $("#result").html(htmlOutput);
    });
    

    As we're looping through each request, we push the response to our data array. It is here where we merge all requests together into a single array of objects. To make things a little more easier to display the results to the page, I use the jsrender.js templating plugin.

    A working demo can be seen on JsFiddle.

  • Published on
    -
    6 min read

    Hardwiring A Dash Cam Into An Audi A1

    Purchasing a Dash Cam has been a priority for me since my pride and joy was damaged when I was in an area of London I hate to drive in. The area in question shall not be named, but I deem it as "a world without rules".

    There are many varieties of Dash Cams on the market that differ in their size and features. From the offset, I had the following set list of requirements in mind:

    • Small form factor
    • LCD Display
    • Option to add a rear camera
    • GPS
    • High-quality night vision
    • Impact sensor - to automatically start recording if a collision occurs when away from the car

    The one that met all these requirements was the RedTiger 4K Dash Cam Front Rear Camera.

    Hardwire Installation

    All Dash Cam's include a power connection to the 12V Cigarette socket, which is absolutely fine if you want a quick setup or are not comfortable in delving into a vehicles fuses. The only disadvantage of this approach is that the 12V socket will always remain occupied and can look slightly untidy.... Dangling wires - no thank you.

    I always had in mind that if I were to get a Dash Cam, I would go down the hardwire option for a more integrated and neater look. To achieve this, a hardware kit needs to be purchased separately. There are many different varieties out there, it's just a matter of finding the right one with a suitable connector for the Dash Cam. In my case, a USB-C connector was needed.

    Understanding The Fuse Box

    Going down the hardwiring approach can be a little daunting and it's recommended one does their due-diligence by reading your car manual and researching online how to access the fuse box as well as getting an understanding of what each fuse does. - This took longer than the installation process itself.

    For my 2018 Audi A1, I found the following resources useful:

    1. Nextbase - How To Fit A Dash cam
    2. Fusebox Info - Audi A1
    3. Audi A5 / S5 2007 -2018 how to fit dash cam to fuse box
    4. Physical Car Owner Manual - Lookup the fuse section
    5. Car Fuse Guide

    Some of these resources were not specifically tailored to my Audi A1, but it gave me a source of reference on what to look out for during the installation process.

    Accessing The Fuses

    The fuses that need to accessed will be located on inside the dashboard. This could either be on the passenger or driver side depending on the most suitable fuse you wish to "piggy-back" onto.

    When you pop open the side-panel of the dashboard, you'll see something that looks like the following:

    In Dash Fuse Boxes

    IMPORTANT: When referring to the Owner Manual, do not make the same mistake I did where I read the fuse box order wrong. In my manual, it illustrated diagrams based on a left-hand drive car, when mine is right-hand drive. This can be confirmed based on the fuse box colour order. As you can see from the image of my fuse box (above), the drivers side fuse boxes are ordered (left to right) - black, brown and red.

    "Piggy-back" A Fuse and Earth Connection

    The hardwire kit will contain various sized fuse wires that can be used based on the size of fuse you plan on "piggy-backing" off of.

    Fuse Wire Options

    For my use, I'm concentrating on the red fuse box where regular sized fuses are present. I opted to connect the fuse wire on slot 2 where the 5A Tan fuse is present. The 5A fuse is then slotted onto the fuse wire to "piggy back" the connection.

    Piggy-backed and Earthed Fuse Wire

    One other thing to point out here is that connection needs to be earthed to the vehicle chassis. This is done by connecting the earth cable (shaped like a hook) to a metal screw/bolt.

    Fuse Slot Update

    Whilst finishing the write-up of this post, I made a slight amendment to the fuse slot used. I decided to use slot 11 (7.5A Red fuse) instead. Even though the first iteration worked absolutely fine in the weeks post install, I preferred to piggy-back off the "Control unit for information electronics" rather than "ABS Control Unit" for peace of mind.

    Rear Camera

    I wasn't planning on installing the rear camera as the initial hard wiring required quite a bit of effort. It just happened that I had a few hours to free one free weekend and thought I'd give it a shot. Connecting the rear camera to the main dash cam unit was the easy part, but finding an inventive way to neatly tuck the wires from the front of the car all the way to the rear took a quite some time.

    The end result is pretty cool - if I do say so myself. Now my car has an additional layer of surveillance.

    Rear Dash Cam Unit

    You may have to be inventive as to how you mount the rear camera. The most ideal place would be close to the rear window on the sill. Due to the way my boot door opens, this was not an option and instead opted to mount it to the roof.

    Disclaimer

    The steps I have detailed in this section is based on an approach that worked for me. The information provided does not constitute professional installation advice. I cannot guarantee that the information is always up to date and will work for all vehicles.

    I am not liable for any personal injury that you may suffer, or vehicle damage as a result of partaking in the installation of a dash cam. To do so, will be at your own risk.

    RedTiger 4K Dash Cam Quick Review

    Overall, the RedTiger dash cam is a worthy addition to maintain the security of my car. I can't complain about the camera quality and was very pleased with the resolution output. For example, vehicle number plates are crisp, even from a distance. I was very much surprised at just how well it performed in all conditions - even on poorly-lit roads.

    My only gripe is a lack of backup battery or onboard memory so that your own settings are saved. Unless the device is hardwired where power is always provided, every time you turn off your car these settings will be wiped. This is a minor annoyance and luckily the default settings suffice for my usage.

    The accompanying app is a little clunky and I was expecting this. But it does it's job very well and was surprised to see how much information is presented when playing back a recording. This specific RedTiger dash cam model has the ability to connect via WiFi from the app allowing me to view all recordings and download selected videos directly to my phone.

    Conclusion

    Hardwiring a dash cam may come across as a daunting task. My advice is to invest time in the preparation and understand where the fuse boxes are located. Most importantly, during the installation process, take things slow and plan how the wiring will run along the inside of the car.

    I found the whole experience quite rewarding.

    Once you have a dash cam fitted, you'd wonder how you ever lived without it.