Blog

Blogging on programming and life in general.

  • This site has been longing for an overhaul, both visually and especially behind the scenes. As you most likely have noticed, nothing has changed visually at this point in time - still using the home-cooked "Surinder theme". This should suffice in the meantime as it currently meets my basic requirements:

    • Bootstrapped to look good on various devices
    • Simple
    • Function over form - prioritises content first over "snazzy" design

    However, behind the scenes is a different story altogether and this is where I believe matters most. Afterall, half of web users expect a site to load in 2 seconds or less and they tend to abandon a site that isn’t loaded within 3 seconds. Damning statistics!

    The last time I overhauled the site was back in 2014 where I took a more substantial step form to current standards. What has changed since then? I have upgraded to Kentico 10, but this time using ASP.NET Web Forms over MVC.

    Using ASP.NET Web Form approach over MVC was very difficult decision for me. Felt like I was taking a backwards step in making my site better. I'm the kind of developer who gets a kick out of nice clean code output. MVC fulfils this requirement. Unfortunately, new development approach for building MVC sites from Kentico 9 onwards will not work under a free license.

    The need to use Kentico as a platform was too great, even after toying with the idea of moving to a different platform altogether. I love having the flexibility to customise my website to my hearts content. So I had to the option to either refit my site in Kentico 10 or Kentico Cloud. In the end, I chose Kentico 10. I will be writing in another post why I didn't opt for the latter. I'm still a major advocate of Kentico Cloud and started using it on other projects.

    The developers at Kentico weren't lying when they said that Kentico 10 is "better, stronger, faster". It really is! I no longer get the spinning loader for obscene duration of time whilst opening popups in the administration interface or lengthy startup times when the application has to restart.

    Upgrading from Kentico 8.0 to 10 alone was a great start. I have taken some additional steps to keep my site clean as possible:

    1. Disable view state on all pages, components and user controls.
    2. Caching static files, such as CSS, JS and images. You can see how I do this at web.config level from this post.
    3. Maximising Kentico's cache dependencies to cache all data.
    4. Took the extra step to export all site contents into a fresh installation of Kentico 10, resulting in a slightly smaller web project and database size.
    5. Restructured pages in the content tree to be more efficient when storing large number of pages under one section.

    I basically carried out the recommendations on optimising website performance and then some! My cache statatics have never been so high!

    My Kentico 10 Cache Statistics

    One slight improvement (been a long time coming) is better open graph support when sharing pages on Facebook and Twitter. Now my links look pretty within a tweet.

  • I can only speak about my experiences from working in the technical industry, but there isn't a week that goes by when I am not being spammed by recruitment agencies who don't seem to get the message that I'm not interested. I can "almost" deal with the random emails I get from various agencies, but when you get targeted by a single person on a daily basis it gets infuriating!

    I remember back in the day when I was fresh out of University and the sense of excitement I had whenever a recruiter phoned or emailed. The awesome feeling that I was in demand! I can still remember my first job interview, where to my horror I found out that the recruiter "tweaked" my CV by adding skills that I didn't have any experience of, resulting in making me look like an absolute idiot in front of my interviewer. I thought they were my friend and only looking for my best interest. In reality, this is not the case.

    As an outsider looking in, the recruitment industry seems to be a really cut-throat business where only one thing seems to matter: the numbers! Not whether a candidate is particularly right for the role. I am not tarnishing all recruiters with the same brush - there are some good guys out there, just not enough.

    A little while ago, I was "tag teamed" by two recruitment agents working at the same agency within such a short space of time. What I'd like to highlight here is that I did not respond to any of their correspondence. But the messages still kept rolling in.

    From Emma...

    Emma sent me standard emails and was quite persistent. One brief Linkedin message accompanied by three direct emails.

    From: Emma
    Sent: 06 July 2016 14:37
    To: 'surinder@doesntwantajob.com
    Subject: New Development Opportunities based in the Oxfordshire area!


    Dear Surinder,


    How are you?


    I just wanted to catch up following on from my prior contact last week over LinkedIn, as mentioned I did come across your profile on LinkedIn and I would be keen to firstly introduce myself, as well as to catch up to find out if you could be open to hearing about anything new.


    As mentioned in my prior email, I am also recruiting for a .Net Developer for a leading organisation based in South Oxfordshire! I would be extremely keen to discuss this position with yourself in further detail.


    Please can you give me a call or drop me an email to let me know your thoughts either way. Hope to hear from you soon!


    Kind Regards,
    Emma

    From: Emma
    Sent: 12 July 2016 14:45
    To: 'surinder@doesntwantajob.com
    Subject: Could you be interested in working for a leading Software House, Surinder?


    Hi Surinder,


    How are you?


    Could you be interested in new roles at present?


    If so as mentioned below, I am working on a new .Net Developer role for a leading company based in South Oxfordshire. Please can you give me a call or drop me an email through to let me know your thoughts.


    I will look forwards to speaking with you soon!


    Kind Regards,  
    Emma

    From: Emma
    Sent: 15 July 2016 14:14
    To: 'surinder@doesntwantajob.com
    Subject: Could you be interested in working for a leading Software House, Surinder?


    Hi Surinder,


    How are you?


    Are you on the look out for new roles?


    If so as mentioned below, I am working on a new .Net Developer role for a leading company based in South Oxfordshire. Please can you give me a call or drop me an email through to let me know your thoughts.


    I will look forwards to speaking with you soon!


    Kind Regards,
    Emma

    From Becky...

    Now Becky cranked things up a notch or two. She was determined to get my attention and very persistent, I'll give her that. Her strategy consisted of Linkedin messages, following me on Twitter and (like Emma) send me a few emails.

    From: Becky
    Sent: 07 August 2015 11:37
    To: 'surinder@doesntwantajob.com
    Subject: Could you be interested?


    Hi Surinder,


    I’ve come across your profile on LinkedIn and it made me think you could possibly be interested in a Web Developer opportunity that I’m currently recruiting for, based at a leading company in West Oxfordshire. 


    I appreciate that you may not be actively looking at the moment, but I can see that you have been at for over 5 years now, so I wanted to approach you about this as I thought you could possibly be interested in a change?

    I’ve included some more information about the role attached.

    <Omitted Job Description>

    I can see you’re working in an agency environment at the moment at <Omitted Company>, which is obviously great for the variety of sites you get to work on. This role will give you that opportunity as well, but with the chance to engage more with the projects your working on, having a deeper involvement in the entire process.

    You will get to work with the latest versions of ASP.Net and C#. My client also gives you full access to Pluralsight. On top of all this, the working environment is the most beautiful in Oxfordshire and there are excellent environmental benefits. The salary is up to <Omitted Salary>.

    That was the first email. A standard recruiters email, but went the extra mile to personalise things based on my current experience. The second email gets a little further to the point and I start to smell the sense of desperation.

    From: Becky
    Sent: 11 August 2015 12:34
    To: 'surinder@doesntwantajob.com
    Subject: Could you be interested?


    Hi Surinder,


    I just wanted to send you another email following up on my previous one below, regarding a Web Developer opportunity I am currently recruiting for based in West Oxfordshire. I have attached some more information for you to the email.


    I’d really like the opportunity to chat to you about this opportunity, as I think it could be a really great fit for you. The company are a great one to work for – they offer you fantastic environmental benefits, a beautiful working space, plus are keen to create an interesting and productive environment for developers with full access to Pluarlsight and the latest versions of ASP.Net and C#.
       
    If you are interested in discussing this with me further it would be great if you could get back to me! However, as I said in my previous email, if you’re not interested n pursuing new opportunities then please do just let me know and I shall remove you from our mailing list right away.


    Kind Regards
    Becky

    I think by the the third and final email, Becky finally got the message and knew I wasn't going to take the bait. But admirably tries to get something out of it by asking if I know of anyone else interested in the position she is offering.

    From: Becky
    Sent: 13 August 2015 14:10
    To: 'surinder@doesntwantajob.com
    Subject: Could you be interested?


    Hi Surinder,


    I just wanted to send you a final email about the role below and attached to this email. I think with your experience at <Omitted Company> you would be a great fit for it, so I would love to have a chat with you about it if you think you could be interested. However, if you’re not looking for a new role – seeing as you are an expert in this field – would you know anyone else with your skill set who may be interested in this position? If so, please do not hesitate to pass on my details!


    Kind Regards
    Becky


    Is it acceptable to go to these great lengths to get someones attention? A single email alone should suffice. I understand they have a job to do, but do they really think this approach works? They're clearly not engaging candidates in the right way. I truly question the mentality here. Recruiters remind me of Terminators...just without the killing part.

    Recruitment Terminator Reference - It Can't Be Reasoned With...

    Doesn't sound hopeful does it? But there are two things you can do to lessen the headache and make you less of a target and at the same time, still keep in the loop (if you feel ever so inclined) with the good opportunities that may arise:

    • First and foremost do NOT ever respond! Even if it is to tell them you're not interested. Soon as they know the email address is active and see signs of life, you'll never get them to leave you alone.
    • If you want to enquire about a position via a recruitment agent, use a different contact email address. At least you can ditch it at times of need.

    Loved reading this article titled: Stop The Recruiting Spam. Seriously.. An inciteful read covering some really good points on the state of the recruitment industry.

    Note to my current employment and any recruiters: I'm happy where I am.

  • My custom Salesforce library that I readily use for any Salesforce integrations within my native .NET applications consists of a combination of both handwritten code as well as utilsing the functionality present within the Force.com Toolkit. Even though the Force.com Toolkit does pretty much everything you need for day to day activities like basic read and write interactions. When it comes to anything more, a custom approach is required.

    I have created a AuthenticationResponse class that contains two methods so I could easily interchange between different authentication processes depending on my needs:

    • Rest - Retrieves access token to Salesforce environment in a traditional REST approach.
    • ForceCom - Retrieves authentication details when API calls using Force.com toolkit is used.
    public class AuthenticationResponse
    {
        /// <summary>
        /// Retrieves access token to Salesforce environment in a traditional REST approach.
        /// </summary>
        /// <returns></returns>
        public static async Task<Authentication> Rest()
        {
            HttpClient authClient = new HttpClient();
                
            // Set required values to be posted.
            HttpContent content = new FormUrlEncodedContent(new Dictionary<string, string>
                    {
                        {"grant_type","password"},
                        {"client_id", SalesforceConfig.ConsumerKey},
                        {"client_secret", SalesforceConfig.ConsumerSecret},
                        {"username", SalesforceConfig.Username},
                        {"password", SalesforceConfig.LoginPassword}
                    }
            );
                
            HttpResponseMessage message = await authClient.PostAsync($"{SalesforceConfig.PlatformUrl}services/oauth2/token", content);
    
            string responseString = await message.Content.ReadAsStringAsync();
    
            JObject obj = JObject.Parse(responseString);
    
            return new Authentication
            {
                AccessToken = obj["access_token"].ToString(),
                InstanceUrl = obj["instance_url"].ToString()
            };
        }
    
        /// <summary>
        /// Retrieves authentication details when API calls using Force.com toolkit is used.
        /// </summary>
        /// <returns></returns>
        public static async Task<ForceClient> ForceCom()
        {
            ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls12;
    
            AuthenticationClient auth = new AuthenticationClient();
    
            await auth.UsernamePasswordAsync(SalesforceConfig.ConsumerKey, SalesforceConfig.ConsumerSecret, SalesforceConfig.Username, SalesforceConfig.LoginPassword, $"{SalesforceConfig.PlatformUrl}services/oauth2/token");
    
            ForceClient client = new ForceClient(auth.InstanceUrl, auth.AccessToken, auth.ApiVersion);
    
            return client;
        }
    }
    

    All configuration settings such as the consumer key, consumer secret, username and password are being read from the web.config via a "SalesforceConfig" class. But these can be replaced by calling directly from your own app settings. Both methods return the access token required for querying a Salesforce platform.

  • As I have been writing the last few blog posts, I've been getting the case of "twitchy feet" during the writing process. I normally get "twitchy feet" when frustrated or annoyed by things in my life that I feel could be done easier. In this case, my site has started to frustrate me and felt that adding new posts became a chore.

    Over the 10 years (has it really been this long!?) owning and maintaining this site, it's started to become a bit of the beast from the initial outset. I've jumped from platform to platform based on my needs at the time:

    • Wordpress (2006)
    • BlogEngine (2007 to 2012)
    • Kentico (2012 to present)

    I feel at the grand old age of 31, I need a platform that nurtures my writing creativity without having to worry about general maintainance and somewhat restrictive editorial functionality. Ever since I tasted the pure nectar that is Markdown, my writing speed has gone through the roof and love having full control through the simplistic editing interface - Markdown is the furture!

    I am a certified Kentico Developer (you may have got that impression from my vast posts on the platform) and specifically chose Kentico CMS because it gave me the full flexibility to build the site how I wanted. As great as the platform is, I've come to the conclusion that this site will never grow to be anything more than one thing: a blog. So I want to down-size like a person getting on in his years and move to a smaller house.

    Enter Ghost...

    Ghost

    The Ghost platform has garnered a lot of traction over the years ever since its concept in 2012. I've been keeping an eye on it over the years and never really gave the platform much thought until I noticed quite a few popular bloggers making the move and experiencing the lightening fast performance. This is possibly down to the blogger hosting their instance on Ghost Pro. Could be wrong. I am planning on going down the Ghost Pro hosting route and get everything setup by the very nice people behind the scenes at Ghost HQ, who will lovingly host and look after my site.

    I opened up a dialog on Twitter to Ghost who were very kind in alleviating my initial migration worries:

    @SurinderBhomra We can upload images for you, if you send the upload directory in the format Ghost uses, i.e. /content/images/yyyy/mm/image-name
    — Ghost (@TryGhost) October 7, 2016

    @SurinderBhomra We can help with the redirects if you're coming over to Ghost(Pro). :)
    — Ghost (@TryGhost) October 6, 2016

    The only thing I will have to get over, which Ghost will not be able to help me with is getting over the mindset that I will not be able to to tinker around with my site to the full extent as I do now. But this isn't necessarily a bad thing and will give me the opportunity to concentrate more on writing quality content. I just hate the thought of restricting myself.

    Ghost has put a framework in place that no other platform has done so well - giving power to write content anywhere:

    • Desktop browser
    • Mobile browser
    • Desktop application

    Looks like Ghost lives up to its main selling point:

    An open source blogging platform which makes writing pleasurable and publishing simple.

    What I also love is the SEO optimisation out-of-the-box. God knows how many hours I've spent trying to get my site SEO friendly, not only from an search indexing standpoint, but a social sharing standpoint too with all the open graph tags built-in. No need for extra plugins or development from a code perspective.

    Whats Next?

    As it currently stands, I am evaluating Ghost through their 14 day trial and need to send an email to their support team before I make a confirmed decision to move. I like what I am seeing to far. Just need to get the time to put a migration process in place to move the 200 posts on this site. Eek!

    Ghost is definitely not as scary as I once thought. Cue Ray Parker Jr...

  • Published on
    -
    2 min read

    The Pursuit Of Happiness

    So it's finally come to this... A point in my life where I'm questioning what have I done to get to this place I currently find myself standing, wanting to make sense of an emotion that was so naturally built into my being from day one. But now, I am not too sure if it exists or ever did exist.

    The Sad Clown

    Before you read any further, I thought I just clarify you won't be finding me talking about the performance of Will and Jayden Smith in the film: The Pursuit of Happiness. The title of the film and this post is purely coincidental.

    This year has been to what I can only describe as: turbulent. The complete opposite to what it should have been. It was going to be a year of pastures new. A seed of great things to come was planted, watered on a daily basis and nurtured to flourish into the start of something quite beautiful. Alas, like the state of my lawn it’s very much the case where no matter how much hard graft is invested to transforming something withered to greener pastures, it morphs back to its original state as nature intended. Some things cannot be changed.

    Why do I write this? That I do not know. Maybe writing my inner thoughts into words to stare back at me in its raw unforgiving form is the only way to come to terms with what I am facing. Let's call it: therapy.

    I look at my life and think I am a lucky person. I have nothing to complain about, yet I feel something missing. As one day ends and another begins, I find myself wondering what I am trying to accomplish and questioning if I am doing everything in my power remedy the wounds still open from earlier this year. Honest answer: probably not. Yesterday, I thought about what Friedrich Nietzsche said:

    If you stare into the abyss, the abyss stares back at you.

    By not confronting the wounds of yesterday, I'm consumed by being reminded of the painful events that has wedged itself deep into my hippocampus. Slowly eroding away my old self. But there is just enough for the small part of me that still exists to warn me that I am slowly edging mentally to the point of no return. So I am here writing this very post.

    If I don't start the healing process now, what I fear the most may come into fruition - others around me will notice the gaping hole where my left ventricle used to be. I have come to the conclusion that I'm not so good at being the great pretender over a considerable duration of time.

    With every letter I type I slowly regain consciousness and become self aware once again, coming to the realisation that this year has changed me. No doubt about that. But I'm stronger for it.

    If a human being thoughts and emotions is truly boundless, then it's in our nature to have the capacity to forgive, forget and learn. By doing this, I can only hope the resulting outcome will be... happiness. In time this will happen. As they say "time is a great healer". I take great comfort in that.

  • Force.com Explorer is a really useful tool that gives you the ability to explore database tables within your Saleforce environment and run queries against them. Even though this tool has been retired since 2011, I still actively use it purely because I prefer to have an application installed on my computer, rather than the web-based tool - Workbench.

    I am writing this post for two reasons: Firstly, for Salesforce newcomers and secondly, one of my fellow developers working on the same project as me was having issues logging into Force.com Explorer. Judging by the title of this post this may sound a little self-explanatory or dim-witted. Nevertheless, it's a worthy post!

    Before I get to it, I am assuming you know the following three things:

    • How to generate a Security Token.
    • Create a Connected App.
    • Generate Client ID and Client Secret from your Connected App.

    Salesforce Force.com Explorer Login

    The easiest part of the login form is entering your login credentials and selecting the type of environment you are planning to explore. Just ensure you have a user login credentials that has sufficient access rights to explore Salesforce database objects.

    The Client ID field is a little misleading because this field doesn't just accept the Client ID key generated from your Connected App alone. It can also accept the following combination:"<Client-ID><Security-Token>". So don't make a misconception where the Client ID is only accepted.

    As you probably know (if you built apps using Salesforce API), combining the Client ID and Security Token allows you to access Salesforce data from any IP. If you whitelisted a specific IP in the Trusted IP Range at Connected App level, you might get away with using the Client ID alone.

  • Published on
    -
    2 min read

    Kentico 9 Certified Developer

    I haven't done the Kentico certified exam for over two years - but this doesn't make me any less of an awesome and competent Kentico Developer. Over the last two years, a lot has changed within the Kentico realm resulting in the subject matter becoming a little more of a challenge to keep up to speed with. Afterall, two years ago we saw the dawn of a new age - the end of Kentico 7 and the start of Kentico 8. I am seeing the same thing happening again, except this time we're just about seeing Kentico 10 making it's appearance over the horizon.

    What I noticed this time round was the increased number of questions evolving around macro's. It felt like I was bombarded at the time of carrying out the exam. I think the only thing that got me through is remembering the approach I took to the macro's I wrote for a recent EMS project.

    The Kentico Certification Preparation Guide has greatly improved compared to previous versions where in the past questions were pretty simple and a vast contrast to the real thing. This allowed me to gauge a lot more on the type of questions that would potentially be presented, but I did notice quite a few questions from the preparation guide cropped up in the real exam - although slightly re-worded.

    I highly recommend anyone who is interested in becoming a Kentico Certified Developer to read the following post by Jeroen Furst prior to taking the exam called: Tips for becoming a Kentico Certified Developer. Jeroen brings up some really good points and guidance to prepare yourself. If only I came across this post two years ago when I wasn't too sure what to expect (being my first Kentico exam), I would have felt more comfortable in understanding what was required.

    I was expecting there to be some questions relating to MVC due to all the effort made by the Kentico development team to make the MVC integration seamless within Kentico 9. Alas, this was not the case. Jeroen also states the following:

    If you are planning to take any older v6, v7 or v8 exams (if still possible) you might run into questions regarding Azure and the Intranet solution. These topics are not part of the v9 exam anymore.

    The Kentico 9 exam purely focuses on the core Kentico features as well as the platform architecture every Kentico developer should know in order to build high quality sites. You will also find yourself learning some new things in the process of preparing for the exam as well as brushing up on your existing knowledge.

    If you succeed, you can proudly display this badge in all its glory! ;-)

    Kentico 9 Certified Developer

  • My bookshelf was really in the need of a good clear out. Out of all the books I own, I noticed that I seem to have more technical/programming books compared any other form of book. I guess this makes me your typical nerd with the high interest of anything programming related. Then again, my blog posts may already show that.

    Books Shelf of Programming Books (Click for enlarged image)

    As I peruse through my vast collection, I can't help but get in the mood to reminisce back at a time where I was still trying to find my feet in the coding world. I am reminded of the confusing and somewhat challenging journey as a student at Oxford Brookes University, where I was trying to get a grip on the fundamentals of programming by sinking my teeth into books about Pascal, Delphi and C++.

    It was only when carrying out my year long dissertation that I had a profound interest in Web Development as well as Microsoft development frameworks in general. This is probably the point in my life where my programming book purchases soared drastically. As you can see from my collection of my books in this post, two things are noticed:

    1. How out dated the subject matter is. Yes, there is a Classic ASP book in there.
    2. The thickness of each book. I think JavaScript Bible is probably the thickest!

    Collection of Programming Books (Click for enlarged image)

    The last programming book I purchased was around three years ago - C# In Depth by Jon Skeet. This was the first book purchase I made in a very long time after studying because I needed to up my game as well as to demonstrate my C# prowess. I generally use developer blogs and forums to expand my knowledge to all my never ending questions.

    So this leads me to the question that I will just throw out there. What is a better method to learning? Books or online resources?

    I think our way of learning has changed over the past few years and I don't think our old friend "the book" is as prominent as it once was as a learning aid, especially when there are far more accessible and interactive ways of learning.

    Pluralsight + Microsoft Virtual Academy + StackOverflow = My Learning Heaven

    Lets take training via Pluralsight as a fine example. Since registering, I find myself having the ability to learn on demand at my own choosing. I am not restricted to lugging a thick programming book around as (believe or not!) I once did. The flexibility of multiple learning paths guides me to all the courses I need to be proficient in a subject all from the comfort of a laptop, phone or tablet. In addition, unlike book purchases that will inevitably go out of date, you will access to all latest content at no extra cost. Big bonus!

    Pluralsight alongside Microsoft Virtual Academy (if you're a .NET Developer) is the most powerful learning resource a developer could have. As much as my argument is swaying more towards the paperless approach, there is nothing like having the satisfaction of flicking through pages of a book. I don't think I could completely empty my book shelf of all programming books. I have just too many timeless classics that I could never give away and will always go back to reach for, one of them being Code Complete.

    I came across an insightful article by Caroline Myrberg called: Screen vs. paper: what is the difference for reading and learning?, where she writes an interesting piece on what recent research had to say about addressing the issues of the learning processes involved in reading on screen compared to on paper. Surprisingly, there isn't much of a substantial difference in the how we are able to absorb information regardless of medium. It's all about how information is presented to us. The article highlights a study where participants completed a knowledge test of 24 questions after one group were given learning material in paper format and another on an interactive web page. The outcome:

    ...the web page group scored better on 18 of those questions, and significantly better (90% or higher) on six. So enhancing the electronic text instead of just turning it into a copy of the printed version seems to have helped the students to score higher on the test.

    I think this is why online learning like Pluralsight works so well! At the same time, there will always be a need for books. No matter how far technology continues to immerse ourselves on a daily basis. We as human-beings relate towards things that are tangible - physical objects we hold and touch. It's our default behavior and the way we're wired. But you can't help and embrace the massive leaps in technology, making access to learning resources more convenient then it ever has been.

  • Early last month, I decided to make the move and finally run my site under a secure certificate. This something I’ve been meaning to do over the last year as it became apparent that Google will soon penalise your search rankings if an SSL is not installed. Quite a few of the developer blogs I follow have already made the transition, so I thought I too should do the same. I was surprised how cheap it was to move to HTTPS. For myself, I pay around £25 a year that consists of a basic Comodo SSL certificate and a dedicated IP. This is purely because my website is hosted on a shared hosting provider. It’ll probably be even more cheaper for those who manage their own hosting.

    I highly recommend anyone who still has qualms on making the move to https to read the following post by Scott Helme: Still think you don't need HTTPS?. He brings up some very interesting points and benefits that motivated me to make the move.

    The transition to making the move to HTTPS was painless and required no major downtime. But I did have to spend time in ensuring all external requests from my site were secure, for example Disqus, Google Ads and some external JS references. However, something a little more pressing caught my eye and got quite a fright when I logged into Google Webmaster Tools yesterday. Unbeknown to me, ever since my site changed to HTTPS, both my clicks and CTR statistics declined drastically over the month. Take a look at the blue and yellow lines:

    Google Webmaster Tools Clicks/CTR Decline

    At least this decline has not been reflected in my Google Analytics report. The number of visitors to my site has remained stable and have even noticed a slight increase - I don’t think the increase has anything to do with the SSL certificate. So what caused the rapid decline in Webmaster Tools? It seems like I missed something in my haste. I needed to in fact create a new website inside Webmaster Tools that contained my website URL prefixed with "https://". This is because a "http://www.surinderbhomra.com" is considered a different URL to "https://www.surinderbhomra.com". Makes sense when I think about it. I wrongly presumed that as long as I have the correct 301 redirects in place so all pages on my site is served over HTTPS, there won't be an issue.

    HTTP and HTTPS Sites In Google Webmaster Tools

    John Mueller wrote a FAQ post on Google+ that covers most of the important things you need to know and how to setup Webmaster Tools correctly on change to HTTPS: https://plus.google.com/+JohnMueller/posts/PY1xCWbeDVC.

    I won't lie to you, seeing that green padlock in the web address bar whenever I visit my site gives me a new found sense of euphoria!

  • For a site I'm working on, the Facebook's Comments plugin is being utilised on all our article pages. There was a requirement to pull in the latest comments in a listing page for each of these article pages as well as number of comments. Facebook's JavaScript library provides the ability to display a comments counter but not the ability to pull out x number of comments. So we'll have to go server-side and use Graph API to get the data we want.

    In this post, I will show you how you can get back all comments for a page by it's full URL.

    Prerequisites

    Before we get into the main C# logic methods, you need to make sure we have a few things in place:

    • ApiWebRequestHelper Class
    • Newtonsoft Json
    • Facebook App Settings
    • Class Objects

    ApiWebRequestHelper Class

    Whenever I am making a call to Facebook's Graph API endpoints, I will be making references to a "ApiWebRequestHelper" helper class. This is something I developed last month to make it easier for me to deserialize XML or JSON requests to a strongly-typed class object. You can take a look at the full code here.

    Newtonsoft Json

    The Newtonsoft Json library is a key ingredient to any JSON web requests. I'd be surprised if you've never heard or used it. :-) Nevertheless, you can get it here: http://www.newtonsoft.com/json.

    Facebook App Settings

    I haven't created a Facebook App for quite some time and things have changed very slightly in terms of the interface and options presented. The key things you need to get out of your created App is:

    • Application ID
    • Application Secret
    • Client Token

    I set the security settings with the following modes, which can be found in Settings > Advanced >  Security.

    Facebook App Advanced API Settings

    Class Objects

    The following class objects will be used to deserialize Graph API requests into class objects.

    The FacebookPageInfo, FacebookPage and FacebookPageShare objects will get the core information about the queried page, such as the Title and Description, as well as the comments and share counts.

    namespace Site.BusinessObjects.Facebook
    {
        public class FacebookPageInfo
        {
            [JsonProperty("id")]
            public string Id { get; set; }
    
            [JsonProperty("og_object")]
            public FacebookPage Page { get; set; }
    
            [JsonProperty("share")]
            public FacebookPageShare Share { get; set; }
        }
    
        public class FacebookPage
        {
            [JsonProperty("id")]
            public string Id { get; set; }
    
            [JsonProperty("description")]
            public string Description { get; set; }
    
            [JsonProperty("title")]
            public string Title { get; set; }
    
            [JsonProperty("type")]
            public string Type { get; set; }
    
            [JsonProperty("updated_time")]
            public DateTime UpdatedTime { get; set; }
    
            [JsonProperty("url")]
            public string Url { get; set; }
        }
    }
    
    namespace Site.BusinessObjects.Facebook
    {
        public class FacebookPageShare
        {
            [JsonProperty("comment_count")]
            public int CommentCount { get; set; }
    
            [JsonProperty("share_count")]
            public int ShareCount { get; set; }
        }
    }
    

    All comments for a page will be stored in the following objects:

    namespace Site.BusinessObjects.Facebook
    {
        public class FacebookPageCommentInfo
        {
            public int TotalComments { get; set; }
            public List<FacebookCommentItem> Comments { get; set; }
        }
    }
    
    namespace Site.BusinessObjects.Facebook
    {
        public class FacebookCommentItem
        {
            [JsonProperty("id")]
            public string Id { get; set; }
    
            [JsonProperty("created_time")]
            public DateTime CreatedTime { get; set; }
    
            [JsonProperty("from")]
            public FacebookCommentFrom From { get; set; }
    
            [JsonProperty("message")]
            public string Message { get; set; }
        }
    
        public class FacebookCommentFrom
        {
            [JsonProperty("id")]
            public string Id { get; set; }
    
            [JsonProperty("name")]
            public string Name { get; set; }
        }
    }
    

    Facebook Logic Class

    Now that we have the pre-requisites in place, lets get to the code that will perform the required functions:

    namespace Site.BusinessLogic
    {
        public class FacebookLogic
        {
            private string _accessToken;
    
            /// <summary>
            /// Uses default Client ID and Secret as set in the web.config.
            /// </summary>
            public FacebookLogic()
            {
                GetAccessToken(Config.Facebook.ClientId, Config.Facebook.ClientSecret);
            }
    
            /// <summary>
            /// Requires  Client ID and Secret.
            /// </summary>
            /// <param name="clientId"></param>
            /// <param name="clientSecret"></param>
            public FacebookLogic(string clientId, string clientSecret)
            {
                GetAccessToken(clientId, clientSecret);
            }
    
            /// <summary>
            /// Gets page info that has been shared to Facebook.
            /// </summary>
            /// <param name="pageUrl"></param>
            /// <returns></returns>
            public FacebookPageInfo GetPage(string pageUrl)
            {
                return ApiWebRequestHelper.GetJsonRequest<FacebookPageInfo>($"https://graph.facebook.com/{pageUrl}?access_token={_accessToken}");
            }
    
            /// <summary>
            /// Gets comments for a page based on its absolute URL.
            /// </summary>
            /// <param name="pageUrl"></param>
            /// <param name="maxComments"></param>
            public FacebookPageCommentInfo GetPageComments(string pageUrl, int maxComments)
            {
                try
                {
                    // Get page information in order to retrieve page ID to pass to commenting.
                    FacebookPageInfo facebookPage = GetPage(pageUrl);
    
                    if (facebookPage.Page != null)
                    {
                        return new FacebookPageCommentInfo
                        {
                            TotalComments = facebookPage.Share.CommentCount,
                            Comments = GetCommentsByPageId(facebookPage.Page.Id, maxComments).Comments
                        };
                    }
                    else
                    {
                        return null;
                    }
                }
                catch (Exception ex)
                {
                    // NOTE: Log exception here...
    
                    return null;
                }
            }
    
            /// <summary>
            /// Gets comments by Facebook's Page ID.
            /// </summary>
            /// <param name="fbPageId"></param>
            /// <param name="max"></param>
            /// <returns></returns>
            public FacebookCommentInfo GetCommentsByPageId(string fbPageId, int max = 10)
            {
                return ApiWebRequestHelper.GetJsonRequest<FacebookCommentInfo>($"https://graph.facebook.com/comments?id={fbPageId}&access_token={_accessToken}&limit={max}");
            }
    
            /// <summary>
            /// Retrieves Access Token from Facebook App.
            /// </summary>
            /// <param name="clientId"></param>
            /// <param name="clientSecret"></param>
            private void GetAccessToken(string clientId, string clientSecret)
            {
                UriBuilder builder = new UriBuilder($"https://graph.facebook.com/oauth/access_token?client_id={Config.Facebook.ClientId}&client_secret={Config.Facebook.ClientSecret}&grant_type=client_credentials");
    
                try
                {
                    using (WebClient client = new WebClient())
                    {
                        // Get Access Token from incoming response.
                        string data = client.DownloadString(builder.Uri);
    
                        NameValueCollection parsedQueryString = HttpUtility.ParseQueryString(data);
    
                        _accessToken = parsedQueryString["access_token"];
                    }
                }
                catch (Exception ex)
                {
                    // NOTE: Log exception here...
                }
            }
        }
    }
    

    By default, on initiation of the FacebookLogic class, the Application ID and Secret values will be inherited from the web.config, or you can pass in these values directly with the class overload parameters.

    Out of all the methods used here, we're interested in only using one: GetPageComments(). What you will notice from this method is that we cannot get the comments from one API call alone. We first have to make an extra API call to get the ID of the page. This ID is passed to the GetCommentsByPageId() method, to return all comments.

    Usage

    Comments for a page can be returned by adding the following in your code, where you will then be able to access properties to iterate through the comments:

    FacebookLogic fbl = new FacebookLogic();
    
    // Pass in the page URL and number of comments to be returned.
    var pageComments = fbl.GetPageComments("https://www.surinderbhomra.com/", 2);
    

    Whenever you call this piece of code, I would make sure you cache the results for 5 - 10 minutes, so you do not use up your API request limits.