Category: Tech Tips

  • Submit A Post via Front End using Gravity Forms

    The following is a Gravity Form Form, that allows you to submit a post via the front end. It can submit as ‘draft’, ‘pending review’ or even ‘published’. Feel free to try it, and if you include a valid WordPress tip or other news I may even publish it.  

  • How to Extract URLs from Google SERPs

    These steps are how to easily extract URLs from google SERPs Install Chrome Install Chris Ainsworth’s Extractor ‘bookmarklet’ http://www.chrisains.com/seo-tools/extract-urls-from-web-serps/ Install ginfinity plugin for Chrome https://chrome.google.com/webstore/detail/ginfinity/dgomfdmdnjbnfhodggijhpbmkgfabcmn Set Google search to return 100 per page Run the query Click on the bookmark This summarises the steps detailed and provided by Chris Ainsworth at http://www.chrisains.com/seo-tools/extract-urls-from-web-serps/  

  • Backing up to gcloud storage from Linux/Virtualmin

    Using Google Cloud Storage as backup for your linux server is inexpensive if you use the nearline storage option. This is cheaper that Amazon Web Services (at the time of writing) The process I use is as follows. Backup locally Create storage bucket / folders, set lifecycle Script to move backups find way to execute…

  • Build and host a static website with Hugo and Google Cloud Storage

    WordPress is overkill for 90% of small business websites, especially if they are only every going to ask the web developer to make changes.  Static websites used to be just that, static, but now it is possible to create static websites that are simple (relatively) to update dynamically.   Hugo is a Golang project that…

  • Google Chrome extract links

    Extract links from a Google Chrome page Inspect element Go to console run urls = $$(‘a’); for (url in urls) console.log ( urls[url].href );  

  • Google not able to fetch robots.txt

    Recently I was getting a message from Web Master Tools, Googlebot can’t access your site! Over the last 24 hours, Googlebot encountered 87 errors while attempting to access your robots.txt. To ensure that we didn’t crawl any pages listed in that file, we postponed our crawl. Your site’s overall robots.txt error rate is 64.4%   Recommended action…