Speeding Up WordPress

I started messing around with my WordPress by first adding in a layer of security in Adding Nginx in Front of WordPress. After putting Nginx in front of my WordPress, I decided that I would further secure it by also Building a Static WordPress. That’s great and all but maybe it was time to make Nginx give me some performance gains rather than just some security controls. That is exactly what we’re going to do in this blog post. Now that Nginx is sitting in front of WordPress, we can use it to control some of the performance aspects.

Generating a Baseline Performance Report

First thing’s first though. Let’s first get us a baseline of where the site is at and what needs work. Google’s PageSpeed is a great tool for finding out what’s slowing down your site. Below is the report for this blog.

PageSpeed mobile numbers

I guess those numbers aren’t terrible but I’m sure they could be better.

Figuring Out What to Fix

As you scroll down the report, there are a number of things to correct. An example of such things would be the Opportunities section:

Opportunities section of report

In addition, there are some diagnostic items that show up:

Example cache policy items

Fixing Some of the Items

Adding a Caching Policy

An initial first step to correct some performance issues, would be to enable caching policies on the Nginx server. Given that we’re serving mostly all static content now, there’s no need to cache any content that we serve up. Nginx is already serving static data so we don’t need to rely on a backend. Let’s modify the static path’s caching policy for clients by adding the cache-control response header:

         location /status {
                 return 200 "healthy\n";
         location / {
                 try_files $uri $uri/ /index.html;
                 add_header 'Cache-Control' "public,max-age=31536000,stale-while-revalidate=360";
                 #proxy_pass https://wordpress;
                 #proxy_ssl_verify off;
                 #proxy_set_header Host blog.shellnetsecurity.com;
                 #proxy_set_header X-Forwarded-For $remote_addr;
         location /sitemap {
                 proxy_pass https://wordpress;
                 proxy_ssl_verify off; 
                 proxy_set_header Host blog.shellnetsecurity.com;
                 proxy_set_header X-Forwarded-For $remote_addr;

This example configuration snippet shows that we are adding the Cache-Control response header to the requests to “/”. This means we’re doing what we planned and are only telling clients to cache data that isn’t sent to the backend WordPress server. Additional parameters that can be supplied to Cache-Control are documented here.

Enable Gzip Compression

By default, even with gzip on, Nginx will not compress all files. Let’s add some additional content to our http config block (note the additional gzip directives bolded listed below gzip on):

    http {
         proxy_set_header X-Real-IP       $proxy_protocol_addr;
         proxy_set_header X-Forwarded-For $proxy_protocol_addr;
         sendfile on;
         tcp_nopush on;
         tcp_nodelay on;
         keepalive_timeout 65;
         types_hash_max_size 2048;
         include /usr/local/nginx/conf/mime.types;
         default_type application/octet-stream;
         access_log /dev/stdout;
         error_log /dev/stdout;
         gzip on;
         gzip_vary on;
         gzip_min_length 1000;
         gzip_proxied expired no-cache no-store private auth;
         gzip_types text/plain text/css text/xml text/javascript application/x-javascript application/xml;
         gzip_disable "MSIE [1-6]\.";
         resolver kube-dns.kube-system.svc.cluster.local;
         include /etc/nginx/sites-enabled/*;

With those changes added to our Nginx configuration, restart Nginx for the changes to take effect.

Testing Our Page Again

Now that those changes should be live in your Nginx, let’s check how we did again on PageSpeed.

Updated PageSpeed details.

The numbers aren’t amazingly stellarly awesomer but they are better. If you look at the overall scoring, we jumped from a 66 to a 72. The final problem left is not something we can correct using Nginx. There are a number of first and third party scripts that are loading and slowing the site down. Next steps will involve researching those scripts and attempting to determine if there are any that can be removed. Until next time!

LED Lighting

Photo by Suzy Hazelwood from StockSnap

Disclosure: I have included some affiliate / referral links in this post. There’s no cost to you for accessing these links but I do indeed receive some incentive for it if you buy through them.

It’s time to get back to some lighting as I spent a little time enhancing my setup that I left off configuring in Making the Lights Dance. In my Building the RaspberryPi Christmas Light Box post, I blamed a friend for starting me down this path. Once again, I’m blaming a different friend for causing me to wander down the LED lighting road. This friend saw some of my posts regarding the simplistic lighting box I created, and they suggested that I tinker with WS2811 lights. Let the tinkering begin!

Hardware List

It’s always good to talk through the hardware that we’ll be using for this. To start, I extended my previous Building the RaspberryPi Christmas Light Box system to be able to do some LED lighting. Here are the items that I purchased from Amazon:

By the way, your hardware may vary slightly but I bought the 5v lights just because. There are LED tape strips and these 12mm bullets. There’s indoor only and IP68 rated. There are 12v lights and many other options to choose from. I just happened to pick these because they were the cheapest at the time.

Tiny Electrical Lesson on the Hardware

I’m not an electrician by any stretch of the imagination but I know a guy that helped me get through it a little. He’s more AC than DC but we had a good chat about it all (ok enough babbling on that).

There are some very important concepts to keep in mind on the power requirements. These important items were things that I had to dig up and learn from my electrician. I figured I’d drop them here to help anyone else going down the LED lighting route as blindly as I did. You need to make sure the output voltage on the power supply matches the input requirements on the lights. Aside from voltage drops (I’ll cover that in a later article when I go bigger on the system), the voltage will remain constant. I am using 5V lights so an adapter capable of outputting 5V was required.

Wattage is cumulative! This is a very important point to remember in all of this. The example lights used have a spec of roughly 0.3W per LED bulb. Each strand has 50 bulbs. If you take 0.3W * 50 bulbs, you get 15W. This means that a single strand of lights requires 15W of power. If you wanted to use two strands, you’d do 0.3W * 100bulbs for 30W of power required. This means that when you purchase your power supply, it will need to support 5V and 15W * <The number of strands>. Given the power supply listed above in the hardware list, I can only run a single strand of lights :facepalm: here. For those wanting to do more than one strand, I would suggest possibly getting the SHNITPWR 4V – 12V Power Supply 10A 120W AC to DC Adapter. This was used when I went bigger with the LED lighting.

The only other thing to keep in mind would be the data channel on the lights. The lights have 3 wires (5 if you count the separate power leads at each end of the strand but who’s really counting?) that supply power and data. The power has your standard +/- and the third middle wire is data. The nice thing is that data does not suffer from voltage drop like the power. Data is repeated at each bulb so it doesn’t lose signal (as long as you get good power to the first bulb. A warning I didn’t fully head until I built the bigger system. Again, more details later).

Connecting Everything

The awesome folks over at Adafruit have put together some really nice articles to also help out. In order to get everything up and running, it was pretty simple to follow their NeoPixels on Raspberry Pi Wiring guide (FYI, this guide contained the warning that I ignored regarding the data power requirements. They refer to it as level shifting. In my testing of a single strand and later connecting 8 strands, I had zero issues without doing the level shifting. The moment I wired everything up outside for the bigger system, I did end up needing to level shift 🙂 ). Ok so I ignored all warnings and went straight for wiring the light strand directly to my Raspberry Pi 4 along with adding the power supply.

I started with my mess of goodies

hardware pieces to be assembled

Something very important to know about the LED lights is that the data is a one way street. When you connect your data wire to the Raspberry Pi, you need to make sure the Pi is feeding “in” to the strand. These strands come with a little arrow to explain how the data is expected to flow (sorry the arrow is a little blurry).

Picture of data directional arrow on 12mm bulb

This isn’t too terribly difficult to get wrong or right to be honest. The example I have above shows the arrow pointing up from the wire. This means that the data will come “in” from that wire to the bulb. This will be the end that you connect your power supply and Raspberry Pi onto. Speaking of that!

Power supply connections and pigtail

The nice thing about these lights is that there was a pigtail included that connected to the existing connectors. Also, the ground aka “-” has a dotted line on the wire while the positive does not. The above picture shows you the 5 wires I talked about. I have the power connected directly to the “-” and “+” light strand’s separate power wires. Those are connected to the proper terminals on the power plug connector. On the pigtail, the color scheme is like this:

  • Green == Data Wire
  • Red == +
  • White == –

With the power connected to the power plug adapter and the pigtail connected to the lights, I needed to connect everything to the Raspberry Pi. This is where the breadboard comes in:

Connections into the breadboard

On the left side of the image, you can see that I have my pigtail wired onto the breadboard. On the right side, the wires are destined for the Pi. The below table explains how I have the wires connected

Red Wire From Light Strand pigtailConnected to the “+” rail on breadboardThis serves no purpose whatsoever. I just didn’t want a lose wire roaming around
Green Wire From Light Strand pigtailConnected to row “44” on breadboardThis is the data wire to the light strand and we’ll need to connect this to GPIO18 on the Raspberry Pi via the breadboard
White Wire From Light Strand pigtailConnected to the “-” rail on the breadboardThis is connecting the ground wire from the light strand to the Raspberry Pi via the breadboard
Red Wire from Raspberry Pi Pin 6 (GND)Connected to the “-” tail on the breadboardThis is connecting the ground from the Raspberry Pi to the ground on the light strand via the breadboard
Tan Wire from Raspberry Pi Pin 12 (GPIO18)Connected to row “44” on breadboardThis is connecting GPIO18 to the light strand via the breadboard.

GPIO18 is very important to use as our data connection to the light strand. Below is a full picture of the wiring.

full wiring diagram of breadboard and pi

Getting the Software

With all of the hardware in place, it is now time to fire everything up and get the software we need to run this light show! I’m assuming you can login to your Pi and create a directory called LEDs. We’ll use this directory to house our testing code. I’m also going to assume that you are ok with using NodeJS (sorry I’ve been on a Javascript kick lately. There’s also Python code available to do this as well). Let’s get into that directory and install the rpi-ws281x-native library we’ll need to get the lights running:

 $ cd LEDs/
 pi@raspberrypi:~/LEDs $ npm install rpi-ws281x-native
 npm WARN npm npm does not support Node.js v10.21.0
 npm WARN npm You should probably upgrade to a newer version of node as we
 npm WARN npm can't make any promises that npm will work with this version.
 npm WARN npm Supported releases of Node.js are the latest release of 4, 6, 7, 8, 9.
 npm WARN npm You can find the latest version at https://nodejs.org/
 > rpi-ws281x-native@0.10.1 install /home/pi/LEDs/node_modules/rpi-ws281x-native
 > node-gyp rebuild
 make: Entering directory '/home/pi/LEDs/node_modules/rpi-ws281x-native/build'
   CC(target) Release/obj.target/rpi_libws2811/src/rpi_ws281x/ws2811.o
   CC(target) Release/obj.target/rpi_libws2811/src/rpi_ws281x/pwm.o
   CC(target) Release/obj.target/rpi_libws2811/src/rpi_ws281x/dma.o
   CC(target) Release/obj.target/rpi_libws2811/src/rpi_ws281x/pcm.o
   CC(target) Release/obj.target/rpi_libws2811/src/rpi_ws281x/mailbox.o
   CC(target) Release/obj.target/rpi_libws2811/src/rpi_ws281x/rpihw.o
   AR(target) Release/obj.target/rpi_libws2811.a
   COPY Release/rpi_libws2811.a
   CXX(target) Release/obj.target/rpi_ws281x/src/rpi-ws281x.o
   SOLINK_MODULE(target) Release/obj.target/rpi_ws281x.node
   COPY Release/rpi_ws281x.node
   COPY ../lib/binding/rpi_ws281x.node
   TOUCH Release/obj.target/action_after_build.stamp
 make: Leaving directory '/home/pi/LEDs/node_modules/rpi-ws281x-native/build'
 npm WARN saveError ENOENT: no such file or directory, open '/home/pi/LEDs/package.json'
 npm WARN enoent ENOENT: no such file or directory, open '/home/pi/LEDs/package.json'
 npm WARN LEDs No description
 npm WARN LEDs No repository field.
 npm WARN LEDs No README data
 npm WARN LEDs No license field.
 + rpi-ws281x-native@0.10.1
 updated 1 package in 13.568s
 pi@raspberrypi:~/LEDs $  

With that all set and ready to go, I suggest grabbing the example libraries hosted in the rpi-ws281x-native GitHub repo. Note that you will need to modify the require lines in those scripts from:

var ws281x = require('../lib/ws281x-native');

to something like this:

var ws281x = require('rpi-ws281x-native');

From there, you can try and test out the one of the scripts. Remember, you must specify the total pixels that are in the strand to be tested. By default, the code will only light up 10 bulbs. The example below shows how you would run the command for all 50 bulbs in our example strand:

 $ sudo node rainbow.js 50
 Press <ctrl>+C to exit.
 pi@raspberrypi:~/LEDs $  

Very important to run them using “sudo” because the code requires root access in order to be able to properly signal the strand.

Video: The Example Scripts in Action

Here is an example of me running those scripts and the lights in action

Automating Static WordPress Updates

Photo by Alex Knight from StockSnap

In my previous post, Building a Static WordPress, I setup my Nginx sitting in front of WordPress to load static content from a private repo. This is great but could become tedious long term. Most notably, this becomes challenging as you begin to post more content. Each time content is posted, we need to fetch all of the update pages including category updates and any new images. Sure, we can just run a few wget commands manually and then update our repo and all is better. Just because you “can” doesn’t mean you should.

From that previous post, you’ll note that I had a bunch of unanswered questions. Some of those questions might remain unanswered. By the time you get to the end of this post, you might be able to address them yourself. I’m going to focus on automating static WordPress updates whenever a new post is published. This similar logic should be possible to replicate when it comes to needing to update static content based upon WordPress and WordPress plugin updates.

Setting Up Slack Notifications

I guess the secret is out! I’m going to be using Slack as part of this. I’m going to assume you have your own slack setup with a channel dedicated to wordpress notifications. In my case, that’s going to be called #wordpress. Given the way my site is configured, I figured Slack notifications would be the best method of triggering my automation. I’m not going to reinvent the wheel here so please refer to the wpbeginner post, How to Get Slack Notifications From Your WordPress Site, on how to configure the Slack Notifications plugin.

After walking through the wpbeginner post, you should have Slack Notifications installed, configured, and successfully tested. For the purposes of this post, you’ll want to create a Posts Published notification like below:

Building the Automation

I wanted to keep everything self contained in my Kubernetes so I decided to build a nice little Node service to do everything that I wanted. We need a service to connect to the #wordpress slack channel and watch for updates. Depending upon the update type it has received, it should commit the new article content to our private repo.

Follow the Existing Tutorial

Slack has created a Javascript related bot framework called Bolt. For this reason, I’m not going to spend too much time explaining how to build a bot when Slack already created a great tutorial, Building an app with Bolt for JavaScript. When setting up the bot, make sure you add the following permissions:

  • message.channels
  • message.im
  • message.groups
  • message.mpim

Go ahead, get familiar with Bolt. I’ll wait while you make the example bot. Once you’re done, come back and continue to the next section.

Extending the Existing Tutorial

This next section takes the above tutorial and extends that to include what we actually need for our bot to handle Post Published notifications. The first step is to generate a SSH Key that has read write to the static content repo. Remember, this was created as a private repo so we’ll need a key that has read write access and that is used by our bot. If you need a little refresher on the process of creating SSH Keys and adding them to your private repo, checkout the previous post on this topic, Creating a Private GitHub Repo. You can store the key where ever you like just keep it handy.

With the SSH Key created, you need to add nodegit to the bot project. Make sure you are in the project root of the bolt bot project you created above.

$ npm install nodegit

Next, we’ll add some variables and constants to the app. Edit your app.js and add in the following:

 const BLOG_HOSTNAME = 'blog.shellnetsecurity.com';
 const WORDPRESS_CONTAINER_URL = 'wordpress_container.default.svc.cluster.local';
 const cloneURL = "git@github.com:my_account/wordpress_content.git";
 const clonePath = "/tmp/clone"; 
 const sshEncryptedPublicKeyPath = "/opt/slackapp/testKey.pub";
 const sshEncryptedPrivateKeyPath = "/opt/slackapp/testKey";
 const sshKeyPassword = "abcd1234";
 const { exec } = require("child_process"); 
 var NodeGit = require("nodegit");
 var Repository = NodeGit.Repository;
 var Clone = NodeGit.Clone;
 const fs = require('fs'); 

You’ll see how we leverage these variables later but the below table explains how we’ll use them.

BLOG_HOSTNAMEThis should be the URL of your blog
WORDPRESS_CONTAINER_URLThis should be the kubernetes DNS hostname of your wordpress container
cloneURLThis will be the SSH link to your static content repo
clonePathThis path will be used for staging our replicated repo.
sshEncryptedPublicKeyPathPath to the location of the public SSH key you created earlier
sshEncryptedPrivateKeyPathPath to the location of the private SSH key you created earlier
sshKeyPasswordPassword for the SSH key you created earlier
explanation of added const

The other remaining items should be self explanatory so we’ll move on by adding some functions of use. We’ll start by adding a new feature to our bot with the botMessages function:

 // Listener middleware - filters out messages that have subtype 'bot_message'
 async function botMessages({ message, next }) {
   if (message.subtype && message.subtype === 'bot_message' && validBotMessageByText(message.text) === true) {
     removeRepo(clonePath, cloneURL, BLOG_HOSTNAME, WORDPRESS_CONTAINER_URL);
     await next();

Since the notifications will be coming in from a bot, we check that the message contains a subtype of “bot_message”. The validBotMessageByText function is used to confirm that the message is supported by our flow:

 function validBotMessageByText(text) {
   let re = RegExp('The post .* was published'); // Test for a post scheduled message
   if(re.test(text)) {
     return true;
   return false;

This is a simple function that contains a regex looking for the Post Published message. If the message is valid, then botMessage executes removeRepo:

 function removeRepo(clonePath, cloneURL, blogHostname, wpUrl) {
   // delete directory recursively
   try {
       fs.rmdirSync(clonePath, { recursive: true });
      console.log(`${clonePath} is deleted!`);
       this.clonePrivateSite(cloneURL, clonePath, blogHostname, wpUrl);
   } catch (err) {
       console.log(`Error while deleting ${clonePath}.`);

The removeRepo function attempts to delete the clonePath directory if it exists and then runs clonePrivateSite.

 function clonePrivateSite(cloneURL, clonePath, blogHostname, wpUrl) {
     var opts = {
       fetchOpts: {
         callbacks: {
           certificateCheck: () => 0,
           credentials: function(cloneURL, userName) {
             return NodeGit.Cred.sshKeyNew(
     Clone(cloneURL, clonePath, opts).catch(function(err) {console.log(err);}).then(function(){ this.mirrorSite(blogHostname, 'https://' + wpUrl, cloneURL, clonePath);} );

clonePrivateSite creates an options object to configure nodegit with SSH credentials created earlier. The Clone command clones the latest version of the repo to the clonePath. Next, mirrorSite is used to pull down a copy of the current dynamic WordPress site running on our backend:

 function mirrorSite(blogHostname, blogURL, cloneURL, clonePath) {
   var cmd = `wget -q --mirror --no-if-modified-since --follow-tags=a,img --no-parent --span-hosts --domains=${blogHostname} --directory-prefix=${clonePath}/html/ --header="Host: ${blogHostname}" --no-check-certificate ${blogURL}`;
   console.log('Executed Command : ' + cmd);
   var child = exec(
     function (error, stdout, stderr) {
       this.fixUrls(cloneURL, clonePath);
       if (error !== null) {
         console.log('exec error: ' + error);

This is the lazy man’s method but it does work. mirrorSite is simply calling the wget command I had in the previous article and saving the output to the html directory of our clonePath. I still haven’t taken the time to fix my site so fixUrls is doing that for me:

 function fixUrls(cloneURL, clonePath) {
   var cmd = `find ${clonePath} -type f -print0 | xargs -0 sed -i'' -e 's/http:\\/\\/blog/https:\\/\\/blog/g'`;
   console.log('Executed Command : ' + cmd);
   var child = exec(
     function (error, stdout, stderr) {
       this.commitPrivateRepo(cloneURL, clonePath, 'Some Commit Message Here');
       if (error !== null) {
         console.log('exec error: ' + error);

Because of a rip in the space time continuum, all of my wgets clone URLs like https://blog.shellnetsecurity.com so fixUrls runs find to swap all of those out to https://blog.shellnetsecurity.com. Once it completes, it calls commitPrivateRepo to commit all of our changes.

 function commitPrivateRepo(cloneURL, clonePath, commitMsg) {
     repoFolder = clonePath + '/.git';
     var repo, index, oid, remote;
       .then(function(repoResult) {
         repo = repoResult;
         return repoResult.refreshIndex();
       .then(function(indexResult) {
         index  = indexResult;
         var paths = [];
         return NodeGit.Status.foreach(repo, function(path) {
         }).then(function() {
           return Promise.resolve(paths);
       .then(function(paths) {
         return index.addAll(paths);
       .then(function() { 
         return index.writeTree();
       .then(function(oidResult) {
         oid = oidResult;
         return NodeGit.Reference.nameToId(repo, 'HEAD');
       .then(function(head) {
         return repo.getCommit(head);
       .then(function(parent) {
         author = NodeGit.Signature.now('Slack App', 'author@email.com');
         committer = NodeGit.Signature.now('Slack App', 'commiter@email.com');
         return repo.createCommit('HEAD', author, committer, commitMsg, oid, [parent]);
       .then(function(commitId) {
         return console.log('New Commit: ', commitId);
       /// PUSH
       .then(function() {
         return NodeGit.Remote.createAnonymous(repo, cloneURL)
         .then(function(remoteResult) {
           remote = remoteResult;
           // Create the push object for this remote
           return remote.push(
               callbacks: {
                 credentials: function(url, userName) {
                   return NodeGit.Cred.sshKeyNew(
       .then(function() {
         console.log('remote Pushed!')
       .catch(function(reason) {

This function goes through the clonePath directory and adds all new and changed files to the commit. After committing all changes to the local repo, it pushes those changes to the remote repo again using the SSH credentials created previously.

What’s Next?

After making all of the above changes, restart the bolt bot and do some testing. If you publish any posts, you should receive a notification to Slack and eventually an update to your repo. Also, you should be able to extend this bot to handle other types of WordPress notifications coming into Slack.

Samsung Phone Dropping WiFi

Image by IO-Images from Pixabay

Disclosure: I have included some affiliate / referral links in this post. There’s no cost to you for accessing these links but I do indeed receive some incentive for it if you buy through them.

I was getting so frustrated with my new phone. I got on the preorder list and was all excited to get my brand new Samsung Note20 Ultra. After it was delivered, I did the standard switch to the new phone. Thank you AT&T and Samsung for making the upgrade from my Note8 to the Note20 so easy!

5G is Great, WiFi is Useless

The new 5G was great and working just fine but my problem is that I was using the mobile network way more than my WiFi connection. Anytime I would pick up my phone, the WiFi would be dead. I would either need to wait several minutes for the connection to return or turn off/on my WiFi. This great new awesome phone was useless in the house! More importantly, I have a few Google Home devices so I was unable to cast to them.

I searched all over the Internet and felt like I was the only one with this problem. Nobody appeared to be having the same issue. I found countless articles on how to “repair” your WiFi. I found just about every article that equated to the “did you reboot it?” question I used to ask customers when they were having problems connecting to their MindSpring (I had to add this for nostalgia purposes of my days working technical support. Also, this company rocks and its memory should never die) accounts.

The details around the problem are as follows:

  • Use phone on WiFi
  • Phone works for a period of time but then WiFi stops responding
  • The WiFi icon on the phone only makes “up arrow” requests meaning that it was sending requests but not getting responses
  • My DNS server would stop seeing DNS requests from my phone
  • Nothing WiFi related would work on my phone
  • After some random period of time passed, my phone would then go wild catching up making a ton of queued up DNS requests and everything would start working.

Everything Got Better or Did it?

For reasons I’ll discuss later (*cough* house full of kids with gaming consoles “LAG” *cough*), I decided to buy a new WiFi router. In the house, I already have the following WiFi devices:

  • Verizon FiOS WiFi – This is so my wife and I can watch TV on our iPads (this is mesh capable and I had issues here as well)
  • Xfinity WiFi – This is my connection for my work gear (Never tried using this one)
  • Google WiFi Mesh – The kids and their devices + guests are permitted to use this
  • Google Nest Mesh – This is where all of the home automation devices live

Why would even consider adding another WiFi router to this house?!?! My original plan was to buy the baddest device you could find at the time and then reduce my networks a little. I bought an ASUS ROG Rapture GT-AX11000 AX11000 Tri-Band 10 Gigabit WiFi 6 Gaming Router to be the next replacement router. I added this router to my WiFi arsenal and connected my phone to it to configure everything and test connectivity. My phone worked great! I didn’t drop WiFi ALL day!

Problem solved move on, right? This clearly fixed my WiFi problem and seemed to be an obvious solution, a new router. Why didn’t my Google Nest Mesh seem to fix the problem? This hardware was rather new and I allow early access code so I should be bleeding edge and without worry.

The Real Solution, I Think

I stayed on my ASUS router and never had an issue. I still wanted to be able to talk to my Googles so every now and then jumped to my mesh only to find the same problems. I still didn’t give up trying to find the real solution and I think I have FINALLY found the problem and solution in this post on the community Samsung forums. This appears to be a problem with Google Location Accuracy. I believe it is the Wi-Fi scanning feature to be exact. I didn’t want to completely disable my Google Location Accuracy so I started first by disabling Improve accuracy setting, Wi-Fi scanning. After disabling this feature, I haven’t seen an issue with my Google Nest Mesh.